We demonstrate that the problem of approximately interpolating a target function by a neural network is computationally intractable, In particular the interpolation training problem for a neural network with two monotone Lipschitzian sigmoidal internal activation functions and one linear output node is shown to be NP-hard and NP-complete if the internal nodes are in addition piecewise ratios of polynomials, This partially answers a question of Blum and Rivest concerning the NP-completeness of training a logistic sigmoidal 3-node network, An extension of the result is then given for networks with n monotone sigmoidal internal nodes and one convex output node, This indicates that many multivariate nonlinear regression problems may be computationally infeasible.