Based on the self-consistent signal-to-noise analysis (SCSNA) capable of dealing with analog neural networks with a wide class of transfer functions, enhancement of the storage capacity of associative memory and the related statistical properties of neural networks are studied for random memory patterns. Two types of transfer functions with the threshold parameter theta are considered, which are derived from the sigmoidal one to represent the output of three-state neurons. Neural networks having a monotonically increasing transfer function F(M), F(M)(u) = sgnu (Absolute value of u > theta), F(M)(u) = 0 (Absolute value of u less-than-or-equal-to theta), are shown to make it impossible for the spin-glass state to coexist with retrieval states in a certain parameter region of theta and alpha (loading rate of memory patterns), implying the reduction of the number of spurious states. The behavior of the storage capacity with changing 0 is qualitatively the same as that of the Ising spin neural networks with varying temperature. On the other hand, the nonmonotonic transfer function F(NM), F(NM)(u) = sgnu (Absolute value of u < theta), F(NM)(u) = 0 (Absolute value of u greater-than-or-equal-to theta) gives rise to remarkable features in several respects. First, it yields a large enhancement of the storage capacity compared with the Amit-Gutfreund-Sompolinsky (AGS) value: with decreasing theta from theta = infinity, the storage capacity alpha(c) of such a network is increased from the AGS value (almost-equal-to 0.14) to attain its maximum value of almost-equal-to 0.42 at theta congruent-to 0.7 and afterwards is decreased to vanish at theta = 0. Whereas for theta greater than or similar to 1 the storage capacity alpha(c) coincides with the value alpha(c) determined by the SCSNA as the upper bound of a ensuring the existence of retrieval solutions, for theta less than or similar to 1 the alpha(c) is shown to differ from the alpha(c) with the result that the retrieval solutions claimed by the SCSNA are unstable for alpha(c) < alpha < alpha(c). Second, in the case of theta < 1 the network can exhibit a new type of phase which appears as a result of a phase transition with respect to the non-Gaussian distribution of the local fields of neurons: the standard type of retrieval state with r not-equal 0 (i.e., finite width of the local field distribution), which is implied by the order-parameter equations of the SCSNA, disappears at a certain critical loading rate alpha0, and for alpha less-than-or-equal-to 0 a qualitatively different type of retrieval state comes into existence in which the width of the local field distribution vanishes (i.e., r = 0+). As a consequence, memory retrieval without errors becomes possible even in the saturation limit alpha not-equal 0. Results of the computer simulations on the statistical properties of the novel phase with alpha less-than-or-equal-to alpha0 are shown to be in satisfactory agreement with the theoretical results. The effect of introducing self-couplings on the storage capacity is also analyzed for the two types of networks. It is conspicuous for the networks with F(NM), where the self-couplings increase the stability of the retrieval solutions of the SCSNA with small values of theta, leading to a remarkable enhancement of the storage capacity.