We analyze the attractors of associative-memory neural networks in which analog neurons compete locally. These networks are well suited for a variety of feature-extraction, pattern-classification, and data-compression tasks. For networks storing a finite number of patterns, we present bifurcation diagrams for the pattern overlaps. For networks storing an extensive number of patterns, we present phase diagrams showing attractor types as a function of pattern-storage fraction and neuron-transfer-function steepness. We also report results for the storage capacity of k-winner associative memories in the limit of infinite neuron gain. Numerical investigations of computer-generated networks confirm the phase diagrams