Ensembles of artificial neural networks (ANN) have been used in the last years as classification/regression machines, showing improved generalization capabilities that outperform those of single networks. However, it has been recognized that for aggregation to be effective the individual networks must be as accurate and diverse as possible. An important problem is, then, how to tune the aggregate members in order to have an optimal compromise between these two conflicting conditions. Recently, we proposed a new method for constructing ANN ensembles —termed here Stepwise Ensemble Construction Algorithm (SECA)— which leads to overtrained aggregate members with an adequate balance between accuracy and diversity. We present here a more extensive evaluation of SECA and discuss a potential problem with this algorithm: the unfrequent but damaging selection through its heuristic of particularly bad ensemble members. We introduce a modified version of SECA that can cope with this problem by allowing individual weighing of aggregate members. The original algorithm and its weighed modification are favorably tested against other methods, producing an improvement in performance on the standard statistical databases used as benchmarks.