Abstract
The Elman Neural Network (ENN) is a type of recurrent network that has a context layer as an inside self-referenced layer .The ENN is trained in a supervised manner using the popular back propagation algorithm, based on the inputs and targets given to the network. Various parameters of the network like initialization of weights, types of inputs, number of hidden neurons, learning rate and momentum factor influence the training behaviour of the network. There exists no solid formula to guarantee that the network will converge to an optimum solution or to a faster convergence or convergence even occurs at all. If the parameters of the network are wrongly selected, then it may take a long time for the network to train or some times the network may not get converged at all. In this work the performance of the network is analyzed using extensive tests by adjusting the values of the above referred parameters to find the optimum condition of learning. A digital system, Binary to ASCII Converter is considered for carrying out the experiments. The optimum conditions are presented in this paper.