This directory contains a set of Java sources that implement a simple multilayered perceptron (A fully connected feed-forward artificial neural network) with steepest descent training. The computations are based on the layer-wise matrix formulation which is documented at least in the article "Robust Formulations for Training Multilayer Perceptrons" by Tommi Kärkkäinen and Erkki Heikkola, published in Neural Computation, April 1, 2004. This was used as material on one lecture of the course "Konenäkö" ("Computer Vision") held at the Department of Mathematical Information Technology, University of Jyväskylä. These were made up pretty quickly to provide some hands-on exercises for the students in the Java language. Beware of bugs and thinkos; I'm sure there are many! Because the course was about Computer Vision and image processing, I included some tentative classes to make it a little bit easier for the students to make some feature extraction and load the MNIST digit images (not included here; they are downloadable from the MNIST website). The MNIST data was reduced and converted for the course exercises in Octave using ``mnist_reduce.m`` which is included here for convenience. Author: Paavo Nieminen