NOTE: This is version 0.42, the patch for the first public release which was version 0.40. It includes several updates and fixes, including more 'intelligent' learning, smaller memory size, actual loading and saving of network, internal support for strings as network maps and targets, and integrated support for direct loading of PCX-format bitmap files, as well as 11 new examples. AI::NeuralNet::BackProp is a simply back-propagation, feed-foward neural network designed to learn using a generalization of the Delta rule and a bit of Hopefield theory. Still in beta stages. Be sure to checkout the ./examples/ directory for 17 different example scripts using the AI::NeuralNet::BackProp network. Use it, let me know what you all think. This is just a groud-up write of a neural network, no code stolen or anything else. It uses the -IDEA- of back-propagation for error correction, with the -IDEA- of the delta rule and hopefield theory, as I understand them. So, don't expect a classicist view of nerual networking here. I simply wrote from operating theory, not math theory. Any die-hard neural networking gurus out there? Let me know how far off I am with this code! :-) Thankyou all for your help. ~ Josiah jdb@wcoil.com http://www.josiah.countystart.com/modules/AI/cgi-bin/rec.pl - dowload latest dist