Artificial neural networks

Artificial neural networks

COHHiIERSTAHOP, ROS (JINTERLACES ELSEVIER C o m p u t e r Standards & Interfaces 16 (1994) 183-184 Guest editorial Artificial Neural Networks The b...

143KB Sizes 3 Downloads 297 Views

COHHiIERSTAHOP, ROS (JINTERLACES ELSEVIER

C o m p u t e r Standards & Interfaces 16 (1994) 183-184

Guest editorial

Artificial Neural Networks The basic ideas behind Artificial Neural Networks - ANNs - are not new. McCulloch & Pitts developed their simplified single neuron model over 50 years ago. Widrow developed his A D A LINE, and Rosenblatt the Perceptron during the 1960s. Multilayer feedforward networks (multilayer perceptrons or MLPs) and the backpropagation algorithm were developed during the late 1970s, and Hopfield devised his recurrent (feedback) network during the early 1980s. The development of MLPs and Hopfield nets heralded a resurgence of worldwide interest in ANNs which has continued unabatted ever since. ANNs are new types of computers based on (inspired by) models of biological neural networks (brains). It should be emphasized that nobody fully understands how biological neural networks work, let alone artificial neural networks. Despite this, ANNs have captured the imagination of both research scientists and the general public alike - the prospect of producing computers based on the workings of the human brain is truly awe inspiring. Despite a flurry of activity during the previous decade, ANNs remain a young field. Nevertheless, it has been repeatedly demonstrated that ANNs can be used to solve many real-world problems, and indeed are excellent for pattern recognition/classification tasks in particular. It is generally acknowledged that the A N N models in common usage today are over-simplified and biologically implausible, however this does not detract from their applicability in solving many real-world problems. ANNs consist of a large number of simple, slow, non-linear, analog processors (nodes or neurons), connected together to form a mas-

sively-parallel, distributed computer - a 'Parallel Distributed Processor' or PDP to use the terminology of Rumelhart and McClelland [1]. They are trained rather than p r o g r a m m e d in the conventional sense. Thus ANNs are essentially hardware devices, however to date they have been primarily implemented by way of computer software simulations running on conventional computers. Even when implemented on parallel machines, ANNs do not map particularly well onto the underlying computer architecture (due in large part to their inherent massive parallelism). Tentative steps have been taken in the direction of hardware implementations of ANNs, in either (analog or digital) VLSI form, or using optical technology, however the field is very much in its infancy. Moreover, the scale of integration achievable via such implementations remains miniscule compared with that found in biological neural networks. This special issue of Computer Standards & htterraces is devoted to A N N 'standards'. This is a timely undertaking, since the field of A N N research is still in its infancy, despite its early origins. Indeed, some researchers regard it as premature to discuss standards in such an active, leading edge area. 'Standards' is used here in its broadest sense - it includcs formal standards, as set by professional bodies and learned societies, as well as de facto standard approaches and commonly accepted practices in the field. Papers arc included in this special issue from both perspectives. The international appeal of and research activity in ANNs is reflected in the papers selected for inclusion in this special issue of Computer Standards & Interfaces: papers appear from USA, UK,

11920-5489/94/$117.00 ,~, 1994 Elsevier Science B.V. All rights reserved SSD1 0 9 2 1 ) - 5 4 8 ~ ) ( 9 3 ) E 1 ) 0 5 6 - 8

184

J. Fulcher/ ComputerStandards& Interfaces 16 (1994)183-184

Brazil, Germany, Switzerland, Austria, South Korea, and Australia. The enthusiasm characterised by researchers active in a leading edge field like ANNs was reflected in the manner in which submissions were forthcoming for this special issue, namely via email over the InterNet. This was in direct response to call-for-papers in the Neuron Digest and comp.ai.neural-nets newsgroups. The initial response to these call-for-papers was nothing short of overwhelming. Of the papers submitted, eight have been selected for this special issue. Two of these papers were invited, and provide accounts of standards activities to date in both USA (IEEE-NNC) and Europe (ESPRIT). These papers lead off the special issue. The remaining six papers cover standards issues as diverse as representation in connectionist networks, advanced supervised learning in MLPs, neo-statistical ANN methods, weightless ANNs, ANN classification and commercial ANN software simulators. It would be remiss of me not to thank those people who helped with the preparation of this special issue of Computer Standards & Interface. Floris van Drunen of Elsevier Science (NorthHolland), and John Berg, Editor-in-Chief of Computer Standards & Interfaces first suggested this special issue, and invited me to serve as Guest editor. Mary Lou Padgett and Walter Karplus have been supportive of the project at every stage of its development, and provided much assistance in their capacities as Vice-chair and Chair respectively of the IEEE-NNC Standards

Committee. Likewise George Dorffner took time off from his ESPRIT commitments to update us on developments within Europe. My special thanks go to all the reviewers consulted for this special issue; their tireless efforts have resulted in an impressive collection of papers (just how significant will only become apparent over the course of time). It has been a pleasure to serve as Guest editor for this special issue of Computer Standards & Interfaces. My sincere hope is that it will serve to introduce ANNs to those readers who may not have previously encountered them (or perhaps have heard mention of them only in passing), or alternatively to raise standardization issues for those readers already conversant with the field. I hope you enjoy the following papers as much as I did in coordinating the preparation of this special issue of Computer Standards & Interfaces. I heartily recommend these papers to you and wish you happy and enjoyable reading. John Fulcher Director, Neural Networks Research Group, Centre for Information Technology & Senior Lecturer, Department of Computer Science, University of Wollongong, Australia

Reference [1] D.E. Rumelhart and J.L. McClelland, ParallelDistributed

Processing: Explorations in the Microstructure of Cognition (2 vols) (MIT Press, Cambridge, MA, 1986).