Skip to main content

Why Neural Networks?

Anurag

That is a valid question. Why Neural Networks?
Neural Network is a fascinating technology, 50 years old, but still not fully employed. And the question is why? Why didn't Neural Network progress as fast as many other technologies?
Let us first take a look back …
The concept of neural networks has been around since the early 1950s, but was mostly dormant until the mid 1980s. One of the first neural networks developed was the perceptron created by a psychologist named Frank Rosenblatt in 1958. The perceptron was a very simple system used to analyze data and visual patterns, which generated a great deal of interest in AI community.
Unfortunately, these earlier successes caused people to exaggerate the potential of neural networks, particularly in light of the limitation in the electronics then available.
Rosenblatt and other scientists claimed that eventually, with enough complexity and speed, the perceptron would be able to solve almost any problem.
In 1969, Marvin Minsky and Seymour Papert of MIT published an influential book, which showed that the perceptron could never solve a class of problems, and hinted at several other fundamental flaws in the model.
Their analysis combined with unfulfilled, outrageous claims convinced the AI community; and the bodies that fund it; of the fruitlessness of pursuing work with neural networks, and the majority of researchers turned away from the approach.
The result was to halt much of the funding and scientists working on neural network type devices found it almost impossible to receive funding.
This period of stunted growth lasted through eighties where several events caused a renewed interest. In 1982 John Hopfield of Caltech presented a paper to the national Academy of Sciences. With clarity and mathematical analysis, he showed how such networks could work and what they could do.
By 1985 the American Institute of Physics began what has become an annual meeting of Neural Networks for Computing. By 1987, the Institute of Electrical and Electronic Engineer's (IEEE) first International Conference on Neural Networks drew more than 1,800 attendees.
And the 1990 US Department of Defense Small Business Innovation Research Program named 16 topics, which specifically targeted neural networks.
By then, the wheel turned again and growth started, but not with the pace that one would wish to see. Over shadowed by Internet explosion, processing limitations also contributed to the slow growth.
In meantime Internet hype has settled down and processing power is no showstopper anymore. Computerization of business and personal transactions generate the flood of data that would certainly contribute to machine learning and other modern data analysis methods.
Thanks to the availability of cheap microprocessors and recent discoveries about DNA and human brain, artificial intelligence has gone from being a fantasy to becoming a reality. In fact, most AI researchers believe that it's only a matter of 20 to 30 years before machines become at least as intelligent as humans.
Already over 80% of Fortune 500 have Neural Net R&D programs and others are realizing its importance.
Now ... Neural Network is back and this time ... to stay ...
Yet, its future, indeed the very key to the whole technology, lies in commercial use.

Comments

Popular posts from this blog

JAVA Scrollbar, MenuItem and Menu, PopupMenu

ava AWT Scrollbar The  object  of Scrollbar class is used to add horizontal and vertical scrollbar. Scrollbar is a  GUI  component allows us to see invisible number of rows and columns. AWT Scrollbar class declaration public   class  Scrollbar  extends  Component  implements  Adjustable, Accessible   Java AWT Scrollbar Example import  java.awt.*;   class  ScrollbarExample{   ScrollbarExample(){               Frame f=  new  Frame( "Scrollbar Example" );               Scrollbar s= new  Scrollbar();               s.setBounds( 100 , 100 ,  50 , 100 );               f.add(s);   ...

Difference between net platform and dot net framework...

Difference between net platform and dot net framework... .net platform supports programming languages that are .net compatible. It is the platform using which we can build and develop the applications. .net framework is the engine inside the .net platform which actually compiles and produces the executable code. .net framework contains CLR(Common Language Runtime) and FCL(Framework Class Library) using which it produces the platform independent codes. What is the .NET Framework? The Microsoft .NET Framework is a platform for building, deploying, and running Web Services and applications. It provides a highly productive, standards-based, multi-language environment for integrating existing investments with next-generation applications and services as well as the agility to solve the challenges of deployment and operation of Internet-scale applications. The .NET Framework consists of three main parts: the common language runtime, a hierarchical set of unified class librari...

Standard and Formatted Input / Output in C++

The C++ standard libraries provide an extensive set of input/output capabilities which we will see in subsequent chapters. This chapter will discuss very basic and most common I/O operations required for C++ programming. C++ I/O occurs in streams, which are sequences of bytes. If bytes flow from a device like a keyboard, a disk drive, or a network connection etc. to main memory, this is called   input operation   and if bytes flow from main memory to a device like a display screen, a printer, a disk drive, or a network connection, etc., this is called   output operation . Standard Input and Output in C++ is done through the use of  streams . Streams are generic places to send or receive data. In C++, I/O is done through classes and objects defined in the header file  <iostream> .  iostream  stands for standard input-output stream. This header file contains definitions to objects like  cin ,  cout , etc. /O Library Header Files There are...