What are Artificial Neural Networks?
Artificial Neural Networks are
relatively crude electronic models based on the neural structure of the brain.
The brain basically learns from experience. It is natural proof that some
problems that are beyond the scope of current computers are indeed solvable by
small energy efficient packages. This brain modeling also promises a less
technical way to develop machine solutions. This new approach to computing also
provides a more graceful degradation during system overload than its more
traditional counterparts.
These biologically inspired methods of computing are thought to be the
next major advancement in the computing industry. Even simple animal brains are
capable of functions that are currently impossible for computers. Computers do
rote things well, like keeping ledgers or performing complex math. But
computers have trouble recognizing even simple patterns much less generalizing
those patterns of the past into actions of the future.
Now, advances in biological research promise an initial understanding of
the natural thinking mechanism. This research shows that brains store
information as patterns. Some of these patterns are very complicated and allow
us the ability to recognize individual faces from many different angles. This
process of storing information as patterns, utilizing those patterns, and then
solving problems encompasses a new field in computing. This field, as mentioned
before, does not utilize traditional programming but involves the creation of
massively parallel networks and the training of those networks to solve
specific problems. This field also utilizes words very different from
traditional computing, words like behave, react, self-organize, learn,
generalize, and forget.
Analogy to the Brain
The exact workings of the human
brain are still a mystery. Yet, some aspects of this amazing processor are
known. In particular, the most basic element of the human brain is a specific
type of cell which, unlike the rest of the body, doesn't appear to regenerate.
Because this type of cell is the only part of the body that isn't slowly
replaced, it is assumed that these cells are what provides us with our
abilities to remember, think, and apply previous experiences to our every
action. These cells, all 100 billion of them, are known as neurons. Each of
these neurons can connect with up to 200,000 other neurons, although 1,000 to
10,000 is typical.
The power of the human mind comes from the sheer numbers of these basic
components and the multiple connections between them. It also comes from
genetic programming and learning.
The individual neurons are complicated. They have a myriad of parts,
sub-systems, and control mechanisms. They convey information via a host of
electrochemical pathways. There are over one hundred different classes of
neurons, depending on the classification method used. Together these neurons
and their connections form a process which is not binary, not stable, and not
synchronous. In short, it is nothing like the currently available electronic
computers, or even artificial neural networks.
These artificial neural networks try to replicate only the most basic
elements of this complicated, versatile, and powerful organism. They do it in a
primitive way. But for the software engineer who is trying to solve problems,
neural computing was never about replicating human brains. It is about machines
and a new way to solve problems.
Anurag
Comments
Post a Comment