Tuesday, May 20, 2008

Humans and Computers

Man Vs Machine
Generally Speaking

Many of us think that computers are many many times faster, more powerful and more capable when compared to our brains simply because they can perform calculations thousands of time faster, workout logical computations without error and store memory at incredible speeds with flawless accuracy.But is the the computer really superior to the human brain in terms of ability , processing power and adaptability ?We now give you the real comparison.

Processing Power and Speed

The human brain - We can only estimate the processing power of the average human brain as there is no way to measure it quantitatively as of yet. If the theory of taking nerve volume to be proportional to processing power is true we then, may have a correct estimate of the human brain's processing power.
It is fortunate that we understand the neural assemblies is the retina of the vertebrate eye quite well (structurally and functionally) because it helps to give us a idea of the human brain's capability.
The retina is a nerve tissue in the back of the eyeball which detects lights and sends images to the brain. A human retina has a size of about a centimeter square is half a millimeter thick and is made up of 100 million neurons. Scientists say that the retina sends to the brain, particular patches of images indicating light intensity differences which are transported via the optic nerve, a million-fiber cable which reaches deep into the brain.
Overall, the retina seems to process about ten one-million-point images per second.
Because the 1,500 cubic centimeter human brain is about 100,000 times as large as the retina, by simple calculation, we can estimate the processing power of a average brain to be about 100 million MIPS (Million computer Instructions Per Second ). In case you're wondering how much speed that is, let us give you an idea.
1999's fastest PC processor chip on the market was a 700 MHz pentium that did 4200 MIPS. By simple calculation, we can see that we would need at least 24,000 of these processors in a system to match up to the total speed of the brain !! (Which means the brain is like a 168,0000 MHz Pentium computer). But even so, other factors like memory and the complexity of the system needed to handle so many processors will not be a simple task. Because of these factors, the figures we so childishly calculated will most probably be a very serious underestimate.

The computer - The most powerful experimental super computers in 1998, composed of thousands or tens of thousands of the fastest microprocessors and costing tens of millions of dollars, can do a few million MIPS. These systems were used mainly to stimulate physical events for high-value scientific calculations.
Here, we have a chart of processor speeds for the past few years.
Year Clock Speed (MHz) Instruction Rate (MIPS)
1992 200 200 (400)
1993.5 300 300 (600)
1995 400 800 (1600)
1996.5 500 1000 (2000)
1998 600 2400 (3600)
1999.5 700 2800 (4200)
2000 1000 ?
From the chart above, we can observe some break through s in microprocessor speeds. The current techniques used by research labs should be able to continue such improvements for about a decade. By then maybe prototype multiprocessor chips finally reaching MIPS matching that of the brain will be cheap enough to develop.
Improvements of computer speeds however have some limitations. The more memory it has, the slower it is because it takes longer to run through its memory once. Computers with less memory hence have more MIPS, but are confined to less space to run big programs. The latest, greatest super computers can do a trillion calculations per second and can have a trillion bytes of memory. As computer memory and processors improve, the Megabyte/MIPS ratio is a big factor to consider. So far, this ratio has remained constant throughout the history of computers.
So who has more processing power ?By estimation, the brain has about 100 million MIPS worth of processing power while recent super-computers only has a few million MIPS worth in processor speed. That said, the brain is still the winner in the race. Because of the cost, enthusiasm and efforts still required, computer technology has still some length to go before it will match the human brain's processing power.

Counting the Memory

The human brain - So far, we have never heard of anybody's brain being "overloaded" because it has ran out of memory. (So it seems as if, the human brain has no limit as to how much memory it can hold. That may not be true)
Our best possible guess of the average human brain's capacity would by calculating using the number of synapses connecting the neurons in the human brain. Because each of the synapses have different molecular states, we estimate each of them to be capable holding one megabyte worth of memory. Since the brain has 100-trillion-synapses, we can safely say that the average brain can hold about 100 million megabytes of memory !!!
Remember what we said about the Megabyte/MIPS ratio of a computer ? By calculation, scientists discovered that the brain's memory/MIPS ratio matches that of modern computers. The megabyte/MIPS ratio seems to hold for nervous systems too!
However, we all know that the memory of the brain is not absolute. It does not have set files or directories that can be deleted, copied or archived like those of a computer. For example, a particular person who thought he had memorized a telephone number for good suddenly realizes he can't recall the number. But some half-a-day later, he may suddenly recall the number again.) It is a strange phenomenal that we still can't really explain. A simple thoery is that the brain treats parts and peices of these ignored memories like a unactive "archives" sections until they are required. Memory spans of parts of the brain seem to depend on how often they are used. Even so, there is no such thing as deletion of data in a brain.

The computer - Computers have more than one form of memory. We can generally classify them into primary and secondary memory. Primary memory is used as a form of temporary memory for calculation processes and storage of temporary values that need rapid access or updating, the contents of the primary memory disappear when the power is turned off. Primary memory is important when executing programs, bigger programs require more primary memory. ( RAM(random access memory), Caches & buffers are just a few examples of primary memory)
Secondary memory often comes in the form of hard disks, removable disk drives and tape drives. Secondary memory is used for the storage of most of a system's data, programs and all other permanent data that should stay there even when the power is turned off. As a computer is fed with bigger, smarter programs and more data, it would naturally need more secondary memory to hold them.
The latest, greatest super computers (as of 1998) have a million megabytes of memory. Today's latest model of hard disk drives on the personal computer market (in early 2000) can hold about 40,000 megabytes (40 gigabytes) of memory.

So who is the Superior ?


The brain is still the overall winner in many fields when it comes to numbers. However, because of its other commitments, the brain is less efficient when a person tries to use it for one specific function. The brain is as we can put it, a general purpose processor when compared to the computer. It therefore loses out when it comes to efficiency and performance. We have given the estimate for total human performance at 100 million MIPS, but the level of efficiency for which this can be applied to any task may only be a small fraction of the total. (this fraction depends on the adaptibilty of the brain to the task)
Deep Blue, the chess machine that bested world chess champion Garry Kasparov in 1997, used specialized chips to process chess moves at a the speed equivalent to a 3 million MIPS universal computer. This is 1/30 of the estimate for total human performance. Since it is plausible that Kasparov, probably the best human player ever, can apply his brain power to the strange problems of chess with an efficiency of 1/30, Deep Blue's near parity with Kasparov's chess skill supports the theory of the level of efficiency of total performance. ( Garry Kasparov beat Deep Blue with a very close, 2 -1 )

Comparison between conventional computers and neural networks

Parallel processing

One of the major advantages of the neural network is its ability to do many things at once. With traditional computers, processing is sequential--one task, then the next, then the next, and so on. The idea of threading makes it appear to the human user that many things are happening at one time. For instance, the Netscape throbber is shooting meteors at the same time that the page is loading. However, this is only an appearance; processes are not actually happening simultaneously.
The artificial neural network is an inherently multiprocessor-friendly architecture. Without much modification, it goes beyond one or even two processors of the von Neumann architecture. The artificial neural network is designed from the onset to be parallel. Humans can listen to music at the same time they do their homework--at least, that's what we try to convince our parents in high school. With a massively parallel architecture, the neural network can accomplish a lot in less time. The tradeoff is that processors have to be specifically designed for the neural network.
The ways in which they function

Another fundamental difference between traditional computers and artificial neural networks is the way in which they function. While computers function logically with a set of rules and calculations, artificial neural networks can function via images, pictures, and concepts.
Based upon the way they function, traditional computers have to learn by rules, while artificial neural networks learn by example, by doing something and then learning from it. Because of these fundamental differences, the applications to which we can tailor them are extremely different. We will explore some of the applications later in the presentation.

Self-programming

The "connections" or concepts learned by each type of architecture is different as well. The von Neumann computers are programmable by higher level languages like C or Java and then translating that down to the machine's assembly language. Because of their style of learning, artificial neural networks can, in essence, "program themselves." While the conventional computers must learn only by doing different sequences or steps in an algorithm, neural networks are continuously adaptable by truly altering their own programming. It could be said that conventional computers are limited by their parts, while neural networks can work to become more than the sum of their parts.
Speed

The speed of each computer is dependant upon different aspects of the processor. Von Neumann machines requires either big processors or the tedious, error-prone idea of parallel processors, while neural networks requires the use of multiple chips customly built for the application.

No comments: