This is a very complex philosophical topic, since it's not clear what being conscious even really means.
Consciousness is often described as a level of self Awareness, subjectivity, sentience, sapience and perception of one surroundings.
Like Intelligence, consciousness has come to means something uniquely Human, that animals or machines can not have, or at least should not have, otherwise they'd have to come up with a new definition for the word to exclude the animal or machine.
With Intelligence, challenges were set but when machines reached that level, they just kept moving the bar high and higher. Finally with the defeat of Gary Kasparov the Chess grand master to Deep Blue, it's hard to argue that machines aren't intelligent.
But this is an old argument, before that it was speech and language. Again, it was discovered animals had language, and machines are easily able to master speech these days. Although we decided that we didn't like hearing from them. Although "Your door is ajar" was cool at first, but soon became very irritating.
We humans arrogantly prefer to think of ourselves a divine and endowed with special GOD given uniqueness and abilities that only we can have.
The thought of having a machine that could be considered our equal can be a truly terrifying concept.
To most people predicting machines that are capable of competing with is on par with predicting the end of the world.
The day a machine becomes superior is the day we cease to be the dominant species on this planet. From that point forward we will only be allow to live through there benevolence, this could take many forms including humans ending up like some sort of house pets.
Fortunately the dates in these predictions of impending doom and the end of human superiority keep passing by uneventfully and predictions keep getting pushed back.
In Ray Kurzweil's “The Age of Spiritual Machines” published in 1999, he estimates 20 year number for creation of conscience machines. I think this is way far off the mark. In his book, Kurzweil fails to take into account several very important things and make a number of wrong assumptions.
The Memory Bottleneck with current technologies.
Everyone knows Moore’s law, with cpu performance doubling every 18 Months (1.5 years). This is an increase of 66% per year. What few people realize is that memory performance is only increasing 11% per year. It takes 9 Years to double memory throughput.
So memory capacity is increasing while it's speed is increasing only 1/6 as much. As things progress, the memory speed actually falling behind. The boot time memory checks on PC's are becoming painfully slow compared as the amount of RAM increases.
In applications with growing storage demands, brute force scanning of memory is actually becoming slower and slower proportionally to the amount of computing power available.
To rephrase this, it is taking an more time to search through every byte stored in a computers as RAM memory sizes increase. So even if you can hold more, and on a Byte per Byte basis memory access is faster, proportionally when compared with the overall size of memory in a modern computers, it's taking more time to search through.
They can make bigger libraries, but not faster librarians.
Now with the Brain, It only runs at 100 Hz, and holds 10^12 Neurons.
Assuming one logic decision per neuron firing time this gives us 10^14 logic decisions per second.
CPU’s (in 2004) are at 3 * 10^9 (3 Ghz) logic decisions per second.
So following Moore's Law, in about 18 Years we will have machines at the same level in terms of logical operations per second.
But this misses the whole point about how the Brain really works.
Even from common sense experience it’s obvious the brain is a terrible computation engine. Competing with the brains computational abilities is pointless. Even a meager computer from 1980 can easily beat the best human's at computation.
So what is the brain really good at?
The brain is also terrible at brute force memorization and retrieval of plain raw information.
The thing it is good at is self learning feedback control system also know as a servo.
It's also very good at pattern recognition and information retrieval of interrelated data.
On one hand a $1 calculator can outperform a human at computation and even a 20 year old Apple II is far better a rote data storage and retrieval. On the other hand when is come to navigation and control the even the largest of computers are humbled when compared to a simple insects and can’t even come close to “Understanding and deriving meaning from information”
So from a mechanical perspective, what the brain is really good at is effectively searching through it’s memory many times per second, and cycling through memories to access more related memories, comparisons and weighing probabilities.
I believe the brain is not about computation but memory speed.
It's all about loose associative memory and pattern recognition. An input pattern come is and needs to be identified quickly to produce a response. Total accuracy is not critical. Approximation is close enough.
What is the brains memory capacity?
In Nov 2001 I asked Don Knuth this question at one of his “Stump the Professor!” lectures at Xerox Parc. Well he was stumped, even he didn't really have any answers saying is it more more a medical / biology problem. Well having had free run of the Stanford campus and asking the best academics I could find I realized they have even less of a clue then the computer / information science geeks.
Current science doesn't even know how the brain stores information, or how much the brain can store. I have heard many theories that it may use quantum effect, to storing information in some other dimension outside of normal time space. I think these are very unlikely and best of all some beyond known science explanation is not needed to understand how the brain can store so much and retrieve if as fast as it does.
Keep in mind the brain is a control system and pattern matching engine. That's what it does and has had billions years of evolution to master it.
So a minimum estimate might be at 1 bit per neuron. This gives 10 GB, although that estimate seems very small and unlikely considering about much redundancy of information is know to be stored with in the brain.
Another estimate might be at 1 bit per dendrite. That would be 10,000 (dendrites) Bits per neuron giving 100 TB. This is certainly a better estimate but I suspect the real number is far greater.
If data was stored using permeable combinations then it's storage would be far greater. If data was stored in interconnection patterns it’s capacity would increase exponentially according to Bell Numbers. See also: 1 , 2
In examining this I realized it would be possible to have from 10 to 1000 times more storage then a bit per dendrite. It's also in line with what we know about the biology where the brain grows and break interconnections, when learning.
This explain would many things, From its extremely high memory performance given such low clock rates (100 Hz) to it’s ability to have so much redundancy that it can tolerate massive damage and for the most part degrades gracefully.
How does this compare with computers.
Lets assume the best case, PC’s are limited to 4GB already (Sorry this originally was written in 2004) , we will be at 10GB in no time all, 7 years. To Reach the 100 TB of RAM that’s about 20 years out. With Moore 66%.
What is the brains memory throughput?
The brain can access all of it memory as much as 100 times per second.
A real brain can do many things a digital system cannot such as return a fuzzy analog values or access all of its memory at once.
If we use the estimates of the brain memory capacity to be 10GB to 100TB this gives us a memory throughput from 1 Terabytes per Sec to 10 Petabytes per second.
Right now in 2004 we have 833 MHz front size bus (FSB) that is 32 Bits wide.
This memory throughput only increases at only 11% a year(much slower then Moore's law) , so it takes about 7 years to double in speed. For us to go from our current PC’s at 8.33 x 10^8 Byte per sec to our lower estimate of 10^10 is a 12x increase and would take 25 Years. The to reach the higher estimate 10^13 would take 90 years. And if interconnects do store data then:
10^14 in 114 years
10^15 in 136 years
10^16 in 157 years
10^17 in 180 years
10^18 in 200 years
So it’s safe to say that memory throughput of the human brain will exceed the best of our computer technology for at least 25 years assuming 1 bit per dendrite.
But if we assume it's capacities are based on interconnection then we will be well into the next century, somewhere from 2090 to 2150 is my estimate. At that point we have only just matched our minimum waged employee intelligence… We would still need to send such an potentially intelligent computer to school since it would have to learn in a very similar manner as a human does.
In the end creating a true sentient human like intelligence would required all of the energy and human interaction that raising a infant into adult hood would. And each unique entity would need to also go through those same unique learning experiences to become a unique individual, and the outcome would be hit or miss much like it is with humans now. (We don’t all go to Harvard) The one real advantage it would possess is that it could continue to grow and survive well beyond the life span of a single human. So I don’t see such an artificial intelligence as being competition to humans for at least a hundred years. I would like to think at that time the major mission of mankind would be the exploration of space and a HAL like computer operating interstellar spacecraft would be ideal.
-----
I wrote this back in 2004, but never published it. I have had a lot of thoughts since, and need to publish my ideas on data storage with permeable combinations.
I suspect we will start seeing many turing test passing machines soon, but they will be lacking sentience. I am sure they will make excellent bill collectors and sales people, but probably will not make great programmer or innovators.
I have also learned much about the nature of evolution and that machines are evolving with us. It's really a matter of economics driving things. With the first Luddite moment around 1811 when mechanized looms started replacing human weavers, it was really a straight out matter of economics. When the machines became cheaper then having the humans do it, we were pushed out and had to find other things to do. As we approach the 200th anniversary of the luddites we find that workers have been pushed out of one job after another. I recall when robots replace auto workers. When spreadsheets replace accountants, now the web is replacing print media. But we are in a symbiotic relationship with our technology as without humans the looms have no customers to produce for. Our whole society oscillates back and forth with each step of progress, and in the process kills off many companies and makes a mess of people lives. It seems the inevitable nature of things as they evolve. The alternative, stagnation, something I would prefer not even think about as it tends to be unpleasant for all. At some point I may write more on that subject as history has many examples,the collapse of Rome for one.
So I am sure as time progresses, and with luck we will continue our oscillations, with machines taking over more and more forcing humans to be more intellectual and creative.
We are in a symbiotic relationship with machines and in the end, it's the combination of the two that will trump. I have no doubt all possible combination and hybridization will be tried, but evolution will make the final decision. In the end we will end up with an almost unimaginable diversity of carrying on the struggle.
No comments:
Post a Comment