Recently, I posted this on Writer’s Beat, a forum where authors, poets, lyricists and bibliophiles gather to talk about all forms of literature.I wonder what the wider public makes of the question. The proposition is not intended to be religious in nature, but perhaps religion and philosophy are as much a part of the answer as science itself. what do you think?
‘As a writer of science-fiction, I have rather glibly used the notion that artificial intelligence will be a mundane reality at some point in the future. I have not done so without reservation. In my writing, I have avoided the use of such technology until at least several centuries have passed from the present day. Why so long? Surely computer advances are so rapid that true artificial intelligence is only a matter of years away, decades at the longest. I do not believe so, but would like other opinions.
Here are my doubts, listed in no particular order.
Firstly, over the last few years, several sources have been fairly consistent in their comparisons between computers and biological brains. A reasonably fast home PC has been likened to the brain of an ant in terms of calculations per second and memory. The human brain is far more complex than this. Nevertheless, in terms of androids, there is an obvious difficulty presented here. The articles concerned don’t state the particular species of ant considered but ant brains peak in volume at 0.1 cubic millimetres and fall as low as 0.002 cubic millimetres. My computer is roughly 250 million to twelve billion times bigger than this. There is clearly a size issue.
Also, whilst assessing my electricity consumption recently, I found that my PC used 130 watts. I could reduce this to 110 watts by making it as slow as a Sinclair Spectrum from the eighties. If an ant’s brain consumed this power, the head would be glowing incandescently bright and would burn instantly. the human brain uses around twenty watts. A cat’s brain operates at one or two watts. Recently, an attempt was made to simulate a cat’s brain using computers, based on memory and speed. Whilst the project claimed to be successful, critics claim that the device functioned almost 100 times slower than the brain of the humble tabby. It consumed one megawatt of power in the attempt. It follows that to simulate the complexity and speed of a human brain around a gigawatt of power is needed. That’s fifty million times less efficient than the biological brain.
Even if silicon based technology can continue to advance at the rate it is, it will take decades to achieve the necessary speed, memory, volume and energy efficiency needed to simulate a human brain. I doubt that silicon is capable of this. Maybe the new graphene holds the key. Graphene is planar graphite made at molecular thickness. So far it is made by lifting graphite from a block with sticky tape…hardly a precise industrial technique. It is super strong and super-conductive. If it could be used in computer memory applications, it may hold the key to the massive increase in performance and efficiency required for an AI brain of comparable size to a human being’s.
Finally, I think that the concept of equating memory and speed of an artificial brain to the equivalent biological unit is unrealistic. What we will end up with is something that still needs to be programmed at a level of complexity that may mimic nature, if the programmers are operating at genius level. The result will still be a mindless number-crunching machine. No super-computer has yet produced a single independent thought. We need to understand more than the simple ‘mechanics’ of the brain to be able to reproduce it correctly.’