Monday, August 4

ExTerminators:The Judgement Day & the GigaDeaths


Hugo de Garis is concerned that massively intelligent machines ("artilects") could become infinitely smarter than human beings, leading to warring factions over the question: should humanity risk building artilects? Result: gigadeaths. Read his article "Building Gods or Building Our Potential Exterminators?"

With the help of Moore's Law he points that Robot artificial intelligence is evolving a million times faster than human intelligence as the chip technology is advancing and increasing the electronic performance massively. He himself is involved in the development of the artifical brain and believes that soon the artificial brained "artilects" (artificial intellects) would be million times more intelligent than the human beings.

For humans he predicts "gigadeaths", the results of the war between the anti-humans and anti-machine groups. He uses the word 'giga' since we will be much more equipped and capable of mass killing each other.

I am basically anti-machine since I believe there has to be limit and ethics to everything. And believe it or not, even if you are not the fan of Terminator and Matrix and similar sci-fi, the end of humans from the hands of robots and cyborgs is very much near. But, Mr. Hugo raises a very valid, strange and shocking point. In the very modern and futuristic world, cyborgs and robots will be the extension or evolution of human beings. If we stop at just being human, he thinks that it will be more tragical than getting killed by robots.

Extending this debate further...
  • Can we non artificial and natural, made in flesh human beings take this planet further?.

  • Can we truly say that we are moving towards the better civilized society and better human civilization.?

  • Are we safe from each other? Or will we be more safe from robots and cyborgs than humans?.

  • If we have not made the better society and better utilization of resources, what is the possibility that our future will be safe in hands of we, the humans.?

  • Will it be not a better option, if we gave this artifical robots to save the nature?

  • Even if this planet is supposed to meet the same future of destruction, wouldn't it be better if it meet its end through super intelligent robots than super self-destructing humans


  • Please remember...
    All our progress is measured in artificial world, naturally speaking we are a failure. The natural evolution of human beings has already stopped.m - Santoshkumar