ET will 'conquer and colonise' humanity, warns Stephen Hawking
THE $100 MILLION HUNT FOR
ALIEN LIFE BACKED BY STEPHEN HAWKING
Professor Stephen Hawking has previously said artificial intelligence could control humans in 100 years.
Now, in his latest dire warning, the physicist claims that if AI doesn't conquer humanity, an advanced alien civilisation may do so instead.
'If aliens visit us, the outcome could be much like when Columbus landed in America, which didn't turn out well for the Native Americans,' Professor Hawking said in a recent interview.
'Such advanced aliens would perhaps become nomads, looking to conquer and colonise whatever planets they can reach,' Hawking told El Pais.
Hawking is currently heading up a major search for intelligent alien life using two of the world's most powerful telescopes.
The telescopes will scour one million of the closest stars to Earth for faint signals thrown out into space by intelligent life beyond our own world.
Scientists taking part in the $100 million (£64 million) initiative will also scan the very centre of our galaxy along with 100 of the closest galaxies for low power radio transmissions.
And the search is heating up. Last week, news that Mars may contain liquid water reignited the belief that alien life could soon be found.
'To my mathematical brain, the numbers alone make thinking about aliens perfectly rational,' said Hawking.
'The real challenge is to work out what aliens might actually be like.'
But Hawking says if aliens don't kill off the human race, then climate change or AI could do so instead.
'I think the survival of the human race will depend on its ability to find new homes elsewhere in the universe, because there's an increasing risk that a disaster will destroy Earth,' he said.
'Computers will overtake humans with AI at some point within the next 100 years. When that happens, we need to make sure the computers have goals aligned with ours.'
In July, Professor Hawking and Tesla founder Elon Musk led 1,000 robotics experts in an open letter warning that 'Autonomous weapons will become the Kalashnikovs of tomorrow'.
The strongly-worded letter called for an outright ban on 'offensive autonomous weapons beyond meaningful human control' in an effort to prevent a global AI arms race.
The experts point out that, unlike nuclear weapons, AI weapons require no costly or hard-to- obtain raw materials.
This means they will become ubiquitous and cheap for all signifi-
A new search for intelligent alien life using two of the world's most powerful telescopes has been launched by leading scientists including Professor Stephen Hawking.
The telescopes will scour one million of the closest stars to Earth for faint signals thrown out into space by intelligent life beyond our own world.
Scientists taking part in the $100 million (£64 million) initiative will also scan the very centre of our galaxy along with 100 of the closest galaxies for low power radio transmissions.
In a second initiative, an international competition will be held to generate messages representing humanity and planet Earth, which may one day be sent to alien civilisations.
The new search for intelligent life, which promises to cover 10 times more of the sky than previous attempts, is backed by Russian billionaire entrepreneur Yuri Milner, who set up the Breakthrough Prize for scientific endeavours.
The attempt to find signs of alien life, which has been named the Breakthrough Listen Initiative, will draw on the expertise of leading scientists, physicists and astronomers.
Professor Hawking, who has in the past said there is certainly alien life out there but has warned humanity against trying to contact them, was among those to back the project. cant military powers to mass-produce.
If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable,' the letter states.
The authors predict that it will only be a matter of time until smart weapons appear on the black market and in the hands of terrorists, dictators and warlords.
They claim AI technology has reached a point where the deployment of such systems is now feasible within years, rather than decades.
'Autonomous weapons are ideal for tasks such as assassinations, destabilising nations, subduing populations and selectively killing a particular ethnic group,' the letter states.
'We therefore believe that a military AI arms race would not be beneficial for humanity.'