Stephen Hawking Warns: Only 100 Years Left For Humankind Before Extinction

It’s no secret that physicist Stephen Hawking thinks humans are running out of time on planet Earth.

In a new BBC documentary, Hawking will test his theory that humankind must colonize another planet or perish in the next 100 years. The documentary Stephen Hawking: Expedition New Earth, will air this summer as part of BBC’s Tomorrow’s World season and will showcase that Hawking‘s aspiration “isn’t as fantastical as it sounds,” according to BBC.

For years, Hawking has warned that humankind faces a slew of threats ranging from climate change to destruction from nuclear war and genetically engineered viruses.

While things look bleak, there is some hope, according to Hawking. Humans must set their sights on another planet or perish on Earth.

We must also continue to go into space for the future of humanity,” Hawking said during a 2016 speech at Britain’s Oxford University Union. In the past, Hawking has suggested that humankind might not survive another 1000 years without escaping beyond our fragile planet.” The BBC documentary hints at an adjusted timeframe for colonization, which many may see in their lifetime.

A.I., Nanotechnology ‘threaten civilisation’

A report from the Global Challenges Foundation created the first list of global risks with impacts that for all practical purposes can be called infinite. It is also the first structured overview of key events related to such risks and has tried to provide initial rough quantifications for the probabilities of these impacts.
Besides the usual major risks such as extreme climate change, nuclear war, super volcanoes or asteroids impact there are 3 emerging new global risks: Synthetic Biology, Nanotechnology and Artificial Intelligence (A.I.).
terminator
The real focus is not on the almost unimaginable impacts of the risks the report outlines. Its fundamental purpose is to encourage global collaboration and to use this new category of risk as a driver for innovation.

In the case of AI, the report suggests that future machines and software with “human-level intelligence” could create new, dangerous challenges for humanity – although they could also help to combat many of the other risks cited in the report. “Such extreme intelligences could not easily be controlled (either by the groups creating them, or by some international regulatory regime), and would probably act to boost their own intelligence and acquire maximal resources for almost all initial AI motivations,” suggest authors Dennis Pamlin and Stuart Armstrong.
In the case of nanotechnology, the report notes that “atomically precise manufacturing” could have a range of benefits for humans. It could help to tackle challenges including depletion of natural resources, pollution and climate change. But it foresees risks too.
It could create new products – such as smart or extremely resilient materials – and would allow many different groups or even individuals to manufacture a wide range of things,” suggests the report. “This could lead to the easy construction of large arsenals of conventional or more novel weapons made possible by atomically precise manufacturing.”

Source: http://globalchallenges.org/