Stephen Hawking Warns: Only 100 Years Left For Humankind Before Extinction

It’s no secret that physicist Stephen Hawking thinks humans are running out of time on planet Earth.

In a new BBC documentary, Hawking will test his theory that humankind must colonize another planet or perish in the next 100 years. The documentary Stephen Hawking: Expedition New Earth, will air this summer as part of BBC’s Tomorrow’s World season and will showcase that Hawking‘s aspiration “isn’t as fantastical as it sounds,” according to BBC.

For years, Hawking has warned that humankind faces a slew of threats ranging from climate change to destruction from nuclear war and genetically engineered viruses.

While things look bleak, there is some hope, according to Hawking. Humans must set their sights on another planet or perish on Earth.

We must also continue to go into space for the future of humanity,” Hawking said during a 2016 speech at Britain’s Oxford University Union. In the past, Hawking has suggested that humankind might not survive another 1000 years without escaping beyond our fragile planet.” The BBC documentary hints at an adjusted timeframe for colonization, which many may see in their lifetime.

S Hawking: highly intelligent machines, the “worst mistake in history”

Dismissing the implications of highly intelligent machines could be humankind’s “worst mistake in history“, write astrophysicist Stephen Hawking, computer scientist Stuart Russell, and physicists Max Tegmark and Frank Wilczek in the Independent. “Self-awaremachines have received the Hollywood treatment in the Johnny Depp film Transcendence, but the subject should receive serious consideration, they say.

Successfully creating artificial intelligence would be “the biggest event in human history“, they write, and the possible benefits for everyday human life are enormous. There could come a time, however, when machines outpace human achievement. If and when that day arrives, they wonder, will the best interest of humans still factor into their calculations?
terminator
One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand,” they write. “Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.”

And what are we humans doing to address these concerns, they ask. Nothing.

All of us should ask ourselves what we can do now to improve the chances of reaping the benefits and avoiding the risks,” they conclude.

A while back, we wondered about the implications of machine journalists. But maybe we should just be thankful that at least something will be around to write long-form essays on the last days of humankind.

Source: http://www.bbc.com/