Legally Blind People Can See With A New Kind Of Glasses

A Canadian company based in Toronto has suceeded to build a kind of Google glass that is able to give back full sight to legally blind people.  The eSight is an augmented reality headset that houses a high-speed, high-definition camera that captures everything the user is looking at.

CLICK ON THE IMAGE TO ENJOY THE VIDEO


Algorithms enhance the video feed and display it on two, OLED screens in front of the user’s eyes. Full color video images are clearly seen by the eSight user with unprecedented visual clarity and virtually no lag. With eSight’s patented Bioptic Tilt capability, users can adjust the device to the precise position that, for them, presents the best view of the video while maximizing side peripheral vision. This ensures a user’s balance and prevents nausea – common problems with other immersive technologies. A blind individual can use both of their hands while they use eSight to see. It is lightweight, worn comfortably around the eyes and designed for various environments and for use throughout the day.

eSight is a comprehensive customized medical device that can replace all the many single-task assistive devices that are currently available but do not provide actual sight (e.g. white canes, magnifying devices, service animals, Braille machines, CCTV scanners, text-to-speech software). It allows a user to instantly auto-focus between short-range vision (reading a book or text on a smartphone) to mid-range vision (seeing faces or watching TV) to long-range vision (looking down a hallway or outsidea window). It is the only device for the legally blind that enables mobility without causing issues of imbalance or nausea (common with other immersive options). A legally blind individual can use eSight not just to see while sitting down but while being independently mobile (e.g. walking, exercising, commuting, travelling, etc).

According to The Wall Street Journal, the company is taking advantages of recent improvements in technology from VR headsets and smartphones that have trickled down to improve the latest version of the eSight. So far, the company has sold roughly a thousand units, but at $10,000 apiece, they’re not cheap (and most insurances apparently don’t cover the product), although eSight’s chief executive Brian Mech notes to the WSJ that getting devices to users is “a battle we are starting to wage.”

Source: https://www.esighteyewear.com/

How Brain Waves Can Control VR Video Games

Virtual reality is still so new that the best way for us to interact within it is not yet clear. One startup wants you to use your head, literally: it’s tracking brain waves and using the result to control VR video games.

Boston-based startup Neurable is focused on deciphering brain activity to determine a person’s intention, particularly in virtual and augmented reality. The company uses dry electrodes to record brain activity via electroencephalography (EEG); then software analyzes the signal and determines the action that should occur.

neurons2

You don’t really have to do anything,” says cofounder and CEO Ramses Alcaide, who developed the technology as a graduate student at the University of Michigan. “It’s a subconscious response, which is really cool.”

Neurable, which raised $2 million in venture funding late last year, is still in the early stages: its demo hardware looks like a bunch of electrodes attached to straps that span a user’s head, worn along with an HTC Vive virtual-reality headset. Unlike the headset, Neurable’s contraption is wireless—it sends data to a computer via Bluetooth. The startup expects to offer software tools for game development later this year, and it isn’t planning to build its own hardware; rather, Neurable hopes companies will be making headsets with sensors to support its technology in the next several years.

Source; https://www.technologyreview.com/
AND
http://neurable.com/

Virtual Images that Blend In And Interact With The Real-World

Avegant, a Silicon Valley startup that sells a pair of headphones equipped with a VR-like portable screen, is breaking into augmented reality. The company today announced that it’s developed a new type of headset technology powered by a so-called light field display.

Avegant ARCLICK ON THE IMAGE TO ENJOY THE VIDEO

The research prototype, which Avegant eventually plans on turning into a consumer product, is based on the company’s previous work with its Glyph projector. That device was a visor of sorts that floats a virtual movie screen in front of your eyes, and developing it gave Avegant insight into how to build an AR headset of its own.

Like Microsoft’s HoloLens and the supposed prototype from secretive AR startup Magic Leap, Avegant’s new headset creates virtual images that blend in and interact with the real-world environment. In a demo, the company’s wired prototype proved to be superior in key ways to the developer version of the HoloLens. Avegant attributes this not to the power of its tethered PC, but to the device’s light field display — a technology Magic Leap also claims to have developed, yet has never been shown off to the public.

The demo I experienced featured a tour of a virtual Solar System, an immersion within an ocean environment, and a conversation with a virtual life-sized human being standing in the same room. To be fair, Avegant was using a tethered and bulky headset that wasn’t all that comfortable, while the HoloLens developer version is a refined wireless device. Yet with that said, Avegant’s prototype managed to expand the field of view, so you’re looking through a window more the size of a Moleskine notebook instead of a pack of playing cards. The images it produced also felt sharper, richer, and more realistic.

In the Solar System demo, I was able to observe a satellite orbiting an Earth no larger than a bocce ball and identify the Big Red Spot on Jupiter. Avegant constructed its demo to show off how these objects could exist at different focal lengths in a fixed environment — in this case a converted conference room at the company’s Belmont, California office. So I was able to stand behind the Sun and squint until the star went out of focus in one corner of my vision and a virtual Saturn and its rings became crystal clear in the distance.

Source: http://www.theverge.com/

In 2029 Immortality May Be Possible

Scientist Ray Kurzweil (Google‘s Director of Engineering) reckons man could become immortal in just a few years’ time. The 61-year-old American – dubbed the smartest futurist on Earth by Microsoft founder Bill Gates – has consistently predicted new technologies many years before they arrived. Here, Ray explains why he believes today’s 60-year-olds could go on to live forever. We are living through the most exciting period of human historyComputer technology and our understanding of genes — our body’s software programs — are accelerating at an incredible rate. He and many other scientists now believe that in around 20 years we will have the means to reprogramme our bodies’ stone-age software so we can halt, then reverse, ageing. Then nano-technology will let us live for ever.

Already, blood cell-sized submarines cnanorobotsalled nanobots are being tested in animals. These will soon be used to destroy tumours, unblock clots and perform operations without scars.

Ultimately, nanobots will replace blood cells and do their work thousands of times more effectively. Within 25 years we will be able to do an Olympic sprint for 15 minutes without taking a breath, or go scuba-diving for four hours without oxygen. Heart-attack victims — who haven’t taken advantage of widely available bionic hearts — will calmly drive to the doctors for a minor operation as their blood bots keep them alive. Nanotechnology will extend our mental capacities to such an extent we will be able to write books within minutes. If we want to go into virtual-reality mode, nanobots will shut down brain signals and take us wherever we want to go. Virtual sex will become commonplace. And in our daily lives, hologram-like figures will pop up in our brain to explain what is happening.

These technologies should not seem at all fanciful. Our phones now perform tasks we wouldn’t have dreamed possible 20 years ago. In 1965, an university’s only computer cost £7million and was huge. Today your mobile phone is a million times less expensive and a thousand times more powerful. That’s a billion times more capable for the same price.

According to Kurrzweil’s theory — the Law of Accelerating Returns — we will experience another billion-fold increase in technological capability for the same cost in the next 25 years. So we can look forward to a world where humans become cyborgs, with artificial limbs and organs. This might sound far-fetched, but remember, diabetics already have artificial pancreases and Parkinson’s patients have neural implants. As we approach the 21st Century’s second decade, stunning medical breakthroughs are a regular occurrence.

In 2008 we discovered skin cells can be transformed into the equivalent of embryonic cells. So organs will soon be repaired and eventually grown. In a few years most people will have their entire genetic sequences mapped. Before long, we will all know the diseases we are susceptible to and gene therapies will mean virtually no genetic problems that can’t be erased. It’s important to ensure we get to take advantage of the upcoming technologies by living well and not getting hit by a bus.

By the middle of this century we will have back-up copies of the information in our bodies and brains that make us who we are. Then we really will be immortal.

Source: https://www.theguardian.com
AND
http://www.thesun.co.uk/

Google Glass Used For Arteries Surgery

Doctors in Poland used a virtual reality system combining a custom mobile application and Google Glass to clear a blocked coronary artery, one of the first uses of the technology to assist with surgery. The imaging system was used with a patient who had chronic total occlusion, a complete blockage of the artery, which doctors said is difficult to clear using standard catheter-based percutaneous coronary intervention, or PCI.

The system provides three-dimensional reconstructions of the artery and includes a hands-free voice recognition system allowing for zoom and changes of the images. The head-mounted display system allows doctors to capture images and video while also interacting with the environment around them. In patients with chronic total occlusion, the standard procedure is not always successful, at least partially because of difficulty visualizing the blockage with conventional coronary tomography angiography, or CTA, imaging.

Doctors-use-virtual-reality-imaging-to-treat-blocked-coronary-artery

This case demonstrates the novel application of wearable devices for display of CTA data sets in the catheterization laboratory that can be used for better planning and guidance of interventional procedures, and provides proof of concept that wearable devices can improve operator comfort and procedure efficiency in interventional cardiology,” Dr. Maksymilian Opolski, of the Department of Interventional Cardiology and Angiology at the Institute of Cardiology in Warsaw (Poland), said in a press release.

Source: http://www.onlinecjc.ca/
AND
http://www.upi.com/

How To Interact With Virtual Reality

An interactive swarm of flying 3D pixels (voxels) developed at Queen’s University’s Human Media Lab (Canada) is set to revolutionize the way people interact with virtual reality. The system, called BitDrones, allows users to explore virtual 3D information by interacting with physical self-levitating building blocks.

Queen’s professor Roel Vertegaal and his students have unveiled the BitDrones system  at the ACM Symposium on User Interface Software and Technology in Charlotte, North Carolina. BitDrones is the first step towards creating interactive self-levitating programmable mattermaterials capable of changing their 3D shape in a programmable fashion – using swarms of nano quadcopters. The work highlights many possible applications for the new technology, including real-reality 3D modeling, gaming, molecular modeling, medical imaging, robotics and online information visualization.

interact with virtual realityCLICK ON THE IMAGE TO ENJOY THE VIDEO

BitDrones brings flying programmable matter, such as featured in the futuristic Disney movie Big Hero 6, closer to reality,” says Dr. Vertegaal. “It is a first step towards allowing people to interact with virtual 3D objects as real physical objects.

Dr. Vertegaal and his team at the Human Media Lab created three types of BitDrones, each representing self-levitating displays of distinct resolutions. “PixelDrones” are equipped with one LED and a small dot matrix display. “ShapeDrones” are augmented with a light-weight mesh and a 3D printed geometric frame, and serve as building blocks for complex 3D models. “DisplayDrones” are fitted with a curved flexible high resolution touchscreen, a forward-facing video camera and Android smartphone board.  All three BitDrone types are equipped with reflective markers, allowing them to be individually tracked and positioned in real time via motion capture technology. The system also tracks the user’s hand motion and touch, allowing users to manipulate the voxels in space.

We call this a Real Reality interface rather than a Virtual Reality interface. This is what distinguishes it from technologies such as Microsoft HoloLens and the Oculus Rift: you can actually touch these pixels, and see them without a headset,” says Dr. Vertegaal.

Source: http://www.hml.queensu.ca/

Invisible Drum Kit To Stay On Good Terms With Neighbours

What you’re seeing is not a trick … this is an invisible drum kit. Aerodrums uses real-time motion-tracking technology to accurately translate the movements of the drummer into sounds. But unlike a traditional drum kit, this one is small enough to carry in a bag and – when used with headphones – won’t annoy the neighbours. Bright light illuminates retro-reflective markers on the drum sticks and feet, with a high speed camera tracking the motion.
A drummer himself since he was nine, co-inventor Richard Lee said complex algorithms make the Aerodrums experience realistic.
aerodrumsCLICK on the image to enjoy the video
It’s very velocity sensitive, so if you hit quiet you get very quiet sounds, if you hit loud, you hear loud sounds. It’s very responsive. It took a lot of research to get the latency as low as possible because that’s the thing that kind of breaks the illusion of ‘aerodrumming‘. If there’s any latency at all then drummers can perceive it and it doesn’t feel right“, says Richard Lee.
If you were back stage and warming up before you want out and played; you could choose those. If you were in a hotel room, you can just put your headphones on. And it’s great, it’s a great tool“, adds Professional drummer Mike Dolbear, who has been performing and teaching the drums for 35 years. Here, he tries out Aerodrums for the first time… The makers have toyed with idea of Aerodrums gloves for playing hand drums and even combining a virtual reality headset. But co-inventor Yann Morvan said their priority was making Aerodrums a viable alternative for drummers, and not a novelty.
We didn’t want Aerodrums to be a fad. We didn’t want it to be the gadget of the year and then it’s forgotten. We wanted it to be a proper musical instrument, that is introducing air drumming as a legitimate way to drum. And keeps going, keeps improving until it’s fine to air drum live, it’s fine to record using air drums because it’s a real musical instrument.”
Traditional drummers may take some convincing before they make the switch. But for those without the space, or who want to stay on good terms with their neighboursAerodrums could hit the spot.
Source: http://aerodrums.com/
AND
http://www.reuters.com/