Posts belonging to Category Augmented Reality



Green Solar Panels And Other Colors

Researchers from AMOLF, the University of Amsterdam (UvA) and the Energy Research Centre of the Netherlands (ECN) have developed a technology to create efficient bright green colored solar panels. Arrays of silicon nanoparticles integrated in the front module glass of a silicon heterojunction solar cell scatter a narrow band of the solar spectrum and create a green appearance for a wide range of angles. The remainder of the solar spectrum is efficiently coupled into the solar cell. The current generated by the solar panel is only  reduced by 10%. The realization of efficient colorful solar panels is an important step for the integration of solar panels into the built environment and landscape.
Photovoltaic
research has much focused on maximizing the electricity yield obtained from solar panels: nowadays, commercial panels have a maximum conversion efficiency from sunlight into electricity of around 22%. To reach such high efficiency, silicon solar cells have been equipped with a textured surface with an antireflection layer to absorb as much light as possible. This creates a dark blue or black appearance of the solar panels.

To create the colored solar panels the researchers have used the effect of Mie scattering, the resonant backscattering of light with a particular color by nanoparticles. They integrated dense arrays of silicon nanocylinders with a diameter of 100 nm in the top module cover slide of a high-efficiency silicon heterojunction solar cell. Due to the resonant nature of the light scattering effect, only the green part of the spectrum is reflected; the other colors are fully coupled into the solar cell. The current generated by the mini solar panel (0,7 x 0,7 cm2)  is only reduced by 10%. The solar panel appears green over a broad range of angles up to 75 degrees. The nanoparticles are fabricated using soft-imprint lithography, a technique that can readily be scaled up to large-area fabrication.
The light scattering effect due to Mie resonances is easily controllable: by changing the size of the nanoparticles the wavelength of the resonant light scattering can be tuned. Following this principle the researchers are now working to realize solar cells in other colors, and on a combination of different colors to create solar panels with a white appearance. For the large-scale application of solar panels, it is essential that their color can be tailored.

The new design was published online in the journal Applied Physics Letters.

Source: https://amolf.nl/

Use The Phone And See 3D Content Without 3D Glasses

RED, the company known for making some truly outstanding high-end cinema cameras, is set to release a smartphone in Q1 of 2018 called the HYDROGEN ONE. RED says that it is a standalone, unlocked and fully-featured smartphone “operating on Android OS that just happens to add a few additional features that shatter the mold of conventional thinking.” Yes, you read that right. This phone will blow your mind, or something – and it will even make phone calls.

In a press release riddled with buzzwords broken up by linking verbs, RED praises their yet-to-be smartphone with some serious adjectives. If we were just shown this press release outside of living on RED‘s actual server, we would swear it was satire. Here are a smattering of phrases found in the release.

Incredible retina-riveting display
Nanotechnology
Holographic multi-view content
RED Hydrogen 4-View content
Assault your senses
Proprietary H3O algorithm
Multi-dimentional audio

  • There are two models of the phone, which run at different prices. The Aluminum model will cost $1,195, but anyone worth their salt is going to go for the $1,595 Titanium version. Gotta shed that extra weight, you know?

Those are snippets from just the first three sections, of which there are nine. I get hyping a product, but this reads like a catalog seen in the background of a science-fiction comedy, meant to sound ridiculous – especially in the context of a ficticious universe.

Except that this is real life.

After spending a few minutes removing all the glitter words from this release, it looks like it will be a phone using a display similar to what you get with the Nintendo 3DS, or what The Verge points out as perhaps better than the flopped Amazon Fire Phone. Essentially, you should be able to use the phone and see 3D content without 3D glasses. Nintendo has already proven that can work, however it can really tire out your eyes. As an owner of three different Nintendo 3DS consoles, I can say that I rarely use the 3D feature because of how it makes my eyes hurt. It’s an odd sensation. It is probalby why Nintendo has released a new handheld that has the same power as the 3DS, but dropping the 3D feature altogether.

Anyway, back to the HYDROGEN ONE, RED says that it will work in tandem with their cameras as a user interface and monitor. It will also display what RED is calling “holographic content,” which isn’t well-described by RED in this release. We can assume it is some sort of mixed-dimensional view that makes certain parts of a video or image stand out over the others.

Source: http://www.red.com/
AND
http://www.imaging-resource.com/

Legally Blind People Can See With A New Kind Of Glasses

A Canadian company based in Toronto has suceeded to build a kind of Google glass that is able to give back full sight to legally blind people.  The eSight is an augmented reality headset that houses a high-speed, high-definition camera that captures everything the user is looking at.

CLICK ON THE IMAGE TO ENJOY THE VIDEO


Algorithms enhance the video feed and display it on two, OLED screens in front of the user’s eyes. Full color video images are clearly seen by the eSight user with unprecedented visual clarity and virtually no lag. With eSight’s patented Bioptic Tilt capability, users can adjust the device to the precise position that, for them, presents the best view of the video while maximizing side peripheral vision. This ensures a user’s balance and prevents nausea – common problems with other immersive technologies. A blind individual can use both of their hands while they use eSight to see. It is lightweight, worn comfortably around the eyes and designed for various environments and for use throughout the day.

eSight is a comprehensive customized medical device that can replace all the many single-task assistive devices that are currently available but do not provide actual sight (e.g. white canes, magnifying devices, service animals, Braille machines, CCTV scanners, text-to-speech software). It allows a user to instantly auto-focus between short-range vision (reading a book or text on a smartphone) to mid-range vision (seeing faces or watching TV) to long-range vision (looking down a hallway or outsidea window). It is the only device for the legally blind that enables mobility without causing issues of imbalance or nausea (common with other immersive options). A legally blind individual can use eSight not just to see while sitting down but while being independently mobile (e.g. walking, exercising, commuting, travelling, etc).

According to The Wall Street Journal, the company is taking advantages of recent improvements in technology from VR headsets and smartphones that have trickled down to improve the latest version of the eSight. So far, the company has sold roughly a thousand units, but at $10,000 apiece, they’re not cheap (and most insurances apparently don’t cover the product), although eSight’s chief executive Brian Mech notes to the WSJ that getting devices to users is “a battle we are starting to wage.”

Source: https://www.esighteyewear.com/

A Brain-computer Interface To Combat The Rise of AI

Elon Musk is attempting to combat the rise of artificial intelligence (AI) with the launch of his latest venture, brain-computer interface company NeuralinkLittle is known about the startup, aside from what has been revealed in a Wall Street Journal report, but says sources have described it as “neural lace” technology that is being engineered by the company to allow humans to seamlessly communicate with technology without the need for an actual, physical interface. The company has also been registered in California as a medical research entity because Neuralink’s initial focus will be on using the described interface to help with the symptoms of chronic conditions, from epilepsy to depression. This is said to be similar to how deep brain stimulation controlled by an implant helps  Matt Eagles, who has Parkinson’s, manage his symptoms effectively. This is far from the first time Musk has shown an interest in merging man and machine. At a Tesla launch in Dubai earlier this year, the billionaire spoke about the need for humans to become cyborgs if we are to survive the rise of artificial intelligence.

cyborg woman

Over time I think we will probably see a closer merger of biological intelligence and digital intelligence,”CNBC reported him as saying at the time. “It’s mostly about the bandwidth, the speed of the connection between your brain and the digital version of yourself, particularly output.” Transhumanism, the enhancement of humanity’s capabilities through science and technology, is already a living reality for many people, to varying degrees. Documentary-maker Rob Spence replaced one of his own eyes with a video camera in 2008; amputees are using prosthetics connected to their own nerves and controlled using electrical signals from the brain; implants are helping tetraplegics regain independence through the BrainGate project.

Former director of the United States Defense Advanced Research Projects Agency (DARPA), Arati Prabhakar, comments: “From my perspective, which embraces a wide swathe of research disciplines, it seems clear that we humans are on a path to a more symbiotic union with our machines.

Source: http://www.wired.co.uk/

How Brain Waves Can Control VR Video Games

Virtual reality is still so new that the best way for us to interact within it is not yet clear. One startup wants you to use your head, literally: it’s tracking brain waves and using the result to control VR video games.

Boston-based startup Neurable is focused on deciphering brain activity to determine a person’s intention, particularly in virtual and augmented reality. The company uses dry electrodes to record brain activity via electroencephalography (EEG); then software analyzes the signal and determines the action that should occur.

neurons2

You don’t really have to do anything,” says cofounder and CEO Ramses Alcaide, who developed the technology as a graduate student at the University of Michigan. “It’s a subconscious response, which is really cool.”

Neurable, which raised $2 million in venture funding late last year, is still in the early stages: its demo hardware looks like a bunch of electrodes attached to straps that span a user’s head, worn along with an HTC Vive virtual-reality headset. Unlike the headset, Neurable’s contraption is wireless—it sends data to a computer via Bluetooth. The startup expects to offer software tools for game development later this year, and it isn’t planning to build its own hardware; rather, Neurable hopes companies will be making headsets with sensors to support its technology in the next several years.

Source; https://www.technologyreview.com/
AND
http://neurable.com/

Virtual Images that Blend In And Interact With The Real-World

Avegant, a Silicon Valley startup that sells a pair of headphones equipped with a VR-like portable screen, is breaking into augmented reality. The company today announced that it’s developed a new type of headset technology powered by a so-called light field display.

Avegant ARCLICK ON THE IMAGE TO ENJOY THE VIDEO

The research prototype, which Avegant eventually plans on turning into a consumer product, is based on the company’s previous work with its Glyph projector. That device was a visor of sorts that floats a virtual movie screen in front of your eyes, and developing it gave Avegant insight into how to build an AR headset of its own.

Like Microsoft’s HoloLens and the supposed prototype from secretive AR startup Magic Leap, Avegant’s new headset creates virtual images that blend in and interact with the real-world environment. In a demo, the company’s wired prototype proved to be superior in key ways to the developer version of the HoloLens. Avegant attributes this not to the power of its tethered PC, but to the device’s light field display — a technology Magic Leap also claims to have developed, yet has never been shown off to the public.

The demo I experienced featured a tour of a virtual Solar System, an immersion within an ocean environment, and a conversation with a virtual life-sized human being standing in the same room. To be fair, Avegant was using a tethered and bulky headset that wasn’t all that comfortable, while the HoloLens developer version is a refined wireless device. Yet with that said, Avegant’s prototype managed to expand the field of view, so you’re looking through a window more the size of a Moleskine notebook instead of a pack of playing cards. The images it produced also felt sharper, richer, and more realistic.

In the Solar System demo, I was able to observe a satellite orbiting an Earth no larger than a bocce ball and identify the Big Red Spot on Jupiter. Avegant constructed its demo to show off how these objects could exist at different focal lengths in a fixed environment — in this case a converted conference room at the company’s Belmont, California office. So I was able to stand behind the Sun and squint until the star went out of focus in one corner of my vision and a virtual Saturn and its rings became crystal clear in the distance.

Source: http://www.theverge.com/

Apple Testing Augmented Reality ‘Smart Glasses’

As part of its effort to expand further into wearable devices, Apple is working on a set of smart glasses, reports Bloomberg. Citing sources familiar with Apple‘s plans, the site says the smart glasses would connect wirelessly to the iPhone, much like the Apple Watch, and would display “images and other information” to the wearer. Apple has contacted potential suppliers about its glasses project and has ordered “small quantities” of near-eye displays, suggesting the project is in the exploratory prototyping phase of development. If work on the glasses progresses, they could be released in 2018.

apple-iglass

AR can be really great,” says Tim Cook, CEO of Apple in July. “We have been and continue to invest a lot in this. We’re high on AR in the long run.

Apple‘s glasses sound similar to Google Glass, the head-mounted display that Google first introduced in 2013. Google Glass used augmented reality and voice commands to allow users to do things like check the weather, make phone calls, and capture photographs. Apple‘s product could be similar in functionality. The glasses may be Apple‘s first hardware product targeted directly at AR, one of the people said. Cook has beefed up AR capabilities through acquisitions. In 2013, Apple bought PrimeSense, which developed motion-sensing technology in Microsoft Corp.’s Kinect gaming system. Purchases of software startups in the field, Metaio Inc. and Flyby Media Inc., followed in 2015 and 2016.

Google Glass was highly criticized because of privacy concerns, and as a result, it never really caught on with consumers. Google eventually stopped developing Google Glass in January of 2015. It is not clear how Apple would overcome the privacy and safety issues that Google faced, nor if the project will progress, but Apple CEO Tim Cook has expressed Apple‘s deep interest in augmented reality multiple times over the last few months, suggesting something big is in the works.

Past rumors have also indicated Apple is exploring a number of virtual and augmented reality projects, including a full VR headset. Apple has a full team dedicated to AR and VR research and how the technologies can be incorporated into future Apple products. Cook recently said that he believes augmented reality would be more useful and interesting to people than virtual reality.

Source: http://www.macrumors.com/