Computer Reads Body Language

Researchers at Carnegie Mellon University‘s Robotics Institute have enabled a computer to understand body poses and movements of multiple people from video in real time — including, for the first time, the pose of each individual’s hands and fingers. This new method was developed with the help of the Panoptic Studio — a two-story dome embedded with 500 video cameras — and the insights gained from experiments in that facility now make it possible to detect the pose of a group of people using a single camera and a laptop computer.

Yaser Sheikh, associate professor of robotics, said these methods for tracking 2-D human form and motion open up new ways for people and machines to interact with each other and for people to use machines to better understand the world around them. The ability to recognize hand poses, for instance, will make it possible for people to interact with computers in new and more natural ways, such as communicating with computers simply by pointing at things.

Detecting the nuances of nonverbal communication between individuals will allow robots to serve in social spaces, allowing robots to perceive what people around them are doing, what moods they are in and whether they can be interrupted. A self-driving car could get an early warning that a pedestrian is about to step into the street by monitoring body language. Enabling machines to understand human behavior also could enable new approaches to behavioral diagnosis and rehabilitation, for conditions such as autism, dyslexia and depression.

CLICK ON THE IMAGE TO ENJOY THE VIDEO

We communicate almost as much with the movement of our bodies as we do with our voice,” Sheikh said. “But computers are more or less blind to it.”

In sports analytics, real-time pose detection will make it possible for computers to track not only the position of each player on the field of play, as is now the case, but to know what players are doing with their arms, legs and heads at each point in time. The methods can be used for live events or applied to existing videos.

To encourage more research and applications, the researchers have released their computer code for both multi-person and hand pose estimation. It is being widely used by research groups, and more than 20 commercial groups, including automotive companies, have expressed interest in licensing the technology, Sheikh said.

Sheikh and his colleagues have presented reports on their multi-person and hand pose detection methods at CVPR 2017, the Computer Vision and Pattern Recognition Conference  in Honolulu.

Source: https://www.cmu.edu/

Nano Robots Build Molecules

Scientists at The University of Manchester have created the world’s first ‘molecular robot’ that is capable of performing basic tasks including building other molecules.

The tiny robots, which are a millionth of a millimetre in size, can be programmed to move and build molecular cargo, using a tiny robotic arm.

Each individual robot is capable of manipulating a single molecule and is made up of just 150 carbon, hydrogen, oxygen and nitrogen atoms. To put that size into context, a billion billion of these robots piled on top of each other would still only be the same size as a single grain of salt. The robots operate by carrying out chemical reactions in special solutions which can then be controlled and programmed by scientists to perform the basic tasks.

In the future such robots could be used for medical purposes, advanced manufacturing processes and even building molecular factories and assembly lines.

All matter is made up of atoms and these are the basic building blocks that form molecules. Our robot is literally a molecular robot constructed of atoms just like you can build a very simple robot out of Lego bricks, explains Professor David Leigh, who led the research at University’s School of Chemistry. “The robot then responds to a series of simple commands that are programmed with chemical inputs by a scientistIt is similar to the way robots are used on a car assembly line. Those robots pick up a panel and position it so that it can be riveted in the correct way to build the bodywork of a car. So, just like the robot in the factory, our molecular version can be programmed to position and rivet components in different ways to build different products, just on a much smaller scale at a molecular level.”

The research has been published in Nature.

Source: http://www.manchester.ac.uk/

Robots With The Sense Of Touch

A team of researchers from the University of Houston (UH) has reported a breakthrough in stretchable electronics that can serve as an artificial skin, allowing a robotic hand to sense the difference between hot and cold, while also offering advantages for a wide range of biomedical devices.

Cunjiang Yu, Bill D. Cook Assistant Professor of mechanical engineering and lead author for the paper, said the work is the first to create a semiconductor in a rubber composite format, designed to allow the electronic components to retain functionality even after the material is stretched by 50 percent. The semiconductor in rubber composite format enables stretchability without any special mechanical structure. Yu noted that traditional semiconductors are brittle and using them in otherwise stretchable materials has required a complicated system of mechanical accommodations. “That’s both more complex and less stable than the new discovery, as well as more expensive.”

Our strategy has advantages for simple fabrication, scalable manufacturing, high-density integration, large strain tolerance and low cost,” he said.

Yu and the rest of the team – co-authors include first author Hae-Jin Kim, Kyoseung Sim and Anish Thukral, all with the UH Cullen College of Engineering – created the electronic skin and used it to demonstrate that a robotic hand could sense the temperature of hot and iced water in a cup. The skin also was able to interpret computer signals sent to the hand and reproduce the signals as .

The robotic skin can translate the gesture to readable letters that a person like me can understand and read,” Yu said.

The work is reported in the journal Science Advances.

Source: http://www.uh.edu/

NanoRobots With Grippers Travel Through the Bloodstream To Capture Cancer Cells

A research team at Worcester Polytechnic Institute (WPI) has developed a revolutionary, light-activated semiconductor nanocomposite material that can be used in a variety of applications, including microscopic actuators and grippers for surgical robots, light-powered micro-mirrors for optical telecommunications systems, and more efficient solar cells and photodetectors.

nanorobotsThis is a new area of science,” said Balaji Panchapakesan, associate professor of mechanical engineering at WPI and lead author of a paper about the new material published in Scientific Reports, an open access journal from the publishers of Nature. “Very few materials are able to convert photons directly into mechanical motion. In this paper, we present the first semiconductor nanocomposite material known to do so. It is a fascinating material that is also distinguished by its high strength and its enhanced optical absorption when placed under mechanical stress.”

Tiny grippers and actuators made with this material could be used on Mars rovers to capture fine dust particles.” Panchapakesan noted. “They could travel through the bloodstream on tiny robots to capture cancer cells or take minute tissue samples. The material could be used to make micro-actuators for rotating mirrors in optical telecommunications systems; they would operate strictly with light, and would require no other power source.”

Like other semiconductor materials, molybdenum disulfide, the material described in the Scientific Report paper, is characterized by the way electrons are arranged and move about within its atoms.

Source: https://www.wpi.edu/

Robots Can Speak Like Real Humans

Generating speech from a piece of text is a common and important task undertaken by computers, but it’s pretty rare that the result could be mistaken for ordinary speech. A new technique from researchers at Alphabet’s DeepMind  (Google) takes a completely different approach, producing speech and even music that sounds eerily like the real thing.

robot-terminator

Early systems used a large library of the parts of speech (phonemes and morphemes) and a large ruleset that described all the ways letters combined to produce those sounds. The pieces were joined, or concatenated, creating functional speech synthesis that can handle most words, albeit with unconvincing cadence and tone. Later systems parameterized the generation of sound, making a library of speech fragments unnecessary. More compact — but often less effective.

WaveNet, as the system is called, takes things deeper. It simulates the sound of speech at as low a level as possible: one sample at a time. That means building the waveform from scratch16,000 samples per second.

milliwavenetEach dot is a separately calculated sample; the aggregate is the digital waveform.

You already know from the headline, but if you don’t, you probably would have guessed what makes this possible: neural networks. In this case, the researchers fed a ton of ordinary recorded speech to a convolutional neural network, which created a complex set of rules that determined which tones follow other tones in every common context of speech.

Each sample is determined not just by the sample before it, but the thousands of samples that came before it. They all feed into the neural network’s algorithm; it knows that certain tones or samples will almost always follow each other, and certain others will almost never. People don’t speak in square waves, for instance.

Source: https://techcrunch.com/tone

Robots That Feel And Touch Like Humans

Smart synthetic skins have the potential to allow robots to touch and sense what’s around them, but keeping them powered up and highly sensitive at low cost has been a challenge. Now scientists report in the journal ACS Nano a self-powered, transparent smart skin that is simpler and less costly than many other versions that have been developed.

mother robot

Endowing robots and prosthetics with a human-like sense of touch could dramatically advance these technologies. Toward this goal, scientists have come up with various smart skins to layer onto devices. But boosting their sensitivity has involved increasing the numbers of electrodes, depending on the size of the skin. This leads to a rise in costs. Other systems require external batteries and wires to operate, which adds to their bulk. Haixia Zhang and colleagues wanted to find a more practical solution.

The researchers created a smart skin out of ultra-thin plastic films and just four electrodes made from silver nanowires. Other prototypes contain up to 36 electrodes. Additionally, one component harvests mechanical energy — for example, from the movement of a prosthetic hand’s fingers — and turns it into an electric current.

Source: http://www.acs.org/

Perfect Artificial Skin For Robots

A pioneering new technique to produce high-quality, low cost graphene could pave the way for the development of the first truly flexibleelectronic skin’, that could be used in robots.

Researchers from the University of Exeter (UK) have discovered an innovative new method to produce the wonder material Graphene significantly cheaper, and easier, than previously possible.

The research team, led by Professor Monica Craciun, have used this new technique to create the first transparent and flexible touch-sensor that could enable the development of artificial skin for use in robot manufacturing. Professor Craciun, from Exeter’s Engineering department, believes the new discovery could pave the way for “a graphene-driven industrial revolution” to take place.

robot female

The vision for a ‘graphene-driven industrial revolution’ is motivating intensive research on the synthesis of high quality and low cost graphene. Currently, industrial graphene is produced using a technique called Chemical Vapour Deposition (CVD). Although there have been significant advances in recent years in this technique, it is still an expensive and time consuming process, ”she said.

The Exeter researchers have now discovered a new technique, which grows graphene in an industrial cold wall CVD system, a state-of-the-art piece of equipment recently developed by UK graphene company Moorfield.

This so-called nanoCVD system is based on a concept already used for other manufacturing purposes in the semiconductor industry. This shows to the semiconductor industry for the very first time a way to potentially mass produce graphene with present facilities rather than requiring them to build new manufacturing plants. This new technique grows graphene 100 times faster than conventional methods, reduces costs by 99 % and has enhanced electronic quality.

These research findings are published in the journal Advanced Materials.

Source: http://www.exeter.ac.uk/

Brain Waves Control Robotic Hand’s Fingers

Easton LaChappelle was 14 when he first started taking apart toasters. Five years on, he’s being touted as a global leader in robotics, for his range of low-cost Anthromod robotic hands developed in his bedroom. Some can be controlled by a user’s mind.
CLRobotic handICK ON THE IMAGE TO ENJOY THE VIDEO
A good example is we actually had an amputee use the wireless brainwave headset to control a hand, and he was able to fluently control the robotic hand in right around about 10 minutes, so the learning curve is hardly a learning curve anymore.” LaChappelle taught himself how to design, make and code his creations. Using a device that picks up on electrical impulses coming from the brain, he can manipulate his robotic hand’s fingers“, explains LaChapelle.
We actually track patterns and try and convert that into movement. So with this I’m actually able to change grips, grip patterns, based on facial gestures, and then use the raw actual brainwaves and focus to actually close the hand or open the clamp or hand.” LaChappelle’s robotics aren’t the first to be controlled by brainwave frequencies – scientists in Austria fitted a truck driver with something similar in 2010. But that’s not where the magic ends.
3D printing allows you to create something that’s human-like, something that’s extremely customised, again for a very low cost, which for certain applications such as prosthetics, is a really big part of it.” The hands cost as little as 600 dollars to make. LaChappelle wants others to use his work as a platform to create customised versions for themselves; he’s made his software open source. That could eventually mean robots being sent in to control search and rescue missions, as well as improving the lives of amputees globally.

Source: http://www.reuters.com/

Liquid-Metal Alloys For “Soft Robots”

New research shows how inkjet-printing technology can be used to mass-produce electronic circuits made of liquid-metal alloys for “soft robots” and flexible electronics. Elastic technologies could make possible a new class of pliable robots and stretchable garments that people might wear to interact with computers or for therapeutic purposes. However, new manufacturing techniques must be developed before soft machines become commercially feasible, said Rebecca Kramer, an assistant professor of mechanical engineering at Purdue University.
liquid robot
“We want to create stretchable electronics that might be compatible with soft machines, such as robots that need to squeeze through small spaces, or wearable technologies that aren’t restrictive of motion,” she said. “Conductors made from liquid metal can stretch and deform without breaking.

A new potential manufacturing approach focuses on harnessing inkjet printing to create devices made of liquid alloys.

inkjet pritingThis artistic rendering depicts electronic devices created using a new inkjet-printing technology to produce circuits made of liquid-metal alloys for “soft robots” and flexible electronics. Elastic technologies could make possible a new class of pliable robots and stretchable garments that people might wear to interact with computers or for therapeutic purposes.

This process now allows us to print flexible and stretchable conductors onto anything, including elastic materials and fabrics,” Kramer said.

Liquid metal in its native form is not inkjet-able,” he underscores. “So what we do is create liquid metal nanoparticles that are small enough to pass through an inkjet nozzle“.

After printing, the nanoparticles must be rejoined by applying light pressure, which renders the material conductive.
A research paper about the method will appear on April 18 in the journal Advanced Materials.
Source: http://www.purdue.edu/

A.I., Nanotechnology ‘threaten civilisation’

A report from the Global Challenges Foundation created the first list of global risks with impacts that for all practical purposes can be called infinite. It is also the first structured overview of key events related to such risks and has tried to provide initial rough quantifications for the probabilities of these impacts.
Besides the usual major risks such as extreme climate change, nuclear war, super volcanoes or asteroids impact there are 3 emerging new global risks: Synthetic Biology, Nanotechnology and Artificial Intelligence (A.I.).
terminator
The real focus is not on the almost unimaginable impacts of the risks the report outlines. Its fundamental purpose is to encourage global collaboration and to use this new category of risk as a driver for innovation.

In the case of AI, the report suggests that future machines and software with “human-level intelligence” could create new, dangerous challenges for humanity – although they could also help to combat many of the other risks cited in the report. “Such extreme intelligences could not easily be controlled (either by the groups creating them, or by some international regulatory regime), and would probably act to boost their own intelligence and acquire maximal resources for almost all initial AI motivations,” suggest authors Dennis Pamlin and Stuart Armstrong.
In the case of nanotechnology, the report notes that “atomically precise manufacturing” could have a range of benefits for humans. It could help to tackle challenges including depletion of natural resources, pollution and climate change. But it foresees risks too.
It could create new products – such as smart or extremely resilient materials – and would allow many different groups or even individuals to manufacture a wide range of things,” suggests the report. “This could lead to the easy construction of large arsenals of conventional or more novel weapons made possible by atomically precise manufacturing.”

Source: http://globalchallenges.org/

Robot Lifting Loads 500 Times Its Own Weight

Engineering team from the National University of Singapore (NUS) led by Dr Arian Koh has achieved a world record. They have designed an artificial muscle which could carry a weight 80 times its own while extending to five times its original length. The team’s invention will pave the way for the constructing of life-like robots with superhuman strength and ability.
Artificial muscles have been known to extend to only three times its original length when similarly stressed. The muscle’s degree of extensibility is a significant factor contributing to the muscle’s efficiency as it means that it could perform a wider range of operations while carrying heavy loads. Robots, no matter how intelligent, are restricted by their muscles which are able to lift loads only half its own weight – about equivalent to an average human’s strength (though some humans could lift loads up to three times their weight).
Dr Koh and his team used polymers which could be stretched over 10 times their original length. Translated scientifically, this means that these muscles have a strain displacement of 1,000 per cent. A good understanding of the fundamentals was largely the cause of their success, Dr Koh added.

robots
We put theory to good use. Last year, we calculated theoretically that polymer muscles driven by electrical impulse could potentially have a strain displacement of 1,000 per cent, lifting a load of up to 500 times its own weight. So I asked my students to strive towards this Holy Grail, no matter how impossible it sounded,” he said.
Source: http://www.eng.nus.edu.sg/

Artificial Skin For Robots Like Human Skin

Researchers from the Georgia Institute of Technology have fabricated arrays of piezotronic transistors capable of converting mechanical motion directly into electronic controlling signals. The arrays could help give robots a more adaptive sense of touch. Mimicking the sense of touch electronically has been challenging, and is now done by measuring changes in resistance prompted by mechanical touch. The devices developed By Georgia Tech scientists rely on a different physical phenomenon – tiny polarization charges formed when piezoelectric materials such as zinc oxide are moved or placed under strain. In the piezotronic transistors, the piezoelectric charges control the flow of current through the wires just as gate voltages do in conventional three-terminal transistors.

robots

Any mechanical motion, such as the movement of arms or the fingers of a robot, could be translated to control signals,” explained Zhong Lin Wang, a Regents’ professor and Hightower Chair in the School of Materials Science and Engineering at the Georgia Institute of Technology. “This could make artificial skin smarter and more like the human skin. It would allow the skin to feel activity on the surface.
Source: http://www.gatech.edu/