Tag Archives: MIT

Sensor-packed Glove Coupled With AI

Wearing a sensor-packed glove while handling a variety of objects, MIT researchers have compiled a massive dataset that enables an AI system to recognize objects through touch alone. The information could be leveraged to help robots identify and manipulate objects, and may aid in prosthetics design.

The researchers developed a low-cost knitted glove, called “scalable tactile glove” (STAG), equipped with about 550 tiny sensors across nearly the entire hand. Each sensor captures pressure signals as humans interact with objects in various ways. A neural network processes the signals to “learn” a dataset of pressure-signal patterns related to specific objects. Then, the system uses that dataset to classify the objects and predict their weights by feel alone, with no visual input needed.

In a paper published today in Nature, the researchers describe a dataset they compiled using STAG for 26 common objects — including a soda can, scissors, tennis ball, spoon, pen, and mug. Using the dataset, the system predicted the objects’ identities with up to 76 percent accuracy. The system can also predict the correct weights of most objects within about 60 grams.

Similar sensor-based gloves used today run thousands of dollars and often contain only around 50 sensors that capture less information. Even though STAG produces very high-resolution data, it’s made from commercially available materials totaling around $10.

The tactile sensing system could be used in combination with traditional computer vision and image-based datasets to give robots a more human-like understanding of interacting with objects.

Humans can identify and handle objects well because we have tactile feedback. As we touch objects, we feel around and realize what they are. Robots don’t have that rich feedback,” says Subramanian Sundaram PhD ’18, a former graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL). “We’ve always wanted robots to do what humans can do, like doing the dishes or other chores. If you want robots to do these things, they must be able to manipulate objects really well.

The researchers also used the dataset to measure the cooperation between regions of the hand during object interactions. For example, when someone uses the middle joint of their index finger, they rarely use their thumb. But the tips of the index and middle fingers always correspond to thumb usage. “We quantifiably show, for the first time, that, if I’m using one part of my hand, how likely I am to use another part of my hand,” he says.

Source: http://news.mit.edu/

Robots Sort Recycling, Detect If An Object Is Paper, Metal Or Plastic

Every year trash companies sift through an estimated 68 million tons of recycling, which is the weight equivalent of more than 30 million cars. A key step in the process happens on fast-moving conveyor belts, where workers have to sort items into categories like paper, plastic and glass. Such jobs are dull, dirty, and often unsafe, especially in facilities where workers also have to remove normal trash from the mix. With that in mind, a team led by researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has developed a robotic system that can detect if an object is paper, metal, or plastic.

The team’s “RoCycle” system includes a soft Teflon hand that uses tactile sensors on its fingertips to detect an object’s size and stiffness. Compatible with any robotic arm, RoCycle was found to be 85 percent accurate at detecting materials when stationary, and 63 percent accurate on an actual simulated conveyer belt. (Its most common error was identifying paper-covered metal tins as paper, which the team says would be improved by adding more sensors along the contact surface.)

CLICK ON THE IMAGE TO ENJOY THE VIDEO

Our robot’s sensorized skin provides haptic feedback that allows it to differentiate between a wide range of objects, from the rigid to the squishy,” says MIT Professor Daniela Rus, senior author on a related paper that will be presented in April at the IEEE International Conference on Soft Robotics (RoboSoft) in Seoul, South Korea. “Computer vision alone will not be able to solve the problem of giving machines human-like perception, so being able to use tactile input is of vital importance.”

A collaboration with Yale University, RoCycle directly demonstrates the limits of sight-based sorting: It can reliably distinguish between two visually similar Starbucks cups, one made of paper and one made of plastic, that would give vision systems trouble.

Source: http://news.mit.edu/

Artificial Intelligence Revolutionizes Farming

Researchers at MIT have used AI to improve the flavor of basil. It’s part of a trend that is seeing artificial intelligence revolutionize farming.
What makes basil so good? In some cases, it’s AI. Machine learning has been used to create basil plants that are extra-delicious. While we sadly cannot report firsthand on the herb’s taste, the effort reflects a broader trend that involves using data science and machine learning to improve agriculture 

The researchers behind the AI-optimized basil used machine learning to determine the growing conditions that would maximize the concentration of the volatile compounds responsible for basil’s flavor. The basil was grown in hydroponic units within modified shipping containers in Middleton, Massachusetts. Temperature, light, humidity, and other environmental factors inside the containers could be controlled automatically. The researchers tested the taste of the plants by looking for certain compounds using gas chromatography and mass spectrometry. And they fed the resulting data into machine-learning algorithms developed at MIT and a company called Cognizant.

The research showed, counterintuitively, that exposing plants to light 24 hours a day generated the best taste. The research group plans to study how the technology might improve the disease-fighting capabilities of plants as well as how different flora may respond to the effects of climate change.

We’re really interested in building networked tools that can take a plant’s experience, its phenotype, the set of stresses it encounters, and its genetics, and digitize that to allow us to understand the plant-environment interaction,” said Caleb Harper, head of the MIT Media Lab’s OpenAg group, in a press release. His lab worked with colleagues from the University of Texas at Austin on the paper.

The idea of using machine learning to optimize plant yield and properties is rapidly taking off in agriculture. Last year, Wageningen University in the Netherlands organized an “Autonomous Greenhousecontest, in which different teams competed to develop algorithms that increased the yield of cucumber plants while minimizing the resources required. They worked with greenhouses where a variety of factors are controlled by computer systems.

The study has appeared  in the journal PLOS One.

Source: https://www.technologyreview.com/

How To Shrink Objects To The Nanoscale

MIT researchers have invented a way to fabricate nanoscale 3-D objects of nearly any shape. They can also pattern the objects with a variety of useful materials, including metals, quantum dots, and DNA.

MIT engineers have devised a way to create 3-D nanoscale objects by patterning a larger structure with a laser and then shrinking it. This image shows a complex structure prior to shrinking.

It’s a way of putting nearly any kind of material into a 3-D pattern with nanoscale precision,” says Edward Boyden, the Y. Eva Tan Professor in Neurotechnology and an associate professor of biological engineering and of brain and cognitive sciences at MIT. Using the new technique, the researchers can create any shape and structure they want by patterning a polymer scaffold with a laser. After attaching other useful materials to the scaffold, they shrink it, generating structures one thousandth the volume of the original.

These tiny structures could have applications in many fields, from optics to medicine to robotics, the researchers say. The technique uses equipment that many biology and materials science labs already have, making it widely accessible for researchers who want to try it. Boyden, who is also a member of MIT’s Media Lab, McGovern Institute for Brain Research, and Koch Institute for Integrative Cancer Research, is one of the senior authors of the paper, which appears in the Dec. 13 issue of Science. The other senior author is Adam Marblestone, a Media Lab research affiliate, and the paper’s lead authors are graduate students Daniel Oran and Samuel Rodriques.

As they did for expansion microscopy, the researchers used a very absorbent material made of polyacrylate, commonly found in diapers, as the scaffold for their nanofabrication process. The scaffold is bathed in a solution that contains molecules of fluorescein, which attach to the scaffold when they are activated by laser light.

Using two-photon microscopy, which allows for precise targeting of points deep within a structure, the researchers attach fluorescein molecules to specific locations within the gel. The fluorescein molecules act as anchors that can bind to other types of molecules that the researchers add.

You attach the anchors where you want with light, and later you can attach whatever you want to the anchors,” Boyden says. “It could be a quantum dot, it could be a piece of DNA, it could be a gold nanoparticle.” “It’s a bit like film photography — a latent image is formed by exposing a sensitive material in a gel to light. Then, you can develop that latent image into a real image by attaching another material, silver, afterwards. In this way implosion fabrication can create all sorts of structures, including gradients, unconnected structures, and multimaterial patterns,” Oran explains.

Source: http://news.mit.edu/

How To Heal Arthritis

Osteoarthritis, a disease that causes severe joint pain, affects more than 20 million people in the United States. Some drug treatments can help alleviate the pain, but there are no treatments that can reverse or slow the cartilage breakdown associated with the disease.

In an advance that could improve the treatment options available for osteoarthritis, MIT engineers have designed a new material that can administer drugs directly to the cartilage. The material can penetrate deep into the cartilage, delivering drugs that could potentially heal damaged tissue.

Six days after treatment with IGF-1 carried by dendrimer nanoparticles (blue), the particles have penetrated through the cartilage of the knee joint.

This is a way to get directly to the cells that are experiencing the damage, and introduce different kinds of therapeutics that might change their behavior,” says Paula Hammond, head of MIT’s Department of Chemical Engineering, a member of MIT’s Koch Institute for Integrative Cancer Research, and the senior author of the study. Treating  rats, the researchers showed that delivering an experimental drug called insulin-like growth factor 1 (IGF-1) with this new material prevented cartilage breakdown much more effectively than injecting the drug into the joint on its own.

Brett Geiger, an MIT graduate student, is the lead author of the paper, which appears in Science Translational Medicine.

Source: http://news.mit.edu/

Telepathy For Real Within 8 Years

Imagine if telepathy were real. If, for example, you could transmit your thoughts to a computer or to another person just by thinking them. In just eight years it will be, says Openwater founder Mary Lou Jepsen, thanks to technology her company is working on.

Jepsen is a former engineering executive at Facebook, Oculus, Google[x] (now called X) and Intel. She’s also been a professor at MIT and is an inventor on over 100 patents. And that’s the abbreviated version of her resume. Jepsen left Facebook to found Openwater in 2016. The San Francisco-based start-up is currently building technology to make medical imaging less expensive.

I figured out how to put basically the functionality of an M.R.I. machine — a multimillion-dollar M.R.I. machine — into a wearable in the form of a ski hat,” Jepson said, though she does not yet have a prototype completed.

Current M.R.I. technology can already see your thoughts: “If I threw [you] into an M.R.I. machine right now … I can tell you what words you’re about to say, what images are in your head. I can tell you what music you’re thinking of,” says Jepsen. “That’s today, and I’m talking about just shrinking that down.”

One day Jepsen’s tech hat could “literally be a thinking cap,” she says. Jepsen explains the goal is for the technology to be able to both read and to output your own thoughts, as well as read the thoughts of others. In iconic Google vocabulary, “the really big moonshot idea here is communication with thought — with telepathy,”adds Jepsen.

Want to Sound Like Barack Obama?

For your hair, there are wigs and hairstylists; for your skin, there are permanent and removable tattoos; for your eyes, there are contact lenses that disguise the shape of your pupils. In short, there’s a plethora of tools people can use if they want to give themselves a makeover—except for one of their signature features: their voice.

Sure a Darth Vader voice changing mask would do something about it, but for people who want to sound like a celebrity or a person of the opposite sex, look no further than Boston-based startup Modulate.

CLICK ON THE IMAGE TO ENJOY THE VIDEO

Founded in August 2017 by two MIT grads, this self-funded startup is using machine learning to change your voice as you speak. This could be a celebrity’s voice (like Barack Obama’s), the voice of a game character or even a totally custom voice. With potential applications in the gaming and movie industries, Modulate has launched  with a free online demo that allows users to play with the service.

The cool thing about Modulate is that the software doesn’t simply disguise your voice, but it does something far more radical: it converts a person’s speech into somebody’s else vocal chords, changing the very I.D. of someone’s speech but keeping intact cadence and word choice. As a result, you sound like you, but have in fact someone’s else voice.

Source: https://www.americaninno.com/

Lulu And Nana, First Gene-Edited Babies

A Chinese researcher who claims to have created the first gene-edited babies, He Jiankui of the Southern University of Science and Technology (SUST), in Shenzhen, is now facing investigation over whether the experiment broke Chinese laws or regulations. The children have their genomes modified to make them resistant to HIV.

He, who led that effort, later released a video statement in which he said that healthy twin girls, Lulu and Nana, had been born “a few weeks ago.”

He said the girls had been conceived using In vitro fertilization (IVF) but that his team had added “a little protein and some information” to the fertilized eggs. That was a reference to the ingredients of CRISPR, the gene-editing technology he apparently employed to delete a gene called CCR5.

The claim set off a wave of criticism in China and abroad from experts who said the experiment created unacceptable risks for a questionable medical purpose. Feng Zhang, one of the inventors of CRISPR, called for a moratorium on its use in editing embryos for IVF procedures.

Documents connected to the trial named the study’s sponsors as He along with Jinzhou Qin and said it was approved by the ethics committee of HarMoniCare Shenzhen Women and Children’s Hospital.

Source: https://www.technologyreview.com/

Plane Propelled Via Ionic Wind

MIT engineers fly first-ever plane with no moving parts. The silent, lightweight aircraft doesn’t depend on fossil fuels or batteries. Since the first airplane took flight over 100 years ago, virtually every aircraft in the sky has flown with the help of moving parts such as propellers, turbine blades, and fans, which are powered by the combustion of fossil fuels or by battery packs that produce a persistent, whining buzz.
Now MIT engineers have built and flown the first-ever plane with no moving parts. Instead of propellers or turbines, the light aircraft is powered by an “ionic wind” — a silent but mighty flow of ions that is produced aboard the plane, and that generates enough thrust to propel the plane over a sustained, steady flight.Unlike turbine-powered planes, the aircraft does not depend on fossil fuels to fly. And unlike propeller-driven drones, the new design is completely silent.
CLICK ON THE IMAGE TO ENJOY THE VIDEO

A new MIT plane is propelled via ionic wind. Batteries in the fuselage (tan compartment in front of plane) supply voltage to electrodes (blue/white horizontal lines) strung along the length of the plane, generating a wind of ions that propels the plane forward.

This is the first-ever sustained flight of a plane with no moving parts in the propulsion system,” says Steven Barrett, associate professor of aeronautics and astronautics at MIT. “This has potentially opened new and unexplored possibilities for aircraft which are quieter, mechanically simpler, and do not emit combustion emissions.”
He expects that in the near-term, such ion wind propulsion systems could be used to fly less noisy drones. Further out, he envisions ion propulsion paired with more conventional combustion systems to create more fuel-efficient, hybrid passenger planes and other large aircraft.

Source: http://news.mit.edu/

MIT Artificial Intelligence System Detects 85 Percent Of Cyber Attacks

While the number of cyber attacks continues to increase it is becoming even more difficult to detect and mitigate them in order to avoid serious consequences. A group of researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) is working on an ambitious project, the development of a technology that is able to early detect cyber attacks. The experts in collaboration with peers from the startup PatternEx have designed an Artificial Intelligence system that is able to detect 85 percent of attacks by using data from more than 3.6 Billion lines of log files each day.

The researchers have developed a system that combines an Artificial Intelligence engine with human inputs. , which researchers call Analyst Intuition (AI), which is why it has been given the name of AI2. The AI2 system first performs an automatic scan of the content with machine-learning techniques and then reports the results to human analysts which have to discriminate events linked to cyber attacks. According to the experts at the MIT the approach implemented by the AI2 system is 3 times better than modern automated cyber attack detection systems.

“The team showed that AI2 can detect 85 percent of attacks, which is roughly three times better than previous benchmarks, while also reducing the number of false positives by a factor of 5. The system was tested on 3.6 billion pieces of data known as “log lines,” which were generated by millions of users over a period of three months.” states a description of the AI2 published by the MIT.

The greater the number of analyzes carried out by the system, the more accurate the subsequent estimates thanks to the feedback mechanism.

“You can think about the system as a virtual analyst,” says CSAIL research scientist Kalyan Veeramachaneni, who developed AI2 with Ignacio Arnaldo, a chief data scientist at PatternEx and a former CSAIL postdoc. “It continuously generates new models that it can refine in as little as a few hours, meaning it can improve its detection rates significantly and rapidly.”

Source: http://ai2.appinventor.mit.edu/
AND
https://securityaffairs.co/