Artificial Synapse For “Brain-on-a-Chip”

When it comes to processing power, the human brain just can’t be beat. Packed within the squishy, football-sized organ are somewhere around 100 billion neurons. At any given moment, a single neuron can relay instructions to thousands of other neurons via synapses — the spaces between neurons, across which neurotransmitters are exchanged. There are more than 100 trillion synapses that mediate neuron signaling in the brain, strengthening some connections while pruning others, in a process that enables the brain to recognize patterns, remember facts, and carry out other learning tasks, at lightning speeds.

Researchers in the emerging field of “neuromorphic computing” have attempted to design computer chips that work like the human brain. Instead of carrying out computations based on binary, on/off signaling, like digital chips do today, the elements of a “brain on a chip” would work in an analog fashion, exchanging a gradient of signals, or “weights,” much like neurons that activate in various ways depending on the type and number of ions that flow across a synapse.

In this way, small neuromorphic chips could, like the brain, efficiently process millions of streams of parallel computations that are currently only possible with large banks of supercomputers. But one significant hangup on the way to such portable artificial intelligence has been the neural synapse, which has been particularly tricky to reproduce in hardware.

Now engineers at MIT have designed an artificial synapse in such a way that they can precisely control the strength of an electric current flowing across it, similar to the way ions flow between neurons. The team has built a small chip with artificial synapses, made from silicon germanium. In simulations, the researchers found that the chip and its synapses could be used to recognize samples of handwriting, with 95 percent accuracy.

The design, published today in the journal Nature Materials, is a major step toward building portable, low-power neuromorphic chips for use in pattern recognition and other learning tasks.

Source: http://news.mit.edu/

Ultra-Thin Memory Storage For Nanocomputer

Engineers worldwide have been developing alternative ways to provide greater memory storage capacity on even smaller computer chips. Previous research into two-dimensional atomic sheets for memory storage has failed to uncover their potential — until now. A team of electrical engineers at The University of Texas at Austin, in collaboration with Peking University scientists, has developed the thinnest memory storage device with dense memory capacity, paving the way for faster, smaller and smarter computer chips for everything from consumer electronics to big data to brain-inspired computing.

For a long time, the consensus was that it wasn’t possible to make memory devices from materials that were only one atomic layer thick,” said Deji Akinwande, associate professor in the Cockrell School of Engineering’s Department of Electrical and Computer Engineering. “With our new ‘atomristors,’ we have shown it is indeed possible.”

Made from 2-D nanomaterials, the “atomristors” — a term Akinwande coined — improve upon memristors, an emerging memory storage technology with lower memory scalability. He and his team published their findings in the January issue of Nano Letters.

Atomristors will allow for the advancement of Moore’s Law at the system level by enabling the 3-D integration of nanoscale memory with nanoscale transistors on the same chip for advanced computing systems,” Akinwande said.

Memory storage and transistors have, to date, always been separate components on a microchip, but atomristors combine both functions on a single, more efficient computer system. By using metallic atomic sheets (graphene) as electrodes and semiconducting atomic sheets (molybdenum sulfide) as the active layer, the entire memory cell is a sandwich about 1.5 nanometers thick, which makes it possible to densely pack atomristors layer by layer in a plane. This is a substantial advantage over conventional flash memory, which occupies far larger space. In addition, the thinness allows for faster and more efficient electric current flow.

Given their size, capacity and integration flexibility, atomristors can be packed together to make advanced 3-D chips that are crucial to the successful development of brain-inspired computing. One of the greatest challenges in this burgeoning field of engineering is how to make a memory architecture with 3-D connections akin to those found in the human brain.

The sheer density of memory storage that can be made possible by layering these synthetic atomic sheets onto each other, coupled with integrated transistor design, means we can potentially make computers that learn and remember the same way our brains do,” Akinwande said.

Source: https://news.utexas.edu

Memristors Retain Data 10 Years Without Power

The internet of things ( IoT) is coming, that much we know. But still it won’t; not until we have components and chips that can handle the explosion of data that comes with IoT. In 2020, there will already be 50 billion industrial internet sensors in place all around us. A single autonomous device – a smart watch, a cleaning robot, or a driverless car – can produce gigabytes of data each day, whereas an airbus may have over 10 000 sensors in one wing alone.

Two hurdles need to be overcome. First, current transistors in computer chips must be miniaturized to the size of only few nanometres; the problem is they won’t work anymore then. Second, analysing and storing unprecedented amounts of data will require equally huge amounts of energy. Sayani Majumdar, Academy Fellow at Aalto University (Finland), along with her colleagues, is designing technology to tackle both issues.

Majumdar has with her colleagues designed and fabricated the basic building blocks of future components in what are called “neuromorphiccomputers inspired by the human brain. It’s a field of research on which the largest ICT companies in the world and also the EU are investing heavily. Still, no one has yet come up with a nano-scale hardware architecture that could be scaled to industrial manufacture and use.

The probe-station device (the full instrument, left, and a closer view of the device connection, right) which measures the electrical responses of the basic components for computers mimicking the human brain. The tunnel junctions are on a thin film on the substrate plate.

The technology and design of neuromorphic computing is advancing more rapidly than its rival revolution, quantum computing. There is already wide speculation both in academia and company R&D about ways to inscribe heavy computing capabilities in the hardware of smart phones, tablets and laptops. The key is to achieve the extreme energy-efficiency of a biological brain and mimic the way neural networks process information through electric impulses,” explains Majumdar.

In their recent article in Advanced Functional Materials, Majumdar and her team show how they have fabricated a new breed of “ferroelectric tunnel junctions”, that is, few-nanometre-thick ferroelectric thin films sandwiched between two electrodes. They have abilities beyond existing technologies and bode well for energy-efficient and stable neuromorphic computing.

The junctions work in low voltages of less than five volts and with a variety of electrode materials – including silicon used in chips in most of our electronics. They also can retain data for more than 10 years without power and be manufactured in normal conditions.

Tunnel junctions have up to this point mostly been made of metal oxides and require 700 degree Celsius temperatures and high vacuums to manufacture. Ferroelectric materials also contain lead which makes them – and all our computers – a serious environmental hazard.

Our junctions are made out of organic hydro-carbon materials and they would reduce the amount of toxic heavy metal waste in electronics. We can also make thousands of junctions a day in room temperature without them suffering from the water or oxygen in the air”, explains Majumdar.

What makes ferroelectric thin film components great for neuromorphic computers is their ability to switch between not only binary states – 0 and 1 – but a large number of intermediate states as well. This allows them to ‘memoriseinformation not unlike the brain: to store it for a long time with minute amounts of energy and to retain the information they have once received – even after being switched off and on again.

We are no longer talking of transistors, but ‘memristors’. They are ideal for computation similar to that in biological brains.  Take for example the Mars 2020 Rover about to go chart the composition of another planet. For the Rover to work and process data on its own using only a single solar panel as an energy source, the unsupervised algorithms in it will need to use an artificial brain in the hardware.

What we are striving for now, is to integrate millions of our tunnel junction memristors into a network on a one square centimetre area. We can expect to pack so many in such a small space because we have now achieved a record-high difference in the current between on and off-states in the junctions and that provides functional stability. The memristors could then perform complex tasks like image and pattern recognition and make decisions autonomously,” says Majumdar.

Source: http://www.aalto.fi/

DNA Origami, The New Revolution To Come For Nanotechnology

For the past few decades, some scientists have known the shape of things to come in nanotechnology is tied to the molecule of life, DNA. This burgeoning field is called “DNA origami.” The moniker is borrowed from the art of conjuring up birds, flowers and other shapes by imaginatively folding a single sheet of paper. Similarly, DNA origami scientists are dreaming up a variety of shapes — at a scale one thousand times smaller than a human hair — that they hope will one day revolutionize computing, electronics and medicine. Now, a team of Arizona State University and Harvard scientists has invented a major new advance in DNA nanotechnology. Dubbed “single-stranded origami” (ssOrigami), their new strategy uses one long noodle-like strand of DNA, or its chemical cousin RNA, that can self-fold — without even a single knot — into the largest, most complex structures to date. And the strands forming these structures can be made inside living cells or using enzymes in a test tube, allowing scientists the potential to plug-and-play with new designs and functions for nanomedicine: picture tiny nanobots playing doctor and delivering drugs within cells at the site of injury.

A DNA origami with an emoji-like smiley face

I think this is an exciting breakthrough, and a great opportunity for synthetic biology as well,” said Hao Yan, a co-inventor of the technology, director of the ASU Biodesign Institute’s Center for Molecular Design and Biomimetics, and the Milton Glick Professor in the School of Molecular Sciences.

We are always inspired by nature’s designs to make information-carrying molecules that can self-fold into the nanoscale shapes we want to make,” he said.

As proof of concept, they’ve pushed the envelope to make 18 shapes, including emoji-like smiley faces, hearts and triangles, that significantly expand the design studio space and material scalability for so-called, “bottom-upnanotechnology.

Source: https://asunow.asu.edu/

Artificial Intelligence Chip Analyzes Molecular-level Data In Real Time

Nano Global, an Austin-based molecular data company, today announced that it is developing a chip using intellectual property (IP) from Arm, the world’s leading semiconductor IP company. The technology will help redefine how global health challenges – from superbugs to infectious diseases, and cancer are conquered.

The pioneering system-on-chip (SoC) will yield highly-secure molecular data that can be used in the recognition and analysis of health threats caused by pathogens and other living organisms. Combined with the company’s scientific technology platform, the chip leverages advances in nanotechnology, optics, artificial intelligence (AI), blockchain authentication, and edge computing to access and analyze molecular-level data in real time.

In partnership with Arm, we’re tackling the vast frontier of molecular data to unlock the unlimited potential of this universe,” said Steve Papermaster, Chairman and CEO of Nano Global. “The data our technology can acquire and process will enable us to create a safer and healthier world.”

We believe the technology Nano Global is delivering will be an important step forward in the collective pursuit of care that improves lives through the application of technology,” explained Rene Haas, executive vice president and president of IPG, Arm. “By collaborating with Nano Global, Arm is taking an active role in developing and deploying the technologies that will move us one step closer to solving complex health challenges.”

Additionally, Nano Global will be partnering with several leading institutions, including Baylor College of Medicine and National University of Singapore, on broad research initiatives in clinical, laboratory, and population health environments to accelerate data collection, analysis, and product development.
The initial development of the chip is in process with first delivery expected by 2020. The company is already adding new partners to their platform.

Source: https://nanoglobal.com/
AND
www.prnewswire.com

Optical Computer

Researchers at the University of Sydney (Australia) have dramatically slowed digital information carried as light waves by transferring the data into sound waves in an integrated circuit, or microchipTransferring information from the optical to acoustic domain and back again inside a chip is critical for the development of photonic integrated circuits: microchips that use light instead of electrons to manage data.

These chips are being developed for use in telecommunications, optical fibre networks and cloud computing data centers where traditional electronic devices are susceptible to electromagnetic interference, produce too much heat or use too much energy.

The information in our chip in acoustic form travels at a velocity five orders of magnitude slower than in the optical domain,” said Dr Birgit Stiller, research fellow at the University of Sydney and supervisor of the project.

It is like the difference between thunder and lightning,” she said.

This delay allows for the data to be briefly stored and managed inside the chip for processing, retrieval and further transmission as light wavesLight is an excellent carrier of information and is useful for taking data over long distances between continents through fibre-optic cables.

But this speed advantage can become a nuisance when information is being processed in computers and telecommunication systems.

Source: https://sydney.universty.au/

How To Store Data At The Molecular Level

From smartphones to nanocomputers or supercomputers, the growing need for smaller and more energy efficient devices has made higher density data storage one of the most important technological quests. Now scientists at the University of Manchester have proved that storing data with a class of molecules known as single-molecule magnets is more feasible than previously thought. The research, led by Dr David Mills and Dr Nicholas Chilton, from the School of Chemistry, is being published in Nature. It shows that magnetic hysteresis, a memory effect that is a prerequisite of any data storage, is possible in individual molecules at -213 °C. This is tantalisingly close to the temperature of liquid nitrogen (-196 °C).

The result means that data storage with single molecules could become a reality because the data servers could be cooled using relatively cheap liquid nitrogen at -196°C instead of far more expensive liquid helium (-269 °C). The research provides proof-of-concept that such technologies could be achievable in the near future.

The potential for molecular data storage is huge. To put it into a consumer context, molecular technologies could store more than 200 terabits of data per square inch – that’s 25,000 GB of information stored in something approximately the size of a 50p coin, compared to Apple’s latest iPhone 7 with a maximum storage of 256 GB.

Single-molecule magnets display a magnetic memory effect that is a requirement of any data storage and molecules containing lanthanide atoms have exhibited this phenomenon at the highest temperatures to date. Lanthanides are rare earth metals used in all forms of everyday electronic devices such as smartphones, tablets and laptops. The team achieved their results using the lanthanide element dysprosium.

This is very exciting as magnetic hysteresis in single molecules implies the ability for binary data storage. Using single molecules for data storage could theoretically give 100 times higher data density than current technologies. Here we are approaching the temperature of liquid nitrogen, which would mean data storage in single molecules becomes much more viable from an economic point of view,’ explains Dr Chilton.

The practical applications of molecular-level data storage could lead to much smaller hard drives that require less energy, meaning data centres across the globe could become a lot more energy efficient.

Source: http://www.manchester.ac.uk/

AR Smart Glasses, Next Frontier Of FaceBook

Facebook is hard at work on the technical breakthroughs needed to ship futuristic smart glasses that can let you see virtual objects in the real world. A patent application for a “waveguide display with two-dimensional scanner” was published on Thursday by three members from the advanced research division of Facebook’s virtual-reality subsidiary, Oculus.

The smart glasses being developed by Oculus will use a waveguide display to project light onto the wearer’s eyes instead of a more traditional display. The smart glasses would be able to display images, video, and work with connected speakers or headphones to play audio when worn.The display “may augment views of a physical, real-world environment with computer-generated elements” and “may be included in an eye-wear comprising a frame and a display assembly that presents media to a user’s eyes,” according to the filing.

By using waveguide technology, Facebook is taking a similar approach to Microsoft‘s HoloLens AR headset and the mysterious glasses being developed by the Google-backed startup Magic Leap.

One of the authors of the patent is, in fact, lead Oculus optical scientist Pasi Saarikko, who joined Facebook in 2015 after leading the optical design of the HoloLens at Microsoft.

While work is clearly being done on the underlying technology for Facebook‘s smart glasses now, don’t expect to see the device anytime soon. Michael Abrash, the chief scientist of Oculus, recently said that AR glasses won’t start replacing smartphones until as early as 2022.

Facebook CEO Mark Zuckerberg has called virtual and augmented reality the next major computing platform capable of replacing smartphones and traditional PCs. Facebook purchased Oculus for $2 billion in 2014 and plans to spend billions more on developing the technology.

Source: http://pdfaiw.uspto.gov/
A
ND
http://www.businessinsider.com

No More Batteries For Cellphones

University of Washington (UW) researchers have invented a cellphone that requires no batteries — a major leap forward in moving beyond chargers, cords and dying phones. Instead, the phone harvests the few microwatts of power it requires from either ambient radio signals or light.

The team also made Skype calls using its battery-free phone, demonstrating that the prototype made of commercial, off-the-shelf components can receive and transmit speech and communicate with a base station.

CLICK ON THE IMAGE TO ENJOY THE VIDEO

We’ve built what we believe is the first functioning cellphone that consumes almost zero power,” said co-author Shyam Gollakota, an associate professor in the Paul G. Allen School of Computer Science & Engineering at the UW. “To achieve the really, really low power consumption that you need to run a phone by harvesting energy from the environment, we had to fundamentally rethink how these devices are designed.”

The team of UW computer scientists and electrical engineers eliminated a power-hungry step in most modern cellular transmissionsconverting analog signals that convey sound into digital data that a phone can understand. This process consumes so much energy that it’s been impossible to design a phone that can rely on ambient power sources. Instead, the battery-free cellphone takes advantage of tiny vibrations in a phone’s microphone or speaker that occur when a person is talking into a phone or listening to a call.

An antenna connected to those components converts that motion into changes in standard analog radio signal emitted by a cellular base station. This process essentially encodes speech patterns in reflected radio signals in a way that uses almost no power. To transmit speech, the phone uses vibrations from the device’s microphone to encode speech patterns in the reflected signals. To receive speech, it converts encoded radio signals into sound vibrations that that are picked up by the phone’s speaker. In the prototype device, the user presses a button to switch between these two “transmitting” and “listening” modes.

The new technology is detailed in a paper published July 1 in the Proceedings of the Association for Computing Machinery on Interactive, Mobile, Wearable and Ubiquitous Technologies.

Source: http://www.washington.edu/
AND
http://www.reuters.com/

How To Generate Any Cell Within The Patient’s Own Body

Researchers at The Ohio State University Wexner Medical Center and Ohio State’s College of Engineering have developed a new technology, Tissue Nanotransfection (TNT), that can generate any cell type of interest for treatment within the patient’s own body. This technology may be used to repair injured tissue or restore function of aging tissue, including organs, blood vessels and nerve cells.

By using our novel nanochip technology (nanocomputer), injured or compromised organs can be replaced. We have shown that skin is a fertile land where we can grow the elements of any organ that is declining,” said Dr. Chandan Sen, director of Ohio State’s Center for Regenerative Medicine & Cell Based Therapies, who co-led the study with L. James Lee, professor of chemical and biomolecular engineering with Ohio State’s College of Engineering in collaboration with Ohio State’s Nanoscale Science and Engineering Center.

Researchers studied mice and pigs in these experiments. In the study, researchers were able to reprogram skin cells to become vascular cells in badly injured legs that lacked blood flow. Within one week, active blood vessels appeared in the injured leg, and by the second week, the leg was saved. In lab tests, this technology was also shown to reprogram skin cells in the live body into nerve cells that were injected into brain-injured mice to help them recover from stroke.

This is difficult to imagine, but it is achievable, successfully working about 98 percent of the time. With this technology, we can convert skin cells into elements of any organ with just one touch. This process only takes less than a second and is non-invasive, and then you’re off. The chip does not stay with you, and the reprogramming of the cell starts. Our technology keeps the cells in the body under immune surveillance, so immune suppression is not necessary,” said Sen, who also is executive director of Ohio State’s Comprehensive Wound Center.

Results of the regenerative medicine study have been published in the journal  Nature Nanotechnology.

Source: https://news.osu.edu/

Building Brain-Inspired AI Supercomputing System

IBM (NYSE: IBM) and the U.S. Air Force Research Laboratory (AFRL) today announced they are collaborating on a first-of-a-kind brain-inspired supercomputing system powered by a 64-chip array of the IBM TrueNorth Neurosynaptic System. The scalable platform IBM is building for AFRL will feature an end-to-end software ecosystem designed to enable deep neural-network learning and information discovery. The system’s advanced pattern recognition and sensory processing power will be the equivalent of 64 million neurons and 16 billion synapses, while the processor component will consume the energy equivalent of a dim light bulb – a mere 10 watts to power.
IBM researchers believe the brain-inspired, neural network design of TrueNorth will be far more efficient for pattern recognition and integrated sensory processing than systems powered by conventional chips. AFRL is investigating applications of the system in embedded, mobile, autonomous settings where, today, size, weight and power (SWaP) are key limiting factors. The IBM TrueNorth Neurosynaptic System can efficiently convert data (such as images, video, audio and text) from multiple, distributed sensors into symbols in real time. AFRL will combine this “right-brain perception capability of the system with the “left-brain” symbol processing capabilities of conventional computer systems. The large scale of the system will enable both “data parallelism” where multiple data sources can be run in parallel against the same neural network and “model parallelism” where independent neural networks form an ensemble that can be run in parallel on the same data.

CLICK ON THE IMAGE TO ENJOY THE VIDEO

AFRL was the earliest adopter of TrueNorth for converting data into decisions,” said Daniel S. Goddard, director, information directorate, U.S. Air Force Research Lab. “The new neurosynaptic system will be used to enable new computing capabilities important to AFRL’s mission to explore, prototype and demonstrate high-impact, game-changing technologies that enable the Air Force and the nation to maintain its superior technical advantage.”

“The evolution of the IBM TrueNorth Neurosynaptic System is a solid proof point in our quest to lead the industry in AI hardware innovation,” said Dharmendra S. Modha, IBM Fellow, chief scientist, brain-inspired computing, IBM Research – Almaden. “Over the last six years, IBM has expanded the number of neurons per system from 256 to more than 64 million – an 800 percent annual increase over six years.’’

Source: https://www-03.ibm.com/

All Carbon Spin Transistor Is Quicker And Smaller

A researcher with the Erik Jonsson School of Engineering and Computer Science at UT Dallas has designed a novel computing system made solely from carbon that might one day replace the silicon transistors that power today’s electronic devices.

The concept brings together an assortment of existing nanoscale technologies and combines them in a new way,” said Dr. Joseph S. Friedman, assistant professor of electrical and computer engineering at UT Dallas who conducted much of the research while he was a doctoral student at Northwestern University.

The resulting all-carbon spin logic proposal, published by lead author Friedman and several collaborators in the June 5 edition of the online journal Nature Communications, is a computing system that Friedman believes could be made smaller than silicon transistors, with increased performance.

Today’s electronic devices are powered by transistors, which are tiny silicon structures that rely on negatively charged electrons moving through the silicon, forming an electric current. Transistors behave like switches, turning current on and off.

In addition to carrying a charge, electrons have another property called spin, which relates to their magnetic properties. In recent years, engineers have been investigating ways to exploit the spin characteristics of electrons to create a new class of transistors and devices called “spintronics.”

Friedman’s all-carbon, spintronic switch functions as a logic gate that relies on a basic tenet of electromagnetics: As an electric current moves through a wire, it creates a magnetic field that wraps around the wire. In addition, a magnetic field near a two-dimensional ribbon of carbon — called a graphene nanoribbon — affects the current flowing through the ribbon. In traditional, silicon-based computers, transistors cannot exploit this phenomenon. Instead, they are connected to one another by wires. The output from one transistor is connected by a wire to the input for the next transistor, and so on in a cascading fashion.

Source: http://www.utdallas.edu/