Posts belonging to Category nanocomputer

How To Trap DNA molecules With Your Smartphone

Researchers from the University of Minnesota College of Science and Engineering have found yet another remarkable use for the wonder material graphenetiny electronictweezers” that can grab biomolecules floating in water with incredible efficiency. This capability could lead to a revolutionary handheld disease diagnostic system that could be run on a smart phoneGraphene, a material made of a single layer of carbon atoms, was discovered more than a decade ago and has enthralled researchers with its range of amazing properties that have found uses in many new applications from microelectronics to solar cells. The graphene tweezers developed at the University of Minnesota are vastly more effective at trapping particles compared to other techniques used in the past due to the fact that graphene is a single atom thick, less than 1 billionth of a meter.

The physical principle of tweezing or trapping nanometer-scale objects, known as dielectrophoresis, has been known for a long time and is typically practiced by using a pair of metal electrodes. From the viewpoint of grabbing molecules, however, metal electrodes are very blunt. They simply lack the “sharpness” to pick up and control nanometer-scale objects.

Graphene is the thinnest material ever discovered, and it is this property that allows us to make these tweezers so efficient. No other material can come close,” said research team leader Sang-Hyun Oh, a Professor at the University of Minnesota. “To build efficient electronic tweezers to grab biomolecules, basically we need to create miniaturized lightning rods and concentrate huge amount of electrical flux on the sharp tip. The edges of graphene are the sharpest lightning rods.

The team also showed that the graphene tweezers could be used for a wide range of physical and biological applications by trapping semiconductor nanocrystals, nanodiamond particles, and even DNA molecules. Normally this type of trapping would require high voltages, restricting it to a laboratory environment, but graphene tweezers can trap small DNA molecules at around 1 Volt, meaning that this could work on portable devices such as mobile phones.

The research study has been published  in Nature Communications.


Nanotechnology Boosts CyberSecurity Against Hackers

The next generation of electronic hardware security may be at hand as researchers at New York University Tandon School of Engineering  (NYU Tandon) introduce a new class of unclonable cybersecurity security primitives made of a low-cost nanomaterial with the highest possible level of structural randomness. Randomness is highly desirable for constructing the security primitives that encrypt and thereby secure computer hardware and data physically, rather than by programming.

In a paper published in the journal ACS Nano, Assistant Professor of Electrical and Computer Engineering Davood Shahrjerdi and his team at NYU Tandon offer the first proof of complete spatial randomness in atomically thin molybdenum disulfide (MoS2). The researchers grew the nanomaterial in layers, each roughly one million times thinner than a human hair. By varying the thickness of each layer, Shahrjerdi explained, they tuned the size and type of energy band structure, which in turn affects the properties of the material.

(a) At monolayer thickness, this material has the optical properties of a semiconductor that emits light. At multilayer, the properties change and the material doesn’t emit light. (b) Varying the thickness of each layer results in a thin film speckled with randomly occurring regions that alternately emit or block light. (c) Upon exposure to light, this pattern can be translated into a one-of-a-kind authentication key that could secure hardware components at minimal cost.

This property is unique to this material,” underscores Shahrjerdi. By tuning the material growth process, the resulting thin film is speckled with randomly occurring regions that alternately emit or do not emit light. When exposed to light, this pattern translates into a one-of-a-kind authentication key that could secure hardware components at minimal cost.


Artificial Intelligence Chip Analyzes Molecular-level Data In Real Time

Nano Global, an Austin-based molecular data company, today announced that it is developing a chip using intellectual property (IP) from Arm, the world’s leading semiconductor IP company. The technology will help redefine how global health challenges – from superbugs to infectious diseases, and cancer are conquered.

The pioneering system-on-chip (SoC) will yield highly-secure molecular data that can be used in the recognition and analysis of health threats caused by pathogens and other living organisms. Combined with the company’s scientific technology platform, the chip leverages advances in nanotechnology, optics, artificial intelligence (AI), blockchain authentication, and edge computing to access and analyze molecular-level data in real time.

In partnership with Arm, we’re tackling the vast frontier of molecular data to unlock the unlimited potential of this universe,” said Steve Papermaster, Chairman and CEO of Nano Global. “The data our technology can acquire and process will enable us to create a safer and healthier world.”

We believe the technology Nano Global is delivering will be an important step forward in the collective pursuit of care that improves lives through the application of technology,” explained Rene Haas, executive vice president and president of IPG, Arm. “By collaborating with Nano Global, Arm is taking an active role in developing and deploying the technologies that will move us one step closer to solving complex health challenges.”

Additionally, Nano Global will be partnering with several leading institutions, including Baylor College of Medicine and National University of Singapore, on broad research initiatives in clinical, laboratory, and population health environments to accelerate data collection, analysis, and product development.
The initial development of the chip is in process with first delivery expected by 2020. The company is already adding new partners to their platform.


Graphene Ripples, Clean And Limitless Energy Source

Graphene is a seemingly impossible material. For years, scientists had theorized that lifting a single layer of carbon atoms from a chunk of graphite could produce the first two-dimensional material, which they called graphene. Finally, in 2004, this was accomplished by two physicists at the University of Manchester, who earned the Nobel Prize in Physics for this breakthrough. There was a problem, however: two dimensional materials violate the laws of physics. Without the support of a substrate, physics predicts they would tear apart or melt, even at a temperature of absolute zero. Physicists had to find a loophole to explain their existence.

That loophole turned out to be related to a phenomenon known as Brownian motion, small random fluctuations of the carbon atoms that make up graphene. This causes the material to ripple into the third dimension, similar to waves moving across the surface of the ocean. These movements in and out of the flat surface allow graphene to stay comfortably within the laws of physics.

Ever since Robert Brown discovered Brownian motion in 1827, scientists have wondered whether they could harvest this motion as a source of energy. The research of Paul Thibado, professor of physics at the University of Arkansas, provides strong evidence that the motion of graphene could indeed be used as a source of clean, limitless energy. Other researchers have theorized that temperature-induced curvature inversion in graphene could be used as an energy source, and even predicted the amount of energy they could produce. What sets Thibado’s work apart is his discovery that graphene has naturally occurring ripples that invert their curvature as the atoms vibrate in response to the ambient temperature.

This is the key to using the motion of 2D materials as a source of harvestable energy,” Thibado said. Unlike atoms in a liquid, which move in a random directions, atoms connected in a sheet of graphene move together. This means their energy can be collected using existing nanotechnology.

These results have been published in the journal Physical Review Letters.


How To Use Computers Heat To Generate Electricity

Electronic devices such as computers generate heat that mostly goes to waste. Physicists at Bielefeld University (Germany) have found a way to use this energy: They apply the heat to generate magnetic signals known as ‘spin currents’. In future, these signals could replace some of the electrical current in electronic components. In a new study, the physicists tested which materials can generate this spin current most effectively from heat. The research was carried out in cooperation with colleagues from the University of Greifswald, Gießen University, and the Leibniz Institute for Solid State and Materials Research in Dresden.

The Bielefeld physicists are working on the basic principles for making data processing more effective and energy-efficient in the young field of ‘spin caloritronics’. They are members of the ‘Thin Films & Physics of Nanostructures’ research group headed by Professor Dr. Günter Reiss. Their new study determines the strength of the spin current for various combinations of thin films.

A spin current is produced by differences in temperature between two ends of an electronic component. These components are extremely small and only one millionth of a millimetre thick. Because they are composed of magnetic materials such as iron, cobalt, or nickel, they are called magnetic nanostructures.

The physicists take two such nanofilms and place a layer of metal oxide between them that is only a few atoms thick. They heat up one of the external films – for example, with a hot nanowire or a focused laser. Electrons with a specific spin orientation then pass through the metal oxide. This produces the spin current. A spin can be conceived as electrons spinning on their own axes – either clockwise or anti-clockwise.

Their findings have been  published  in the research journal ‘Nature Communications’.


Tesla Electric Truck Travels 500 Miles (805 km) On A Single Charge

The main course was expected: a pair of sleek silver Tesla semi-trucks that get 500 miles per charge, go from zero to 60 mph in five seconds and — if the hype is to be believed — promise to single-handedly transform the commercial trucking industry. But dessert was a surprise: A bright red prototype of the newest Tesla Roadster, a revamped version of the company’s debut vehicle that can travel from Los Angeles to San Francisco and back on a single charge and go from zero to 60 mph in under two seconds. If true, that would make the $200,000 sports car the fastest production car ever made.

On Thursday night, Tesla chief executive Elon Musk delivered both dishes to a packed crowd at the company’s design studio in Hawthorne, Calif.

What does it feel like to drive this truck?” Musk asked the audience, shortly after his latest creations rolled onto the stage. “It’s amazing! It’s smooth, just like driving a Tesla.” “It’s unlike any truck that you’ve ever driven,” he added, noting that Tesla’s big rig puts the driver at the center of the vehicle like a race car, but surrounded with touchscreen displays like those found in the Model 3. “I can drive this thing and I have no idea how to drive a semi.”

Range anxiety has always been a key concern for anyone who is weighing the purchase of an electric vehicle. Musk sought to reassure potential buyers that the company’s big rigs can match — and surpass — the performance of a diesel engine, which he referred to as “economic suicide.” Musk did not reveal the truck’s exact price, but argued that a diesel truck would be 20 cents more expensive per mile than Tesla’s electric counterpart, which will be available for purchase in 2019.


AI, “worst event in the history of our civilisation” says Stephen Hawking

Stephen Hawking has sent a stark warning out to the world, stating that the invention of artificial intelligence (AI) could be the “worst event in the history of our civilisation”. Speaking at the Web Summit technology conference in Lisbon, Portugal, the theoretical physicist reiterated his warning against the rise of powerful, conscious machines.
While Prof Hawking admitted that AI could be used for good, he also stated that humans need to find a way to control it so that it does not become more powerful than us as “computers can, in theory, emulate human intelligence, and exceed it.” Looking at the positives, the 75-year old said AI could help undo some of the damage that humans have inflicted on the natural world, help beat disease and “transform” every aspect of society. But, there are negatives that come with it.

Success in creating effective AI, could be the biggest event in the history of our civilisation. Or the worst. We just don’t know. “So we cannot know if we will be infinitely helped by AI, or ignored by it and side-lined, or conceivably destroyed by it. “Unless we learn how to prepare for, and avoid, the potential risks, AI could be the worst event in the history of our civilisation. It brings dangers, like powerful autonomous weapons, or new ways for the few to oppress the many. It could bring great disruption to our economy,” explains the University of Cambridge alumni.

Prof Hawking added that to make sure AI is in line with our goals, creators need to “employ best practice and effective management.” But he still has hope: “I am an optimist and I believe that we can create AI for the good of the world. “That it can work in harmony with us. We simply need to be aware of the dangers, identify them, employ the best possible practice and management, and prepare for its consequences well in advance.”

Just last week, Prof Hawking warned that AI will replace us as the dominant being on the planet.


Sophia The Robot Says: ‘I have feelings too’

Until recently, the most famous thing that Sophia the robot had ever done was beat Jimmy Fallon a little too easily in a nationally televised game of rock-paper-scissors.


But now, the advanced artificial intelligence robot — which looks like Audrey Hepburn, mimics human expressions and may be the grandmother of robots that solve the world’s most complex problems — has a new feather in her cap:


The kingdom of Saudi Arabia officially granted citizenship to the humanoid robot last week during a program at the Future Investment Initiative, a summit that links deep-pocketed Saudis with inventors hoping to shape the future.

Sophia’s recognition made international headlines — and sparked an outcry against a country with a shoddy human rights record that has been accused of making women second-class citizens.


Ultra-fast Data Processing At Nanoscale

Advancement in nanoelectronics, which is the use of nanotechnology in electronic components, has been fueled by the ever-increasing need to shrink the size of electronic devices like nanocomputers in a bid to produce smaller, faster and smarter gadgets such as computers, memory storage devices, displays and medical diagnostic tools.

While most advanced electronic devices are powered by photonics – which involves the use of photons to transmit informationphotonic elements are usually large in size and this greatly limits their use in many advanced nanoelectronics systems. Plasmons, which are waves of electrons that move along the surface of a metal after it is struck by photons, holds great promise for disruptive technologies in nanoelectronics. They are comparable to photons in terms of speed (they also travel with the speed of light), and they are much smaller. This unique property of plasmons makes them ideal for integration with nanoelectronics. However, earlier attempts to harness plasmons as information carriers had little success.

Addressing this technological gap, a research team from the National University of Singapore (NUS) has recently invented a novel “converter” that can harness the speed and small size of plasmons for high frequency data processing and transmission in nanoelectronics.

This innovative transducer can directly convert electrical signals into plasmonic signals, and vice versa, in a single step. By bridging plasmonics and nanoscale electronics, we can potentially make chips run faster and reduce power losses. Our plasmonic-electronic transducer is about 10,000 times smaller than optical elements. We believe it can be readily integrated into existing technologies and can potentially be used in a wide range of applications in the future,” explained Associate Professor Christian Nijhuis from the Department of Chemistry at the NUS Faculty of Science, who is the leader of the research team behind this breakthrough.

This novel discovery was first reported in the journal Nature Photonics.


The Ultra Smart Community Of The Future

Japan’s largest electronics show CEATEC – showcasing its version of our future – in a connected world with intelligent robots And cars that know when the driver is falling asleep. This is Omron‘s “Onboard Driving Monitoring Sensor,” checking its driver isn’t distracted.


We are developing sensors that help the car judge what state the driver is in, with regards to driving. For example, if the driver has his eyes open and set on things he should be looking at, if the driver is distracted or looking at smartphones, and these types of situations,” explains Masaki Suwa, Omron Corp. Chief Technologist.

After 18 years of consumer electronics, CEATEC is changing focus to the Internet of Things and what it calls ‘the ultra-smart community of the future‘ A future where machines take on more important roles – machines like Panasonic‘s CaloRieco – pop in your plate and know exactly what you are about to consume.

By placing freshly cooked food inside the machine, you can measure total calories and the three main nutrients: protein, fat and carbohydrate. By using this machine, you can easily manage your diet,” says Panasonic staff engineer Ryota Sato.

Even playtime will see machines more involved – like Forpheus the ping playing robot – here taking on a Olympic bronze medalist – and learning with every stroke.
Rio Olympics Table Tennis player , Jun Mizutani, Bronze Medalist, reports: “It wasn’t any different from playing with a human being. The robot kept improving and getting better as we played, and to be honest, I wanted to play with it when it had reached its maximum level, to see how good it is.”

Optical Computer

Researchers at the University of Sydney (Australia) have dramatically slowed digital information carried as light waves by transferring the data into sound waves in an integrated circuit, or microchipTransferring information from the optical to acoustic domain and back again inside a chip is critical for the development of photonic integrated circuits: microchips that use light instead of electrons to manage data.

These chips are being developed for use in telecommunications, optical fibre networks and cloud computing data centers where traditional electronic devices are susceptible to electromagnetic interference, produce too much heat or use too much energy.

The information in our chip in acoustic form travels at a velocity five orders of magnitude slower than in the optical domain,” said Dr Birgit Stiller, research fellow at the University of Sydney and supervisor of the project.

It is like the difference between thunder and lightning,” she said.

This delay allows for the data to be briefly stored and managed inside the chip for processing, retrieval and further transmission as light wavesLight is an excellent carrier of information and is useful for taking data over long distances between continents through fibre-optic cables.

But this speed advantage can become a nuisance when information is being processed in computers and telecommunication systems.


Very Fast Magnetic Data Storage

For almost seventy years now, magnetic tapes and hard disks have been used for data storage in computers. In spite of many new technologies that have been developed in the meantime, the controlled magnetization of a data storage medium remains the first choice for archiving information because of its longevity and low price. As a means of realizing random access memories (RAMs), however, which are used as the main memory for processing data in computers, magnetic storage technologies were long considered inadequate. That is mainly due to its low writing speed and relatively high energy consumption.

In 1956, IBM introduced the first magnetic hard disc, the RAMAC. ETH researchers have now tested a novel magnetic writing technology that could soon be used in the main memories of modern computers

Pietro Gambardella, Professor at the Department of Materials of the Eidgenössische Technische Hochschule Zürich (ETHZ, Switzerland), and his colleagues, together with colleagues at the Physics Department and at the Paul Scherrer Institute (PSI), have now shown that using a novel technique, magnetic storage can still be achieved very fast and without wasting energy.

In 2011, Gambardella and his colleagues already demonstrated a technique that could do just that: An electric current passing through a specially coated semiconductor film inverted the magnetization in a tiny metal dot. This is made possible by a physical effect called spin-orbit-torque. In this effect, a current flowing in a conductor leads to an accumulation of electrons with opposite magnetic moment (spins) at the edges of the conductor. The electron spins, in turn, create a magnetic field that causes the atoms in a nearby magnetic material to change the orientation of their magnetic moments. In a new study the scientists have now investigated how this process works in detail and how fast it is.

The results were recently published in the scientific journal Nature Nanotechnology.