Innovation and Technology Weekly – No. 31, 2016

This is the online version of UNU-MERIT’s I&T Weekly which is sent out by email every Friday. If you wish to subscribe to this free service, please submit your email address in the box to the right.

Issue 31, 2016

This week's headlines:



New theory of gravity might explain dark matter
November 08, 2016

A new theory of gravity by Physicist Erik Verlinde from the University of Amsterdam might explain the curious motions of stars in galaxies. Emergent gravity, as the new theory is called, predicts the exact same deviation of motions that is usually explained by invoking dark matter.

In 2010, Verlinde surprised the world with a completely new theory of gravity which says that gravity is not a fundamental force of nature, but an emergent phenomenon. In the same way that temperature arises from the movement of microscopic particles, gravity emerges from the changes of fundamental bits of information, stored in the very structure of spacetime. Extending this work Verlinde now shows how to understand the behaviour of stars in galaxies without adding the puzzling dark matter.

The outer regions of galaxies rotate much faster around the centre than can be accounted for by the quantity of ordinary matter like stars, planets and interstellar gasses. Something else has to produce the required amount of gravitational force, so physicists proposed the existence of dark matter. However, Verlinde claims there is no need to add a dark matter particle. His theory of gravity accurately predicts the velocities by which the stars rotate around the centre of the Milky Way, as well as the motion of stars inside other galaxies.

One of the ingredients in the new theory is an adaptation of the holographic principle, according to which all the information in the entire universe can be described on a giant imaginary sphere around it. Verlinde now shows that this idea is not quite correct - part of the information in our universe is contained in space itself.

This extra information is required to describe that other dark component of the universe: Dark energy, which is believed to be responsible for the accelerated expansion of the universe. Investigating the effects of this additional information on ordinary matter, Verlinde comes to a stunning conclusion. Whereas ordinary gravity can be encoded using the information on the imaginary sphere around the universe, as he showed in his 2010 work, the result of the additional information in the bulk of space is a force that nicely matches that attributed to dark matter.

Full story: PhysOrg / arxiv Back to top


Implants hack reflexes to let paralysed monkeys move their legs
November 09, 2016

Some animals have walking reflexes governed by nerves in their spine - it's why a chicken continues to run after its head has been cut off. Now these reflexes have let paralysed monkeys regain use of their legs after a week or two of practice. Previous methods have taken months.

We have no reliable means to reconnect severed nerves in people with injured spinal cords. One way to overcome paralysis might be to detect a person's desire to move and use this to stimulate nerves or muscles.

Last year, a paralysed man walked thanks to a cap of electrodes that read his brainwaves, and implants that stimulated his leg muscles. But directly stimulating muscles in this way can make movements jerky and uncoordinated. This coordination is usually carried out by circuits in the spine, which control walking once it has been initiated by the brain - as happens in headless chickens.

A team from the Swiss Federal Institute of Technology in Lausanne has found a way to exploit this using spine implants in monkeys. The spinal cord was severed on one side above the implant, and a second implant put into the part of the brain that controls the affected leg. This implant detected when the monkey wanted to move and sent signals to the device in the spine. Within six days, the first monkey was crawling using both legs. A second animal did so in two weeks.

Full story: New Scientist / Nature Back to top


New program reads lips with superhuman accuracy
November 08, 2016

Scientists at the University of Oxford have developed software that can read lips at level that far surpasses the best professionals. The researchers said the program, dubbed 'LipNet', could be used to improve hearing aids, enable conversations in noisy places or to add speech to silent movies.

The researchers, working with Google's artificial intelligence division DeepMind, trained the software on more than 30,000 videos of test subjects speaking sentences. Over time, it would match certain words with particular lip movements to learn what words were being spoken.

The researchers then played it further videos of people speaking sentences and the LipNet software succeeded with 93.4% accuracy. This compares to 52.3% for hearing impaired students, and surpassed other lip-reading programs.

Unlike previous software, LipNet digested the phrases as full sentences, and allowing it to put words in context rather than decipher them individually allowed much greater accuracy. It also means the software does not need to split a video into separate videos for each word.

Full story: Daily Telegraph Back to top


Tiny fingertip camera helps blind people read without braille
November 09, 2016

To read printed material, many visually impaired people rely on mobile apps like KNFB Reader that translate text to speech. Snap a picture and the app reads the page aloud. But users sometimes find it difficult to ensure that their photo captures all of the text, and these apps can have trouble parsing a complex layout, such as a newspaper or restaurant menu.

But now researchers from the University of Maryland have developed a device, nicknamed HandSight, that uses a tiny camera originally developed for endoscopies. Measuring just one millimetre across, the camera sits on the tip of the finger while the rest of the device clasps onto the finger and wrist. As the user follows a line of text with their finger, a nearby computer reads it out. Audio cues or haptic buzzes help the user make their way through the text, for example changing pitch or gently vibrating to help nudge their finger into the correct position.

In a study published in October, 19 blind people tried out the technology, spending a couple of hours exploring passages from a school textbook and a magazine-style page. On average, they were able to read between 63 and 81 words per minute and only missed a few words in each passage. The average reading speed for an expert braille reader is around 90 to 115 words per minute, while sighted individuals have an average reading speed around 200 words per minute.

The creators of HandSight envisage a much more dynamic, smartwatch-like device that blind people could use not only to read text but also to discern other visual characteristics, like colours and patterns.

Full story: New Scientist / ACM Transactions on Accessible Computing Back to top


First colour images produced by an electron microscope
November 03, 2016

Imagine spending your whole life seeing the world in black and white, and then seeing a vase of roses in full colour for the first time. That's kind of what it was like for the scientists who have taken the first multicolour images of cells using an electron microscope.

Electron microscopes can magnify an object up to 10 million times, allowing researchers to peer into the inner workings of, say, a cell or a fly's eye, but until now they've only been able to see in black and white.

The new advance - 15 years in the making - uses three different kinds of rare earth metals called lanthanides layered one-by-one over cells on a microscope slide. The microscope detects when each metal loses electrons and records each unique loss as an artificial colour. So far, the researchers can only produce the colours red, green, and yellow.

Still, the ability to use colour creates stark contrasts that greyscale images simply can't accomplish. The team could see a string of proteins squeezing through a cell membrane in more detail than scientists ever had before, for example. With a few more tweaks and added metal ions, researchers hope to add three or four other colours to the mix and improve the images' resolution.

Full story: ScienceMag / Cell Chemical Biology Back to top


Face electrodes let you taste and chew in virtual reality
November 04, 2016

You're having dinner in a virtual reality game. The banquet scene in front of you looks so real that your mouth is watering. Normally, you would be disappointed, but not this time. You approach the food, stick out your tongue - and taste the flavours on display. You move your jaw to chew - and feel the food's texture between your teeth.

Experiments with 'virtual food' use electronics to emulate the taste and feel of the real thing, even when there's nothing in your mouth. This tech could add new sensory inputs to virtual reality or augment real-world dining experiences, especially for people with restricted diets or health issues that affect their ability to eat.

Researchers from the National University of Singapore started experimenting with thermal stimulation. Their project uses changes in temperature to mimic the sensation of sweetness on the tongue. The user places the tip of their tongue on a square of thermoelectric elements that are rapidly heated or cooled, hijacking thermally sensitive neurons that normally contribute to the sensory code for taste. The researchers envisage such a system embedded in a glass or mug to make low-sugar drinks taste sweeter.

However, texture is every bit as important. A team from the University of Tokyo has developed a device that uses electricity to simulate the experience of chewing foods of different textures. It also uses electrodes, but not on the tongue, instead they place them on the masseter muscle - a muscle in the jaw used for chewing - to give sensations of hardness or chewiness as a user bites down.

Full story: New Scientist Back to top


UNU-MERIT