SIM Card Man Even Closer

Share Button

Little bit more about Nigel Kerner’s Cim Card Man:

Image of a moth with a SIM computer chip attached to it's back

SIM Card Man Even Closer

sources:
‘Part moth, part machine: Cyborgs are on the move
‘Cyborgs walk like a lamprey, dance like a moth’
by Duncan Graham – Rowe
New Scientist 3rd November 2010

 

By tapping into the mind of a sex-mad moth or the spine of a lamprey,
robots can track scents or walk like a living organism.

A MALE silk moth gets a whiff of pheromones and begins a complex search pattern to track down a potential mate – a brief surge forward, an intricate zigzag, a sweeping loop. For this deluded moth there is no female to find, and its movements are enacted by a wheeled robot plugged into its lovesick brain.

This cyborg moth is the latest demonstration of how scientists coax complex behaviour from robots by tapping into the nervous systems of living organisms, co-opting algorithms that already exist in nature. “Biological organisms can solve problems that are too difficult for computer engineers,” says Ferdinando Mussa-Ivaldi, a pioneer in the field at Northwestern University in Evanston, Illinois.

The mothborg’s dance is a strategy to locate the source of the pheromone. Chemical plume tracking, as mathematicians call it, could prove very useful for sniffing out explosives. Programming machines to track a chemical plume efficiently is still too challenging, says mothborg creator Atsushi Takashima at the Tokyo Institute of Technology in Japan. He recently presented the research at the International Conference on Intelligent Robotic Systems in Taipei, Taiwan. “Chemicals do not make a smooth gradient in the air,” he says. But these insects have evolved mechanisms for solving the task even when the weather is windy.

Takashima and colleagues immobilised their moth on a small wheeled robot and placed two recording electrodes into nerves running down its neck to monitor commands the moth uses to steer. By rerouting these signals to motors in the robot, they found that they could emulate the moth’s plume-tracking behaviour.

Over the last decade similar techniques have been used to create a menagerie of cyborgs, from fish-brained bots that can follow a light source to living automata, such as rats and cockroaches, that can be steered with a remote-controlled zap to the brain.

Piggy-backing a live organism on a robot is less than ideal, so the goal is to recreate biological circuits in silicon, says Mussa-Ivaldi. This is difficult, as it is not clear how individual neurons work, let alone vast circuits of them. But some progress has been made, in particular with central pattern generators (CPGs): self-contained oscillating circuits that exist in the spines of many vertebrates and which are involved in locomotion. CPGs are among many types of behavioural circuits in the brain and spine that carry out routine tasks for us, allowing us to walk or grasp an object with little or no conscious input.

Ralph Etienne-Cummings at Johns Hopkins University in Baltimore, Maryland, has used recordings of CPGs taken from a lamprey to generate walking motions in a pair of robotic legs, which he calls RedBot. While lampreys can’t walk themselves, of course, their CPGs are similar to the human CPGs that create rhythmic commands to drive our leg muscles. In RedBot, Etienne-Cummings has demonstrated that lamprey CPG signals can be used to create natural gaits – walking, running and going up steps, adapting in real time to a changing environment.

However, for Etienne-Cummings and colleagues, RedBot is merely a stepping stone. Their aim is to replicate the circuits in a chip and implant it into people with spinal injuries so that they can walk again. They have already shown this is possible in paralysed cats.

While reproducing human brain circuitry in any detail is still a long way off, Shigeru Sakurazawa and colleagues at the Future University Hakodate in Hokkaido, Japan, have found a way to tap into the human nervous system as a whole, and without so much as a scalpel. An on-screen robot navigating a simple maze takes inputs from skin sensors worn by an observer.

As the robot bumbles about its simulated environment, the skin sensors detect when the person anticipates an impending collision, and the software uses these signals to alter the robot’s behaviour.

The distinction between first-person and third-person becomes very confusing for the volunteer who is rigged up to the robot, says Sakurazawa. To anyone else watching, the “skin-bot” seems, well, less robotic.

Source

Share Button

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.