Dec 22 2008

KAR – the Kitchen Assistant Robot does the dishes

KAR (Kitchen Assistant Robot) is a dishwashing robot from Japan. Developed by teams at Panasonic and The University of Tokyo, the one armed robot can gingerly handle your most precious chinaware, wash the entire set and then place the dishes in a kitchen dish rack.

B36A0AA1-0000-4BC8-999F-6FAB89249182.jpg

KAR accomplishes this feat via the use of 18 different sensors in its hand. The scientists behind the robot hope to have KAR ready for commercial use in about five years. For now the KAR robot is just a prototype, but grizzled drifters counting on greasy spoon diner dishwashing work might want to take notice. You can check out video of the KAR robot in action here.


Dec 21 2008

Fleets of robotic aircraft could improve weather forecasts.

Weather forecasters may not have the best reputation for accuracy, but with today’s computational modeling, it’s possible to make pretty reliable weather predictions up to 48 hours in advance. Researchers at MIT, however, believe that autonomous aircraft running smart storm-chasing algorithms could get that figure up to four days. Better weather forecasting could help farmers and transportation authorities with planning and even save lives by providing earlier warnings about storms and severe weather, says Jonathan How, principal investigator at MIT’s Department of Aeronautics and Astronautics.

B00D4A23-0EDB-4B86-8585-A1E1426630B3.jpg

Long-term predictions don’t necessarily go wrong because of forecasting models, but rather because initial conditions were inaccurately measured, says Martin Ralph, a research meteorologist at the National Oceanic and Atmospheric Administration’s earth systems laboratory, in Boulder, CO. Such inaccuracies come from gaps in the data, he says.

Ground-based sensors are already used to record temperature, wind speed, humidity, air density, and rainfall, but they gauge conditions only at ground level, says How. At sea, where many severe weather fronts originate, the coverage is much sparser. Satellite observations help build up a picture, but satellites are blind to a number of useful types of data, such as low-altitude wind speed and atmospheric boundary conditions, says Ralph.

To get the most accurate readings, you really want to get your sensors into the weather itself, says How. In theory, weather balloons can do this, but only if they happen to be in the right place at the right time. So weather services currently attempt to track down weather systems using piloted planes that fly prescribed routes, taking measurements along the way. The logistics of deploying such planes is so complicated, however, that it’s difficult to change their routes in response to changing weather conditions.

Consequently, says How, there has been a lot of interest in using unmanned aerial vehicles, or UAVs, instead. The idea is that there would be a constant number of UAVs in the air, continuously working together to position themselves in what would collectively be the most useful locations.

The problem, says How, is that calculating the most useful locations is an enormously complex task. It involves analyzing more than a million data states from hundreds of thousands of sensor locations, and using this data to predict the weather conditions six to eight hours from now. But that’s exactly the challenge that the MIT researchers tackled.

So far, the algorithms they developed have been used only in a simulation, as part of a National Science Foundation project. MIT’s Han-Lim Choi, who has been working on the algorithms as part of his PhD research, presented the latest results of the project last week at the IEEE Conference on Decision Control in Cancun, Mexico. The work has attracted the interest of the U.S. Navy, and the MIT group is applying for funding to put the algorithms into practice, says How.

One of the challenges presented by the project is fuel management, says Dario Floreano, an expert in flying robotics and head of the Laboratory of Intelligent Systems at the École Polytechnique Fédérale de Lausanne, in Switzerland. The algorithms will need to be able to quickly and efficiently reroute the UAVs so that they maintain optimal coverage, he says. “This will have to take into account many variables, including energy requirements for different reallocation strategies.”

Another challenge is size, says Floreano. The UAVs need to be small and safe enough to not harm humans and objects if they are deployed in large numbers. He points out, however, that subkilogram UAVs are now becoming available.

In fact, How and his colleagues are more interested in testing their algorithms on the relatively large ScanEagle UAVs from Boeing, which weigh about 18 kilograms apiece. These would be capable of flying distances in excess of 1,000 miles, even laden with sensors and communications equipment. With this sort of range, a fleet of just four could reasonably cover a good-sized area, reducing the risk of collisions with manmade objects.


Dec 18 2008

British scientist warns we must protect the vulnerable from robots

Top robotics expert Professor Noel Sharkey, of the University of Sheffield, has called for international guidelines to be set for the ethical and safe application of robots before it is too late. Professor Sharkey, writing in the prestigious Science journal, believes that as the use of robots increases, decisions about their application will be left to the military, industry and busy parents instead of international legislative bodies.

65630745-0FDD-4EC0-B568-B1E2450353C2.jpg

Robots have been used in laboratories and factories for many years, but their uses are changing fast. Since the turn of the century, sales of professional and personal service robots have risen sharply and are estimated to total 5.5 million in 2008. IFR Statistics estimate 11.5 million in the next two years. The price of robot manufacture is also falling. With robots 80% cheaper in 2006 than they were in 1990, they are set to enter our lives in unprecedented numbers.

Service robots are currently being used in all walks of life, from child-minding robots to robots that care for the elderly. These types of robots can be controlled by a mobile phone or from a PC, allowing input from camera “eyes” and remote talking from caregivers. Sophisticated elder-care robots like the Secom “My Spoon” automatic feeding robot; the Sanyo electric bathtub robot that automatically washes and rinses; and the Mitsubishi Wakamura robot, used for reminding people to take their medicine, are already in widespread use.

Despite this no international legislation or policy guidelines currently exist, except in terms of negligence. This is still to be tested in court for robot surrogates and may be difficult to prove in the home (relative to cases of physical abuse).

Professor Sharkey urges his fellow scientists and engineers working in robotics to be mindful of the unanticipated risks and the ethical problems linked to their work. He believes that robots for care represent just one of many ethically problematic areas that will soon arise from the increase in their use, and that policy guidelines for ethical and safe application need to be set before the guidelines set themselves.

He said: “Research into service robots has demonstrated close bonding and attachment by children, who, in most cases, prefer a robot to a teddy bear. Short-term exposure can provide an enjoyable and entertaining experience that creates interest and curiosity.

“However, because of the physical safety that robot minders provide, children could be left without human contact for many hours a day or perhaps for several days, and the possible psychological impact of the varying degrees of social isolation on development is unknown.

“At the other end of the age spectrum, the relative increase in many countries in the population of the elderly relative to available younger caregivers has spurred the development of elder-care robots. These robots can help the elderly to maintain independence in their own homes, but their presence could lead to the risk of leaving the elderly in the exclusive care of machines without sufficient human contact.”


Dec 17 2008

Researchers bring mind-controlled robotic limbs a few steps closer

Even the most fertile science fiction imagination might not see a link between the behaviour of insects and the development of lifelike robotic limbs, but that is the straightforward mathematical reality of research underway in the UTS Faculty of Engineering and Information Technology.

B53E2AB5-9780-428B-AED9-161751526EF6.jpg

PhD student Rami Khushaba knows it takes some explaining, but the analysis of the behaviour of social insects like ants is helping find the best way to tap into the body’s electrical signals, so that a robotic prosthetic device can be operated like a flesh and blood limb, just by thinking about it.

“I don’t think the crossover from science fiction to science reality is that far away now,” Rami said. “It has been known for some time that human muscle activity, known as the Electromyogram (EMG), carries the distinct signature of the voluntary intent of the central nervous system.

“These so-called myoelectric signals are already being used to control prosthetic devices, but there is a lot of refining to do before a robotic arm will respond instantaneously and accurately to the intention to move.

“Right now the best that can be done is a few simple tasks with rather unsatisfactory performance, due to poor signal recognition and the high computational cost that leads to extra time delays.

“Improvement in analysing the myoelectric signals will spur improvement of the hardware, and that’s where our work is directed.”

Supervised by Dr Adel Al-Jumaily and Dr Ahmed Al-Ani of the School of Electrical, Mechanical and Mechatronic Systems, Rami is developing the mathematical basis for identifying what biosignals relate to particular arm movements and where electrodes should be placed to achieve the optimum result.

“This project presents novel ‘swarm intelligence’ based algorithms to tackle many of the problems associated with the current myoelectric control strategies,” Rami said.

“The way the members of a colony of ants will interact to achieve goals like finding food is metaphor that can be expressed in algorithms that are powerful tools for pattern recognition.

“Current methods for capturing biosignals on the forearm can involve mounting up to 16 electrodes on the skin, generating a vast quantity of data to be processed. We have already demonstrated that applying swarm logic will both simplify that set-up and achieve significantly better results.

“Applying the algorithms on 16-channel EMG datasets from six people found patterns that made it clear only three surface electrodes were actually needed.

“These few electrode positions achieved 97per cent accuracy in capturing the crucial biosignals for movement. This significantly reduced the number of channels to be used for a real time problem, thus reducing the computational cost and enhancing the system’s performance.

“The result was confirmed on a second dataset consisting of eight channels of EMG data collected from the right arm of thirty normally limbed subjects (twelve males and eighteen females).

“We hope one accuracy will lead to another and it will be the very near future when amputees, who can still imagine moving a lost limb, will have access to a device that can truly respond to their intentions.”


Dec 11 2008

Rolling Jollbot

A robot that jumps like a grasshopper and rolls like a ball has been developed by a Bath University PhD researcher.

The Jollbot can jump over obstacles and roll over terrain, meaning it could be used for space exploration or land survey work.

One of the major challenges that faces robots designed for space exploration is being able to move over rough terrain. Robots with legs are generally very complex, expensive to build and control, and encounter problems if they fall over. Wheels are a simpler solution to this, but are limited by the size of obstacles they can overcome.

To solve the problem, Rhodri Armour and his colleagues at Bath University’s Centre for Biomimetic and Natural Technologies have been looking to nature for inspiration – designing a robot that jumps obstacles in its path like an insect.

The Jollbot is shaped like a spherical cage and can roll in any direction, giving it the maneuverability of wheels without the problem of overturning or getting stuck in potholes.

The robot is also flexible and small, weighing less than a kilogram, meaning it’s not damaged when landing after jumping and is therefore less expensive than conventional exploration robots.

‘In the past, others have made robots that jump and robots that roll; but we’ve made the first robot that can do both,’ said Armour.

In nature, there are two main types of jumping: hopping, like a kangaroo, which uses its fine control and direct muscle action to propel it along; and ‘pause and leap’, like a grasshopper, which stores muscle energy in spring-like elements and rapidly releases it to make the jump.

‘We’ve made a robot that jumps in a similar way to the grasshopper, but uses electrical motors to slowly store the energy needed to leap in its springy skeleton. Before jumping, the robot squashes its spherical shape. When it is ready, it releases the stored energy all at once to jump to heights of up to half a metre,’ added Armour.

Future prototypes could include a stretchy skin covered in solar cells on the outside of the robot, so it could power itself, and robotic control sensors to enable it to sense its environment.

The components of the robot were made by rapid prototyping technology, similar to that used by the RepRap machine pioneered by the university, which builds parts by ‘printing’ layers of plastic on top of each other to produce a 3D object.

Video


Dec 7 2008

Mamoru-kun, a robot helper

84FC5B99-E996-4298-B7CD-33A40F10FB73.jpg

Mamoru-kun, an immobile and diminutive robot by the University of Tokyo, could tell you where you last left those keys that you lose. Developed in cooperation with Fujitsu and Toyota, Mamoru-kun measures a mere 40cm tall and weighs 3.8kg. Mamoru-kun can find an object the owner is looking for within specified areas and alerts the owner about it either via verbally through speaker or by pointing at it using its jointed arm (it has another two joints at the neck). We’re still very early on into the technology though – Mamoru-kun’s lost-item-finding powers are limited to pre-programmed items in a specified area only.

For the system to work, you’d have to input items that often get misplaced for identification later. The whole area where you need Mamoru-kun to monitor would also have to be wired with cameras connected to the robot. While these requirements are severely limiting, Mamoru-kun’s developers believe that it could provide great assistance “especially beneficial for elderly and sick persons. ” Additionally, commercialization of the technology isn’t expected until 2018.


Dec 4 2008

Playing Pong with HOAP-3 robot

Sylvain Calinon is a postdoctoral fellow who is working on humanoid robot and imitation learning at the Ecole Polytechnique Federale de Lausanne in Switzerland in the Learning Algorithms and Systems Laboratory.

28292CD7-0EB0-42AC-A20D-18C2A31DE699.jpg

This video is a demonstration of a Human-Robot Interaction application where the robot takes the role of an opponent by playing Pong against a user on an old-school arcade game. The robot is connected to the game and thus knows the current state of the game. It then uses a prediction mechanism to handle the latency due to the game and to the dynamics of the robot’s hand motion (the robot controls the game only through the joystick and buttons on the arcade game).

By knowing the position of the ball on the screen, it mimicks the visual following of this ball by turning the head appropriately through inverse kinematics. In a preliminary phase, the robot plays alone at the game to refine its skill through self-calibration.


Dec 3 2008

Gaze-controlled robot demonstrated on video

A team of researchers from the Technical University of Denmark and the IT University of Copenhagen demonstrated their gaze navigated robot at a recent Microsoft Robotics event.

B2D1A2C2-6741-43F0-A09C-BCA5D094C54D.jpg

The simplicity begins with the LEGO NXT-G kit that the bot is based around, which gets paired with a webcam and a laptop that’s connected to the bot via Bluetooth, and to a desktop PC via WiFi. The PC comes into the picture with an eye-tracking system that lets folks control the robot as they watch the live feed from its webcam which, as you can see in the video after the break, appears to work remarkably well. The researchers apparently aren’t content with things just yet, however, and they’re already looking to use the system to control a wheelchair, and add some head-tracking to the mix for good mesure.