Scale-up of Microbial Fuel Cells for Wastewater Treatment

Microbial Fuel Cells (MFCs) harness electrons from the metabolic reactions of microorganisms to generate an electrical current. The microbes utilise organic material in the medium and successful power generation has been achieved using a range of sources including; acetate, dead flies and domestic wastewater. Recent research using the latter has revealed the potential for application into the water industry, where wastewater can be treated at the same time as power is generated. This research venture aims to develop a continuous flow MFC system capable of long-term use in a water-treatment plant.

There is already a solid platform of MFC research in place at BRL and this project will utilise the technology and look to develop further with the specific aim of wastewater treatment. This development will look at optimizing the architectural design, the operational parameters and the selection of efficient robust microbial consortia capable of metabolizing the target compounds present in effluent samples. Wastewater is heterogeneous and will vary in organic content from location to location. In addition the organism’s environment will differ depending on where the MFC system operates and so it may transpire that microbial consortia need to be ‘customized’ to reflect the conditions they work in and organic components they encounter.

This Great Western Research (GWR) project is a joint collaboration between UWE, Bristol University and Wessex Water.


Though humanoid robots are being produced in increasing quantity and variety they are still often controlled by the same methods of their industrial counterparts, with large unpredictable movements of redundant joints.

This project aims to generate more human like motion from anthropomorphic (humanoid) robots by imposing an anatomically inspired control scheme based on human muscle location, grouping and response.

By having such a model running alongside the robot’s primary motion controller the robot constrains its motion for maximum efficiency. This produces movement trajectories that appear lifelike while still permitting the primary controller to execute useful tasks.

The control scheme is implemented in the form of a novel nonlinear controller based on a dynamic model of the robot system. This creates a general system where a practical application has been achieved through a controller grounded in theoretical mathematics.

Designing a robot to move in a human-like way is intended to improve human-robot interaction for the next generation of robotic applications where robots are envisaged to work in close proximity to humans and elements of trust and predictability are linked to efficient working and safety.

This project uses the BERUL (Bristol Elumotion Robotic Upper Limb) and BERT (Bristol Elumotion Robotic Torso) robot systems that were designed and built by Elumotion Ltd .

Berul and BertFigure 2: Robots used for this study, Berul and Bert


The principle investigators on this project are Adam Spiers, Guido Herrmann and Chris Melhuish

This project is part of the research carried out by the Nonlinear Robotics Control Group.

[picture description] Autonomous vehicle utility has reached a plateau due to mobility constraints on the current generation of robots.  Of particular note is the inability of existing systems to manoeuvre in more than one substrate.  Although no mature examples of morphing (flying/crawling) robots exist today, many animals possess the ability of locomotion in more than one medium.  For example, the preferred mode of locomotion of birds, bats and some insects is flight, but many are also quite efficient on the ground. Robotic performance typically reflects a similar design space as small animals; multiple locomotion modes could represent a generational leap in their utility. It is the aim of this research to facilitate such capabilities and provide a foundation for a new generation of mobile robots. Potential applications include: surveillance, reconnaissance, exploration, search/rescue, and remote inspection. Our first generation robot (pictured), fabricated in pilot research in the USA (in collaboration with Case Western Reserve University, the University of Florida, and the Naval Postgraduate School), is the only existing man-portable robot that can fly, land, and crawl in a continuous sequence of actions. Our prototype can be launched manually, walk itself to fly off of a rooftop, and can also be deployed from larger (air or ground) vehicles. Versions of the robot, all exploiting insect-inspired aerial and/or terrestrial locomotion mechanisms, have flown autonomously, subsumed wing-folding mechanisms, and taken off successfully from the ground.  Video of the robot during (US) military field testing is available at: http://faculty.nps.edu/ravi/BioRobotics/Projects.htm

[picture description]While vision supplies information about distant objects, touch is invaluable in sensing the nearby environment. However, in designing intelligent, life-like machines, such as robots, the touch modality has been largely overlooked. Current systems make only limited use of tactile sensors for simple tasks such as detecting physical contact. Biology, by contrast, reveals an abundant use of tactile sensing in the animal kingdom. Indeed, in nocturnal creatures, or those that inhabit poorly-lit places, touch is widely preferred to vision as a primary means of discovering the world. The tactile senses of many mammals are built around arrays of facial hairs known as “whiskers” or “vibrissae”. In this project we will develop new technologies inspired by the whisker morphology and neural processing systems of two such tactile specialists: the Norwegian rat and the Etruscan shrew. These animals sweep their whiskers back and forth at high speeds in a controlled and feedback-sensitive manner. This “active touch” capacity allows them to: (i) maximise their intake of useful information (ii) solve perceptual tasks such as determining the position, shape, and surface texture of encountered objects (iii) encode tactile memories that allow recognition of familiar items and (iv) track and capture prey animals using touch signals alone. Using our understanding of these natural vibrissal systems we will develop two biomimetic artefacts endowed with similar sensing capabilities: a novel active tactile sensing array, termed a BIOTACT sensor, which will be mounted on a robot manipulator arm and incorporate a large number of whisker-like sensing elements and an autonomous whiskered robot that can seek-out, identify, and track fast-moving target objects.

Overall, our project will bring about a step- change in the understanding of active touch sensing and in the use of whisker-like sensors in intelligent machines.

BRL is a partner in the BIOTACT consortium and is responsible for the majority of the robotics aspects of the project.

The main project web site is at http://www.biotact.org/.

See also Whiskered Robots: from Whiskerbot towards SCRATCHbot.

The BRL robot manipulator arm was designed and built by Elumotion Ltd (http://www.elumotion.com). in a partnership with BRL


CHRIS will address the fundamental issues which would enable safe Human Robot Interaction (HRI). Specifically this project addresses the problem of a human and a robot performing co-operative tasks in a co-located space, such as in the kitchen where your service robot stirs the soup as you add the cream. eye tracking system following a human's gaze These issues include communication of a shared goal (verbally and through gesture), perception and understanding of intention (from dextrous and gross movements), cognition necessary for interaction, and active and passive compliance. These are the prerequisites for many applications in service robotics and through research will provide the scientific foundations for engineering cognitive systems. The project is based on the essential premise that it will be ultimately beneficial to our socioeconomic welfare to generate service robots capable of safe co-operative physical interaction with humans. The key hypothesis is that safe interaction between human and robot can be engineered physically and cognitively for joint physical tasks requiring co-operative manipulation of real world objects. A diverse set of disciplines have been brought together to realise an inter-disciplinary solution. The starting point for understanding cooperative cognition will be from the basic building blocks of initial interactions, those of young children. Engineering principles of safe movement and dexterity will be explored on the 3 available robot platforms, and developed with principles of language, communication and decisional action planning where the robot reasons explicitly with its human partner. Integration of cognition for safe co-operation in the same physical space will provide significant advancement in the area, and a step towards service robots in society.


  • BRL- Bristol Robotics Laboratory, UK
  • CNRS – Centre National Recherche Scientifique, France
  • IIT – Fondazione Istituto Italiano di Technologia, Italy
  • MPG – Max Planck Gesellschaft zur Förderung der Wissenschaften e.V., Germany

Equipment at BRL

Robotic Torso – BERT2 (Bristol Elumotion Robotic Torso 2):

Artist's impression of a robot torso with a digital, expressive head We are currently developing a advanced robot torso featuring an expressive digital head, torque sensors, artificial ‘skin’ and agile limbs. At the same time we are investigating novel adaptive control algorithm for save human robot interaction. The BERT torso was developed and built by Elumotion Ltd (www.elumotion.com) in a partnership with BRL.

Human Eye and Gaze Tracking:

eye tracking system following a human's gaze An eye-tracking system is employed to allow the robot to estimate the gaze of its human partner and infer the ‘state of mind’ like intention and focus. This will allow for safer cooperation as well as imitation between the human and the robot

3D Motion Tracking:

Vicon 3D Vision System We employ a Vicon 3D tracking system to continously monitor position, body movement and gestures of the human whilst interacting with the robot. It is known that human infants undergo a long period of postnatal development which corresponds to the acquisition by trial-and-error exploration (sometimes called motor babbling) of internal models of the body and of the external objects. We are able to oberserve the robot’s joint position in 3D in real-time and will apply similar learning systems to enable the robot to investigate it’s own ‘body’ and the interaction with the environment

At the Bristol Robotics Laboratory in England, researchers are designing their newest bug-eating robot—Ecobot III.

The device is the latest in a series of small robots to emerge from the lab that are powered by a diet of insects and other biomass.

“We all talk about robots being able to do stuff on their own, being intelligent and autonomous,” said lab director Chris Melhuish.

“But the truth of the fact is that without energy, they can’t do anything at all.”

Most robots today draw that energy from electrical cords, solar panels, or batteries. But in the future some robots will need to operate beyond the reach of power grids, sunlight, or helping human hands.

Melhuish and his colleagues think such release-and-forget robots can satisfy their energy needs the same way wild animals do—by foraging for food.

“Animals are the proof that this is possible,” he said.

Bug-munching Bots

Over the last decade, Melhuish’s team has produced a string of bots powered by sugar, rotten apples, or dead flies.

The biomass is converted into electricity through a series of stomachlike microbial fuel cells, or MFCs.

Living batteries, MFCs generate power from colonies of bacteria that release electrons as the microorganisms digest plant and animal matter. (Electricity is simply a flow of electrons.)

The lab’s first device, named Slugbot, was an artificial predator that hunted for common garden slugs.

While Slugbot never digested its prey, it laid the groundwork for future bots powered by biomass.

In 2004 researchers unveiled Ecobot II. About the size of a dessert plate, the device could operate for 12 days on a diet of eight flies.

“The flies [were] given as a whole insect to each of the fuel cells on board the robot,” said Ioannis Ieropoulos, who co-developed Ecobot II as part of his Ph.D. research.

With its capacitors charged, the bot could roll 3 to 6 inches (8 to 16 centimeters) an hour, moving toward light while recording temperature. It sent data via a radio transmitter.

While hardly a speedster, Ecobot II was the first robot powered by biomass that could sense its world, process it, act in it, and communicate, Melhuish says.

The scientist sees analogs in the autonomously powered robots of the future.

“If you really do want robots that are going to … monitor fences, [oceans], pollution levels, or carbon dioxide—all of those things—what you need are very, very cheap little robots,” he said.

“Now our robots are quite big. But in 20 to 30 years time, they could be quite minuscule.”

(See a photo of small, solar-powered robots.)

More Power

Whether microbial fuel-cell technology can advance enough to power those robots, however, is unclear.

Stuart Wilkinson, a mechanical engineer at the University of South Florida in Tampa, developed the world’s first biomass-powered robot, a toy-train-like bot nicknamed Chew-Chew that ran on sugar cubes.

He says the major drawback of MFCs is that it takes a big fuel cell to produce a small amount of power.

Most AA batteries, for example, produce far more power than a single MFC.

“MFCs are capable of running low-power electronics, but are not well suited to power-hungry motors needed for machine motion,” Wilkinson said in an email interview.

He added that scientists “need to develop MFC technology further before it can be of much practical use for robot propulsion.”

Ieropoulos, Ecobot II’s co-developer, agrees that MFCs need a power boost.

He and his colleagues are exploring ways to improve the materials used in MFCs and to maintain resident microbes at their peak.

Bot Weaning

To date, the Bristol team has hand-fed its bots.

But if the researchers are going to realize their vision of autonomously powered robots, then the machines will need to start gathering their own food.

When Ecobot II debuted in 2004, Melhuish suggested one way that it might lure and capture its fly food-source: a combination fly-trap/suction pump baited with pheromones.

Whether the accessory will appear in Ecobot III is anyone’s guess. The BRL team remains tight-lipped about their current project, preferring to finish their work in secret before discussing it publicly.

Melhuish will say this, however: “What we’ve got to do is develop a better digestion system … . There are many, many problems that have to be overcome, and waste removal is one of them.”

Will the future bring robot restrooms? Watch this space.

by Laura June — Feb 23rd 2009 at 10:29AM

BERTI (built by a partnership of the Bristol Robotics Laboratory and Elutmotion Ltd.) is a fully automated robotic torso designed to perform “credible conversational gestures.” The robot is capable of quite complex hand movements, and, in the demonstration video above, plays a game of Rock, Paper, Scissors with a fine gentleman wearing a Goldfinger t-shirt, becoming another addition to the long line of gaming bots. Hit the read link to find out more info about BERTI and the project. [Via Robots.net]