ABSTRACT
Evolutionary algorithms can adapt the behavior of individual agents to maximize the fitness of populations of agents. We use a genetic algorithm (GA) to optimize behavior in a team of simulated robots that mimic foraging ants. We introduce positional and resource detection error models into this simulation, emulating the sensor error characterized by our physical iAnt robot platform. Increased positional error and detection error both decrease resource collection rates. However, they have different effects on GA behavior. Positional error causes the GA to reduce time spent searching for local resources and to reduce the likelihood of returning to locations where resources were previously found. Detection error causes the GA to select for more thorough local searching and a higher likelihood of communicating the location of found resources to other agents via pheromones. Agents that live in a world with error and use parameters evolved specifically for those worlds perform significantly better than agents in the same error-prone world using parameters evolved for an error-free world. This work demonstrates the utility of employing evolutionary methods to adapt robot behaviors that are robust to sensor errors.
- B. D. Beverly, H. McLendon, S. Nacu, S. Holmes, and D. Gordon. How site fidelity leads to individual differences in the foraging activity of harvester ants. Behavioral Ecology, 20(3):633--638, 2009.Google ScholarCross Ref
- R. A. Brooks. New Approaches to Robotics. Science, 253(5025):1227--1232, 1991.Google ScholarCross Ref
- R. A. Brooks. Artifical Life and Real Robots. In Toward a Practice of Autonomous Systems: Proceedings of the First European Conference on Artificial Life, pages 3--10, 1992.Google Scholar
- S. A. Curtis, J. Mica, J. Nuth, G. Marr, M. Rilee, and M. Bhat. ANTS (Autonomous Nano-Technology Swarm): An artificial intelligence approach to asteroid belt resource exploration. Technical report, 2000.Google Scholar
- S. A. Curtis, W. Truszkowski, M. L. Rilee, and P. E. Clark. ANTS for Human Exploration and Development of Space. In IEEE Aerospace Conference, volume 1, pages 1--161, 2003.Google Scholar
- T. P. Flanagan, K. Letendre, W. R. Burnside, G. M. Fricke, and M. E. Moses. Quantifying the Effect of Colony Size and Food Distribution on Harvester Ant Foraging. PLoS ONE, 7(7):e39427, 2012.Google ScholarCross Ref
- D. W. Gage. Many-Robot MCM Search Systems. In Autonomous Vehicles in Mine Countermeasures Symposium, number April, pages 4--7, 1995.Google Scholar
- J. P. Hecker, K. Letendre, K. Stolleis, D. Washington, and M. E. Moses. Formica ex Machina: Ant Swarm Foraging From Physical to Virtual and Back Again. Swarm Intelligence, 7461:252--259, 2012. Google ScholarDigital Library
- N. Jakobi. Half-baked, Ad-hoc and Noisy: Minimal Simulations for Evolutionary Robotics. In Fourth European Conference on Artificial Life, pages 348--357. MIT Press, 1997.Google Scholar
- N. Jakobi, P. Husbands, and I. Harvey. Noise and The Reality Gap: The Use of Simulation in Evolutionary Robotics. Advances in Artificial Intelligence, 704--720, 1995. Google ScholarDigital Library
- H. Kitano, S. Tadokoro, I. Noda, H. Matsubara, T. Takahashi, A. Shinjou, and S. Shimada. Robocup rescue: Search and rescue in large-scale disasters as a domain for autonomous agents research. In Systems, Man, and Cybernetics, 1999. IEEE SMC'99 Conference Proceedings. 1999 IEEE International Conference on, volume 6, pages 739--743. IEEE, 1999.Google Scholar
- C. S. Kong, N. A. Peng, and I. Rekleitis. Distributed coverage with multi-robot system. In IEEE International Conference on Robotics and Automation, number May, pages 2423--2429. Ieee, 2006.Google Scholar
- K. Letendre and M. E. Moses. Synergy in Ant Foraging Strategies: Memory and Communication Alone and In Combination. GECCO (in press), 2013. Google ScholarDigital Library
- E. B. Mallon and N. R. Franks. Ants estimate area using Buffon's needle. Proceedings of the Royal Society of London. Series B: Biological Sciences, 267(1445):765--770, 2000.Google ScholarCross Ref
- M. J. Matarić. Learning to Behave Socially. Third International Conference on Simulation of Adaptive Behavior, (617):453---462, 1994. Google ScholarDigital Library
- M. J. Matarić. Behaviour-based control: examples from navigation, learning, group behavior. Journal of Experimental & Theoretical Artificial Intelligence, 9(2--3):323--336, 1997.Google ScholarCross Ref
- M. J. Matarić. Reinforcement learning in the multi-robot domain. Autonomous Robots, 4:73--83, 1997. Google ScholarDigital Library
- J. McLurkin. Using Cooperative Robots for Explosive Ordnance Disposal. Massachusetts Institute of Technology Artificial Intelligence Laboratory, pages 1--10, 1996.Google Scholar
- O. Miglino, H. H. Lund, and S. Nolfi. Evolving Mobile Robots in Simulated and Real Environments. Artificial Life, 2(4):417--434, Jan. 1995. Google ScholarDigital Library
- M. E. Moses. Metabolic scaling from individuals to societies. PhD thesis, University of New Mexico, 2005.Google Scholar
- A. Nelson, E. Grant, and T. Henderson. Evolution of neural controllers for competitive game playing with teams of mobile robots. Robotics and Autonomous Systems, 46(135--150), 2004.Google Scholar
- S. Nolfi, D. Floreano, O. Miglino, and F. Mondada. How to evolve autonomous robots: Different approaches in evolutionary robotics. Artificial Life, 4:190--197, 1993.Google Scholar
- L. Panait and S. Luke. Learning ant foraging behaviors. Proceedings of the Ninth International Conference on the Simulation and Synthesis of Living Systems (ALIFE9), 2004.Google Scholar
- L. E. Parker. ALLIANCE: An architecture for fault tolerant multirobot cooperation. Robotics and Automation, IEEE Transactions on, 14(2):220--240, 1998.Google Scholar
- T. Paz Flanagan, K. Letendre, W. Burnside, G. M. Fricke, and M. Moses. How Ants Turn Information into Food. IEEE Conference on Artificial Life, pages 178--185, 2011.Google ScholarCross Ref
- C. Pinciroli, V. Trianni, R. O. Grady, G. Pini, A. Brutschy, M. Brambilla, N. Mathews, E. Ferrante, G. D. Caro, F. Ducatelle, T. Stirling, A. Guti, L. M. Gambardella, and M. Dorigo. ARGoS: a Pluggable, Multi-Physics Engine Simulator for Heterogeneous Swarm Robotics. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), number December 2010, pages 5027--5034, 2011.Google Scholar
- C. Pinciroli, V. Trianni, R. O'Grady, G. Pini, A. Brutschy, M. Brambilla, N. Mathews, E. Ferrante, G. Di Caro, F. Ducatelle, M. Birattari, L. M. Gambardella, and M. Dorigo. ARGoS: a modular, parallel, multi-engine simulator for multi-robot systems. Swarm intelligence, 6:271--295, 2012.Google ScholarCross Ref
- B. Prabhakar, K. N. Dektar, and D. M. Gordon. The Regulation of Ant Colony Foraging Activity without Spatial Information. PLoS Computational Biology, 8(8):e1002670, 2012.Google ScholarCross Ref
- M. Singh and D. Parhi. Path optimisation of a mobile robot using an artificial neural network controller. International Journal of Systems Science, 42(1):107--120, Jan. 2011. Google ScholarDigital Library
- E. Tunstel, J. M. Dolan, T. Fong, and D. Schreckenghost. Mobile Robotic Surveying Performance for Planetary Surface Site Characterization. In Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems, pages 200--205, New York, NY, 2008. ACM. Google ScholarDigital Library
- R. Vaughan. Massively multi-robot simulation in stage. Swarm Intelligence, 2008.Google Scholar
- R. A. Watson, S. G. Ficici, and J. B. Pollack. Embodied Evolution: Distributing an evolutionary algorithm in a population of robots. Robotics and Autonomous Systems, 39(1):1--18, Apr. 2002.Google ScholarCross Ref
- M. Wittlinger, R. Wehner, and H. Wolf. The Ant Odometer: Stepping on Stilts and Stumps. Science, 312(5782):1965--1967, 2006.Google Scholar
Index Terms
- An evolutionary approach for robust adaptation of robot behavior to sensor error
Recommendations
Evolving evolutionary algorithms using evolutionary algorithms
GECCO '07: Proceedings of the 9th annual conference companion on Genetic and evolutionary computationA new model for automatic generation of Evolutionary Algorithms (EAs) by evolutionary means is proposed in this paper. The model is based on a simple Genetic Algorithm (GA). Every GA chromosome encodes an EA, which is used for solving a particular ...
Comparative Analysis of Evolutionary Algorithms for the Problem of Parametric Optimization of PID Controllers
Six modern and promising evolutionary algorithms are described: genetic algorithm, differential evolution method, variational genetic algorithm, particle swarm optimization algorithm, bat-inspired method and firefly algorithm. For all algorithms brief ...
A social behaviour evolution approach for evolutionary optimisation
GECCO '11: Proceedings of the 13th annual conference companion on Genetic and evolutionary computationEvolutionary algorithms were originally designed to locate basins of optimum solutions in a stationary environment. Therefore, additional techniques and modifications have been introduced to deal with further requirements such as handling dynamic ...
Comments