ABSTRACT
We integrate a Denoising Autoencoder (DAE) into an Estimation of Distribution Algorithm (EDA) and evaluate the performance of DAE-EDA on several combinatorial optimization problems. We asses the number of fitness evaluations and the required CPU times. Compared to the state-of-the-art Bayesian Optimization Algorithm (BOA), DAE-EDA needs more fitness evaluations, but is considerably faster, sometimes by orders of magnitude. These results show that DAEs can be useful tools for problems with low but non-negligible fitness evaluation costs.
- Y. Bengio, L. Yao, G. Alain, and P. Vincent. Generalized Denoising Auto-Encoders as Generative Models. In Advances in Neural Information Processing Systems 26 (NIPS'13). NIPS Foundation (http://books.nips.cc), 2013.Google Scholar
- K. Deb and D. E. Goldberg. Analyzing Deception in Trap Functions. University of Illinois, Department of General Engineering, 1991.Google Scholar
- S. A. Kauffman and E. D. Weinberger. The NK Model of Rugged Fitness Landscapes and its Application to Maturation of the Immune Response. Journal of theoretical biology, 141(2):211--245, 1989.Google Scholar
- A. Krizhevsky, I. Sutskever, and G. E. Hinton. Imagenet Classification with Deep Convolutional Neural Networks. In Advances in neural information processing systems, pages 1097--1105, 2012.Google ScholarDigital Library
- P. Larrañaga and J. A. Lozano. Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation. Genetic Algorithms and Evolutionary Computation, 2. Kluwer Academic Pub, 2002.Google ScholarCross Ref
- B. L. Miller and D. E. Goldberg. Genetic Algorithms, Tournament Selection, and the Effects of Noise. Complex Systems, 9:193--212, 1995.Google Scholar
- J. Očenášek and J. Schwarz. The Parallel Bayesian Optimization Algorithm. In The State of the Art in Computational Intelligence, pages 61--67. Springer, 2000.Google ScholarCross Ref
- M. Pelikan. Bayesian Optimization Algorithm. In Hierarchical Bayesian Optimization Algorithm, volume 170 of Studies in Fuzziness and Soft Computing, pages 31--48. Springer Berlin / Heidelberg, 2005.Google ScholarCross Ref
- M. Probst. Denoising Autoencoders for Fast Combinatorial Black Box Optimization. preprint on arXiv, arXiv:1503.01954, 2015.Google Scholar
- M. Probst, F. Rothlauf, and J. Grahl. An Implicitly Parallel EDA Based on Restricted Boltzmann Machines. In Proceedings of the 2014 Conference on Genetic and Evolutionary Computation, GECCO '14, pages 1055--1062, New York, NY, USA, 2014. ACM. Google ScholarDigital Library
- M. Probst, F. Rothlauf, and J. Grahl. Scalability of Using Restricted Boltzmann Machines for Combinatorial Optimization. preprint on arXiv, abs/1411.7542, 2014.Google Scholar
- P. Vincent, H. Larochelle, Y. Bengio, and P.-A. Manzagol. Extracting and Composing Robust Features with Denoising Autoencoders. In Proceedings of the 25th international conference on Machine learning, pages 1096--1103. ACM, 2008. Google ScholarDigital Library
- R. A. Watson, G. S. Hornby, and J. B. Pollack. Modeling Building-Block Interdependency. In Parallel Problem Solving from Nature - PPSN V, pages 97--106. Springer, 1998. Google ScholarCross Ref
Index Terms
- Denoising Autoencoders for Fast Combinatorial Black Box Optimization
Recommendations
Noiseless functions black-box optimization: evaluation of a hybrid particle swarm with differential operators
GECCO '09: Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking PapersIn this work we evaluate a Particle Swarm Optimizer hybridized with Differential Evolution and apply it to the Black-Box Optimization Benchmarking for noiseless functions (BBOB 2009). We have performed the complete procedure established in this special ...
Black-box optimization benchmarking for noiseless function testbed using an EDA and PSO hybrid
GECCO '09: Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking PapersThis paper benchmarks an Estimation of Distribution Algorithm (EDA) and Particle Swarm Optimizer (PSO) on noise-free BBOB 2009 testbed. The algorithm is referred to as EDA-PSO and further enhanced with correlation-triggered adaptive variance scaling.
How to Escape Local Optima in Black Box Optimisation: When Non-elitism Outperforms Elitism
Escaping local optima is one of the major obstacles to function optimisation. Using the metaphor of a fitness landscape, local optima correspond to hills separated by fitness valleys that have to be overcome. We define a class of fitness valleys of ...
Comments