Expanding From Discrete To Continuous Estimation Of Distribution Algorithms: The IDEA Peter A.N. Bosman and Dirk Thierens Department of Computer Science, Utrecht University, P.O. Box 80.089, 3508 TB Utrecht, The Netherlands fpeterb, Dirk.Thierensg@cs.uu.nl Abstract. The direct application of statistics to stochastic optimiza- tion based on iterated density estimation has become more important and present in evolutionary computation over the last few years. The estimation of densities over selected samples and the sampling from the resulting distributions, is a combination of the recombination and muta- tion steps used in evolutionary algorithms. We introduce the framework named IDEA to formalize this notion. By combining continuous proba- bility theory with techniques from existing algorithms, this framework allows us to de ne new continuous evolutionary optimization algorithms. 1 Introduction Algorithms in evolutionary optimization guide their search through statistics based on a vector of samples, often called a population. By using this stochastic information, non{deterministic induction is performed in order to attempt to use the structure of the search space and thereby aid the search for the optimal solution. In order to perform induction, these samples are combined so as to gen- erate new solutions that will hopefully be closer to the optimum. As this process is iterated, convergence is intended to lead the algorithm to a nal solution. In the genetic algorithm [11, 14] and many variants thereof, values for problem variables are often exchanged and subsequently individually adapted. Another way of combining the samples is to regard them as being representative of some probability distribution. Estimating this probability distribution and sampling more solutions from it, is a global statistical type of inductive iterated search. Such algorithms have been proposed for discrete spaces [2{4, 12, 13, 15, 17, 19, 21], as well as in a limited way for continuous spaces [5, 10, 15, 22, 23]. An overview of this eld has been given by Pelikan, Goldberg and Lobo [20]. Our goal in this paper is to apply the search for good probability density models to continuous spaces. To this end, we formalize the notion of building and using probabilistic models in a new framework named IDEA. We show how we can adjust existing techniques to be used in the continuous case. We thereby de ne new evolutionary optimization algorithms. Using a set of test functions, we validate their performance. The remainder of this paper is organized as follows. In section 2, we present the IDEA framework. In section 3 we describe a few existing algorithms that build and use probabilistic models. In section 4 we state some derivations of probability density functions (pdfs). We use the described algorithms and pdfs within the IDEA in our experiments in section 5. Topics for further research are discussed in section 6 and our nal conclusions are drawn in section 7. 2 The IDEA We write a = (a 0 ; a 1 ; : : : ; a ja the elements in a vector is relevant. We assume to have l random variables available, meaning that each sample point is an l dimensional vector. We introduce the notation ahci = (a c 0 ; a c 1 ; : : : ; a c jc vector of l numbers and let = (Z 0 ; Z 1 ; : : : ; Z l h i), we might as well assume a uniform distribution over . Now denote a probability distribution that is uniformly distributed over all zh i with C(zh i)   and that has a probability of 0 otherwise, by P  ( ). In the discrete case we have: P  ( )(zh i) =