Genetic Algorithms Digest Monday, January 27 1992 Volume 6 : Issue 2 - Send submissions to GA-List@AIC.NRL.NAVY.MIL - Send administrative requests to GA-List-Request@AIC.NRL.NAVY.MIL Today's Topics: - Re: Ordering information for GA proceedings and books - Re: Mutation, bitclimbing and test suites - Re: "Evolutionary Programming" - GAs and very fast simulated re-annealing: A comparison - Looking for information on Goldberg paper ********************************************************************** CALENDAR OF GA-RELATED ACTIVITIES: (with GA-List issue reference) Canadian AI Conference, Vancouver, (CFP 1/7) May 11-15, 1992 COGANN, Combinations of GAs and NNs, @ IJCNN-92 (v5n31) Jun 6, 1992 ARTIFICIAL LIFE III, Santa Fe, NM Jun 15-19, 1992 10th National Conference on AI, San Jose, (CFP 1/15) Jul 12-17, 1992 FOGA-92, Foundations of Genetic Algorithms, Colorado (v5n32) Jul 26-29, 1992 COG SCI 92, Cognitive Science Conference, Indiana, (v5n39) Jul 29-1, 1992 ECAI 92, 10th European Conference on AI (v5n13) Aug 3-7, 1992 Parallel Problem Solving from Nature, Brussels, (v5n29) Sep 28-30, 1992 (Send announcements of other activities to GA-List@aic.nrl.navy.mil) ********************************************************************** ---------------------------------------------------------------------- From: Alan C. Schultz (GA-List Moderator) Date: Wed, 15 Jan 92 14:32:59 +0200 Subject: Re: Ordering information for GA proceedings and books Well, here's one I missed. Add this to the list of information on GA-related books and proceedings (v6n1): Davidor, Yuval. (1991) Genetic algorithms and robotics, World Scientific. ISBN 9810202172 ------------------------------ From: David Davis <70461.1552@CompuServe.COM> Date: 14 Jan 92 09:30:16 EST Subject: Re: Mutation, bitclimbing and test suites I intend for this is my last posting on the discussion of mutation, bit climbing, and test suites. I just wanted to make sure that my conclusions were clear. Larry Eshelman has made several important points in a recent GA-List, and I would like to underscore them. There has been a tendency in the GA field to discount mutation-based approaches to evolution, and my last posting might lead a casual reader to continue to discount them. But these approaches can be quite effective, as Larry has shown using an ingenious algorithm of his own design on problems without the fatal flaws of the F1-F7 test suite. Larry also points out that I seem to have unjustly condemned all the members of the test suite because of the flaws of a subset of them. This is true. But I still think that one should never use F1-F7 as a test suite, for another reason mentioned in my earlier posting: One should never use any single test suite of problems in evaluating a GA using binary (and, I think, Gray coded) representation. The reason is that, in experiments I have run using binary, and in other types of runs in binary carried out with Gil Syswerda when I was at BBN on problems like F1 and F6, it seems that the amount of difficulty the GA has in solving a problem is critically dependent on the amount of offset the problem has from the primary frequencies of the representation. In looking at a graph of GA solution time as one shifts the representation, the result is not a linear increase or decrease. It appears instead to be fractal. Thus it is not simple to characterize a "representative" version of a mathematical function optimization problem in binary because the difficulties jump around so much. This is why I recommend (and will use in my own future work on binary representation, if I ever do any) multiple runs of the GA over representations shifted randomly-determined amounts. The result, in the absence of theory showing how to predict the area under the difficulty graph, will be an approximation to this area. I also recommend that researchers who do this publish their offset parameters with their results, so that other researchers can use the same test suites on their own versions of the algorithm. One nice feature of using numerical representation is that one doesn't have to worry about the problem of offset amounts, since the operators and representation are not sensitive to the alignment of the problem representation to the problem itself. Another is that in my experience solutions to hard problems come much quicker. With all this said, I would like to reiterate two points from my posting and from Larry's posting, so that there is no mistake about what our main conclusions are and no impression that we disagree about them: 1. There are deep and pernicious problems with the test suite problems we have been using for 15 years that have led us to conclude that binary representation is much more effective at numerical optimization than it really is. 2. Mutation-based approaches to numerical function optimization are effective over a wider range of problems than we GA researchers have given them credit for; Larry's work and the earlier work of David Ackley (and his posting to GA-List in 1991) help to point this out. Lawrence "David" Davis Tica Associates ------------------------------ From: Inman Harvey Date: Wed, 15 Jan 92 14:43:23 GMT Subject: Re: "Evolutionary Programming" In GA Digest, Monday, January 13 1992 Volume 6 : Issue 1 Melanie Mitchell writes >Subject: "Evolutionary Programming" > > I picked this up from a news group. I'd never heard of this society > before. Strange that there's no mention here of GAs! > ... >> ANNOUNCING >> First Annual Conference on Evolutionary Programming >> ... On seeing that the Conference General Chairman is David B. Fogel, I think I can speculate why there is no mention of GAs. One of the early evolutionary programming approaches was that of "Artificial Intelligence through Simulated Evolution" by Fogel, L.J., Owens, A.J. and Walsh, M.J. published by John Wiley (1966). This basically used (a) selection of the best Finite-State machine in a population of two and (b) a single genetic operator on the offspring of mutation of the state transition table (no crossover). Hence this was significantly different from GAs' later emphasis on crossover as the major operator, and use of sizeable populations. According to Goldberg ("Genetic Algorithms in Search, Optimization and Machine Learning", p. 105) "The rejection of this work by the artificial intelligence community, more than any other single factor, was responsible for the widespread skepticism faced by more schema-friendly genetic algorithms of the late 1960s and mid-1960s." I don't know how fair this comment is. If accurate, some might argue that there are parallels in the rejection of Rosenblatt's and colleagues' neural network ideas at the time Minsky and Papert published "Perceptrons". Anyway, there have been a recent number of papers by D.B. Fogel and others in Biological Cybernetics advocating extensions of Evolutionary Programming (in one paper a co-author is L.J. Fogel, which makes me think they are related - father/son?) containing quotes such as "...Goldberg ... and Grefenstette et.al. have addressed the Traveling Salesman problem through the use of the genetic algorithm proposed by Holland (1975). This algorithm is a derivative of the Evolutionary Programming concept offered by Fogel (1962,1964; Fogel et. al. 1966)" Ref. (1) below I enjoyed the use of lower-case for ga's and upper-case for EPs, and generally get the impression that there is a bit of a rift between them. If this is so it is unfortunate. The main distinction between the two approaches nowadays seems to be whether or not mutation alone is any better than random search. There has been a fair amount of discussion on this list and elsewhere recently on the surprising power of what the GA community calls Naive Evolution (mutation only - which sounds similar to Evolutionary Programming to me). Inman Harvey (disclaimer - some of the above is guesswork and I am happy to be corrected!) Refs: 1) D.B. Fogel 'An Evolutionary Approach to the Traveling Salesman Problem' Biol. Cybern. 60, 139-144 (1988) 2) D.B. Fogel & J.W. Atmar "Comparing Genetic Operators with Gaussian mutation in Simulated Evolutionary Processes using Linear Systems" Biol. Cybern. 63 (2) 111-114 (1990) 3) D.B. Fogel, L.J. Fogel & U.W. Porto "Evolving Neural Networks" Biol. Cybern. 63 (6) 487-493 (1990) ------------------------------ From: ingber@umiacs.UMD.EDU (Lester Ingber) Date: Tue, 14 Jan 1992 04:48:55 EST Subject: GAs and very fast simulated re-annealing: A comparison Genetic algorithms and very fast simulated re-annealing: A comparison by Lester Ingber and Bruce Rosen For awhile, most-current drafts of this (p)reprint and related papers can be downloaded via anonymous ftp from ftp.umiacs.umd.edu [128.8.120.23] in the pub/ingber directory. If you have problems with this, let me know and I will be glad to prepare a uuencoded copy to email to you. This preprint is saga.ps.Z local% ftp ftp.umiacs.umd.edu [local% ftp 128.8.120.23] Name (ftp.umiacs.umd.edu:yourloginname): anonymous Password (ftp.umiacs.umd.edu:anonymous): [type in yourloginname] ftp> cd pub/ingber ftp> binary ftp> get README.file ftp> get file.ps.Z ftp> quit local% uncompress file.ps.Z local% lpr [-P..] file.ps [to your PostScript laserprinter] ------------------------------------------ | Prof. Lester Ingber | | ______________________ | | Science Transfer Corporation | | P.O. Box 857 703-759-2769 | | McLean, VA 22101 ingber@umiacs.umd.edu | ------------------------------------------ ------------------------------ From: Richard Harris Date: Thu, 9 Jan 92 10:59:48 GMT Subject: Looking for information on Goldberg paper Hello all, Does anybody out there know if D.Goldberg has completed his paper on proving performance theorems for real number representations for genetic algorithms. (As mentioned in the Handbook of Genetic Algorithms, Ed. L.Davis, Van Nostrand Reinhold 1991) If so can anyone give me the reference for this paper so that I may obtain it for myself. My e-mail address is richard@cx.psw.ac.uk Many thanks. Richard Harris ------------------------------ End of Genetic Algorithms Digest ******************************