Date: Tue, 19 Dec 1995 14:08:02 +0900 To: genetic-programming@cs.Stanford.EDU From: Usami Yoshiyuki Subject: Re; GAs and Optimization In reply to Dr.Stavropoulos's mail, I attach the following information from GA-list. SUMMARY Our Indexed GA Bibliographies Collection now includes "An Indexed Bibliography of Genetic Algorithms and Simulated Annealing: Hybrids and Comparisons" containing over 90 references. The file is compressed PostScript and available via anonymous ftp (ftp.uwasa.fi) directory: cs/report94-1 file: gaSAbib.ps.Z. > I'm a Phd student and I'm interested in GAs and their > applications in Optimization. > Does anybody knows where can I find some papers deal with > this subject? Now, I'm looking for a program that solve the Traveling Salesman Problem exactly. Applegate solved 7397 problem exactly last year (D.Applegate, ). Does anybody provide me any program which could solve the TSP exactly. We need such a program for comparison with our work. Our work was published on Physical Review Letters, Usami Yoshiyuki and Kano Yoshiki., "New Method of Solving the Traveling Salesman Problem Based on Real Space Renormalization Theory" , Phys.Rev.Lett.,75,1683(1995). Sincerely, Usami Yoshiyuki, Institute of Physics, Kanagawa University. e-mail usami@nex.phsc2.kanagawa-u.ac.jp ------------------------------------------------- From: GREENWOOD GARRISON Date: Fri, 9 Jun 1995 08:58:09 -0400 Subject: GAs vs SA Colleagues: I have on several occasions attempted to apply a genetic algorithm, etc. etc to a problem area only to have someone say "Well, how does the quality of your solution compare to simulated annealing????" I feel it is counterproductive to continually rehash this issue. From every comparison I have seen, the evolutionary techiques produce solutions AT LEAST AS GOOD IF NOT BETTER than simulated annealing while doing so in far less computation time. Is anyone aware of an article where someone has conducted a comparison between evolutionary techniques and simulated annealing. I'm looking for something that is not tied to a specific application but general enough so that, when such question come up in the future, I can simply refer them to the appropriate article. Regards, Garry garry.greenwood@wmich.edu [ WMS: It all depends on what you mean by "general". I think a better question is "On what problem classes are GAs better/worse than SA?" Although I've seen specific comparisons I haven't seen anything that answers this question. But I'm always open to being enlightened...! ] ------------------------------ From: fogel@ece.ucsd.edu (Fogel) Date: Thu, 15 Jun 95 16:54:36 PTD Subject: simulated annealing refs. In response to Garry Greenwood's posting on 6/15/95: Dear Colleagues: In the ga-digest posted 6/15/95, Garry Greenwood asked for articles comparing GAs with simulated annealing in general. Two comparison papers that I am aware of are: WL Goffe, GD Ferrier, and J Rogers (1994) "Global optimization of statistical functions with simulated annealing," J. of Econometrics, 60, 65-99. L Ingber and B Rosen (1992) "Genetic algorithms and very fast simulated annealing -- a comparison," Math. and Comp. Model., 16, 87-100. Neither of these comparisons favor genetic algorithms. Garry Greenwood indicated that in all of the comparisons he has observed, evolutionary techniques (presumably speaking about GAs given the preceding paragraph of his text) have always performed at least as well if not better than simulated annealing, and with less computational effort. I think a bibliography of papers documenting such results would be appropriate given the above two papers in archive literature. I would appreciate receiving the citations for such papers. Regards, David [ WMS: And I'd be happy to post those citations, David. ] From: Jarmo Alander Date: Sat, 24 Jun 1995 10:26:31 +0300 Subject: GAs/simulated annealing comparisons (Re: v9n36) Dear All, Our Indexed GA Bibliographies Collection now includes "An Indexed Bibliography of Genetic Algorithms and Simulated Annealing: Hybrids and Comparisons" containing over 90 references. The file is compressed PostScript and available via anonymous ftp (ftp.uwasa.fi) directory: cs/report94-1 file: gaSAbib.ps.Z. Yours, Jarmo Alander University of Vaasa, Finland PS. The directory also contain a few other such bibs. [ WMS: Thanks, Jarmo! Jarmo also sent me a list of pubs, but it was quite long - I have placed it on ftp.aic.nrl.navy.mil under pub/galist/info/GA-SA.comparison.refs. Now, who is going to read all these papers and summarize the results? Volunteers? ;-) ] From: park@cs.bu.edu (Kihong Park) Date: Sat, 1 Jul 1995 19:32:34 -0400 Subject: GA & SA references A follow-up to David Fogel's message in GA-List 6/22/95: > WL Goffe, GD Ferrier, and J Rogers (1994) "Global optimization of > statistical functions with simulated annealing," J. of Econometrics, > 60, 65-99. > > L Ingber and B Rosen (1992) "Genetic algorithms and very fast > simulated annealing -- a comparison," Math. and Comp. Model., 16, > 87-100. > > Neither of these comparisons favor genetic algorithms. Bob Carter and myself have also been doing a study of genetic algorithms relative to simulated annealing in the context of MAX-CLIQUE and MAX-SAT over the last couple of years. Our findings also disfavor genetic algorithms when viewed competitively, and we provide a "theory," albeit qualitative, of why this may be so. Two recent references of possible interest are: K. Park and B. Carter. On the effectiveness of genetic search in combinatorial optimization. In Proc. 10th ACM Symp. on Applied Computing, GA & Optimization Track, pp. 329-336, Feb., 1995. K. Park. A comparative study of genetic search. To appear in Proc. 6th Int. Conf. on Genetic Algorithms, July, 1995. I'd be happy to provide further information to interested parties. Best regards, Kihong Park From: avkampen@cse.unsw.edu.au (Antoine Van Kampen) Date: Thu, 19 Oct 95 09:40:44 +1000 Subject: comparison Hi, with great interest I followed the discussion about the comparison of GAs to other global optimisation methods like SA and PBIL. I want to put in a few remarks of my own. 1) I think it is correct to say that the representation and the crossover operator must be such that meaningful information is transfered on application. Otherwise, the algorithm is indeed just a naive evolution algorithm, i.e., the crossover is without effect. If this latter is the case, the optimisation proceeds by selection and mutation only. Because the mutation operator is often a simple bit flip operator, I think it is obvious that the resulting optimisation scheme is far from optimal. Other strategies which proceeds via a more advanced scheme of selection and mutation (like SA or evol. strategies), will therefore, in most cases, outperform, the 'GA' 2) Any (application) paper discussing a GA should therefore indeed contain a section providing information about the information that is transfered by the recombination operator. I think it is sufficient to repeat some of the presented experiments without the crossover operator. This will immediately show the effect of this operator. But if the crossover operator does has effect, how do we know that the information that is processed is meaningful? How can we know that the crossover is not only a form of mutation? I think that some (toy) problems are that simple that we can understand what is happening on application of the crossover. But what if the fitness function is too complex to understand the search space (relations between parameters). How could we setup a suitable design (and crossover operator) for these problems, such that we know that the transfered information will indeed make sense for the GA? 3) Even if we finally obtained a *real* GA (with a functional crossover operator), we still are left with the question whether other optimisation methods would be more suitable to solve that specific optimisation problem. Can we define classes of optimisation problems, each which have to be solved by a specific optimisation strategy? If we cannot do this, why should we use the complex GA, which is often hard to understand, and difficult to set up if the optimization problem is complex. According to my opinion, techniques like SA are easier to apply and should for most problems be able to come up with equaly good results. So what is the benefit of GA (for optimization problems) in comparison to SA? I think this wil provide some material for further discussion. Antoine van Kampen From: wren@scs.leeds.ac.uk (Anthony Wren) Date: Fri, 27 Oct 1995 09:14:36 GMT Subject: Comparison Antoine van Kampen raises a number of interesting points about the nature and function of crossovers etc, and I agree with most of them. I'm dubious however about SA coming up with equally good results for most problems. However, as someone driven by the need to solve problems rather than by a desire to use GA's for their own sake, I recommend the pragmatic approach: If a GA gives results to a particular class of problems which are better than any others obtained, then use the GA until something better is found. If you suspect that it should be possible to obtain even better solutions, seek a better method, which may be a better GA or may be something else. If you actually have better solutions by another method, use that method. BUT KEEP LOOKING for improved evolutionary methods or other methods, if you have time before the problem goes away. Tony Anthony Wren email: wren@scs.leeds.ac.uk Professor of Scheduling and Constraint Management Phone: +44-113-233-5481 School of Computer Studies Fax: +44-113-233-5468 University of Leeds Leeds LS2 9JT Home: +44-1423-870881 UK ------------------------------ From: HUNTER Andrew Date: Fri, 27 Oct 95 10:27:00 PDT Subject: comparison of GA and other techniques Antoine van Kampen provided a sensible critique of the use of GAs, in comparison with other forms of search technique, in the last ga-list. However, I think the following points also need to be kept in mind: 1) GAs are NOT incompatible with other search techniques. Any form of single-point search can be integrated with a GA in the guise of a special mutation operator. The question is not then whether GAs are superior to other techniques, but whether they can add anything to them. The GA may add value to search techniques because: a) it searches multiple points; b) it can use crossover to (sometimes) exploit building blocks (if they exist). Don't forget about hybrids! 2) GAs are a parallel search technique. In principal, you can run each member of the population on a different machine. If you want to compare with (say) SA, then you should compare with parallel SA, where a number of points are searched concurrently. Finally, a little bit of advertising! Sugal 2.1, coming soon, will have enhanced features to support the integration of additional "mutation" operators. I hope that members of the GA community will chip in with various non-GA search methods, and give us all a platform where we can investigate these issues AND reproduce each other's results! Andrew Hunter Date: Tue, 27 Jun 1995 11:34:50 +0300 From: Jarmo Alander To: ga-list@AIC.NRL.Navy.Mil Subject: GAs and SA comparison (a reference list of 48 items) Muhammad S.~T. Benten and Sadiq~M. Sait. {GAP}: a genetic algorithm approach to optimize two-bit decoder {PLAs}. {\em International Journal of Electronics}, 76(1):99--106, January 1994. H.~Ding, A.~A. Elkeib, and R.~Smith. Optimal clustering of power networks using genetic algorithms. {\em Electric Power Systems Research}, 30(3):209--214, 1994. (Proceedings of the 3rd Biennial Symposium on Industrial Electric Power Applications, New Orleans, LA, Nov. 12.-13., 1992) S.~Finnerty and S.~Sen. Simulated annealing based classification. In ?, editor, {\em Proceedings of the 6th IEEE Conference on Tools with Artificial Intelligence (TAI'94)}, pages 824--827, New Orleans, LA, 6.-9.~November 1994. IEEE Computer Society Press, Los Alamitos, CA. Wen Fushuan and Han Zhenxiang. Fault section estimation in power systems using genetic algorithm and simulated annealing. {\em Proc. CSEE (China)}, 14(3):29--35, May 1994. (in Chinese) W.~L. Goffe, G.~D. Ferrier, and J.~Rogers. Global optimization of statistical functions with simulated annealing. {\em Journal of Econometrics}, 60(?):65--99, ? 1994. John Gunnels, Paul Cull, and J.~L. Holloway. Genetic algorithms and simulated annealing for gene mapping. In {\em Proceedings of the First IEEE Conference on Evolutionary Computation\/} \cite{ga94ICCIEC1}, pages 385--390. Hasan Pirkul and Erik Rolland. New heuristic solution procedures for the uniform graph partitioning problem: extensions and evaluation. {\em Computers {\&} Operations Research}, 21(8):895--907, October 1994. R.~Ponnusamy, N.~Mansour, A.~Choudhary, and G.~C. Fox. Graph contraction for mapping data on parallel computers: a quality-cost tradeoff. {\em Sci. Program.}, 3(1):73--82, Spring 1994. R.~Vemuri and R.~Vemuri. Genetic algorithm for {MCM} partitioning. {\em Electronics Letters}, 30(16):1270--1272, July 1994. Muhammad S.~T. Benten and Sadiq~M. Sait. Genetic scheduling of task graphs. {\em International Journal of Electronics}, 77(4):401--415, October 1994. Una-May O'Reilly and Franz Oppacher. Program search with a hierarchical variable length representation: Genetic programming, simulated annealing and hill climbing. Technical Report 94-04-021, Santa Fe Institute, 1994. R.~Vemuri and R.~Vemuri. {MCM} layer assignment using genetic search. {\em Electronics Letters}, 30(20):1635--1637, 29.~September 1994. Carlos~B. Lucasius, M.~L.~M. Beckers, and Gerrit Kateman. Genetic algorithms in wavelength selection: a comparative study. {\em Analytica Chimica Acta}, 286(2):135--153, 18.~February 1994. Takashi Kido, K.~Takagi, and Masakazu Nakanishi. Analysis and comparisons of genetic algorithm, simulated annealing, tabu search, and evolutionary combination algorithm. {\em Informatica}, 18(4):399--410, 1995. Julian Sheung, Alex Fan, and Anthony Tang. Time tabling using genetic algorithm and simulated annealing. In {\em Proceedings of the 1993 IEEE Region 10 Conference on Computer, Communication, Control and Power Engineering (TENCON'93)}, volume~1, pages 448--451, Beijing (China), 19.-21.~October 1993. IEEE. Baumin Lee. {\em Three new algorithms for exact {D}-optimal design problems}. PhD thesis, The Ohio State University, 1993. Donald~E. Brown, Christopher~L. Huntley, Bernard~P. Markowicz, and David~E. Sappington. Rail network routing and scheduling using simulated annealing. In {\em Proceedings of the 1992 IEEE International Conference on Systems, Man, and Cybernetics}, volume~1, pages 589--592, Chicago, IL, 18.-21.~October 1992. IEEE Computer Society Press, Loa Alamitos, CA. K.~Dockx and J.~F. Lutsko. {SA/GA}: Survival of the fittest in {Alaska}. In ?, editor, {\em Proceedings of the Fourth International Workshop on Artificial Intelligence and Statistics}, pages 219--224, Fort Lauderdale, FL, ? 1993. ? Guoguang Yang. Genetic algorithm for the optimal design of diffractive optical elements and the comparison with simulated annealing. {\em Guangxue Xuebao}, 13(7):577--584, July 1993. (in Chinese) Bernd Hartke. Global geometry optimization of clusters using genetic algorithms. {\em The Journal of Physical Chemistry}, 97(39):9973--9976, 1993. A.~Hill and C.~J. Taylor. Model-based image interpretation using genetic algorithms. {\em Image and Vision Computing}, 10(5):295--300, June 1992. Lester Ingber and Bruce Rosen. Genetic algorithms and very fast simulated re-annealing. Technical report, Science Transfer Corporation and University of Delaware, McLean, 1991. Lester Ingber and Bruce Rosen. Genetic algorithms and very fast simulated annealing: A comparison. {\em Mathematical and Computer Modelling}, 16(11):87--100, November 1992. Richard~S. Judson, M.~E. Colvin, J.~C. Meza, A.~Huffer, and D.~Gutierrez. Do intelligent configuration search techniques outperform random search for large molecules? {\em International Journal of Quantum Chemistry}, 44(2):277--290, 1992. Mrinal~K. Sen, Akhil~Datta Gupta, and Paul~L. Stoffa. Stochastic reservoir modeling using simulated annealing and genetic algorithm. In ?, editor, {\em Proceedings of the 1992 SPE Annual Technical Conference and Exhibition}, pages 939--950, Washington, DC, 4.-7.~October 1992. Society of Petroleum Engineers of AIME, Richardson, TX. Alberto Colorni, Marco Dorigo, and Vittorio Maniezzo. {\sc{Algodesk}}: an experimental comparison of eight evolutionary heuristics applied to the {QAP} problem. Technical Report 92-052, Politecnico di Milano, Dipartimento di Elettronica, 1992. Ron Unger and John Moult. A genetic algorithm for {3D} protein folding simulations. In Forrest \cite{ga:GA5}, pages 581--588. Marco Muselli and Sandro Ridella. Global optimization of functions with the interval genetic algorithm. {\em Complex Systems}, 6(3):193--212, June 1992. Koichi Nara, Atsushi Shiose, Minoru Kitagawa, and Toshihisa Ishihara. Implementation of genetic algorithm for distribution systems loss minimum re-configuration. {\em IEEE Transactions on Power Systems}, 7(3):1044--1051, August 1992. Liana Borup and Alan Parkinson. Comparison of four non-derivative optimization methods on two problems containing heuristics and analytic knowledge. In {\em Proceedings of the 18th Annual ASME Design Automation Conference}, volume~1, pages 137--143, Scottsdale, AZ, 13./16.~September 1992. ASME, New York. P.~W. Poon and G.~T. Parks. Optimizing {PWR} reload core design. In M{\"a}nner and Manderick \cite{ga:PPSN2}, pages 371--380. Lin-Ming Jin and Shu-Park Chan. Analogue placement by formulation of macrocomponents and genetic partitioning. {\em International Journal of Electronics}, 73(1):157--173, July 1992. Sandip Sen. Minimal cost set covering using probabilistic methods. In ?, editor, {\em Proceedings of the 1993 ACM/SIGAPP Symposium on Applied Computing}, pages 157--164, Indianapolis, IN, 14.-16.~February 1993. ACM, New York. Youssef~G. Saab and Vasant~B. Rao. Combinatorial optimization by stochastic evolution. {\em IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems}, 10(4):525--535, 1991. Marius Sinclair. Comparison of the performance of modern heuristics for combinatorial optimization on real data. {\em Computers {\&} Operations Research}, 20(7):687--695, September 1993. B.~Stuckman, G.~Evans, and M.~Mollaghasemi. Comparison of global search methods for design optimization using simulation. In {\em IEEE Winter Simulation Conference Proceedings}, pages 937--944, Phoenix, AZ, 8.-11.~December 1991. IEEE, New York. Tai~A. Ly and Jack~T. Mowchenko. Applying simulated evolution to high level synthesis. {\em IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems}, 12(3):389--409, March 1993. El-Ghazali Talbi and Traian Muntean. Hill-climbing, simulated annealing and genetic algorithms: A comparative study. In Trevor~N. Mudge, Veljko Milutionovic, and Lawrence Hunter, editors, {\em Proceedings of the 26th Hawaii International Conference on Systems Science (HICSS-26)}, volume~2, pages 565--573, Wailea, HI, 5.-8.~January 1993. IEEE Computer Society Press, Los Alamitos, CA.