Path: utzoo!utgpu!news-server.csri.toronto.edu!bonnie.concordia.ca!uunet!mcsun!ukc!mucs!liv-cs!liv!ec13 From: EC13@LIVERPOOL.AC.UK Newsgroups: comp.theory Subject: Brand New Optimization Methods Message-ID: <91166.152446EC13@LIVERPOOL.AC.UK> Date: 15 Jun 91 14:24:46 GMT Organization: University of Liverpool Lines: 178 Entropic Vector Optimization and Simulated Entropy: Theory & Applications ------------------------------------------------------------------------- I am including here the last chapter of that thesis which was under 'Conclusions and Recommendations for Future Work'......... 1-)CONCLUSIONS: ~~~~~~~~~~~~ This thesis has examined the use of Shannon's (informational) entropy measure and Jaynes' maximum entropy principle in connection with the solution of single criterion and multi-criteria optimization problems. At first glance, the two concepts, entropy and optimization, seem to have no direct link as the Shannon entropy is essentially related to probabilities while optimization is usually viewed in terms of a deterministic topological domain. To explore possible links between them, an optimization problem has been simulated as a statistical thermodynamic system that spontaneously approaches its equilibrium state under a specified temperature, which is then characterized by the maximum entropy. An attacking line: Entropy ------> Thermodynamic Equilibirum -------> Optimization was then postulated. Several questions are then raised about how to do this simulation. They are: a) What are micro-states of this statistical thermodynamic system in an optimiz ation context? b) What are the probabilities of the micro-states? c) What common characteristic is there in these two processes? d) What common law governs them? In multi-criteria optimization, these questions are brieflyanswered as follows: a) Each micro-state corresponds to a criteria in which it is optimized by randomly taking or adding a finite amplitude step from or to it respectively so that a Pareto set can be generated. b) The multiplier associated with each criteria is interpreted as the probability of the system being in the corresponding micro-state. c) The Pareto set generation process can be thought of as a sequence of feasible transition of the system to its equilibrium states such that the equilibra become the common characteristics in the two processes. d) That the entropy of the system attains a maximum value in equilibrium states represent the common law to govern the two processes. In single-criterion constrained minimization, these questions are briefly answ ered as follows: a) Each micro-state corresponds to a constraint in which the objective function is minimized, subject to all constraints randomly taking a finite amplitude step from it. b) The multipliers associated with each constraint is interpreted as the probability of the system being in the corresponding micro-state. c) A minimizing process can be thought of as a sequence of transitions of the system to its equilibrium states that the equilibrium becomes the common characteristic in the two processes. d) That the entropy of the system attains a maximum value at an equilibrium state but has a monotonically decreasing value during the minimization process represents the common law to govern the two processes. In the course of this study, the concept of entropy was further examined by presenting a new entropy-based minimax method for generating Pareto solutions sets for multi-criteria optimization problems. The subject of simulated entropy for SEEKING the global minimum of single-criteria constrained minimization problems was explored and proved by developing two simulated entropy techniques. The main developments made in the present study are summarized as follows: 1) Two entropy-based method for generating Pareto solutions sets were developed in terms of Jaynes' maximum entropy formalism. These new methods have provided additional insight into entropic optimization as well as affording a simple means of calculating the least biased probabilities. 2) Two new entropy-based stochastic techniques were developed to reach the global minimum of constrained single-criterion minimization problems and belong to a new class of algorithms called simulated entropy. 3) Numerical examples were presented, tested and compared using all the new entropy-based methods described in Chapter 3. The present work has shown that there are links between entropy and optimization. There is no doubt that good entropy-based optimization algorithms can be devised based upon the present research. Several conclusions, drawn from the present research, are summarised as follows: 1) The development of the new entropy-based methods has provided not only an alternative convenient solution strategy but also additional insights into entropic processes. 2) Uncertainty contained in the solution of multi-criteria optimization problems is similar to that contained in thermodynamic systems: thus it is reasonable to employ a statistical thermodynamic approach, i.e., the entropy maximization approach, to estimate multi-criteria multipliers. 3) Uncertainty contained in the solution of constrained minimization problems is also similar to that contained in thermodynamic systems, so it is reasonable to employ a statistical thermodynamic approach, i.e., the entropy minimaximization approach, to estimate the entropy multipliers. 4) The exploration of a new class of methods called simulated entropy methods has a significance far beyond that subject itself. A fact which must be emphasized is, that it is the informational entropy approach which has made this possible. During the simulated entropy process simulation of the entropy was a close parallel to minimization of the maximum entropy at each configuration. 5) Entropic optimization developed in this thesis have a very unique property which make it very easy to be programmed. This unique property may be formulated as follows: "No matter how many goals or constraints we have, only one parameter controls the process: the entropy parameter, P" 6) The development of the entropic optimization methods have enabled the mathematical optimization examples considered in Chapter 4 to be solved easily. The computer results have shown that the developed methods have fast and stable convergence. Through the development made here, it can be seen that the entropic optimization methods DESERVE TO BE MORE WIDELY RECOGNIZED THAN HITHERTO. 7) Exponentiation and the use of logarithms within the entropy function have very little influence on computer execution time if time is optimized and full efficiency of the computer system is used. 2- Recommendations for Future Work ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The present work is exploratory. However, it has opened up new avenues in the study of some classes of minimization problems, such as general stochastic problems. Some potential research topics, which become possible due to the present work, are summarized as follows: 1) The present work is mainly oriented towards providing a practical basis for using Shannon entropy through the Maximum Entropy Principle (MEP) and the Minimax Entropy Principle (MMEP) in an optimization context. It has left many aspects of theoretical algorithmic development to be explored. For example, the Maximin Entropy Principle. 2) Further practical refinements are also required for more deeply understanding the minimum entropy principle and the maximin entropy principle and extending its applications to more optimization areas. 3) It is now clear that the explored relationship between simulated entropy and simulated annealing has a close relationship to global minimization. Thus, the Minimax Entropy Principle (MMEP) can be readily adapted to solving large simulation problems of a stochastic nature. This has still to be explored. 4) Because of the explored relationship between simulation and entropy and since simulation is an experimental arm of operations research, the author strongly believes that ENTROPY RESEARCH (ER), which involves research on entropy may be described as a scientific approach to decision making that involves uncertainty and as a result its concept is so general that it is equally applicable to many other fields as well. 5) The roots of Operations Research can be traced back many decades to the Second World War. Because of the war effort, there was an urgent need to allocate scarce resources to the various military operations and to the activities within each operation in an effective manner. However, because of the strong waves of technological development, the world is shaken. A new strategic system of management is urgently needed. Entropy may be used efficiently for such development. In words, operations research deals with what is available today while entropy research (ER) deals with what is unpredictable tomorrow which is the case we are facing. In conclusion, the main contribution to knowledge contained in this thesis centers around the various demonstrations that informational entropy, stochastic simulations and optimization processes are closely linked. Through this work it is now possible to view the traditional deterministic, topological interpretation of optimization processes as not the only interpretation, but as just one possible interpretation. It is valid to treat optimization processes in a probabilistic, information-theoretic way and to develop solution methods from this interpretation. This new insight opens up new avenues for research into optimization methods. A. Sultan