5 results on '"Lagaris, I.E."'
Search Results
2. Newtonian clustering: An approach based on molecular dynamics and global optimization
- Author
-
Blekas, K. and Lagaris, I.E.
- Subjects
- *
EQUATIONS of motion , *LAGRANGE equations , *ALGORITHMS , *REVISED Universal Soil Loss Equation (RUSLE) - Abstract
Abstract: Given a data set, a dynamical procedure is applied to the data points in order to shrink and separate, possibly overlapping clusters. Namely, Newton''s equations of motion are employed to concentrate the data points around their cluster centers, using an attractive potential, constructed specially for this purpose. During this process, important information is gathered concerning the spread of each cluster. In succession this information is used to create an objective function that maps each cluster to a local maximum. Global optimization is then used to retrieve the positions of the maxima that correspond to the locations of the cluster centers. Further refinement is achieved by applying the EM-algorithm to a Gaussian mixture model whose construction and initialization is based on the acquired information. To assess the effectiveness of our method, we have conducted experiments on a plethora of benchmark data sets. In addition we have compared its performance against four clustering techniques that are well established in the literature. [Copyright &y& Elsevier]
- Published
- 2007
- Full Text
- View/download PDF
3. GenMin: An enhanced genetic algorithm for global optimization
- Author
-
Tsoulos, Ioannis G. and Lagaris, I.E.
- Subjects
- *
COMPUTER operating system software , *GENETIC programming , *GENETIC algorithms , *COMBINATORIAL optimization , *PROGRAMMING languages , *FORTRAN , *ESTIMATION theory , *MATHEMATICAL statistics - Abstract
Abstract: A new method that employs grammatical evolution and a stopping rule for finding the global minimum of a continuous multidimensional, multimodal function is considered. The genetic algorithm used is a hybrid genetic algorithm in conjunction with a local search procedure. We list results from numerical experiments with a series of test functions and we compare with other established global optimization methods. The accompanying software accepts objective functions coded either in Fortran 77 or in C++. Program summary: Program title: GenMin Catalogue identifier: AEAR_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEAR_v1_0.html Program obtainable from: CPC Program Library, Queen''s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 35 810 No. of bytes in distributed program, including test data, etc.: 436 613 Distribution format: tar.gz Programming language: GNU-C++, GNU-C, GNU Fortran 77 Computer: The tool is designed to be portable in all systems running the GNU C++ compiler Operating system: The tool is designed to be portable in all systems running the GNU C++ compiler RAM: 200 KB Word size: 32 bits Classification: 4.9 Nature of problem: A multitude of problems in science and engineering are often reduced to minimizing a function of many variables. There are instances that a local optimum does not correspond to the desired physical solution and hence the search for a better solution is required. Local optimization techniques are frequently trapped in local minima. Global optimization is hence the appropriate tool. For example, solving a nonlinear system of equations via optimization, employing a least squares type of objective, one may encounter many local minima that do not correspond to solutions (i.e. they are far from zero). Solution method: Grammatical evolution and a stopping rule. Running time: Depending on the objective function. The test example given takes only a few seconds to run. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
4. Supporting adaptive and irregular parallelism for non-linear numerical optimization.
- Author
-
Hadjidoukas, P.E., Voglis, C., Dimakopoulos, V.V., Lagaris, I.E., and Papageorgiou, D.G.
- Subjects
- *
ADAPTIVE control systems , *NONLINEAR systems , *GLOBAL optimization , *SWITCHING power supplies , *HESSIAN matrices , *NUMERICAL calculations - Abstract
Highlights: [•] A global optimization framework for SMPs and multicore clusters is presented. [•] It exploits hierarchal and dynamic task parallelism of the Multistart method. [•] Gradient/Hessian calculations and Newton’s optimization method are parallelized. [•] Several task distribution schemes are studied and evaluated. [•] Our framework is applied successfully to the protein conformation problem. [Copyright &y& Elsevier]
- Published
- 2014
- Full Text
- View/download PDF
5. MEMPSODE: A global optimization software based on hybridization of population-based algorithms and local searches
- Author
-
Voglis, C., Parsopoulos, K.E., Papageorgiou, D.G., Lagaris, I.E., and Vrahatis, M.N.
- Subjects
- *
PARTICLE swarm optimization , *DIFFERENTIAL evolution , *STOCHASTIC processes , *SYSTEMS software , *ALGORITHMS , *PROBLEM solving , *COMPUTER operating systems , *MATHEMATICAL physics - Abstract
We present MEMPSODE, a global optimization software tool that integrates two prominent population-based stochastic algorithms, namely Particle Swarm Optimization and Differential Evolution, with well established efficient local search procedures made available via the Merlin optimization environment. The resulting hybrid algorithms, also referred to as Memetic Algorithms, combine the space exploration advantage of their global part with the efficiency asset of the local search, and as expected they have displayed a highly efficient behavior in solving diverse optimization problems. The proposed software is carefully parametrized so as to offer complete control to fully exploit the algorithmic virtues. It is accompanied by comprehensive examples and a large set of widely used test functions, including tough atomic cluster and protein conformation problems. Program summary: Program title: MEMPSODE (MEMetic Particle Swarm Optimization and Differential Evolution) Catalogue identifier: AELM_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AELM_v1_0.html Program obtainable from: CPC Program Library, Queenʼs University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 14 877 No. of bytes in distributed program, including test data, etc.: 592 244 Distribution format: tar.gz Programming language: ANSI C, ANSI Fortran-77 Computer: Workstations Operating system: Developed under the Linux operating system using the GNU compilers v.4.4.3. It has also been tested under Solaris and the Cygwin environment. RAM: The code uses internal storage, n being the dimension of the problem and N the maximum population size. The required memory is dynamically allocated. Word size: 64 bits Classification: 4.9 Subprograms used: Nature of problem: Optimization is a valuable mathematical tool for solving a plethora of scientific and engineering problems. Usually, the underlying problems are modeled with objective functions whose minimizers (or maximizers) correspond to the desired solutions of the original problem. In many cases, there is a multitude of such minimizers that correspond to solutions either locally, i.e., in their close neighborhood, or globally, i.e., with respect to the whole search space. There is a significant number of efficient algorithms for addressing optimization problems. One can distinguish two main categories, based on their adequacy in performing better global (exploration) or local (exploitation) search. Standard local optimization algorithms have the ability to rapidly converge towards local minimizers but they are also prone to get easily trapped in their vicinity. These algorithms usually exploit local information of the objective function, including first- and second-order derivatives. On the other hand, global optimization algorithms are designed to perform better exploration, although at the cost of questionable convergence properties. Typically, these approaches integrate stochastic operations. The form of the optimization problem at hand plays a crucial role in the selection of the most appropriate algorithm. Objective functions that lack nice mathematical properties (such as differentiability, continuity etc.) may raise applicability issues for algorithms that require derivatives. On the other hand, applications that require high accuracy may be laborious for stochastic algorithms. The existence of a multitude of local and/or global minimizers can render these problems even harder for any single optimization algorithm. Solution method: Evolutionary Algorithms and Swarm Intelligence approaches have been established as effective global optimization algorithms that make minor assumptions about the objective function. Particle Swarm Optimization (PSO) and Differential Evolution (DE) possess a salient position among the most successful algorithms of these categories. Numerous studies indicate that their performance can be radically improved when combined with efficient local optimization schemes. The resulting hybrid algorithms offer more balanced search intensification/diversification than the original ones, thereby increasing both their efficiency and effectiveness. Such hybrid schemes are called Memetic Algorithms, and they have gained a rapidly growing interest over the past few years. We present MEMPSODE (MEMetic, PSO and DE), a global optimization software that implements memetic PSO and DE within a unified framework. The software utilizes local search procedures from the established Merlin optimization environment. The performance of the implemented approaches is illustrated on several examples, including hard optimization tasks such as atomic cluster and protein conformation problems. Restrictions: The current version of the software uses double precision arithmetic. However, it can be easily adapted by the user to handle integer or mixed-integer problems. Unusual features: The software takes into account only bound constraints. General constraints may be tackled by user-defined penalty or barrier functions that can be easily incorporated in the source code of the objective function. Additional comments: The use of the Merlin Optimization Environment 3.1.1 (see subprograms above) is optional. A comprehensive user manual is provided that covers in detail the installation procedure and provides detailed examples of operation. Running time: The running time depends solely on the complexity of the objective function (and its derivatives, if used) as well as on the available computational budget (number of function evaluations). The test run provided (Rastrigin function ), requires function evaluations (2.8 seconds on an i7-920 CPU). [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.