March 17, 2015

Paper: An Effective Ensemble-Based Method for Creating On-the-Fly Surrogate Fitness Functions for Multi-Objective Evolutionary Algorithms - SYNASC 2013

This paper was prepared for the SYNASC 2013 conference and it is related to my current research project that has the general aim of enhancing currently available Evolutionary Computation methods employed for the multi-objective optimization of problems that rely on a very time-intensive fitness evaluation functionsHere is the abstract of the article:
The task of designing electrical drives is a multi-objective optimization problem (MOOP) that remains very slow even when using state-of-the-art approaches like particle swarm optimization and evolutionary algorithms because the fitness function used to assess the quality of a proposed design is based on time-intensive finite element (FE) simulations. One straightforward solution is to replace the original FE-based fitness function with a much faster-to-evaluate surrogate. In our particular case each optimization scenario poses rather unique challenges (i.e., goals and constraints) and the surrogate models need to be constructed on-the-fly, automatically, during the run of the evolutionary algorithm. In the present research, using three industrial MOOPs, we investigated several approaches for creating such surrogate models and discovered that a strategy that uses ensembles of multi-layer perceptron neural networks and Pareto-trimmed training sets is able to produce very high-quality surrogate models in a relatively short time interval.
You can download a preliminary version of the paper by clicking here or from my Downloads box (On the Performance of Master-Slave Parallelization Methods for MOEAs - ICAISC 2013.pdf). The same preliminary draft of the document can be previewed at the bottom of this post. The original publication is available at the IEEE Xplore website.

For citations please use the following BibTeX reference:

@INPROCEEDINGS{Zuavoianu2013SYNASC,
  author = {Alexandru-Ciprian Z\u{a}voianu and Edwin Lughofer and Gerd Bramerdorfer and Wolfgang Amrhein and Erich Peter Klement},
  title = {An Effective Ensemble-Based Method for Creating On-the-Fly Surrogate Fitness Functions for Multi-Objective Evolutionary Algorithms},
  booktitle = {Proceedings of the 15th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC 2013)},
  year = {2013},
  pages = {235-248},
  publisher = {IEEE Computer Society},
}

February 20, 2014

My favourite quotes - revisited

"Success is not final, failure is not fatal: it is the courage to continue that counts."
Sir Winston Churchill

"I hear and I forget, I see and I remember, I do and I understand"
Confucius

"The two most important days in your life are the day you are born and the day you find out why."
Mark Twain

"Empty pockets never held anyone back. Only empty heads and empty hearts can do that"
Norman Vincent Peale 

"A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects."  
Robert A. Heinlein

 "We are all in the gutter, but some of us are looking at the stars."
Oscar Wilde 

"Computers are useless. They can only give you answers."
Pablo Picasso


 "It is better to keep your mouth closed and let people think you are a fool than to open it and remove all doubt."
Mark Twain

"Illusion is the first of all pleasures."
Oscar Wilde

"When you play, play hard; when you work, don't play at all."
Theodore Roosevelt  


"Courage is what it takes to stand up and speak, Courage is also what it takes to sit down and listen."
Sir Winston Churchill 


"The empires of the future are the empires of the mind."
Sir Winston Churchill  


"Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep."
Scott Adams 


"Insanity: doing the same thing over and over again and expecting different results."
Albert Einstein

October 7, 2013

Paper: Hybridization of Multi-Objective Evolutionary Algorithms and Artificial Neural Networks for Optimizing the Performance of Electrical Drives - EAAI

A big part of my current PhD work at the JKU-Department of Knowledge-Based Mathematical Systems is related to the dissemination of our current research (i.e., writing scientific articles). That's why most of my recent / forthcoming posts are / will be about "papers".

The work presented in this post is a revised and extended (journal) version of one of the earlier papers written in collaboration with our partners from the Institute for Electrical Drives and Power Electronics of the Johannes Kepler University, Linz. The aim of the article is to describe a surrogate-based enhancement that can help to significantly speed-up a multi-objective evolutionary algorithm that requires an extremely time-intensive fitness evaluation function. Here is the abstract of the article:
Performance optimization of electrical drives implies a lot of degrees of freedom in the variation of design parameters, which in turn makes the process overly complex and sometimes impossible to handle for classical analytical optimization approaches. This, and the fact that multiple non-independent design parameter have to be optimized synchronously, makes a soft computing approach based on multi-objective evolutionary algorithms (MOEAs) a feasible alternative. In this paper, we describe the application of the well known Non-dominated Sorting Genetic Algorithm II (NSGA-II) in order to obtain high-quality Pareto-optimal solutions for three optimization scenarios. The nature of these scenarios requires the usage of fitness evaluation functions that rely on very time-intensive finite element (FE) simulations. The key and novel aspect of our optimization procedure is the on-the-fly automated creation of highly accurate and stable surrogate fitness functions based on artificial neural networks (ANNs). We employ these surrogate fitness functions in the middle and end parts of the NSGA-II run (=> hybridization) in order to significantly reduce the very high computational effort required by the optimization process. The results show that by using this hybrid optimization procedure, the computation time of a single optimization run can be reduced by 46% to 72% while achieving Pareto-optimal solution sets with similar, or even slightly better, quality as those obtained when conducting NSGA-II runs that use FE simulations over the whole run-time of the optimization process.
You can download the preprint version of the paper by clicking here or from my Downloads box (Hybridization of Multi-Objective Evolutionary Algorithms and Artificial Neural Networks for Optimizing the Performance of Electrical Drives - EAAI 2013.pdf). The same preprint version can be previewed at the bottom of this post. The original publication is available at elsevier.com.

For citations please use the following BibTeX reference:


@ARTICLE{Zavoianu2013EAAI,
  author = {Alexandru-Ciprian Z\u{a}voianu and Gerd Bramerdorfer and Edwin Lughofer and Siegfried Silber and Wolfgang Amrhein and Erich Peter Klement},
  title = {Hybridization of Multi-Objective Evolutionary Algorithms and Artificial Neural Networks for Optimizing the Performance of Electrical Drives},
  journal = {Engineering Applications of Artificial Intelligence},
  year = {2013},
  volume = {26},
  pages = {1781-1794},
  number = {8},
  doi = {10.1016/j.engappai.2013.06.002}
}

June 13, 2013

Paper: On the Performance of Master-Slave Parallelization Methods for Multi-Objective Evolutionary Algorithms - ICAISC 2013

This paper was prepared for the ICAISC 2013 conference and it is related to my current research project that has the general aim of enhancing currently available Evolutionary Computation methods employed for the multi-objective optimization of problems that rely on a very time-intensive fitness evaluation functionsHere is the abstract of the article:
This paper is focused on a comparative analysis of the performance of two master-slave parallelization methods, the basic generational scheme and the steady-state asynchronous scheme. Both can be used to improve the convergence speed of multi-objective evolutionary algorithms (MOEAs) that rely on time-intensive fitness evaluation functions. The importance of this work stems from the fact that a correct choice for one or the other parallelization method can lead to considerable speed improvements with regards to the overall duration of the optimization. Our main aim is to provide practitioners of MOEAs with a simple but effective method of deciding which master-slave parallelization option is better when dealing with a time-constrained optimization process.
You can download a preliminary version of the paper by clicking here or from my Downloads box (On the Performance of Master-Slave Parallelization Methods for MOEAs - ICAISC 2013.pdf). The same preliminary draft of the document can be previewed at the bottom of this post. The original publication is available at www.springerlink.com.

For citations please use the following BibTeX reference:

@INCOLLECTION{Zavoianu2013ICAISC,
  author = {Alexandru-Ciprian Z\u{a}voianu and Edwin Lughofer and Werner Koppelst\"{a}tter and G\"{u}nther Weidenholzer and Wolfgang Amrhein and Erich Peter Klement},
  title = {On the Performance of Master-Slave Parallelization Methods for Multi-Objective Evolutionary Algorithms},
  booktitle = {Artificial Intelligence and Soft Computing},
  publisher = {Springer Berlin Heidelberg},
  year = {2013},
  editor = {Laszek Rutkowski et al.},
  volume = {7895},
  series = {Lecture Notes in Artificial Intelligence},
  pages = {122-134},
  doi = {10.1007/978-3-642-38610-7_12}
}

April 16, 2013

Most TED talks are very good. A few are simply astonishing.

I've stumbled upon this particular gem via a Business and Culture English course I'm taking at the University (if you find it hard to follow, it also has subtitles for around 30 languages):