Table of Contents
In the IEEE Congress on Evolutionary Computation CEC’2019 we have organized a tutorial about Evolutionary Large-Scale Global Optimization http://cec2019.org/programs/tutorials.html#cec-09.
Organized by Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, and Antonio LaTorre
Many real-world optimization problems involve a large number of decision variables. The trend in en- gineering optimization shows that the number of decision variables involved in a typical optimization problem has grown exponentially over the last 50 years , and this trend continues with an ever- increasing rate. The proliferation of big-data analytic applications has also resulted in the emergence of large-scale optimization problems at the heart of many machine learning problems [1, 11]. The recent advances in the area of machine learning has also witnessed very large scale optimization problems en- countered in training deep neural network architectures (so-called deep learning), some of which have over a billion decision variables [3, 7]. It is this “curse-of-dimensionality” that has made large-scale op- timization an exceedingly difficult task. Current optimization methods are often ill-equipped in dealing with such problems. It is this research gap in both theory and practice that has attracted much research interest, making large-scale optimization an active field in recent years. We are currently witnessing a wide range of mathematical and metaheuristics optimization algorithms being developed to overcome this scalability issue. Among these, metaheuristics have gained popularity due to their ability in dealing with black-box optimization problems. Currently, there are two different approaches to tackle this complex search. The first one is to apply decomposition methods, that divide the total number of variables into groups of variables allowing researcher to optimize each one separately, reducing the curse of dimension- ality. Their main drawback is that choosing proper decomposition could be very difficult and expensive computationally. The other approach is to propose algorithms specifically designed for large-scale global optimization, creating algorithms whose features are well-suited for that type of search. The Tutorial is divided in two parts, each dedicated to exploring the advances in the approaches stated above, presented by experts in each respective field.
This tutorial is suitable for anyone with an interest in evolutionary computation who wishes to learn more about the state-of-the-art in large-scale global optimization. The tutorial is specifically targeted for Ph.D. students, and early career researchers who want to gain an overview of the field and wish to identify the most important open questions and challenges in the field to bootstrap their research in large-scale optimization. The tutorial can also be of interest to more experienced researchers as well as practitioners who wish to get a glimpse of the latest developments in the field. In addition to our prime goal which is to inform and educate, we also wish to use this tutorial as a forum for exchanging ideas between researchers. Overall, this tutorial provides a unique opportunity to showcase the latest developments on this hot research topic to the EC research community. The expected duration of each part is approximately 110 minutes.
Mohammad Nabi Omidvar is a research fellow in evolutionary computation and is a member of Centre of Excellence for Research in Computational Intelligence and Applications (Cercia) at the school of computer science, the University of Birmingham. Prior to joining the University of Birmingham, Dr. Omidvar completed his Ph.D. in computer science with the Evolutionary Computing and Machine Learning (ECML) group at RMIT University in Melbourne, Australia. Dr. Omidvar holds a bachelor in applied mathematics and a bachelor in computer science with first class honors from RMIT University. Dr. Omidvar won the IEEE Transaction on Evolutionary Computation Outstanding Paper award for his work on large-scale global optimization. He has also received an Australian Postgraduate Award in 2010 and also received the best Computer Science Honours Thesis award from the School of Computer Science and IT, RMIT University. Dr. Omidvar is a member of IEEE Computational Intelligence Society since 2009 and is a member of IEEE Taskforce on Large-Scale Global Optimization. His current research interests are large-scale global optimization, decomposition methods for optimization, and multi-objective optimization.
Xiaodong Li received his B.Sc. degree from Xidian University, Xi’an, China, and Ph.D. degree in information science from University of Otago, Dunedin, New Zealand, respectively. Currently, he is an Associate Professor at the School of Computer Science and Information Technology, RMIT University, Melbourne, Australia. His research interests include evolutionary computation, neural networks, com- plex systems, multiobjective optimization, and swarm intelligence. He serves as an Associate Editor of the journal IEEE Transactions on Evolutionary Computation, Swarm Intelligence (Springer), and Inter- national Journal of Swarm Intelligence Research. He is a founding member of IEEE CIS Task Force on Swarm Intelligence, a Vice-chair of IEEE CIS Task Force of Multi-Modal Optimization, and a for- mer Chair of IEEE CIS Task Force on Large Scale Global Optimization. He was the General Chair of SEAL’08, a Program Co-Chair AI’09, and a Program Co-Chair for IEEE CEC’2012. He is the recipient of 2013 ACM SIGEVO Impact Award.
Daniel Molina Cabrera received his B.Sc. degree and Ph.D. degree in Computer Science from University of Granada, Spain. He was Associate Professor at the School of Engineering in the University of Cadiz for several years, and currently he is an Associate Professor at the School of Computer and Telecommunications Engineering in the University of Granada. His current research interests are large- scale global optimization and other different continuous optimization using Evolutionary Algorithms. He won twice the competition on Large-Scale Global Optimization, in 2010 and in 2018. Since 2015 he is the chair of IEEE CIS Task Force on Large Scale Global Optimization. Dr. Daniel Molina has organized an special issue on large-scale global optimization in Soft Computing and several special sessions and competitions on the IEEE Congress on Evolutionary Computation.
Antonio LaTorre obtained a MS. in Computer Science at the Universidad Politécnica de Madrid (UPM), a MS. in Distributed Systems at École Superieure des Télécommunications de Bretagne (ENST- B), both in 2004, and a PhD in Computer Science at UPM in 2009. He has developed his career in the field of heuristic optimization, high-performance data analysis and modeling and, in the last years, he has an active research in applied problems in the domain of logistics, neurosciences and health. He has more than 14 years of research experience backed-up by participation in 14 national and international projects, both with public and private funding, leading 3 of them. He has published more than 40 peer- reviewed contributions in international journals and conferences and participates as associator editor in 3 international journals. He is currently serving as vice-chair of the IEEE CIS Task Force on Large Scale Global Optimization.