IEEE Symposium on Evolutionary Neural Architecture Search and Applications (IEEE ENASA)

Deep neural networks based deep learning has shown significantly promising performance in addressing real-world problems, such as image recognition, natural language processing and self-driving. The achievements of such algorithms owe to its deep neural architectures. However, designing an optimal deep architecture for a particular problem requires rich domain knowledge on both the investigated data and the neural network domain, which is not necessarily held by the end-users. In addition, the problem of searching for the optimal architecture could be non-convex and nondifferentiable, and existing accurate methods are incapable of well addressing it. Furthermore, the deep architecture defined for the task is not reusable, i.e., a new one must be redesigned for data with a slightly changed scenario and/or unseen data. Evolutionary computation (EC) approaches, particularly genetic algorithms (GAs), particle swarm optimization (PSO) and genetic programming (GP), have shown superiority in addressing real-world problems due largely to their powerful abilities in searching for global optima, dealing with non-convex/non-differentiable problems, and requiring no rich domain knowledge. However, most of the existing EC methods cannot provide satisfactory results in searching for deep architectures. In this regard, deep neural architecture designed by EC approaches, i.e., evolutionary neural architecture search, would be a great research topic. The theme of this symposium aims to bring together researchers investigating methods and applications in evolutionary neural architecture search. Particularly, the methods focus on effective and efficient encoding strategy, recombination mechanisms and fitness evaluation techniques. Authors are invited to submit their original and unpublished work to this symposium.

Topics

Topics include but are not limited to:

  • Encoding strategy for deep neural networks
  • Encoding strategy for huge number of parameters
  • Encoding strategy for variable-length individuals
  • Global and/or local search operators for variable-length individuals
  • New search operators for evolutionary neural architecture search
  • Evolutionary large-scale optimization algorithms for deep learning
  • Fast fitness evaluation algorithms in evolutionary neural architecture search
  • Surrogate assisted method for neural network performance prediction
  • Quantitative method for neural network architecture
  • Multi- and many-objective optimization for evolutionary neural architecture search
  • Hybrid methods for evolutionary neural architecture search
  • Real-world applications of evolutionary neural architecture search, e.g. image sequences, image analysis, face recognition, pattern recognition, health and medical data analysis, text mining, network security, engineering problems, and financial and business data analysis, etc.

Symposium Chair

  • Yanan Sun
    ysun@scu.edu.cn
    College of Computer Science, Sichuan University, Chengdu, China

Programme Committee

  • Yao Zhou, Sichuan University, China
  • Zhichao Lu, Michigan State University, USA
  • Qing Ye, Sichuan University, China
  • Peng Hu, Agency for Science, Technology and Research, Singapore
  • Liangli Zhen, Agency for Science, Technology and Research, Singapore
  • Yangxu Wang, China National Electronics Imp.&Exp. Corp., China
  • Cheng He, Southern University of Science and Technology, China
  • Bin Wang, Victoria University of Wellington, New Zealand
  • Marley Vellasco, Pontifical Catholic University of Rio de Janeiro, Brazil
  • Edgar Galvan , Maynooth University, Ireland