This special session aims to investigate randomized algorithms for the training and design of neural networks. We welcome submissions that explore the benefits of this class of learning techniques, both analytically and through numerical experiments, and that demonstrate the power and merits of novel families of stochastic models (e.g., stochastic configuration networks) for data modelling and predictive analytics. Theoretical works related to exploring the utility of classical and novel concentration of measure phenomena such as stochastic separation theorems for efficient learning, and the universal approximation property for recurrent randomized learner models are also solicited. This special session will stress the significance and usefulness of randomized algorithms in building neural networks without the use of backpropagation, with a focus on industrial and applicative scenarios.
Topics of interest include but are not limited to:
- Universal approximation property of random learner models
- Convolutional randomized algorithms
- Random projection
- Stochastic configuration networks
- Concentration of measure and stochastic separation theory
- Regularization theory, model evaluation and selection criteria
- Robust randomized algorithms for uncertain data analytics
- Randomized fuzzy systems
- Applications and case studies
Keywords: randomized algorithms, stochastic configuration networks, stochastic separation theorems, random vector functional-link nets, echo state networks
Special Session Chairs
- Dianhui Wang
dh.wang@latrobe.edu.au
China University of Mining and Technology, China - Ivan Tyukin
ivan.tyukin@kcl.ac.uk
King’s College of London, UK - Simone Scardapane
simone.scardapane@uniroma1.it
Sapienza University of Rome, Italy