Deep Learning (DL) is growing in popularity because it solves complex problems in machine learning by exploiting multi scale, multi-layer architectures making better use of the data patterns. Multi-scale machine perception tasks such as object and speech recognitions using DL have recently outperformed systems that have been under development for many years. The principles of DL, and its ability to capture multi scale representations, are very general and the technology can be applied to many other problem domains, which makes it quite attractive. Many open problems and challenges still exists, e.g. interpretability, computational and time costs, repeatability of the results, convergence, ability to learn from a very small amount of data, to evolve dynamically/continue to learn, etc. The Symposium will provide a forum for discussing new DL advances, challenges, brainstorming new solutions and directions between top scientists, researchers, professionals, practitioners and students with an interest in DL and related areas including applications to autonomous transportation, communications, medical, financial services, etc.
Topics
Topics of IEEE DL include but are not limited to:
- Unsupervised, semi-, and supervised learning
- Deep reinforcement learning (deep value function estimation, policy learning and stochastic control)
- Memory Networks and differentiable programming
- Implementation issues (software and hardware)
- Dimensionality expansion and sparse modeling
- Learning representations from large-scale data
- Multi-task learning
- Learning from multiple modalities
- Weakly supervised learning
- Metric learning and kernel learning
- Hierarchical models
- Interpretable DL
- Fuzzy rule-based DL
- Non-Iterative DL
- Recursive DL
- Repeatability of results in DL
- Convergence in DL
- Incremental DL
- Evolving DL
- Fast DL
- Applications in:
- Image/video
- Audio/speech
- Natural language processing
- Robotics, navigation, control
- Games
- Cognitive architectures
- AI