Speakers: Decebal Constantin Mocanu, Elena Mocanu, Tiago Pinto, Zita Vale
Tutorial website: https://sites.google.com/view/ecai2020-one-billion-neurons
Agenda: Scalable Deep Learning: How far is one billion neurons?
A fundamental task for artificial intelligence is learning. Deep Neural Networks have proven to cope perfectly with all learning paradigms, i.e. supervised, unsupervised, and reinforcement learning. Nevertheless, traditional deep learning approaches make use of cloud computing facilities and do not scale well to autonomous agents with low computational resources. Even in the cloud they suffer from computational and memory limitations and cannot be used to model properly large physical worlds for agents which assume networks with billion of neurons. These issues are addressed in the last few years by the emerging topic of scalable deep learning which makes use of static and adaptive sparse connectivity in neural networks before and throughout training. The tutorial covers these research directions focusing on theoretical advancements, practical applications, and hands-on experience, in two parts.
Decebal Constantin Mocanu (University of Twente, Eindhoven University of Technology)
Elena Mocanu (University of Twente)
Tiago Pinto (Polytechnic Institute of Porto)
Zita Vale (Polytechnic Institute of Porto)
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.