Cascade Processes for Sparse Machine Learning
The project aims to democratize deep learning by developing small-scale, resource-efficient models that enhance fairness, robustness, and interpretability through innovative techniques from statistical physics and network science.
Projectdetails
Introduction
Deep learning continues to achieve impressive breakthroughs across disciplines and is a major driving force behind a multitude of industry innovations. Most of its successes are achieved by increasingly large neural networks that are trained on massive data sets. Their development inflicts costs that are only affordable by a few labs and prevent global participation in the creation of related technologies.
Challenges of Large Models
The huge model sizes also pose computational challenges for algorithms that aim to address issues with features that are critical in real-world applications like:
- Fairness
- Adversarial robustness
- Interpretability
The high demand of neural networks for vast amounts of data further limits their utility for solving highly relevant tasks in biomedicine, economics, or natural sciences.
Goals of the Project
To democratize deep learning and to broaden its applicability, we have to find ways to learn small-scale models. With this end in view, we will:
- Promote sparsity at multiple stages of the machine learning pipeline.
- Identify models that are scalable, resource- and data-efficient, robust to noise, and provide insights into problems.
Overcoming Challenges
To achieve this, we need to overcome two challenges:
- The identification of trainable sparse network structures.
- The de novo optimization of small-scale models.
Proposed Solutions
The solutions that we propose combine ideas from statistical physics, complex network science, and machine learning. Our fundamental innovations rely on the insight that neural networks are a member of a cascade model class that we made analytically tractable on random graphs.
Expected Innovations
Advancing our derivations will enable us to develop novel parameter initialization, regularization, and reparameterization methods that will compensate for the missing implicit benefits of overparameterization for learning. The significant reduction in model size achieved by our methods will help unlock the full potential of deep learning to serve society as a whole.
Financiële details & Tijdlijn
Financiële details
Subsidiebedrag | € 1.499.285 |
Totale projectbegroting | € 1.499.285 |
Tijdlijn
Startdatum | 1-12-2023 |
Einddatum | 30-11-2028 |
Subsidiejaar | 2023 |
Partners & Locaties
Projectpartners
- CISPA - HELMHOLTZ-ZENTRUM FUR INFORMATIONSSICHERHEIT GGMBHpenvoerder
Land(en)
Vergelijkbare projecten binnen European Research Council
Project | Regeling | Bedrag | Jaar | Actie |
---|---|---|---|---|
Reconciling Classical and Modern (Deep) Machine Learning for Real-World ApplicationsAPHELEIA aims to create robust, interpretable, and efficient machine learning models that require less data by integrating classical methods with modern deep learning, fostering interdisciplinary collaboration. | ERC Consolid... | € 1.999.375 | 2023 | Details |
Scalable Learning for Reproducibility in High-Dimensional Biomedical Signal Processing: A Robust Data Science FrameworkScReeningData aims to develop a scalable learning framework to enhance statistical robustness and reproducibility in high-dimensional data analysis, reducing false positives across scientific domains. | ERC Starting... | € 1.500.000 | 2022 | Details |
Using deep learning to understand computations in neural circuits with Connectome-constrained Mechanistic ModelsThis project aims to develop a machine learning framework that integrates mechanistic modeling and deep learning to understand neural computations in Drosophila melanogaster's circuits. | ERC Consolid... | € 1.997.321 | 2023 | Details |
From reconstructions of neuronal circuits to anatomically realistic artificial neural networksThis project aims to enhance artificial neural networks by extracting wiring principles from brain connectomics to improve efficiency and reduce training data needs for deep learning applications. | ERC Proof of... | € 150.000 | 2022 | Details |
Collaborative Machine IntelligenceCollectiveMinds aims to revolutionize machine learning by enabling decentralized, collaborative model updates to reduce resource consumption and democratize AI across various sectors. | ERC Consolid... | € 2.000.000 | 2025 | Details |
Reconciling Classical and Modern (Deep) Machine Learning for Real-World Applications
APHELEIA aims to create robust, interpretable, and efficient machine learning models that require less data by integrating classical methods with modern deep learning, fostering interdisciplinary collaboration.
Scalable Learning for Reproducibility in High-Dimensional Biomedical Signal Processing: A Robust Data Science Framework
ScReeningData aims to develop a scalable learning framework to enhance statistical robustness and reproducibility in high-dimensional data analysis, reducing false positives across scientific domains.
Using deep learning to understand computations in neural circuits with Connectome-constrained Mechanistic Models
This project aims to develop a machine learning framework that integrates mechanistic modeling and deep learning to understand neural computations in Drosophila melanogaster's circuits.
From reconstructions of neuronal circuits to anatomically realistic artificial neural networks
This project aims to enhance artificial neural networks by extracting wiring principles from brain connectomics to improve efficiency and reduce training data needs for deep learning applications.
Collaborative Machine Intelligence
CollectiveMinds aims to revolutionize machine learning by enabling decentralized, collaborative model updates to reduce resource consumption and democratize AI across various sectors.