Cascade Processes for Sparse Machine Learning

The project aims to democratize deep learning by developing small-scale, resource-efficient models that enhance fairness, robustness, and interpretability through innovative techniques from statistical physics and network science.

Subsidie
€ 1.499.285
2023

Projectdetails

Introduction

Deep learning continues to achieve impressive breakthroughs across disciplines and is a major driving force behind a multitude of industry innovations. Most of its successes are achieved by increasingly large neural networks that are trained on massive data sets. Their development inflicts costs that are only affordable by a few labs and prevent global participation in the creation of related technologies.

Challenges of Large Models

The huge model sizes also pose computational challenges for algorithms that aim to address issues with features that are critical in real-world applications like:

  • Fairness
  • Adversarial robustness
  • Interpretability

The high demand of neural networks for vast amounts of data further limits their utility for solving highly relevant tasks in biomedicine, economics, or natural sciences.

Goals of the Project

To democratize deep learning and to broaden its applicability, we have to find ways to learn small-scale models. With this end in view, we will:

  1. Promote sparsity at multiple stages of the machine learning pipeline.
  2. Identify models that are scalable, resource- and data-efficient, robust to noise, and provide insights into problems.

Overcoming Challenges

To achieve this, we need to overcome two challenges:

  1. The identification of trainable sparse network structures.
  2. The de novo optimization of small-scale models.

Proposed Solutions

The solutions that we propose combine ideas from statistical physics, complex network science, and machine learning. Our fundamental innovations rely on the insight that neural networks are a member of a cascade model class that we made analytically tractable on random graphs.

Expected Innovations

Advancing our derivations will enable us to develop novel parameter initialization, regularization, and reparameterization methods that will compensate for the missing implicit benefits of overparameterization for learning. The significant reduction in model size achieved by our methods will help unlock the full potential of deep learning to serve society as a whole.

Financiële details & Tijdlijn

Financiële details

Subsidiebedrag€ 1.499.285
Totale projectbegroting€ 1.499.285

Tijdlijn

Startdatum1-12-2023
Einddatum30-11-2028
Subsidiejaar2023

Partners & Locaties

Projectpartners

  • CISPA - HELMHOLTZ-ZENTRUM FUR INFORMATIONSSICHERHEIT GGMBHpenvoerder

Land(en)

Germany

Vergelijkbare projecten binnen European Research Council

ERC Consolid...

Reconciling Classical and Modern (Deep) Machine Learning for Real-World Applications

APHELEIA aims to create robust, interpretable, and efficient machine learning models that require less data by integrating classical methods with modern deep learning, fostering interdisciplinary collaboration.

€ 1.999.375
ERC Starting...

Scalable Learning for Reproducibility in High-Dimensional Biomedical Signal Processing: A Robust Data Science Framework

ScReeningData aims to develop a scalable learning framework to enhance statistical robustness and reproducibility in high-dimensional data analysis, reducing false positives across scientific domains.

€ 1.500.000
ERC Consolid...

Using deep learning to understand computations in neural circuits with Connectome-constrained Mechanistic Models

This project aims to develop a machine learning framework that integrates mechanistic modeling and deep learning to understand neural computations in Drosophila melanogaster's circuits.

€ 1.997.321
ERC Proof of...

From reconstructions of neuronal circuits to anatomically realistic artificial neural networks

This project aims to enhance artificial neural networks by extracting wiring principles from brain connectomics to improve efficiency and reduce training data needs for deep learning applications.

€ 150.000
ERC Consolid...

Collaborative Machine Intelligence

CollectiveMinds aims to revolutionize machine learning by enabling decentralized, collaborative model updates to reduce resource consumption and democratize AI across various sectors.

€ 2.000.000