Researchers use statistical physics and "toy models" to explain how neural networks avoid overfitting and stabilize learning in high-dimensional spaces.
Catalog description: Presents the underlying theory behind machine learning in proofs-based format. Answers fundamental questions about what learning means and what can be learned via formal models of ...
This course provides foundational and advanced concepts in statistical learning theory, essential for analyzing complex data and making informed predictions. Students will delve into both asymptotic ...
Artificial intelligence systems based on neural networks—such as ChatGPT, Claude, DeepSeek or Gemini—are extraordinarily powerful, yet their internal workings remain largely a "black box." To better ...
Physicists at Harvard University have developed a simplified, physics-inspired mathematical model to better understand how neural networks learn, potentially explaining why large AI systems often ...