The BootCamp:

Here’s your roadmap:

👣 Start from the top and move down, one row at a time.
  1. 🚀 Take the Course on Google Colab.
  2. 🎯 Run the PartyTime Project on Google Colab.
  3. 🌟 Add complexities of professional setting to PartyTime projects and prepare them for deployment:
  4. 🎓 Want to learn it all & even more systematically? Apply to our BootCamp

Chapters:

Courses PartyTime Projects Advanced Projects
Ch1. Linear Regression Kaggle House Prices Kaggle House Prices: Like a Pro
Ch2. Loss Function & Gradient Descent 🔒 Predict WHO Urban PM2.5 🔒 Predict WHO Urban PM2.5: Like a Pro
Ch3. Regularization: Lasso, Ridge & Elastic Net Regression Models 🔒 Movie recommender systems 🔒 Movie recommender systems: Like a Pro
Ch4. Logistic Regression coming soon coming soon



Physics-Inspired Machine Learning:

Research Notes & their Videos (Optional)

While the following materials are not required to finish the BootCamp, going through them will deepen your understanding of the underlying concepts.

The entire field of machine learning & deep learning can be interpreted through just one equation: \[P=\frac{e^{-F}}{Z}\] And, we always need to ask a single question: "How do we find \(F\) using data?"
We will answer this question for various machine learning models in different parts of this program.

Notes Subjects
Ep1. How the Canonical Ensemble Becomes Linear Regression Linear Regression
Ep2. How Entropy Becomes the Loss Function of Linear Regression Loss Function
Ep3. Physic's Inspired Residual Sum of Squares Residual Sum of Squares
Ep4. Gradient Descent Gradient Descent
Ep5. The Heisenberg "Principle" of Machine Learning Bias Variance TradeOff
Ep6. The Lagrange Multiplier “Method” of Machine Learning Regularization
Ep7. How do we find the (N, V, E) set in machine learning? Feature Engineering and Selection
Ep8. The physics dualities of machine learning Probabilistic basis of ML
Ep9. The "Quantum Interactions" of Machine Learning Polynomial Regression
Ep10. The "Energy Levels" of Machine Learning Logistic Regression
Ep11. The "Quantum Statistics" of Machine Learning Multinomial Logistic Regression
Ep12. Is Physics' Approach Next Machine Learning Breakthrough? Support Vector Machine (SVM)
Ep13. The Physics in a Coffee Cup can Predict in Machine Learning Kernels
Ep14. Can We Use Physics's "Symmetries" in Machine Learning? K-Nearest Neighbors (KNN)
Ep15. This Physics Game can Train Machine Learning Decision tree
Ep16. This Quantum Concept Helped Me Understand Machine Learning KMeans & Gaussian Mixture
Ep17. I Built a Mini "GPT DALL-E" in One Day On a Laptop without GPU Kernel Density Estimator
Ep18. I Didn’t Tag Anyone—So How Did My iPhone Know? Principal Component Analysis (PCA)
Ep19. This Physics-Inspired Microscope Reveals Neural Network Geometry High Dimensional Landscape Visualization
Ep20. Most Neural Network Training Is Wasted - Here’s the Physics Reason Symmetries in Loss Function of Neural Nets
Ep21. This Physics Framework Predicts Neural Network Failure Modes Free Energy of Neural Network

How to cite these notes:

If you use or reference material from this collection, please cite as:

Borzou, Ardavan. 2026. Physics-Inspired Machine Learning. CompuFlair. https://compu-flair.com/physics-inspired-ml

BibTeX:

@misc{borzou2025mlphysics,
author = {Borzou, Ardavan},
title = {Physics-Inspired Machine Learning},
year = { 2026 },
url = {https://compu-flair.com/physics-inspired-ml},
note = {Accessed: 2026-01-08"}
}