- News
- Short info
- Time and place
- Communication with teachers
- Exam/projects results
- Course program
- Projects proposals
- Bibliography
- Useful links
- Projects added to the projects folder
- Second round of exam will take place on May 17th at 17:00.
- Exam will take place online right after the 11th lecture on May 3rd. Topics to learn: first 10 lectures. The participants should send an e-mail to [email protected] in advance with e-mail topic [MM MLDL Exam] and mentioning their names, course, and group inside the e-mail body.
- The projects should be sent via the same e-mail with e-mail topic [MM MLDL Project] and mentioning their names, course, and group inside the e-mail body + link to the code + link to the presentation with results.
- The first lecture will take place on Wednesday, February 8th, at 16:45 (online)
In the spring semester of 2023 at the Faculty of Mechanics and Mathematics of Lomonosov Moscow State University a new special course of the student's choice, dedicated to the theory of machine learning and deep learning, is to be provided.
The course will be taught on the basis of the Department of Mathematical Theory of Intelligent Systems under the guidance of Ph.D., Senior Researcher Mazurenko I.L. The course will be taught by Ph.D. Petiushko A.A.
Classes are to be taught on Wednesdays at 16:45, online.
- Telegram-channel, where all important news will appear
- Feedback - by email [email protected]
- Well, you can always write in issues :)
- Investigate Neural Collapse on different datasets (MNIST, Omniglot, LFW, ...)
- Make a comparison study of angular-based losses vs metric-based ones on different datasets (MNIST, Omniglot, LFW, ...)
- Think of evaluation metric for GAN solution (aside from Inception Score / Frechet Inception Distance) and make a coparison study of this metric for different GAN solution: vanilla GAN, WGAN, WGAN-GP
- Implement and analyze the BNN recognition results using different priors for weights (Uniform, Gaussian, Laplace) on different datasets (MNIST, Omniglot, LFW, ...)
- Do it with Variational Inference
- Do it with MCMC
- Explore the Diffusion generation quality vs number of steps on different datasets (MNIST, Omniglot, LFW, ...)
- Do it with unconditional generation
- Do it with classifier(-free) guidance
- Explore different strategies of
$\alpha$ ($\beta$ ) decrease schedule
- Make a quantitave and qualitative analysis of different
$l_0/l_1/l_2/l_{\infty}$ -based Adversarial Attacks (success rate, number of iterations, etc) on different datasets (MNIST, Omniglot, LFW, ...)- Do it for the Universal Adversarial Attack as well
- Compare the transferability for different NN architectures (LeNet, VGG, ResNet, etc)
- Create a real-world attack demo for any detection/recognition system
- Machine Learning Lecture Course on http://www.machinelearning.ru from Vorontsov K.V.
- Hastie, T. and Tibshirani, R. and Friedman, J. The Elements of Statistical Learning, 2nd edition, Springer, 2009.
- Bishop, C.M. Pattern Recognition and Machine Learning, Springer, 2006.
- Ian Goodfellow, Yoshua Bengio, Aaron Courville, and Yoshua Bengio. Deep learning. Vol. 1. Cambridge: MIT press, 2016.
- Matus Telgarsky, Deep learning theory lecture notes, 2021
- Sanjeev Arora et al., Theory of Deep learning book draft, 2020
- Homemade Machine Learning: github repo
- Machine learning: Course by Andrew Ng on the site https://www.coursera.org