Project

Augumenting SGD optimizers with low dimensional 2nd order information

Positions: Student Researcher

Created: 2023-10-22 Deadline:

Location: Poland

SGD optimization is currently dominated by 1st order methods like Adam. Augumenting them with 2nd order information would suggest e.g. optimal step size. Such online parabola model can be maintained nearly for free by extracting linear trends of the gradient sequence (arXiv: 1907.07063), and is planned to be included for improving standard methods like Adam.

General must-have requirements

The students needs to know basics of tensor flow or pytorch, preferred experience in mathematical analysis.

Contact: Jarek Duda (jaroslaw.duda [ at ] uj.edu.pl)

Project's lab:

GMUM (Group of Machine Learning Research) is a group of researchers working on various aspects of machine learning, and in particular deep learning - in both fundamental and applied settings. The group is led by prof. Jacek Tabor. We are based in the Jagiellonian University in the beautiful city of Kraków, Poland.

Some of the research directions our group pursues include:

  • Generative models: efficient training and sampling; inpainting; super-resolution,
  • Theoretical understanding of deep learning and optimization,
  • Natural language processing,
  • Drug design and cheminformatics,
  • Unsupervised learning and clustering,
  • Computer vision and medical image analysis.

In 2023, we organized the second edition of Machine Learning Summer School (MLSS^S) with a focus on Applications in Science. We invite participants to collaborate with us on various ongoing research projects - learn more here.

See lab's page