• Participated in developing of new version of Natural Language Processing course, taught lectures and seminars.
• I supervised student project on BERT (SOTA language model designed by Google) compression. Team result: BERT was compressed by 11 times compared to original almost without loss of quality on the tasks.
I was author and supervised student teams on next topics:
• Neural machine translator for Latin language.
• PlacePulse for Moscow: automatically recognize attractivenes of different streets based on Google Street photo.
• CoronaSearch: search engine for COVID-related scientific preprints.
• Managed Yandex.Keyboard backend team and interns. Improved keyboard spellchecker and on-device language models.
• Participated in transfer from phrase-based machine translation system to neural hybrid system. I made experiments with
fusion n-gram language model with neural machine translation. Developed a classifier for choosing between two systems.
• Developed word embedding-based emoji translator as a fun machine translation project.
• Developed transliteration service for multiple writing systems and C++ transliteration library. Also I've made multiple experiments with neural networks for abjad writing system transliteration. Transliteration model for Hebrew was released and used in the Yandex MT service.
I’ve analyzed fMRI data of brain activity during solving of simple language tasks with healthy people and people after stroke.