Yury Zemlyanskiy

ML / NLP PhD student at ShaLab.

I’m a fifth-year Computer Science Ph.D. student at the University of Southern California, advised by professor Fei Sha. Before joining graduate school, I worked on Neural Machine Translation at the Applied Machine Learning team from Facebook. I got my specialist (5 years) degree in Computer Science and Math from St. Petersburg State University with professor Alexander Vakhitov as my thesis advisor.

I’m interested broadly in Machine Learning and Natural Language Processing, focusing on text and entity representations, memory augmented models and reasoning over text.

news

Jan 2022 Put a short note on SimSiam self-distillation objective.
Jan 2022 Mention Memory paper has been accepted to ICLR’2022. OpenReview forum
Jan 2022 Started an internship at Google Research with hosts Joshua Ainslie and Sumit Sanghai.
Dec 2021 Successfully defended my thesis proposal on “Parametric and semi-parametric methods for knowledge acquisition from text”. slides
Nov 2021 Gave a talk (English/Russian) on knowledge acquisition methods from text at NTR seminar. videoslides

selected publications

  1. Mention Memory: incorporating textual knowledge into Transformers through entity mention attention
    Michiel Jong*, Yury Zemlyanskiy*, Nicholas FitzGerald, Fei Sha, and William Cohen
    2021
  2. ReadTwice: Reading Very Large Documents with Memories
    Yury Zemlyanskiy, Joshua Ainslie, Michiel Jong, Philip Pham, Ilya Eckstein, and Fei Sha
    NAACL-HLT 2021
  3. DOCENT: Learning Self-Supervised Entity Representations from Large Document Collections
    Yury Zemlyanskiy, Sudeep Gandhe, Ruining He, Bhargav Kanagal, Anirudh Ravula, Juraj Gottweis, Fei Sha, and Ilya Eckstein
    EACL 2021
  4. Aiming to Know You Better Perhaps Makes Me a More Engaging Dialogue Partner
    Yury Zemlyanskiy, and Fei Sha
    CoNLL 2018