publications

2021

  1. Mention Memory: incorporating textual knowledge into Transformers through entity mention attention
    Michiel Jong*, Yury Zemlyanskiy*, Nicholas FitzGerald, Fei Sha, and William Cohen
    2021
  2. ReadTwice: Reading Very Large Documents with Memories
    Yury Zemlyanskiy, Joshua Ainslie, Michiel Jong, Philip Pham, Ilya Eckstein, and Fei Sha
    NAACL-HLT 2021
  3. DOCENT: Learning Self-Supervised Entity Representations from Large Document Collections
    Yury Zemlyanskiy, Sudeep Gandhe, Ruining He, Bhargav Kanagal, Anirudh Ravula, Juraj Gottweis, Fei Sha, and Ilya Eckstein
    EACL 2021

2019

  1. Self-Attentive, Multi-Context One-Class Classification for Unsupervised Anomaly Detection on Text
    Lukas Ruff, Yury Zemlyanskiy, Robert A. Vandermeulen, Thomas Schnake, and Marius Kloft
    ACL 2019

2018

  1. Aiming to Know You Better Perhaps Makes Me a More Engaging Dialogue Partner
    Yury Zemlyanskiy, and Fei Sha
    CoNLL 2018

2014

  1. Extracting translation pairs from social network content
    Matthias Eck, Yury Zemlyanskiy, Joy Zhang, and Alex Waibel
    IWSLT 2014