Yash Kumar Atri

Yash Kumar Atri

University of Virginia
atri [at] virginia.edu
Résumé
Happenings
Aug 2025 Patent on text summarization published (announcement)
May 2025 Invited talk at Lamarr Institute NLProc Colloquium, University of Bonn
Apr 2025 Lifelong Model Editing with Graph-Based External Memory accepted to ACL Findings 2025
Oct 2024 Selected as DAAD AInet Fellow 2024
Jun 2024 Joined UVA as a postdoc
About Me

Hi! I am a postdoctoral researcher at the University of Virginia, working with Prof. Tom Hartvigsen. Before joining UVA, I completed my PhD in Computer Science & Engineering at IIIT Delhi, advised by Prof. Vikram Goyal and Prof. Tanmoy Chakraborty. My research aims to build language systems that learn continuously, adapt responsibly, and evolve alongside human knowledge.

Research Interests

I am interested in enabling language models to continuously learn and adapt beyond their initial pretraining. My current research explores model editing — developing methods to update models efficiently with new information and corrections. I build modular systems that apply each change locally, allowing the model to evolve while retaining its existing knowledge. My long-term vision is to create systems that learn like humans: accumulating knowledge over time, refining their understanding, and remaining robust as language, technology, and societal values change.

Research Directions
Model Editing & Continual Learning Current
  • Lifelong model editing with graph-based external memory (ACL Findings 2025)
  • Evaluating temporal consistency in multi-turn language models (ACL Main 2026)
Text Summarization
  • Promoting topic coherence via simplicial complex & sheaf graph (EMNLP Main 2023)
  • Fusing multimodal signals in hyper-complex space (SIGKDD 2023)
Data Quality & Evaluation
  • Inline citation classification with peripheral context & time-evolving augmentation (PAKDD 2023)
  • Assessing the quality of datasets by identifying mislabeled samples (ASONAM 2021)

Publications

8 papers
2 preprints
1 patent
Evaluating Temporal Consistency in Multi-Turn Language Models
Yash Kumar Atri, Steven L. Johnson, Thomas Hartvigsen
Lifelong Model Editing with Graph-Based External Memory
Yash Kumar Atri, Ahmed Alaa, Thomas Hartvigsen
Continually Self-Improving Language Models for Bariatric Surgery Question–Answering
Yash Kumar Atri, Thomas H Shin, Thomas Hartvigsen
Promoting Topic Coherence and Inter-Document Consorts in Multi-Document Summarization via Simplicial Complex and Sheaf Graph
Yash Kumar Atri, Arun Iyer, Tanmoy Chakraborty, Vikram Goyal
Fusing Multimodal Signals on Hyper-Complex Space for Extreme Abstractive Text Summarization
Yash Kumar Atri, Tanmoy Chakraborty, Vikram Goyal

Experience & Education

Experience
Postdoctoral Researcher
2024 – Present
University of Virginia, USA
Research Associate
2019
LCS2, IIIT Delhi, India
Software Engineer (Data Science)
2018 – 2019
Lumiq.ai, Noida, India
Awards
DAAD AInet Fellow 2024
Postdoc-NET-AI, Germany
Travel Grants
Microsoft, iHub-Anubhuti, ACM-India — EMNLP 2023, KDD 2023
Education
Ph.D. Computer Science & Engineering
2020 – 2024
IIIT Delhi, India
Advisors: Prof. Vikram Goyal, Prof. Tanmoy Chakraborty
Thesis: Advancing Text Summarization with Conscience, Comprehension, and Multimodality
B.Tech. Computer Science & Engineering
2014 – 2018
Jaypee University, India
Advisor: Dr. Amit Kumar
Thesis: Machine Translation in Indian Languages
Professional Service
Area Chair — ACL Rolling Review (ARR) 2025
Reviewer — ICLR 2024, ICLR 2025, ACL 2023, EMNLP 2023, and 10+ other venues
Organizer — BDA 2023, ICON 2023, ACSS Workshops (2020–2022)
Teaching & Mentoring
Teaching Assistant: CSE557 (W2020, W2021), CSE506 (S2020, S2021) — IIIT Delhi
Guest Lecturer: Tutorial on AI-Driven Mental Health Counseling — ICON 2023
Mentor: Guided undergraduate research interns on summarization & fairness (2021–2023)

Contact