i am on the academic job market for positions starting in 2026.
Yash Kumar Atri

yash kumar atri

university of virginia
atri [at] virginia.edu

happenings

aug 2025 – patent on text summarization granted (announcement)
may 2025 – invited talk at lamarr institute nlproc colloquium, university of bonn
apr 2025 – our paper lifelong model editing with graph-based external memory accepted to acl findings 2025
oct 2024 – selected as daad ainet fellow 2024
june 2024 – joined uva as a postdoc

about me

hi! i am a postdoc at the university of virginia, working with prof. tom hartvigsen. before joining uva, i completed my phd in computer science & engineering at iiit Delhi, where i was advised by prof. vikram goyal and prof. tanmoy chakraborty.

research interests

i am interested in enabling language models to continuously learn and adapt beyond their initial pretraining. my current research explores model editing, developing methods to update models efficiently with new information and corrections. i build modular systems that apply each change locally, allowing the model to evolve while retaining its existing knowledge. my long-term vision is to create systems that learn like humans: accumulating knowledge over time, refining their understanding, and remaining robust as language, technology, and societal values change.

research directions

model editing & continual learning current

  • lifelong model editing with graph-based external memory (acl findings 2025)
  • continually self-improving language models for bariatric surgery qa (preprint 2025)

text summarization

  • promoting topic coherence via simplicial complex & sheaf graph (emnlp main 2023)
  • fusing multimodal signals in hyper-complex space (sigkdd 2023)

data quality & evaluation

  • inline citation classification with peripheral context & time-evolving augmentation (pakdd 2023)
  • assessing the quality of the datasets by identifying mislabeled samples (asonam 2021)

publications

papers: 8
preprints: 2
patents: 1

lifelong model editing with graph-based external memory

yash kumar atri, ahmed alaa, thomas hartvigsen

paper acl (findings) 2025

continually self-improving language models for bariatric surgery question--answering

yash kumar atri, thomas h shin, thomas hartvigsen

paper preprint 2025

promoting topic coherence and inter-document consorts in multi-document summarization via simplicial complex and sheaf graph

yash kumar atri, arun iyer, tanmoy chakraborty, vikram goyal

paper emnlp main 2023

fusing multimodal signals on hyper-complex space for extreme abstractive text summarization

yash kumar atri, tanmoy chakraborty, vikram goyal

paper sigkdd 2023

invited talks

may 2025 – lamarr institute nlproc colloquium, university of bonn — waking llms from cryosleep with continual learning
feb 2024 – acm-india arcs symposium, niser bhubaneswar — fairness in abstractive text summarization
dec 2023 – icon 2023, university of goa — tutorial on building blocks of ai-driven mental health counseling
mar 2023 – riise 2023, iiit delhi — poster presentation

experience

postdoc researcher

2024 – present

university of virginia, usa

research associate

2019

lcs2-iiitd, delhi

software engineer (data science)

2018 – 2019

lumiq.ai, noida

awards

daad ainet fellow 2024 – postdoc-net-ai, germany

microsoft travel grant, ihub-anubhuti travel grant – emnlp 2023

microsoft + ihub-anubhuti-iiitd-foundation + acm-india travel grants – kdd 2023

education

ph.d. computer science and engineering

iiit delhi, india 2020 – 2024

advisors: prof. vikram goyal, prof. tanmoy chakraborty

thesis title: advancing text summarization with conscience, comprehension, and multimodality

b.tech. computer science and engineering

jaypee university, india 2014 – 2018

advisor: dr. amit kumar

thesis title: machine translation in indian languages

services

reviewer: arr25,24; iclr25; tcss25,24; tasl24; kbs24; asonam24; icon24; emnlp23; acl23; bda23

organizer: bda 2023 conf; icon 2023 conf; acss 2020–2022 workshops; cofad 2020 workshop

teaching assistant: cse557 (w2020, w2021); cse506 (s2020, s2021)

Get in touch