Ned Cooper
I'm a postdoctoral researcher in the DesignAI lab at Cornell, working on human-AI interaction design and AI governance. I'm also an affiliate of the MINT and Social-Centered AI labs.
My research asks how those affected by ML systems can influence their development and evaluation. This includes studying how participation works (and fails) in ML pipelines, creating responsible AI design frameworks in areas like mental health and Indigenous languages, and evaluating LLM capabilities like theory of mind and persuasion.
Before research, I spent eight years in strategy consulting, infrastructure delivery, and human rights law -- including managing regulatory policy for Australia's national broadband rollout and criminal defense work with Aboriginal communities.
Email: ned [dot] cooper [at] cornell [dot] edu
SELECTED RECENT PUBLICATIONS:
Designing AI
- Ned Cooper, Jose A. Guridi, Angel Hsing-Chi Hwang, Beth Kolko, Emma Elizabeth McGinty, Qian Yang. 2026. Framing Responsible Design of AI for Mental Well-Being: AI as Primary Care, Nutritional Supplement, or Yoga Instructor?. ACM Conference on Human Factors in Computing Systems (CHI).
- Ned Cooper and Alexandra Zafiroglu. 2025. Constraining Participation: Affordances of Feedback Features in Interfaces to Large Language Models. ACM Journal on Responsible Computing.
- Ben Hutchinson, Celeste RodrÃguez Louro, Glenys Collard, and Ned Cooper. 2025. Designing Speech Technologies for Australian Aboriginal English: Opportunities, Risks and Participation. ACM Conference on Fairness, Accountability, and Transparency (FAccT).
Evaluating AI
- Jared Moore, Rasmus Overmark*, Ned Cooper*, Beba Cibralic, Nick Haber, Cameron R. Jones. 2026. Large Language Models Persuade Without Planning Theory of Mind. arXiv.
- Jared Moore, Ned Cooper*, Rasmus Overmark*, Beba Cibralic, Nick Haber, Cameron R. Jones. 2025. Do Large Language Models Have a Planning Theory of Mind? Evidence from MINDGAMES: a Multi-Step Persuasion Task. Second Conference on Language Modeling (COLM).
*Equal contribution