Inner Speech as Behavior Guides:
Steerable Imitation of Diverse Behaviors for Human-AI Coordination

1Massachusetts Institute
of Technology
2Georgia Institute
of Technology
3Harvard
University
Corresponding author: triver@mit.edu
Spotlight Paper at NeurIPS 2025
Behaviorist framework

(a) Behaviorist framework: Direct stimulus-response mapping

Cognitive framework

(b) Cognitive framework: Linguistically-mediated action selection

Figure: Contrasting theoretical frameworks for IL. (a) The behaviorist approach models human behavior as a direct mapping from environmental states to actions (st ↦ at), treating cognitive processes as opaque transformations. (b) The cognitive approach instantiated by MIMIC introduces inner speech as a mediational layer (st → mt → at), where mt represents linguistically-structured internal deliberation that enables behavioral diversity and contextual adaptation.

Abstract

Effective human-AI coordination requires artificial agents capable of exhibiting and responding to human-like behaviors while adapting to changing contexts. Imitation learning has emerged as one of the prominent approaches to build such agents by training them to mimic human-demonstrated behaviors. However, current methods struggle to capture the inherent diversity and non-Markovian nature of human behavior and lack the ability to steer behavior at inference time. Drawing inspiration from the theory of human cognitive processes, where inner speech guides action selection before execution, we propose MIMIC (Modeling Inner Motivations for Imitation and Control), a framework that uses language as an internal representation of behavioral intent.

MIMIC employs the novel use of vision-language models as linguistic scaffolding to train a conditional variational autoencoder capable of generating inner speech from observations. A diffusion-based behavior cloning policy then selects actions conditioned on current observations and the generated inner speech. MIMIC enables fine-grained steering of behavior at inference time by conditioning the agent on behavior-specific speech. Experiments across robotic manipulation tasks and human-AI collaboration games demonstrate that MIMIC significantly enhances both behavior diversity and fidelity to human demonstrations while enabling nuanced behavioral steering without training on additional demonstrations.

Method

Coming Soon

Results

Ablation Studies

Ablation Studies

Ablation Studies: Analysis of key components in the MIMIC framework.

Generation and Steering Examples

Overcooked

D3IL Benchmark

Ask a Question

Have a question about our work? We'd love to hear from you!

We'll respond to your email as soon as possible.

BibTeX

@inproceedings{
trivedi2025inner,
title={Inner Speech as Behavior Guides: Steerable Imitation of Diverse Behaviors for Human-{AI} coordination},
author={Rakshit Trivedi and Kartik Sharma and David C. Parkes},
booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems},
year={2025},
url={https://openreview.net/forum?id=AwLRF1lZvI}
}