LLM Attention Explorer
Visualize attention matrices (layer/head) for GPT-style models. Load sample data or upload your own JSON.
JSON format
{
"tokens": ["I", "love", "AI", "!"],
"attentions": [ // layers
[ // heads
[[...],[...],[...],[...]], // head 0 matrix (n x n)
[[...],[...],[...],[...]] // head 1
],
...
]
}