2 Comments
User's avatar
Margeret Heath's avatar

“To attend is to collapse distance into meaning”. On this point (meaning) I might disagree; it’s not quite “meaning” as tighter-coupling of associational likelihood? We after all provide the “meaning” when we read the words. The LLM offers-up the most likely weighted arrays of probable couplings (based on training data not on “contextual” information which is how we do things) from which WE then infer “meaningfulness” further, surely ..? (Eg, “deposit” can as well refer to river bank in geography as to banking and finance). Or do I misunderstand?

Expand full comment
Unfinished Maps's avatar

That’s a great point — and you’re right that the model isn’t really creating meaning the way we do. What I meant with that line was more about structure than semantics — that attention, whether in us or in a model, sort of folds the field so things that were far apart suddenly influence each other.

For us, that folding shows up as the feeling of meaning; for the model, it’s just the geometry of associations tightening up. Maybe “collapse distance into coherence” would have been more precise — but “meaning” felt closer to how it lands when we experience it, even though the model itself doesn’t.

Really appreciate you pushing on that distinction though — it’s exactly the kind of nuance that makes this stuff interesting.

Expand full comment