13.02.26, University of Amsterdam, PEPTalks #26. Training to Exhaustion: Linguistic Labor and Model Collapse, online +LINK
In the first talk, Training to Exhaustion: Linguistic Labour and Model Collapse, Paolo Caffoni addresses how LLMs create a phenomenological suspension of scarcity: textual outputs proliferate without the immediately visible constraints that ordinarily accompany linguistic production, time, attention, fatigue, and other limits of embodied linguistic performance. A chatbot can sustain millions of simultaneous exchanges not because it overcomes these limits, but because it operates by circulating sedimented linguistic labor without exposure to exhaustion. Framing language automation through the lens of financialization, Caffoni argues that “model collapse” names not a technical anomaly but a structural limit, emerging when linguistic value circulates independently of its conditions of production. Read this way, model collapse marks the point at which speculative abundance encounters its own finitude, expressed as energetic and semiotic exhaustion.