Discussion about this post

User's avatar
Chris Schuck's avatar

This isn't quite the same thing, but your thoughts on meaning and its metaphysical dependence on flow of time brings to mind some of the trippier episodes of Lost in Seasons 4 and 5, when characters keep jumping back and forward in time to the point that their heads start exploding (both figuratively and a tiny bit literally). Especially the Constant episode (Season 4 Episode 5) where they explain the toll this takes from not having a metaphysical anchor or reference point. I guess that's not collapsing time to zero, but maybe related enough to be sort of fun?

Expand full comment
Dominic Fox's avatar

I have a feeling that while training parallelises naturally, token generation is constrained by the linear sequencing of tasks: you must have the first token before its successor, and the successor before the successor’s successor, and so on. I asked ChatGPT and it gave a characteristically plausible response: https://chatgpt.com/s/t_68533d9d792481918e59d3b8dbfad67b

Expand full comment
3 more comments...

No posts