Advanced
Nat Emodi@emodi
7/23/2023

Saw an impressive demo yesterday of a locally run, fully offline LLM — 13bn params, 8GB running on an M2 Macbook browser In the next couple years we’ll see access to human-level intelligence across a lot of the world’s information without need for internet

In reply to @emodi
m_j_r@m-j-r
7/23/2023

think of the Mixture of (locally hosted) Experts we'll have as well. right now the trend is near-instaneous single-player 0-shot, what happens when we revert to async, git-style, & multiplayer-edited output for common ideas?