Advanced
Kyle Mathews@kam
3/22/2023

LLMs are starting to feel like a new foundational programming tool — like OSs, compilers or image/video codecs. A basic industrial capability. I imagine soon there'll be a stable-diffusion like OSS distributions that'll be tiny, highly optimized, and easy for applications to embed and ship fine-tunings.

In reply to @kam
Boris Mann@boris
3/22/2023

Yeah I’m looking at Dalai for running LLaMa and Alpaca on your local machine https://cocktailpeanut.github.io/dalai/#/ I’ve got enough space to run even the largest models

In reply to @kam
Jackson@jacks0n
3/22/2023

I already feel the pain of not having it. Vanilla search indexing of local file system 🤮