Advanced
In reply to @kam
Boris Mann@boris
3/22/2023

Yeah I’m looking at Dalai for running LLaMa and Alpaca on your local machine https://cocktailpeanut.github.io/dalai/#/ I’ve got enough space to run even the largest models

In reply to @boris
Kyle Mathews@kam
3/22/2023

oh hmmm... maybe I'll try using that for my project. I've hooked it up to the OpenAI but don't see why this wouldn't work instead