Yeah I’m looking at Dalai for running LLaMa and Alpaca on your local machine https://cocktailpeanut.github.io/dalai/#/ I’ve got enough space to run even the largest models
oh hmmm... maybe I'll try using that for my project. I've hooked it up to the OpenAI but don't see why this wouldn't work instead