Advanced
In reply to @jacks0n
Dean Pierce 👨‍💻🌎🌍@deanpierce
7/21/2023

GPT-4 context is like 8k tokens, not enough for much. The cool kids these days dump everything into a vector database, and then use terms from the prompt to max out the context window with relevant snippets of code and text etc, but that approach is still pretty limited.

In reply to @deanpierce
Dean Pierce 👨‍💻🌎🌍@deanpierce
7/21/2023

Here's a great lesson from Andrew Ng on this topic: https://www.deeplearning.ai/short-courses/langchain-chat-with-your-data/