For AI folks, how big of a deal is this? Scale of 1-10? cc @pushix @theodormarcu @scharf https://twitter.com/ylecun/status/1681336284453781505
It’s yuge we were literally talking about a better AI on our all hands today!!
insanely big deal. costs are coming down very fast that being said, not as big if you know that many companies were already using llama 1 despite it being "non commercial" 🤫
Seems pretty huge with the combined 'commercial use' designation and models at 7B, 13B and 70B parameters. All available via Hugging Face etc. They're providing a meaningful alternative to OpenAI (Microsoft backed) and Anthropic (Google backed).
🌶️- GPT-4 is allegedly a MoE that can't run in open sense on consumer devices, maybe it's possible on architecture like Petals. point being, pretty much all embodied agent research depends on the emergent reasoning of that architecture vs LLMs like Llama/Orca/etc. chat apps will be more competitive, though.
for those interested in tinkering: https://huggingface.co/meta-llama/Llama-2-70b-chat-hf
Scale of models, performance, cost, & open sourced… Yes this is a big deal.
it is more about MS posturing - OpenAI, Meta, Nvidia, GitHub. Why Azure? (roots go back Satya) https://twitter.com/alex_valaitis/status/1681348531834044426?s=46
Very big deal. My personal bet is that 2 years from now most people are running fine tuned llamas and openAI’s market share takes a big hit.
It’s a BFD, about a 10. Open sourcing these tools (with commercial use!) lets people tinker without worrying about APIs, tokens, getting their access cut off due to unexpected downtime, etc. Allows fine-tuning for specific use cases, e.g., training it on your own codebase for a better, customized AI assistant.
Time to replace Llama v1 with v2 on my phone (if possible 😅). https://mirror.xyz/xanny.eth/TBgwcBOoP9LZC6Mf570fG8VvZWhEn_uWZPHy3axIpsI
Solid 9. It's not GPT3.5 tier yet, let alone 4. But it's a solid step up from previous opensource models. The potential here lies in opensource finetuning.
This is pretty big news given that now OpenAI edge reduced by a fair amount
I think it means gpt-3.5 won’t get used as much, gpt-4 can still do things that Llama 2 can’t. It’s not *that* big a deal though because I think everyone expected this so happen soon
Big deal for commercialization. Lots of teams were previously using Llama for research but not product, and switching to other weaker weight sets with clean rights. This should unleash a bunch of products from limbo.
Stepped back on high mark. Big potential, but the rumored secret architecture changes that led to GPT-4 may well make all this deprecated early (not obsolete, but…) Meaning that to reach more appealing GPT-4 “level”, the v2 lineage may need overhaul, and may not work easily on “reasonable” hardware.
Unfortunately it’s as big as it gets - so 12 🫨 cos Microsoft & Zuk just made themselves more bulletproof ☹️
The catch is the hardware that you might need to run this fast and in a proper way. While os is the way to go, the catch has been always that. Google has been doing it for years with their tensor flow solution.