The first GPT-4-class AI model anyone can download has arrived: Llama 405BWilshire@lemmy.world to Technology@lemmy.world – 193 points – 2 months agoarstechnica.com61Post a CommentPreviewYou are viewing a single commentView all commentsShow the parent commentWAKE UP! It works offline. When you use with ollama, you don't have to register or agree to anything. Once you have downloaded it, it will keep on working, meta can't shut it down.Well, yes and no. See the other comment, 64 GB VRAM at the lowest setting.Oh, sure. For the 405B model it's absolutely infeasible to host it yourself. But for the smaller models (70B and 8B), it can work. I was mostly replying to the part where they claimed meta can take it away from you at any point - which is simply not true.
WAKE UP! It works offline. When you use with ollama, you don't have to register or agree to anything. Once you have downloaded it, it will keep on working, meta can't shut it down.Well, yes and no. See the other comment, 64 GB VRAM at the lowest setting.Oh, sure. For the 405B model it's absolutely infeasible to host it yourself. But for the smaller models (70B and 8B), it can work. I was mostly replying to the part where they claimed meta can take it away from you at any point - which is simply not true.
Well, yes and no. See the other comment, 64 GB VRAM at the lowest setting.Oh, sure. For the 405B model it's absolutely infeasible to host it yourself. But for the smaller models (70B and 8B), it can work. I was mostly replying to the part where they claimed meta can take it away from you at any point - which is simply not true.
Oh, sure. For the 405B model it's absolutely infeasible to host it yourself. But for the smaller models (70B and 8B), it can work. I was mostly replying to the part where they claimed meta can take it away from you at any point - which is simply not true.
WAKE UP!
It works offline. When you use with ollama, you don't have to register or agree to anything.
Once you have downloaded it, it will keep on working, meta can't shut it down.
Well, yes and no. See the other comment, 64 GB VRAM at the lowest setting.
Oh, sure. For the 405B model it's absolutely infeasible to host it yourself. But for the smaller models (70B and 8B), it can work.
I was mostly replying to the part where they claimed meta can take it away from you at any point - which is simply not true.