Its interesting to see so much investor money chase AI unicorns at the moment. Something tells me many of them might be making the wrong bets.
Its interesting to see so much investor money chase AI unicorns at the moment. Something tells me many of them might be making the wrong bets.
I’d like to set up Llama 2 on my system, is there an LLM community on Lemmy yet or do I just need to dive in deep and dark?
Reddit r/LocalLLaMA is the best place to start. AI communities on lemmy are too small
I’m still avoiding reddit out of principle, might check on that eventually but I’ll make at least a token effort to find the answers elsewhere first.
If you are comfortable with python, check huggingface.
To see how to set up a gui, check https://github.com/camenduru/text-generation-webui-colab to test models on colab.
You can copy the commands there and adapt to have it locally
!localllama@sh.itjust.works
Perfect! Exactly the resource I was hoping for!
My dinosaur machines might barely be able to cope with the smallest model… maybe… it’ll be fine.
Dive deep and dark with blindfolds.