Only Mac support?
#2
by juniorhero - opened
I am unable to run on Windows. Only MLX support?
You need to install llama.cpp. Sometimes the installation on windows can be a huge pain in the ass. I helped a few friends install it, its easier to use on Ubuntu or if you wanna soft switch get Winux. either way if you aren't using LMstudio or equivalent you'll need to install llama.cpp. all the instructions are here: https://github.com/ggml-org/llama.cpp.
I added this cause they say its easy. I do not know if that is true or not and reddit can contain useful information but usually is full of retards. So if it works cool.
https://www.reddit.com/r/LocalLLaMA/comments/1cy2nf9/easy_guide_to_install_llamacpp_on_windows_all/