I’ll pile on to the lunarvim suggestions for a more out of the box/curated experience.
I’ll pile on to the lunarvim suggestions for a more out of the box/curated experience.
Did you get charged by the pixel to post this or something?
I mean the last one was released in 2013, it’s not exactly super relevant but if you’re that unaware of it I assume you were playing habbo hotel or whatever little kids played 10 years ago.
Pretty much any deterioration of service would do it, I’m not tied to github at all, it works but so does gitlab and self hosted solutions.
I will absolutely take that bet. Given both how unpopular the decision is combined with it being even tangentially related to the gaming community I would be astonished if they didn’t receive death threats.
No, it provides an API as an interface, it is not consuming an API.
I don’t think you understand what this is. It is a local model, it runs locally. It provides an API which you can then use in the same manner as you would the ChatGPT API. I’m not super familiar with GPT4all since llama.cpp/kobold.cpp are pretty much the standard in local inferrence, but for example llama-cpp-python provides an OpenAI compatible API.
Give me fish or give me vanilla bash.
That’s nothing new, I learned Novell Netware in college and Pascal in high school.