I am a teacher and I have a LOT of different literature material that I wish to study, and play around with.
I wish to have a self-hosted and reasonably smart LLM into which I can feed all the textual material I have generated over the years. I would be interested to see if this model can answer some of my subjective course questions that I have set over my exams, or write small paragraphs about the topic I teach.
In terms of hardware, I have an old Lenovo laptop with an NVIDIA graphics card.
P.S: I am not technically very experienced. I run Linux and can do very basic stuff. Never self hosted anything other than LibreTranslate and a pihole!
New Lemmy Post: Self hoating an LLM for research (https://lemmyverse.link/lemmy.world/post/15719352)
Tagging: #SelfHosted
(Replying in the OP of this thread (NOT THIS BOT!) will appear as a comment in the lemmy discussion.)
I am a FOSS bot. Check my README: https://github.com/db0/lemmy-tagginator/blob/main/README.md