18

i built a pc that has a crop ton of processing power, but i know nothing about the software side of things/

thoughts? prayers? concerns? comments? @$%&'s to give?

top 17 comments
sorted by: hot top controversial new old
[-] jeena@piefed.jeena.net 19 points 15 hours ago
  1. Install linux on it https://ubuntuhandbook.org/index.php/2024/04/install-ubuntu-24-04-desktop/
  2. Install ollama https://ollama.com/download/linux
  3. Install Open WebUI https://docs.openwebui.com/getting-started/quick-start/
  4. Install stable-diffusion-webui https://github.com/AUTOMATIC1111/stable-diffusion-webui
  5. Spent a coupple of weeks learning how to configure it so you can get a chat running and a image generator
[-] grue@lemmy.world 2 points 15 hours ago

Do you actually need the webui stuff or can you just use ollama on the command line?

[-] Grenfur@pawb.social 5 points 15 hours ago

Ollama can be run from CLI.

[-] iii@mander.xyz 4 points 15 hours ago

It's just an optional interface. There's the build in console. There's other 3rd party TUIs too.

[-] Sabata11792@ani.social 1 points 13 hours ago

You can run it from the command line but you will not have tools and the formatting will be unpleasant.

[-] 0x01@lemmy.ml 6 points 13 hours ago

Processing (cpu) doesn't really matter as much as gpu, and generally the constraint is gpu memory on consumer grade machines. Processing via nvidia chips has become the standard, which is a huge part of why they have become the single most valuable company on the planet, though you can use cpu you'll find the performance almost unbearably slow.

Ollama is the easiest option, but you can also use option and pytorch (executorch), vllm, etc

You can download your model through huggingface or sometimes directly from the lab's website

It's worth learning the technical side but ollama genuinely does an excellent job and takes a ton off your plate

[-] Grenfur@pawb.social 9 points 15 hours ago* (last edited 14 hours ago)

Not entirely sure what you mean by "Limitation Free", but here goes.

First thing you need is a way to actually run a LLM. For me I've used both Ollama and Koboldcpp.

Ollama is really easy to set up and has it's own library of models to pull from. It's a CLI interface, but if all you're wanting is a locally hosted AI to ask silly questions to, that's the one. Something of note for any locally hosted LLM, they're all dated. So none of them can tell you about things like local events. They're data is current as of when the model was trained. Generally a year or longer ago. If you wanted up to date news you could use something like DDGS and write a python script that calls Ollama. At any rate.

Koboldcpp. If your "limitation free" is more spicy roleplay, this is the better option. It's a bit more work to get going, but has tons of options to let you tweak how your models run. You can find .gguf models at Hugging Face, load em up and off you go. kobold's UI is kinda mid, and though is more granular than ollama, if you're really looking to dive into some kinda role play or fantasy trope laden adventure, SillyTavern has a great UI for that and makes managing character cards easier. Note that ST is just a front end, and still needs Koboldcpp (or another back end) running for it to work.

Models. Your "processing power" is almost irrelevant for LLMs. Its your GPUs VRAM that matters. A general rule of thumb is to pick a model that has a download size 2-4GB smaller than your available VRAM. If you got 24G VRAM, you can probably run a model that's 22G in download (Roughly a 32B Model depending on the quant).

Final notes, I could have misunderstood and this whole question was about image gen, hah. InvokeAI is good for that. Models can be found on CivitAI (Careful it's... wild). I've also heard good things about ComfyUI but never used it.

GL out there.

[-] bobbyguy@lemmy.world 2 points 14 hours ago

thanks! this helps a lot! ill have to learn what it means first but ill definitely try it!

[-] infinitevalence@discuss.online 3 points 15 hours ago

Install Linux

Install llmstudio

Profit

[-] iconic_admin@lemmy.world 1 points 12 hours ago

I was going to mention this one. LMStudio is much better than ollama.

[-] infinitevalence@discuss.online 2 points 12 hours ago

LLMstudio is local AI on easy mode.

[-] Disregard3145@lemmy.world 3 points 15 hours ago* (last edited 15 hours ago)

What do you mean by "make" what do you want it to do that you aren't getting.

Maybe some existing model via ollama - llama-uncensored?

Do you need to add context with some specific set of data, should it be retrieval based or tuned or cross trained?

Does it even need to be an llm? What are you trying to actually achieve?

[-] bobbyguy@lemmy.world 1 points 14 hours ago

i want to make my own chatbot that can also act without my input, be able to create emails, and do online jobs, and make its own decisions, things like that

[-] Grenfur@pawb.social 3 points 14 hours ago

Most of the options mentioned in this thread won't act independent of your input. You'd need some kind of automation software. n8n has a community edition that you can host locally in a docker container. You can link it to an LLM API and emails, excel sheets etc. As for doing "online jobs" I'm not sure what that means, but at the point where you're trying to get a single AI to interact with the web and make choices on it's own, you're basically left coding it all yourself in python.

[-] bobbyguy@lemmy.world 1 points 14 hours ago

i mean like actual jobs a person could do online, like commissions with art programs, or administration jobs for software companies, basically it would mimic a person online

[-] Acamon@lemmy.world 7 points 10 hours ago

If someone with a home computer and very little knowledge of AI could setup an AI that could do admin jobs for software companies ... Why wouldn't the software companies do exactly that themselves rather than outsource work?

I think you're massively overestimating what a LLM is capable of.

[-] bobbyguy@lemmy.world 1 points 7 hours ago

i have no idea what that even is so im putting my effort into whatever you guys tell me to do

(i have no experience in programming at all so im really just winging this)

this post was submitted on 14 Aug 2025
18 points (100.0% liked)

No Stupid Questions

42863 readers
827 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here. This includes using AI responses and summaries.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 2 years ago
MODERATORS