16
submitted 1 month ago* (last edited 1 month ago) by Womble@lemmy.world to c/localllama@sh.itjust.works

I've recently been writing fiction and using an AI as a critic/editor to help me tighten things up (as I'm not a particularly skilled prose writer myself). Currently the two ways I've been trying are just writing text in a basic editor and then either saving files to add to a hosted LLM or copy pasting into a local one. Or using pycharm and AI integration plugins for it.

Neither is particularly satisfactory and I'm wondering if anyone knows of a good setup for this (preferably open source but not neccesary), integration with at least one of ollama or open-router would be needed.

Edit: Thanks for the recommendations everyone, lots of things for me to check out when I get the time!

top 7 comments
sorted by: hot top controversial new old
[-] nibby@sh.itjust.works 5 points 1 month ago

If you're up for learning Emacs, it has several packages for integrating with Ollama, such as ellama. It has worked satisfactory for me.

[-] Womble@lemmy.world 1 points 1 month ago

I actually already use emacs, I just find configuring it a complete nightmare. Good to know its an option though

[-] tal@lemmy.today 1 points 1 month ago* (last edited 1 month ago)

I installed the emacs ellama package, and I don't think that it required any configuration to use, though I'm not at my computer to check.

[-] copacetic@discuss.tchncs.de 2 points 1 month ago

I recently started with Zed. It works with ollama.

Too early for me to give more of an assessment than "it works".

[-] Womble@lemmy.world 1 points 1 month ago

I'll give it a try thanks.

[-] hendrik@palaver.p3x.de 2 points 1 month ago* (last edited 1 month ago)

I'm not sure if this is what you're looking for, but for AI generated novels, we have Plot Bunni. That's specifically made to draft, generate an outline and chapters and then the story. Organize ideas... It has a lot of rough edges though. I had some very limited success with it, and it's not an editor. But it's there and caters to storywriting.

[-] brucethemoose@lemmy.world 2 points 1 month ago* (last edited 1 month ago)

Mikupad is incredible:

https://github.com/lmg-anon/mikupad

I think my favorite feature is the 'logprobs' mouseover, aka showing the propability of each token that's generated. It's like a built-in thesaurus, a great way to dial in sampling, and you can regenerate from that point.

Once you learn how instruct formatting works (and how it auto inserts tags), it's easy to maintain some basic formatting yourself and question it about the story.

It's also fast. It can handle 128K context without being too laggy.

I'd recommend the llama.cpp server or TabbyAPI as backends (depending on the model and your setup), though you can use whatever you wish.

I'd recommend exui as well, but seeing how exllamav2 is being depreciated, probably not the best idea to use anymore... But another strong recommendation is kobold.cpp (which can use external APIs if you want).

this post was submitted on 23 Jun 2025
16 points (100.0% liked)

LocalLLaMA

3450 readers
6 users here now

Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.

Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.

As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.

Rules:

Rule 1 - No harassment or personal character attacks of community members. I.E no namecalling, no generalizing entire groups of people that make up our community, no baseless personal insults.

Rule 2 - No comparing artificial intelligence/machine learning models to cryptocurrency. I.E no comparing the usefulness of models to that of NFTs, no comparing the resource usage required to train a model is anything close to maintaining a blockchain/ mining for crypto, no implying its just a fad/bubble that will leave people with nothing of value when it burst.

Rule 3 - No comparing artificial intelligence/machine learning to simple text prediction algorithms. I.E statements such as "llms are basically just simple text predictions like what your phone keyboard autocorrect uses, and they're still using the same algorithms since <over 10 years ago>.

Rule 4 - No implying that models are devoid of purpose or potential for enriching peoples lives.

founded 2 years ago
MODERATORS