[-] Fingerthief@infosec.pub 3 points 5 months ago

That seems like a pretty naive and biased approach to software to me honestly.

Ease of use, community support, feature set, CI/CD etc..all should come into play when deciding what to use.

Freedom at all costs is great until you limit the community development and potential user base by 90% by using a completely open repo service that 5% of the population uses or some small discord alternative.

So then the option is to host on multiple platforms/communities and the management and time investment goes up keeping them in sync and active.

As with most things in life, it's best to look at things with nuance rather than a hard stance imo.

I may stand it up on another service at some point, but also anyone else is totally free to do that as well. There are no restrictions.

[-] Fingerthief@infosec.pub 3 points 5 months ago

Ahh, I see lol

[-] Fingerthief@infosec.pub 4 points 5 months ago* (last edited 5 months ago)

I'm not sure I understand at all?

It's fully open source, can run/connect any number of fully local models as well as the big name models if a user chooses to use them.

Can you expand on what you mean?

45
submitted 5 months ago* (last edited 5 months ago) by Fingerthief@infosec.pub to c/opensource@lemmy.ml

cross-posted from: https://infosec.pub/post/13676291

I've been building MinimalChat for a while now, and based on the feedback I've received, it's in a pretty decent place for general use. I figured I'd share it here for anyone who might be interested!

Quick Features Overview:

  • Mobile PWA Support: Install the site like a normal app on any device.
  • Any OpenAI formatted API support: Works with LM Studio, OpenRouter, etc.
  • Local Storage: All data is stored locally in the browser with minimal setup. Just enter a port and go in Docker.
  • Experimental Conversational Mode (GPT Models for now)
  • Basic File Upload and Storage Support: Files are stored locally in the browser.
  • Vision Support with Maintained Context
  • Regen/Edit Previous User Messages
  • Swap Models Anytime: Maintain conversational context while switching models.
  • Set/Save System Prompts: Set the system prompt. Prompts will also be saved to a list so they can be switched between easily.

The idea is to make it essentially foolproof to deploy or set up while being generally full-featured and aesthetically pleasing. No additional databases or servers are needed, everything is contained and managed inside the web app itself locally.

It's another chat client in a sea of clients but it is unique in its own ways in my opinion. Enjoy! Feedback is always appreciated!

Self Hosting Wiki Section https://github.com/fingerthief/minimal-chat/wiki/Self-Hosting-With-Docker

I thought sharing here might be a good idea as well, some might find it useful!

I've added some updates since even the initial post which gave a huge improvement to message rendering speed as well as added a plethora of new models to choose from and load/run fully locally in your browser (Edge and Chrome) with WebGPU and WebLLM

[-] Fingerthief@infosec.pub 10 points 5 months ago

I haven't personally tried it yet with Ollama but it should work since it looks like Ollama has the ability to use OpenAI Response Formatted API https://github.com/ollama/ollama/blob/main/docs/openai.md

I might give it go here in a bit to test and confirm.

118
submitted 5 months ago* (last edited 5 months ago) by Fingerthief@infosec.pub to c/selfhosted@lemmy.world

I've been building MinimalChat for a while now, and based on the feedback I've received, it's in a pretty decent place for general use. I figured I'd share it here for anyone who might be interested!

Quick Features Overview:

  • Mobile PWA Support: Install the site like a normal app on any device.
  • Any OpenAI formatted API support: Works with LM Studio, OpenRouter, etc.
  • Local Storage: All data is stored locally in the browser with minimal setup. Just enter a port and go in Docker.
  • Experimental Conversational Mode (GPT Models for now)
  • Basic File Upload and Storage Support: Files are stored locally in the browser.
  • Vision Support with Maintained Context
  • Regen/Edit Previous User Messages
  • Swap Models Anytime: Maintain conversational context while switching models.
  • Set/Save System Prompts: Set the system prompt. Prompts will also be saved to a list so they can be switched between easily.

The idea is to make it essentially foolproof to deploy or set up while being generally full-featured and aesthetically pleasing. No additional databases or servers are needed, everything is contained and managed inside the web app itself locally.

It's another chat client in a sea of clients but it is unique in its own ways in my opinion. Enjoy! Feedback is always appreciated!

Self Hosting Wiki Section https://github.com/fingerthief/minimal-chat/wiki/Self-Hosting-With-Docker

5

For anyone who wants to join, I run a casual AC server for hot lapping competitions.

Nothing fancy, new track/cars every week or so.

https://gofast.emperorservers.com/live-timing leaderboards and server join link etc...

[-] Fingerthief@infosec.pub 6 points 1 year ago

I used Apple for the last few years until recently and I can't say I've ever really noticed stuff like apps faking being another app. That's not to say it doesn't happen of course.

I do know the Apple app approval process is definitely more strict than what is required for the Play Store.

I'm not very experienced with Apple or Android development so I'd be curious to hear from devs that use both platforms as well.

1
submitted 1 year ago* (last edited 1 year ago) by Fingerthief@infosec.pub to c/minimalgpt@infosec.pub

Changes from release notes

  • Adjusted chat message bubbles max width to take up nearly the entire width of the chat.

  • Increase sized of message label logos and font.

  • Adjusted message font size and line-height for a better reading experience

  • Added a border to one side of message bubbles for some UI design changes

1
submitted 1 year ago* (last edited 1 year ago) by Fingerthief@infosec.pub to c/minimalgpt@infosec.pub
1
submitted 1 year ago* (last edited 1 year ago) by Fingerthief@infosec.pub to c/minimalgpt@infosec.pub

I've created a fairly thorough overview of MinimalGPT with all the basic info to get started. Please feel free to take a look!

1
submitted 1 year ago* (last edited 1 year ago) by Fingerthief@infosec.pub to c/minimalgpt@infosec.pub

Link to a live version of MinimalGPT that I host, you can always spin up a local version youself via the GItHub project.

201
submitted 1 year ago* (last edited 1 year ago) by Fingerthief@infosec.pub to c/cat@lemmy.world
[-] Fingerthief@infosec.pub 17 points 1 year ago

As a dev it’s nice to check all the official guideline boxes, as a user I’d much rather actually have features.

[-] Fingerthief@infosec.pub 15 points 1 year ago

They’re just going to source the allowed parts from Red Bull basically exactly like they used to do with Toro Rosso.

To think that will equate to a RB19 is a bit insane in my opinion. They will likely improve, but still be a mid midfield team like they used to be with Toro Rosso.

[-] Fingerthief@infosec.pub 3 points 1 year ago* (last edited 1 year ago)

But it’s able to correct unlike what’s shown in the OP messages.

Extremely semantically it seems but it clearly listens. It's neat to see how different each person experience is.

Also different tuning parameters etc..could make outputs different. That might explain why mine is seemingly a bit better at listening.

[-] Fingerthief@infosec.pub 7 points 1 year ago* (last edited 1 year ago)

Now it’s broken, I guess I I don’t use it this way often enough. Interesting nonetheless!

Edit - it’s very semantic, it matters if I include an uppercase “S” or not. That’s amusing.

I wonder if the temperature settings adjustment would fix that or just make it even weirder.

[-] Fingerthief@infosec.pub 13 points 1 year ago

Idk what I’m doing wrong, thankfully it always seems to listen and work fine for me lmao

[-] Fingerthief@infosec.pub 4 points 1 year ago

You’ve never actually used them properly then.

[-] Fingerthief@infosec.pub 4 points 1 year ago

Memmy is easily my favorite out of all the apps.

view more: next ›

Fingerthief

joined 1 year ago
MODERATOR OF