247
mallocPlusAI (sh.itjust.works)
top 26 comments
sorted by: hot top controversial new old
[-] orca@orcas.enjoying.yachts 53 points 1 month ago

From my experience with ChatGPT:

  1. It will NEVER consistently give you only the value in the response. It will always eventually add in some introductory text like it’s talking to a human. No matter how many times I tried to get it to just give me back the answer alone, it never consistently did.
  2. ChatGPT is terrible with numbers. It can’t count, do math, none of that. So asking it to do byte math is asking for a world of hurt.

If this isn’t joke code, that is scary.

[-] dev_null@lemmy.ml 58 points 1 month ago

I refuse to believe you are not certain this is a joke

[-] orca@orcas.enjoying.yachts 22 points 1 month ago

I know it is, but I’ve also seen people try to use ChatGPT for similar things as a serious endeavor.

[-] Gutek8134@lemmy.world 25 points 1 month ago
[-] orca@orcas.enjoying.yachts 6 points 1 month ago

Neat! Never seen this one before.

[-] yetAnotherUser@discuss.tchncs.de 4 points 4 weeks ago

Where's my 1 million dollars?

[-] foenkyfjutschah@programming.dev 2 points 1 month ago

how is it different from a calculator or say a Python REPL? i'm asking b/c i'm too old to try out young folks inefficiently engineered "solutions".

[-] Gutek8134@lemmy.world 8 points 4 weeks ago

You input some text, chatGPT guesses the answer using the linear algebra that powers LLMs

The project was made as a satire of companies putting AI into everything

[-] Swedneck@discuss.tchncs.de 5 points 4 weeks ago

have you ever wanted your calculator to be able to be wrong like a human?
Like, not just calculating the wrong answer or returning an error, i mean outright brainfart and just giving a nonsense answer

[-] BluesF@lemmy.world 4 points 4 weeks ago* (last edited 4 weeks ago)

I know a guy who was working on something like this, they just had the call to the model loop until the response met whatever criteria the code needed (e.g. one single number, a specifically formatted table, viable code, etc) or exit after a number of failed attempts. That seemed to work pretty well, it might mess up from time to time but it's unlikely to (with the right prompt) do so repeatedly when asked again.

[-] orca@orcas.enjoying.yachts 2 points 4 weeks ago

That’s a good approach. I think for my use case the struggle was trying to not use a ton of tokens (upper management was being stingy on that front). I kept trying to push to make it more robust but you know how those things go. Axed ahead of their time or zombified.

[-] mindaika@lemmy.dbzer0.com 4 points 4 weeks ago

Response:

Observation 1: ChatGPT is designed to provide context for responses to enhance clarity for human users. Requests for answers without accompanying text may result in inconsistent behavior due to its conversational model. It is not optimized for providing pure data outputs without context.

Observation 2: ChatGPT is not inherently equipped to perform complex mathematical operations with high reliability. Numerical inaccuracies or rounding errors may occur due to the model’s structure. While capable of basic arithmetic, it is not a specialized tool for precise calculations, particularly in domains like byte math, where accuracy is critical.

Statement acknowledged.

[-] Sotuanduso@lemm.ee 4 points 1 month ago

For 1, that's why you say "Format your answer in this exact sentence: The number of bytes required (rounded up) is exactly # bytes., where # is the number of bytes." And then regex for that sentence. What could go wrong?

Also, it can do math somewhat consistently if you let it show its work, but I still wouldn't rely on it as a cog in code execution. It's not nearly reliable enough for that.

[-] Object@sh.itjust.works 41 points 1 month ago

Money generator for bug bounty hunters

[-] _____@lemm.ee 39 points 1 month ago

knowing GPT users, this is probably not satire.

[-] zaphod@sopuli.xyz 14 points 1 month ago

You don't need to cast the return value from malloc.

[-] addie@feddit.uk 11 points 1 month ago

True. Although given how easy it is to cast void pointers to the wrong damn thing, it would be nice if you did, makes refactoring much easier. Makes me appreciate std::any all the more.

[-] embed_me@programming.dev 3 points 4 weeks ago* (last edited 4 weeks ago)

Void pointer should be avoided anyways. Even I find them rare and I mostly work in embedded RTOS

[-] Subverb@lemmy.world 5 points 4 weeks ago* (last edited 4 weeks ago)

This isn't malloc though. I have to assume the cast is because the user has experience with the output from an LLM being untrustworthy.

[-] vrighter@discuss.tchncs.de 2 points 4 weeks ago
[-] zaphod@sopuli.xyz 2 points 4 weeks ago

In c++ you should use new.

[-] vrighter@discuss.tchncs.de 1 points 4 weeks ago

that is besides the point. You can still call malloc, it will still return void*, and it would still reqoire casting in c++

[-] yetAnotherUser@lemmy.ca 6 points 1 month ago

Someone needs to put this in DreamBerd

[-] fubarx@lemmy.ml 5 points 1 month ago
[-] AngryCommieKender@lemmy.world 4 points 4 weeks ago

Malloc doesn't even exist! He's not canon!

/j

[-] ValenThyme@reddthat.com 2 points 4 weeks ago

I saw the best minds of my generation....

this post was submitted on 18 Oct 2024
247 points (100.0% liked)

Programming Horror

1888 readers
2 users here now

Welcome to Programming Horror!

This is a place to share strange or terrible code you come across.

For more general memes about programming there's also Programmer Humor.

Looking for mods. If youre interested in moderating the community feel free to dm @Ategon@programming.dev

Rules

Credits

founded 1 year ago
MODERATORS