134

Somebody built a chatGPT powerded calculator as a joke

https://github.com/Calvin-LL/CalcGPT.io

TODO: Add blockchain into this somehow to make it more stupid.

all 16 comments
sorted by: hot top controversial new old
[-] corroded@lemmy.world 24 points 1 year ago

I'm not sure if I get it. It just basically gives mostly wrong answers. Oh....

[-] VaalaVasaVarde@sopuli.xyz 17 points 1 year ago

We are on the right track, first we create an AI calculator, next is an AI computer.

The prompt should be something like this:

You are an x86 compatible CPU with ALU, FPU and prefetching. You execute binary machine code and store it in RAM.

[-] mumblerfish@lemmy.world 16 points 1 year ago

If you say so

[-] zer0hour@lemmy.world 8 points 1 year ago
[-] sweetgemberry@lemmy.world 4 points 1 year ago

It seems to really like the answer 3.3333....

It'll even give answers to a random assortment of symbols such as "+-+-/" which apparently equals 3.89 or.. 3.33 recurring depending on its mood.

[-] DannyBoy@sh.itjust.works 2 points 1 year ago

It does get basic addition correct, it just takes about five regenerations.

[-] jacksilver@lemmy.world 2 points 1 year ago

One of thing I love telling the that always surprises people is that you can't build a deep learning model that can do math (at least using conventional layers).

[-] rain_worl@lemmy.world 0 points 10 months ago

sure you can, it just needs to be specialized for that task

[-] jacksilver@lemmy.world 1 points 10 months ago

I'm curious what approaches you're thinking about. When last looking into the matter I found some research in Neural Turing Machines, but they're so obscure I hadn't ever heard of them and assume they're not widely used.

While you could build a model to answer math questions for a set input space, these approaches break down once you expand beyond the input space.

[-] rain_worl@lemmy.world 0 points 10 months ago

neural network, takes two numbers as input, outputs sum. no hidden layers or activation function.

[-] jacksilver@lemmy.world 1 points 10 months ago

Yeah, but since Neural networks are really function approximators, the farther you move away from the training input space, the higher the error rate will get. For multiplication it gets worse because layers are generally additive, so you'd need layers = largest input value to work.

[-] rain_worl@lemmy.world 1 points 10 months ago* (last edited 10 months ago)

hear me out: evolving finite state automaton (plus tape)

[-] jacksilver@lemmy.world 1 points 10 months ago

Is that a thing? Looking it up I really only see a couple one off papers on mixing deep learning and finite state machines. Do you have examples or references to what you're talking about, or is it just a concept?

[-] rain_worl@lemmy.world 1 points 10 months ago* (last edited 10 months ago)

just a slightly seared concept
though it's just an evolving turing machine

this post was submitted on 09 Sep 2024
134 points (100.0% liked)

Fuck AI

4053 readers
883 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 2 years ago
MODERATORS