420
submitted 1 year ago by ElCanut@jlai.lu to c/technology@beehaw.org
top 50 comments
sorted by: hot top controversial new old
[-] Phroon@beehaw.org 149 points 1 year ago

“You may not instantly see why I bring the subject up, but that is because my mind works so phenomenally fast, and I am at a rough estimate thirty billion times more intelligent than you. Let me give you an example. Think of a number, any number.”

“Er, five,” said the mattress.

“Wrong,” said Marvin. “You see?”

― Douglas Adams, Life, the Universe and Everything

[-] AlexisFR@jlai.lu 9 points 1 year ago

The mattress? Like for sleeping?

[-] Asafum@feddit.nl 41 points 1 year ago* (last edited 1 year ago)

Yep! The hitchhikers books are so much fun lol

I still think one of my favorite lines is "the ships hung in the sky in much the same way that bricks don't."

[-] Bishma@discuss.tchncs.de 125 points 1 year ago

37 is well represented. Proof that we've taught AI some of our own weird biases.

[-] GenderNeutralBro@lemmy.sdf.org 43 points 1 year ago

What's special about 37? Just that it's prime or is there a superstition or pop culture reference I don't know?

[-] Bishma@discuss.tchncs.de 101 points 1 year ago

If you discount the pop-culture numbers (for us 7, 42, and 69) its the number most often chosen by people if you ask them for a random number between 1 and 100. It just seems the most random one to choose for a lot of people. Veritasium just did a video about it.

[-] metallic_z3r0@infosec.pub 28 points 1 year ago

37 is my favorite, because 3x7x37=777 (three sevens), and I think that's neat.

load more comments (6 replies)
[-] SubArcticTundra@lemmy.ml 12 points 1 year ago
[-] Bishma@discuss.tchncs.de 20 points 1 year ago

I'm curious about that too. Something is twisting weights for 57 fairly strongly in the model but I'm not show what. Maybe its been trained on a bunch of old Heinz 57 varieties marketing.

[-] boredtortoise@lemm.ee 7 points 1 year ago

Wesley Snipes

load more comments (1 replies)
load more comments (9 replies)
[-] Karyoplasma@discuss.tchncs.de 18 points 1 year ago* (last edited 1 year ago)

Probably just because it's prime. It's just that humans are terrible at understanding the concept of randomness. A study by Theodore P. Hill showed that when tasked to pick a random number between 1 and 10, almost a third of the subjects (n was over 8500) picked 7. 10 was the least picked number (if you ditch the few idiots that picked 0).

[-] K0W4LSK1@lemmy.dbzer0.com 7 points 1 year ago

Maybe randomness is a label we slapped on shit we don't understand.

[-] driving_crooner@lemmy.eco.br 9 points 1 year ago* (last edited 1 year ago)

I remember watching a lecture about probability, and the professor said that only quantum processes are really random, the rest of things that we call random is just the human inability to measure the variables that affects the random outcome. I'm an actuarie, and it's made me change the perspective on how I see and study random processes and how it made think on ways to influence the outcome of random processes.

[-] jarfil@beehaw.org 8 points 1 year ago* (last edited 1 year ago)

...which is kind of a hilarious tautology, because "quantum processes" are by definition "processes that we are unable to decompose into more basic parts".

The moment we learn about some more fundamental processes being the reason for a given process, it stops being "quantum" and the new ones become "it".

load more comments (8 replies)
load more comments (6 replies)
[-] olicvb@lemmy.ca 62 points 1 year ago

holy crap, the answer to life the universe and everything XD

[-] HarkMahlberg@kbin.social 29 points 1 year ago* (last edited 1 year ago)

I mean... they didn't specify it had to be random (or even uniform)? But yeah, it's a good showcase of how GPT acquired the same biases as people, from people..

[-] OsrsNeedsF2P@lemmy.ml 22 points 1 year ago

uniform

Reminds me of my previous job where our LLM was grading things too high. The AI "engineer" adjusted the prompt to tell the LLM that the average output should be 3. I had a hard time explaining that wouldn't do anything at all, because all the chats were independent events.

Anyways, I quit that place and the project completely derailed.

[-] lauha@lemmy.one 27 points 1 year ago

Ask humans the same and most common numer is 37

[-] Catsrules@lemmy.ml 13 points 1 year ago

I saw that YouTube video as well.

load more comments (14 replies)
[-] ForestOrca@kbin.social 25 points 1 year ago

WAIT A MINUTE!!! You mean Douglas Adams was actually an LLM?

load more comments (3 replies)
[-] FlashMobOfOne@beehaw.org 22 points 1 year ago

HA, funny that this comes up. DND Beyond doesn't have a d100, so I opened my ChatGPT sub and had it roll a d100 for me a few times so I could use my magic beans properly.

[-] terminhell@lemmy.dbzer0.com 18 points 1 year ago

I use the percentile die for that.

[-] FlashMobOfOne@beehaw.org 8 points 1 year ago

Also an excellent method.

[-] TauriWarrior@aussie.zone 11 points 1 year ago* (last edited 1 year ago)

Opened up DND Beyond to check since i remember rolling it before and its there, its between D8 and D10, the picture shows 2 dice

load more comments (1 replies)
[-] Urist@lemmy.ml 9 points 1 year ago

Roll two d10, once for each digit, and profit?

load more comments (2 replies)
[-] Cube6392@beehaw.org 7 points 1 year ago

But why use Chatgpt for that? Why not a duck duck go action? I just don't understand why we're asking a LLM whose goal is consistency, not randomness, to do random

[-] DarkFox@pawb.social 16 points 1 year ago

Which model?

When I tried on ChatGPT 4, it wrote a short python script and executed it to get a random integer.

import random

# Pick a random number between 1 and 100
random_number = random.randint(1, 100)
random_number
[-] TonyTonyChopper@mander.xyz 7 points 1 year ago

does the neural network actually run scripts or is it pretending

[-] amju_wolf@pawb.social 10 points 1 year ago

It generates code and then you can use a call to some runtime execution API to run that code, completely separate from the neural network.

load more comments (1 replies)
[-] xyguy@startrek.website 10 points 1 year ago

Only 1000 times? It's interesting that there's such a bias there but it's a computer. Ask it 100,000 times and make sure it's not a fluke.

[-] BuboScandiacus@mander.xyz 8 points 1 year ago
[-] thesmokingman@programming.dev 8 points 1 year ago

42, 47, and 50 all make sense to me. What’s the significance of 37, 57, and 73?

[-] Rekhyt@beehaw.org 31 points 1 year ago

There's a great Veritasium video recently about this exact thing: https://youtu.be/d6iQrh2TK98

It's a human thing, though. This is just more evidence of LLM's problem with garbage in, garbage out: it's human biases being present in a system that people want to claim doesn't have them.

People do mention Veritasium, though he doesn't give any significant explanation of the phenomenon.

I still wonder about 47. In Veritasium plots, all these numbers provide a peak, but not 47. I recall from my childhood that I indeed used to notice that number everywhere, but idk why.

load more comments (4 replies)
[-] Grimpen@lemmy.ca 9 points 1 year ago

Veritasium just released a video about people picking 37 when asked to pick a random number.

[-] lolola 8 points 1 year ago
[-] kciwsnurb@aussie.zone 9 points 1 year ago

The temperature scale, I think. You divide the logit output by the temperature before feeding it to the softmax function. Larger (resp. smaller) temperature results in a higher (resp. lower) entropy distribution.

[-] ArmoredThirteen@lemmy.ml 8 points 1 year ago

I don't understand any of these words, I need to take a math class or something

Higher temperature -> more chaotic output

load more comments (3 replies)
[-] TheOctonaut@mander.xyz 7 points 1 year ago

Temperature is basically how creative you want the AI to be. The lower the temperature, the more predictable (and repeatable) the response.

load more comments (1 replies)
[-] PhreakyByNature@feddit.uk 7 points 1 year ago

NEEDS MOAR 69 FELLOW HUMAN

[-] Semi-Hemi-Demigod@kbin.social 7 points 1 year ago

So what? It figured out The Answer, big whoop.

Get back to me when it figures out The Question.

load more comments
view more: next ›
this post was submitted on 10 Apr 2024
420 points (100.0% liked)

Technology

38589 readers
101 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS