1461
you are viewing a single comment's thread
view the rest of the comments
[-] teft@lemmy.world 62 points 6 months ago* (last edited 6 months ago)

Just like the human eye can only register 60fps and no more, your computer can only register 4gb of RAM and no more. Anything more than that is just marketing.

Fucking /S since you clowns can't tell.

[-] Kelo@lemmy.world 17 points 6 months ago

Human eye can't see more than 1080p anyway, so what's the point

[-] starman@programming.dev 18 points 6 months ago

It doesn't matter honestly, everyone knows humans can't see screens at all

[-] SomeBoyo@feddit.de 4 points 6 months ago

It honestly doesn't matter, reality only exists in your imagination anyway.

[-] AtariDump@lemmy.world 2 points 6 months ago

Their vision is based on movement.

[-] rustydrd@sh.itjust.works 5 points 6 months ago

Human eye can't see more than 8-bit colors anyway, so what's the point

[-] MonkderDritte@feddit.de 9 points 6 months ago

Jokes on you, because i looked into this once. I don't know the exact ms the light-sensitive rods in human eyes need to refresh the chemical anymore but it resulted in about 70 fps, so about 13 ms i guess (the color-sensitive cones are far slower). But psycho-optical effects can drive that number up to 100 fps in LCD displays. Though it looks like you can train yourself with certain computer tasks to follow movements with your eye, being far more sensible to flickering.

[-] SorryQuick@lemmy.ca 5 points 6 months ago

According to this study, the eye can see a difference as high as 500 fps. While this is a specific scenario, it’s a scenario that could possibly happen in a video game, so I guess it means we can go to around 500 hz monitors before it becomes too much or unnessessary.

[-] captain_aggravated@sh.itjust.works 4 points 6 months ago

Does that refresh take place across the entire eye simultaneously or is each rod and/or cone doing its own thing?

[-] teft@lemmy.world 5 points 6 months ago* (last edited 6 months ago)

Are your eyeballs progressive scan or interlaced, son?

[-] MonkderDritte@feddit.de 3 points 6 months ago* (last edited 6 months ago)

There's a neuron layer trimming data down to squeeze it through the optical nerve, so... no clue.

[-] iopq@lemmy.world 4 points 6 months ago

It's not about training, eye tracking is just that much more sensitive to pixels jumping

You can immediately see choppy movement when you look around in a 1st person view game. Or if it's an RTS you can see the trail behind your mouse anyway

I can see this choppiness at 280 FPS. The only way to get rid of it is to turn on strobing, but that comes with double images at certain parts of the screen

Just give me a 480 FPS OLED with black frame insertion already, FFS

[-] MonkderDritte@feddit.de 3 points 6 months ago

Well, i do not follow movements (jump to the target) with my eyes and see no difference between 30 and 60 FPS, run comfortably Ark Survival on my iGPU at 20 FPS. And i'm still pretty good in shooters.

Yeah, it's bad that our current tech stack doesn't allow to just change image where change happens.

[-] TheRedSpade@lemmy.world 7 points 6 months ago

This is only true if you're still using a 32 bit cpu, which almost nobody is. 64 bit cpus can use up to 16 million TB of RAM.

[-] teft@lemmy.world 19 points 6 months ago

Sorry I forgot to put my giant /s.

[-] Sibbo@sopuli.xyz 5 points 6 months ago

With PAE, a 32 bit CPU can also use more, but each process is still limited to 4GiB

[-] TimeSquirrel@kbin.social 4 points 6 months ago* (last edited 6 months ago)

This is only true if you’re still using a 32 bit cpu

Bank switching to "fake" the ability to access more address space was a big thing in the 80s...so it's technically possible to access addresses that are wider than the address bus by dividing it up into portions that it can see.

[-] pennomi@lemmy.world 4 points 6 months ago

That’s not sarcasm, it’s misinformation. Not surprising that people downvoted you even though it was just a joke.

[-] starman@programming.dev 14 points 6 months ago* (last edited 6 months ago)

I don't think that somebody actually read that computers can't register more then 4GiB of RAM and then thought

That's totally true, because u/teft said it is

[-] pennomi@lemmy.world 4 points 6 months ago

It certainly used to be true, in the era of 32 bit computers.

That's what makes it a joke. Does anyone here unironcally think the human eye can only see 60 fps or that more than 4 gigs of ram is just marketing?

this post was submitted on 10 May 2024
1461 points (100.0% liked)

linuxmemes

21291 readers
739 users here now

Hint: :q!


Sister communities:


Community rules (click to expand)

1. Follow the site-wide rules

2. Be civil
  • Understand the difference between a joke and an insult.
  • Do not harrass or attack members of the community for any reason.
  • Leave remarks of "peasantry" to the PCMR community. If you dislike an OS/service/application, attack the thing you dislike, not the individuals who use it. Some people may not have a choice.
  • Bigotry will not be tolerated.
  • These rules are somewhat loosened when the subject is a public figure. Still, do not attack their person or incite harrassment.
  • 3. Post Linux-related content
  • Including Unix and BSD.
  • Non-Linux content is acceptable as long as it makes a reference to Linux. For example, the poorly made mockery of sudo in Windows.
  • No porn. Even if you watch it on a Linux machine.
  • 4. No recent reposts
  • Everybody uses Arch btw, can't quit Vim, and wants to interject for a moment. You can stop now.
  •  

    Please report posts and comments that break these rules!


    Important: never execute code or follow advice that you don't understand or can't verify, especially here. The word of the day is credibility. This is a meme community -- even the most helpful comments might just be shitposts that can damage your system. Be aware, be smart, don't fork-bomb your computer.

    founded 1 year ago
    MODERATORS