111
submitted 10 months ago by ugjka@lemmy.world to c/linux@lemmy.ml
top 29 comments
sorted by: hot top controversial new old
[-] BananaTrifleViolin@lemmy.world 90 points 10 months ago* (last edited 10 months ago)

The actual answer in on Stack exchange in their comments.

https://unix.stackexchange.com/questions/740319/why-is-gnome-fractional-scaling-1-7518248558044434-instead-of-1-75

It is related to a mix of actual display resolution vs conversions to virtual resolutions (the scaled resolution), and use of single precision floating point calculations.

Essentially my understanding is what it is doing is storing the value needed to convert your actual resolutions number of pixels (2160p) to a virtual resolution number of pixels (2160/1.75 horizontally) but that gets you fractions of a virtual pixel. So instead of 1.75 it scaled by 1.75182... to get to a whole number of virtual pixels to work with. Then on top of that the figure is slightly altered from what we'd expect by floating point errors.

If you take the actual horizontal resolution 2190 and divide it by the virtual resolution it's trying to use 1233 pixels, you need a conversion value of 1.75182.... to convert to it so you don't get fractions of a pixel. If you used 1.75 you'd get 1234.2857... pixels. So gnome is storing the fraction that gets you a clean conversion in pixels to about 4 decimal places of a pixel.

Full credit to rakslice at Stack Exchange who also goes into the detail.

[-] Dirk@lemmy.ml 22 points 10 months ago

For the same reason a lot of programming languages can't calculate 0.1+0.2 properly.

There's a website explaining it: https://0.30000000000000004.com/

[-] MotoAsh@lemmy.world 57 points 10 months ago

Floating point error? Yeaahhh no. No. Just... no. That is NEVER as big as 0.01 unless the number is also insanely massive.

The error is relative in scale. It's not magically significant fractions off.

[-] Giooschi@lemmy.world 4 points 10 months ago

TBF the error can become that big if you do a bunch of unstable operations (i.e. operations that continue to increase the relative error), though that's probably not what is happening here.

[-] MotoAsh@lemmy.world 3 points 10 months ago

To get to 0.01 error, you'd need to add up trillions of trillions of floating point errors. It will not happen solely because of floating point unless you're doing such crazy math that you shouldn't be using primitives in the first place.

[-] Giooschi@lemmy.world 2 points 10 months ago

That's why I said unstable operations. Addition is considered a stable operation (for values with the same sign)

[-] dgriffith@aussie.zone 1 points 10 months ago* (last edited 10 months ago)

As the answer in the link explains, it's adjustment of your scaling factor to the nearest whole pixel, plus a loss of precision rounding to/from single/double floating point values.

So I'm not really sure of the point of this post. It's not a question, as the link quite effectively answers it. It's more just "here's why your scaling factor looks weird in your gnome config file", and it's primarily the first reason - rounding to whole pixels.

[-] Neon@lemmy.world 1 points 10 months ago

0.001, but still

[-] ElectroLisa 30 points 10 months ago

If I'm not mistaken 1.75 can be correctly stored as a float, as it consists of 1/2 and 1/4 only

[-] Aradia@lemmy.ml 5 points 10 months ago* (last edited 10 months ago)

Gnome is coded with JavaScript (lmao 🤣) ~~so yeah, I Think you are right.~~

EDIT: Actually, even if JavaScript and other languages have this issue, the value 1.7518248558044434 has not this issue. There is another reply that explains it and makes totally sense. But still pretty lame to know the desktop runs with JavaScript. (Yeah, I hate Gnome)

[-] priapus@sh.itjust.works 14 points 10 months ago

GNOME is primarily written in C

[-] yukijoou 6 points 10 months ago

the desktop shell is mostly javascript though

[-] priapus@sh.itjust.works 14 points 10 months ago

Closer to 50/50, and other parts of the GNOME desktop like mutter, are largely C. Saying the entire GNOME desktop is mostly JS is silly.

[-] kbal@fedia.io 2 points 10 months ago

On the other hand, saying that there's way too much javascript in it is objectively factual.

[-] priapus@sh.itjust.works 10 points 10 months ago

You don't get to decide what too much JS in the project is unless you actually work on and have in depth knowledge of the project. I dont like JS, but it has its uses.

Many people are conflating modern electron bloatware with 'JS bad', but things are not that simple.

[-] Aradia@lemmy.ml 1 points 10 months ago

No one here said GNOME desktop is mostly JS.

[-] priapus@sh.itjust.works 1 points 10 months ago

You're right, they said the desktop shell, which is still incorrect, but I guess a little less incorrect. My bad.

[-] Aradia@lemmy.ml 1 points 10 months ago

Well, I started this thread saying it runs on JavaScript, and I mean that they need JS for most of the interactions with the desktop, like gesture or mouse events. 😞 Even if most of the code is C, we all know we need to write much many lines of code of C to do the same with JS, so most of the logics on GNOME is computed by JS. We need some rust here. 🦀 🦀 🦀 🦀

[-] Aradia@lemmy.ml 3 points 10 months ago

Yeah, on their git says 46% of the code is JavaScript: https://gitlab.gnome.org/GNOME/gnome-shell

That's pretty much, almost half of the code.

[-] priapus@sh.itjust.works 10 points 10 months ago

That page also shows that there is more C. That page is also specifically the shell, not all of the desktop.

[-] Aradia@lemmy.ml 1 points 10 months ago

There is less than 4% more code in C than JavaScript. That's pretty much, many features on the gnome-desktop is using JavaScript too, like gestures and mouse events.

[-] possiblylinux127@lemmy.zip 1 points 10 months ago

The JavaScript and typescript gtk bindings are nice and make building apps nice.

[-] Aradia@lemmy.ml 2 points 10 months ago

Okay, but still needs JavaScript, they are slowly trying to remove or improve it. But it is a fact that it also runs on JavaScript. 🤣

[-] priapus@sh.itjust.works 11 points 10 months ago

Using JavaScript isn't inherently a bad thing. JavaScript can be very useful when used for scripting. Obviously anything with a new for performance will be done in C.

[-] Aradia@lemmy.ml 3 points 10 months ago

JavaScript isn't the best language to make a desktop interface in my opinion, it can be very efficient, but you can see in bugs (at least in the past) how bad performance it had, and they needed to re-factor it to replace to C or improve the JavaScript. I'm just laughing and making fun of it using JavaScript, not saying it is slow, Gnome is pretty fast nowadays.

[-] TimeSquirrel@kbin.social 2 points 10 months ago

Javascript was a toy created in the mid 90s to make dumb interactive animations and have some sort of dynamic aspect to a web page. The world starting to code entire desktop programs and servers in it was a giant, horrific, societal mistake.

[-] atzanteol@sh.itjust.works 14 points 10 months ago

It's not a "language" issue it's a "computer" issue. This math is being done on the CPU.

IEEE 754

Some languages do provide for "arbitrary precision math" (Java's BigDecimal for example) but it's slower to do that. Not what you want if you're multiplying a 4k matrix every millisecond.

[-] Aradia@lemmy.ml 4 points 10 months ago

I see, thanks for the explanation.

[-] gian@lemmy.grys.it 1 points 10 months ago

True, but it is not that difficult to trucante (or round) the value at the second decimal value.

this post was submitted on 09 Jan 2024
111 points (100.0% liked)

Linux

48382 readers
1983 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS