1841
top 50 comments
sorted by: hot top controversial new old
[-] Diplomjodler3@lemmy.world 193 points 5 months ago

Planned obsolescence is one of the major engines that keep our current system of oligarchic hypercapitalism alive. Won't anybody think of the poor oligarchs?!?

[-] huginn@feddit.it 142 points 5 months ago

Resources are just way cheaper than developers.

It's a lot cheaper to have double the ram than it is to pay for someone to optimize your code.

And if you're working with code that requires that serious of resource optimization you'll invariably end up with low level code libraries that are hard to maintain.

... But fuck the Always on internet connection and DRM for sure.

[-] rbn@sopuli.xyz 111 points 5 months ago

If you consider only the RAM on the developers' PCs maybe. If you count in thousands of customer PCs then optimizing the code outperforms hardware upgrades pretty fast. If because of a new Windows feature millions have to buy new hardware that's pretty desastrous from a sustainability point of view.

[-] vithigar@lemmy.ca 45 points 5 months ago

But that's just more business!

[-] huginn@feddit.it 18 points 5 months ago

Last time I checked - your personal computer wasn't a company cost.

Until it is nothing changes - and to be totally frank the last thing I want is to be on a corporate machine at home.

[-] CosmicTurtle0@lemmy.dbzer0.com 10 points 5 months ago

When I was last looking for a fully remote job, a lot of companies gave you a "technology allowance" every few years where they give you money to buy a computer/laptop. You could buy whatever you wanted but you had that fixed allowance. The computer belonged to you and you connected to their virtual desktops for work.

Honestly, I see more companies going in this direction. My work laptop has an i7 and 16GB of RAM. All I do is use Chrome.

[-] huginn@feddit.it 10 points 5 months ago

It'd be nice to have that - yeah. My company issued me a laptop that only had 16gb of RAM to try and build Android projects.

Idk if you know Gradle builds but a multi module project regularly consumes 20+GB of ram during a build. Despite the cost difference being paid for in productivity gains within a month it took 6 months and a lot of fighting to get a 32gb laptop.

My builds immediately went from 8-15 minutes down to 1-4.

load more comments (3 replies)
load more comments (2 replies)
load more comments (1 replies)
load more comments (1 replies)
load more comments (13 replies)
[-] FlapJackFlapper@lemm.ee 98 points 5 months ago

Reminds me of a funny story I heard Tom Petty once tell. Apparently, he had a buddy with a POS car with a crappy stereo, and Tom insisted that all his records had to be mixed and mastered not so that they sound great on the studio's million dollar equipment but in his friend's car.

[-] FlyingSquid@lemmy.world 38 points 5 months ago

That's how my professors instructed me to mix. To make it sound as good on shitty speakers as possible and also sound good on expensive systems.

[-] tfw_no_toiletpaper@lemmy.world 33 points 5 months ago* (last edited 5 months ago)

Reminds me of the ass audio mixing in movies where it is only enjoyable in a 7.1 cinema or your rich friends home theater but not on your own setup

[-] Lorindol@sopuli.xyz 28 points 5 months ago

I had the same exact approach back in the late 90's. My friends had several band projects and when they were mixing their demos, I insisted that if the mixes sound good in a standard car stereo, they'll sound good anywhere.

[-] mPony@lemmy.world 10 points 5 months ago

This is still a perfectly sound method.

Getting the music you made in your own DAW to sound good on your home speakers is almost easy. getting it to not suck on shitty speakers? that's an art.

load more comments (1 replies)
[-] puchaczyk 80 points 5 months ago

Most of the abstractions, frameworks, "bloats", etc. are there to make development easier and therefore cheaper, but to run such software you need a more and more expensive hardware. In a way it is just pushing some of the development costs onto a consumer.

[-] UnderpantsWeevil@lemmy.world 29 points 5 months ago

Most of the abstractions, frameworks, “bloats”, etc. are there to make development easier and therefore cheaper

That's true to an extent. But I've been on the back side of this kind of development, and the frameworks can quickly become their own arcane esoteric beasts. One guy implements the "quick and easy" framework (with 16 gb of bloat) and then fucks off to do other things without letting anyone else know how to best use it. Then half-dozen coders that come in behind have no idea how to do anything and end up making these bizarre hacks and spaghetti code patches to do what the framework was already doing, but slower and worse.

The end result is a program that needs top of the line hardware to execute an oversized pile of javascripts.

load more comments (3 replies)
[-] jpeps@lemmy.world 62 points 5 months ago

Reminds me of the UK's Government Digital Services, who want to digitise government processes but also have a responsibility to keep that service as accessible and streamlined as possible, so that even a homeless person using a £10 phone on a 2G data service still has an acceptable experience.

An example. Here they painstakingly remove JQuery (most modern frameworks are way too big) from the site and shave 32Kb off the site size.

[-] roguetrick@lemmy.world 31 points 5 months ago

That's the most professional comment section I've ever fucking seen.

[-] draughtcyclist@lemmy.world 23 points 5 months ago

Website is amazingly responsive as well, seems to be working.

[-] ameancow@lemmy.world 19 points 5 months ago

Hasn't been linked to reddit yet probably.

Getting away from reddit has shown me that there are unspoiled places in the digital world out there, communities of people who actually care about the topic and not performatism and internet attention.

load more comments (1 replies)
load more comments (14 replies)
[-] Magister@lemmy.world 49 points 5 months ago

When you see what ONE coder was able to do in the 80s, with 64K of RAM, on a 4MHz CPU, and in assembly, it's quite incredible. I miss my Amstrad CPC6128 and all its good games.

[-] prole@sh.itjust.works 40 points 5 months ago* (last edited 5 months ago)

Still happens.

Animal Well was coded by one guy, and it was ~35mb on release (I think it's above 100 at this point after a few updates, but still). The game is massive and pretty complex. And it's the size of an SNES ROM.

Dwarf Fortress has to be one of the most complex simulations ever created, developed by two brothers and given out for free for several decades. The game, prior to adding actual graphics, DF was ~100mb and the Steam version is still remarkably compact.

I am consistently amazed by people's ingenuity with this stuff.

[-] Blackmist@feddit.uk 17 points 5 months ago

SNES ROMs were actually around 4MB. People always spoke about them being 32 Meg or whatever, but they meant megabits.

I did like Animal Well, but gave up after looking at one of the bunny solutions and deciding I didn't have the patience for that.

I think most of the size of games is just graphics and audio. I think the code for most games is pretty small, but for some godforsaken reason it's really important that they include incredibly detailed doorknobs and 50 hours of high quality speech for a dozen languages in raw format.

load more comments (1 replies)
load more comments (1 replies)
[-] linearchaos@lemmy.world 48 points 5 months ago

Doesn't really matter what your developers run on, you need your QA to be running on trash hardware.

We can even cut out the middleman and optimize unity and unreal to run on crap

[-] meliaesc@lemmy.world 19 points 5 months ago

Jokes on you, my corporate job has crippled the Mac they gave us so much that EVERYONE has trash hardware!

[-] manicdave@feddit.uk 45 points 5 months ago

I can think of a few games franchises that wouldn't have trashed their reputation if they'd have had an internal rule like "if it doesn't play on 50% of the machines on Steam's hardware survey, it's not going out"

[-] UnderpantsWeevil@lemmy.world 25 points 5 months ago

I think it's given us a big wave of "Return to pixelated tradition" style games. When you see 16-bit sprites in the teaser, you can feel reasonably confident your computer will run it.

[-] manicdave@feddit.uk 29 points 5 months ago

I don't mind if indie devs try something experimental that melts your computer. Like beamNG needs a decent computer but the target audience kinda knows about that sort of stuff.

The problem is with games like cities skylines 2. Most people buying that game probably don't even know how much RAM they have, it shouldn't be unplayable on a mid range PC.

load more comments (5 replies)
[-] yamanii@lemmy.world 41 points 5 months ago

I knew someone that refused to upgrade the programmer's workstation precisely because it would have been a big leap in performance compared to what their costumers used the software on. Needless to say the program was very fast even on weaker hardware.

load more comments (1 replies)
[-] Cowbee@lemmy.ml 36 points 5 months ago

We need more shorter games, made by happier devs paid more to work fewer hours, with worse graphics.

load more comments (2 replies)
[-] WolfLink@lemmy.ml 29 points 5 months ago

The ideal is “plays fine at lowest graphics settings on old hardware” while having “high graphics settings” that look fantastic but requires too-of-the-line hardware to play reasonably.

Generally this is almost impossible to achieve.

[-] DmMacniel@feddit.de 23 points 5 months ago

But .. where is the innovation (and also Alt text?)

[-] lord_admiral@lemmy.world 20 points 5 months ago

Probably an innovative revelation of the concept of "bloat".

[-] Gamers_Mate@kbin.run 16 points 5 months ago

Image description.

The image is a screenshot of a tumblr post by user elbiotipo.

My solution for bloatware is this: by law you should hire in every programming team someone who is Like, A Guy who has a crappy laptop with 4GB and an integrated graphics card, no scratch that, 2 GB of RAM, and a rural internet connection. And every time someone in your team proposes to add shit like NPCs with visible pores or ray tracing or all the bloatware that Windows, Adobe, etc. are doing now, they have to come back and try your project in the Guy’s laptop and answer to him. He is allowed to insult you and humilliate you if it doesn’t work in his laptop, and you should by law apologize and optimize it for him. If you try to put any kind of DRM or permanent internet connection, he is legally allowed to shoot you.

With about 5 or 10 years of that, we will fix the world.

load more comments (1 replies)
[-] IrateAnteater@sh.itjust.works 22 points 5 months ago

I think that every operating system needs to a have a "do what the fuck I told you to" mode, especially as it comes to networking. I've come close to going full luddite just trying to get smart home devices to connect to a non-internet connected network, (which of course you can only do through a dogshit app) and having my phone constantly try to drop that network since it has no Internet.

I get the desire to have everything be as hand-holdy as possible, but it's really frustrating when the hand holding way doesn't work and there is absolutely zero recourse, and even less ability to tell what went wrong.

Then there's my day job, where I get do deal with crappy industrial software, flakey Internet connections and really annoying things like hyper-v occupying network ports when it's not even open.

load more comments (7 replies)
[-] Semi_Hemi_Demigod@lemmy.world 18 points 5 months ago

If I can't type the program into my TRS-80 from a computer magazine I don't trust it.

load more comments (3 replies)
[-] HStone32@lemmy.world 16 points 5 months ago

I'm training to work in hardware currently. Its my hope that there at least, people still care about min-maxing power vs performance.

load more comments (7 replies)
[-] mariusafa@lemmy.sdf.org 15 points 5 months ago

This is the way. Most of the games today run as shit because people doesn't know or care about computer resources management.

[-] fossphi@lemm.ee 12 points 5 months ago

I 100% agree. But, where Linux?

load more comments (2 replies)
[-] sasquash@sopuli.xyz 12 points 5 months ago

But how would you implement that new Microsoft Screenshot surveillance bullshit feature? Just imagine what a giant waste of resources that is. You have something on your screen which is information and mostly likely already in a good form to process like text. But it makes a screenshot every few seconds and uses some "AI" to make the already existing information searchable again from a fucking screenshot??? Maybe I missed something but that is how I understood the feature.

load more comments (1 replies)
[-] outerspace@lemmy.zip 12 points 5 months ago

Better to run a whole generation , so like 30 years so people would start planning the upgrades ahead for when everyone is ready

[-] 11111one11111@lemmy.world 11 points 5 months ago

In 1000 years this meme/tweet/post will be what my entire generation's existence will be known for. Noone will remember the politics, the disasters, the geopolitical events good or bad, they will remember our entire world and existence ad the only time that technology advancement was driven by the big tech mafia trying to see how far it can get it's dick in your digital footprint.

It's the new cops v robbers or bootleggers v prohibition race. Our tech is getting faster to out run the corporate fuckin maleare but the faster we go the more they stuff in so to the avg user they're ended with paying $6k for a GPU/cpu combo that runs at the same efficiency as my school library's c9mputer did running ms-dos running Oregon Trail in 1995. You are so confined by only having access to functions with massive fuckiing app buttons that even logging in as a guest user req you to memorize every CLI ever made.

It's become my defining "I don't want to live in this world anymore"

load more comments (1 replies)
[-] Sleepyforestwizard@slrpnk.net 11 points 5 months ago

Antix made my old Chromebook’s usable. Old tech is fun.

load more comments (2 replies)
[-] Ugurcan@lemmy.world 11 points 5 months ago

Stop using JS/Node for even brewing your coffee and see this problem resolves itself.

load more comments (1 replies)
[-] SuperSpruce@lemmy.zip 10 points 5 months ago

I make sure my own web game can run smoothly on crappy hardware. It runs well on my gaming laptop downclocked to 400MHz with a 4x slowdown set by Chrome. It also loads in a couple seconds with a typical crappy Internet connection of 200kbps and >10% packet loss. However, it doesn't run smoothly on my Snapdragon 425 phone or my old Core 2 Duo laptop. Is this my game or just browser overhead?

[-] rustydrd@sh.itjust.works 10 points 5 months ago

I think I already posted this at some point, but Software Disenchantment is always worth mentioning in this context.

[-] mryessir@lemmy.sdf.org 10 points 5 months ago

How goes the saying? 32 MB of RAM and always swapping?

load more comments (2 replies)
[-] Holzkohlen@feddit.de 10 points 5 months ago

You have my vote.

load more comments
view more: next ›
this post was submitted on 04 Jun 2024
1841 points (100.0% liked)

linuxmemes

21263 readers
471 users here now

Hint: :q!


Sister communities:


Community rules (click to expand)

1. Follow the site-wide rules

2. Be civil
  • Understand the difference between a joke and an insult.
  • Do not harrass or attack members of the community for any reason.
  • Leave remarks of "peasantry" to the PCMR community. If you dislike an OS/service/application, attack the thing you dislike, not the individuals who use it. Some people may not have a choice.
  • Bigotry will not be tolerated.
  • These rules are somewhat loosened when the subject is a public figure. Still, do not attack their person or incite harrassment.
  • 3. Post Linux-related content
  • Including Unix and BSD.
  • Non-Linux content is acceptable as long as it makes a reference to Linux. For example, the poorly made mockery of sudo in Windows.
  • No porn. Even if you watch it on a Linux machine.
  • 4. No recent reposts
  • Everybody uses Arch btw, can't quit Vim, and wants to interject for a moment. You can stop now.

  • Please report posts and comments that break these rules!

    founded 1 year ago
    MODERATORS