5
submitted 2 days ago by 0x01@lemmy.ml to c/asklemmy@lemmy.ml

Artificial Superintelligence, AI systems that are more intelligent than humans across every domain, may or may not be coming soon

What could we do to prepare now for a future where it has arrived?

I have been considering:

  • starting local community groups
  • updating my investment strategies to be more resilient to market disruption
  • diversifying personal income streams
  • staying up to date with the latest news and learn to better use the latest tools/technology
  • upgrading personal skills towards the harder to replace industries

It's a bit difficult to imagine a truly "safe" way of life. Barring UBI and more progressive taxes it seems like it may be quite challenging for the average person to exist comfortably.

Some industries that are already impacted at the level of technology we already have

  • programming
  • ui design
  • creative writing
  • technical writing
  • customer support
  • graphic art
  • data analysis

I think almost every other industry is at risk of significant disruption. A capitalism based society will always stray toward the cheapest option, "if AI can take customer support calls for $1/day and customer satisfaction doesn't dip, why would I pay a person $150/day?"

you are viewing a single comment's thread
view the rest of the comments
[-] Vinny_93@lemmy.world 6 points 1 day ago

Until humanity unites there is no 'we' who can do anything. We are currently too busy polarising, fighting and reinforcing poverty. If AGI ever arrives, it'll most likely be weaponised or used to make rich people richer.

If ASI runs out of control, we can only hope it'll be a nice god and doesn't immediately see humans are a disease and try to kill us off.

It's, at this point, hubris to assume AGI/ASI wants anything to do with us.

[-] 0x01@lemmy.ml 2 points 1 day ago

More than fair, how much effort do we put in to make sure the ants have a comfortable life? Or even further, the tardigrades? I am on the optimistic side, hoping that a superintelligence holds a benevolent nostalgia/amusement for sentient life if it does indeed come to that.

There's a chance that asi doesn't happen and we stall indefinitely on a simple token prediction system, in which case the disruption could be limited to what we've seen already?

[-] Vinny_93@lemmy.world 2 points 1 day ago

I am in the hopeful side. Maybe an ASI can quickly analyze our issues and interfere. ASI might spit out any plans to improve everyone's life but if the people in power ignore all of the advice because they'll no longer be in power, nothing will really change except now there's an ASI using huge amounts of electricity.

Considering how everything's going, I honestly think an ASI won't make anything worse happen than the current state of affairs.

this post was submitted on 28 Jan 2025
5 points (100.0% liked)

Asklemmy

44611 readers
737 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS