492

Trust in AI technology and the companies that develop it is dropping, in both the U.S. and around the world, according to new data from Edelman shared first with Axios.

Why it matters: The move comes as regulators around the world are deciding what rules should apply to the fast-growing industry. "Trust is the currency of the AI era, yet, as it stands, our innovation account is dangerously overdrawn," Edelman global technology chair Justin Westcott told Axios in an email. "Companies must move beyond the mere mechanics of AI to address its true cost and value — the 'why' and 'for whom.'"

you are viewing a single comment's thread
view the rest of the comments
[-] cmnybo@discuss.tchncs.de 27 points 1 year ago

I have never trusted AI. One of the big problems is that the large language models will straight up lie to you. If you have to take the time to double check everything they tell you, then why bother using the AI in the first place?

If you use AI to generate code, often times it will be buggy and sometimes not even work at all. There is also the issue of whether or not it just spat out a piece of copyrighted code that could get you in trouble if you use it in something.

[-] abhibeckert@lemmy.world 8 points 1 year ago* (last edited 1 year ago)

One of the big problems is that the large language models will straight up lie to you.

Um... that's a trait AI shares with humans.

If you have to take the time to double check everything they tell you, then why bother using the AI in the first place?

You have to double check human work too. So, since you are going to double check everything anyway, it doesn't really matter if it's wrong?

If you use AI to generate code, often times it will be buggy

... again, exactly the same as a human. Difference is the LLM writes buggy code really fast.

Assuming you have good testing processes in place, and you better have those, AI generated code is perfectly safe. In fact it's a lot easier to find bugs in code that you didn't write yourself.

There is also the issue of whether or not it just spat out a piece of copyrighted code that could get you in trouble

Um - no - that's not how copyright works. You're thinking of patents. But human written code has the same problem.

[-] TimeSquirrel@kbin.social 2 points 1 year ago* (last edited 1 year ago)

I'm using Github Copilot every day just fine. It's great for fleshing out boilerplate and other tedious things where I'd rather spend the time working out the logic instead of syntax. If you actually know how to program and don't treat it as if it can do it all for you, it's actually a pretty great time saver. An autocomplete on steroids basically. It integrates right into my IDE and actually types out code WITH me at the same time, like someone is sitting right beside you on a second keyboard.

this post was submitted on 07 Mar 2024
492 points (100.0% liked)

Technology

73370 readers
3813 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS