1759
top 50 comments
sorted by: hot top controversial new old
[-] namnnumbr@lemmy.ml 90 points 1 year ago

It's not just every tech company, it's every company. And it's terrifying - it's like giving people who don't know how to ride a bike a 1000hp motorcycle! The industry does not have guardrails in place and the public consciousness "chatGPT can do it" without any thought to checking the output is horrifying.

[-] szczuroarturo@programming.dev 9 points 1 year ago

Basically the internet.

load more comments (1 replies)
[-] kbity@kbin.social 68 points 1 year ago

There's even rumours that the next version of Windows is going to inject a bunch of AI buzzword stuff into the operating system. Like, how is that going to make the user experience any more intuitive? Sounds like you're just going to have to fight an overconfident ChatGPT wannabe that thinks it knows what you want to do better than you do, every time you try opening a program or saving a document.

[-] Taleya@aussie.zone 58 points 1 year ago

This is what pisses me off about the whole endeavour. We can't even get a fucking search algo right any more, why the fuck do i want a machine blithely failing to do what it's told as it stumbles off a cliff.

[-] DragonAce@lemmy.world 42 points 1 year ago

It'll be like if they brought clippy back but only this time hes even more of an asshole and now he can fuck up your OS too.

[-] sigh@lemmy.world 13 points 1 year ago

There’s even rumours

Like, I know we all love to hate Microsoft here but can we stop with the random nonsense? That's not what's happening, at all.

load more comments (2 replies)
[-] whosdadog@sh.itjust.works 9 points 1 year ago

Windows Co-pilot just popped up on my Windows 11 machine. Its disclaimer said it could provide surprising results. I asked it what kind of surprising results I could expect, it responded that it wasn't comfortable talking about that subject and ended the conversation.

load more comments (1 replies)
load more comments (4 replies)
[-] Blackmist@feddit.uk 64 points 1 year ago
[-] h_a_r_u_k_i@programming.dev 14 points 1 year ago

It's sad to see it spit out text from the training set without the actual knowledge of date and time. Like it would be more awesome if it could call time.Now(), but it 'll be a different story.

[-] Blackmist@feddit.uk 37 points 1 year ago

if you ask it today's date, it actually does that.

It just doesn't have any actual knowledge of what it's saying. I asked it a programming question as well, and each time it would make up a class that doesn't exist, I'd tell it it doesn't exist, and it would go "You are correct, that class was deprecated in {old version}". It wasn't. I checked. It knows what the excuses look like in the training data, and just apes them.

It spouts convincing sounding bullshit and hopes you don't call it out. It's actually surprisingly human in that regard.

[-] tjaden@lemmy.sdf.org 19 points 1 year ago

It spouts convincing sounding bullshit and hopes you don’t call it out. It’s actually surprisingly human in that regard.

Oh great, Silicon Valley's AI is just an overconfident intern!

load more comments (1 replies)
[-] scarabic@lemmy.world 9 points 1 year ago

It’s super weird that it would attempt to give a time duration at all, and then get it wrong.

[-] dan@upvote.au 11 points 1 year ago

It doesn't know what it's doing. It doesn't understand the concept of the passage of time or of time itself. It just knows that that particular sequence of words fits well together.

load more comments (3 replies)
load more comments (13 replies)
load more comments (1 replies)
[-] Poob@lemmy.ca 58 points 1 year ago

None of it is even AI, Predicting desired text output isn't intelligence

[-] freeman@lemmy.pub 27 points 1 year ago

At this point i just interpret AI to be "we have lots of select statements and inner joins "

load more comments (4 replies)
[-] drekly@lemmy.world 21 points 1 year ago

I do agree, but on the other hand...

What does your brain do while reading and writing, if not predict patterns in text that seem correct and relevant based on the data you have seen in the past?

[-] fidodo@lemm.ee 16 points 1 year ago

I've seen this argument so many times and it makes zero sense to me. I don't think by predicting the next word, I think by imagining things both physical and metaphysical, basically running a world simulation in my head. I don't think "I just said predicting, what's the next likely word to come after it". That's not even remotely similar to how I think at all.

load more comments (1 replies)
[-] Noughmad@programming.dev 11 points 1 year ago

AI is whatever machines can't do yet.

Playing chess was the sign of AI, until a computer best Kasparov, then it suddenly wasn't AI anymore. Then it was Go, it was classifying images, it was having a conversation, but whenever each of these was achieved, it stopped being AI and became "machine learning" or "model".

load more comments (1 replies)
load more comments (15 replies)
[-] DrownedRats@lemmy.world 51 points 1 year ago

My cousin got a new TV and I was helping to set it up for him. During the setup thing, it had an option to enable AI enhanced audio and visuals. Turning the ai audio on turned the decent, but maybe a little sub par audio, into an absolute garbage shitshow it sounded like the audio was being passed through an "underwater" filter then transmitted through a tin can and string telephone. Idk who decided this was a feature that was ready to be added to consumer products but it was absolutely moronic

[-] Whitebrow@lemmy.world 50 points 1 year ago

Coupled with laying off a few thousand employees

[-] MrMamiya@feddit.de 27 points 1 year ago

God it’s exhausting. Okay, I’ll buy a 3d television if that’s what I have to do, let’s bring that back instead. Please?

[-] ICastFist@programming.dev 27 points 1 year ago

Unlike the previous bullshit they threw everywhere (3D screens, NFTs, metaverse), AI bullshit seems very likely to stay, as it is actually proving useful, if with questionable results... Or rather, questionable everything.

[-] WheelcharArtist@lemmy.world 14 points 1 year ago

if it only were AI and not just llms, machine learning or just plain algorithms. but yeah let's call everything AI from here on. NFTs could be useful if used as proof of ownership instead of expensive pictures etc

[-] peopleproblems@lemmy.world 6 points 1 year ago

The NFT as ownership should really become the standard. Instead of having any people "authorizing" yadadada it's done completely by machine and traceable.

No middlemen needed. Just I own x, this says I own x. I can sell you x, and you get ownership of x immediately. No "waiting 45 days to close" or "2 day transaction close" or even "title search verification." Too many middlemen benefitting from the current system to allow NFT to replace them though. That's the actual challenge.

Okay, someone gains access to your device and sends themselves the NFT that proves ownership of your house.

What do you do? Do you just accept that since they own the NFT, that means they own the house? Probably not. You'll go through the legal system, because that's still what ultimately decides the ownership. I bet you'll be happy about middle men and "waiting 45 days to close" then.

load more comments (1 replies)
load more comments (11 replies)
load more comments (1 replies)
[-] cloudy1999@sh.itjust.works 27 points 1 year ago

This is refreshing to see. I thought I was the only one who felt this way.

[-] denemdenem@lemmy.world 22 points 1 year ago

If you take out the AI part it still holds true. 2023 is full of bullshit.

load more comments (1 replies)
[-] ExtraMedicated@lemmy.world 18 points 1 year ago* (last edited 1 year ago)

I'm bookmarking this for the next time my supervisor plugs ChatGPT.

[-] ApathyTree@lemmy.dbzer0.com 8 points 1 year ago

I had a manager tell me some stuff was being scanned by AI for one of my projects.

No, you are having it scanned by a regular program to generate keyword clouds that can be used to pull it up when humans type their stupidly-worded questions into our search. It’s not fucking AI. Stop saying everything that happens on a computer that you don’t understand is fucking AI!

I’m just so over it. But at least they aren’t trying to convince us chatGPT is useful (it definitely wouldn’t be for what they would try to use it for)

[-] taanegl@lemmy.ml 17 points 1 year ago

It begs the question... what's the boardroom random bullshit timeline?

When was it random cloud bullshit go and when was it random Blockchain bullshit go, and what other buzzwords almost guaranteed Silicon Valley tech bros tossed money in your face and at what point in time were they applicable?

[-] Protoflare@lemmy.world 16 points 1 year ago

Snapchat AI. My friends don't want it, they can't block it, and it is proven to lie about certain things, like asking if it has one's location.

[-] KIM_JONG_JUICEBOX@lemmy.ml 15 points 1 year ago

What companies are you people working for?

We are being asked not to use AI.

[-] fluxion@lemmy.world 17 points 1 year ago

Ain't gotta use it to sell it or slap AI stickers on top of whatever products you're selling

load more comments (3 replies)
[-] RagingRobot@lemmy.world 7 points 1 year ago

Larger companies have been working fast to sandbox the models used by their employees. Once they are safe from spilling data they go all in. I'm currently on a platform team enabling generative Ai capabilities at my company.

load more comments (1 replies)
[-] jimmydoreisalefty@lemmus.org 14 points 1 year ago

More Ads and tracking systems, Now With AI!

Commercial...

[-] stark@qlemmy.com 6 points 1 year ago

This makes me think I should stay in IT infrastructure and not move to a developer position.

load more comments
view more: next ›
this post was submitted on 06 Aug 2023
1759 points (100.0% liked)

Programmer Humor

32414 readers
229 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS