[-] DR_Hero@programming.dev 8 points 2 months ago

There's a much more accurate stat... and it's disgusting

[-] DR_Hero@programming.dev 3 points 2 months ago

I think he was just imminently concerned about their safety. Like the post suggests, many thought desperate times were coming and any rando in a maga hat might retaliate.

[-] DR_Hero@programming.dev 4 points 2 months ago

Ai generation has legitimately gotten that good. They do text very well, and photorealism too

[-] DR_Hero@programming.dev 3 points 2 months ago

At least the same company developed both in that case. As soon as a new open source AI model released, Elon just slapped it on wholesale and started charging for it

[-] DR_Hero@programming.dev 6 points 4 months ago

I'm confused as to what your point is

[-] DR_Hero@programming.dev 3 points 7 months ago

Barring Poe's Law and all

Would love to grab a beer with the owner of that car

[-] DR_Hero@programming.dev 5 points 9 months ago

Excuse me but, the fuck is wrong with you?

[-] DR_Hero@programming.dev 2 points 10 months ago

The federation part was appreciated on my end at least. This instance is a big part of lemmy experience, and would have been odd for it to suddenly disappear.

[-] DR_Hero@programming.dev 20 points 1 year ago* (last edited 1 year ago)

The worst part is that it took them years after it came out to be a known risk before they actually sent me a replacement machine.

Having to choose between the risk of heart failure and the risk of cancer sure was fun...

[-] DR_Hero@programming.dev 3 points 1 year ago

Now I'm upset this wasn't the original haha

[-] DR_Hero@programming.dev 9 points 1 year ago

I've definitely experienced this.

I used ChatGPT to write cover letters based on my resume before, and other tasks.

I used to give it data and tell chatGPT to "do X with this data". It worked great.
In a separate chat, I told it to "do Y with this data", and it also knocked it out of the park.

Weeks later, excited about the tech, I repeat the process. I tell it to "do x with this data". It does fine.

In a completely separate chat, I tell it to "do Y with this data"... and instead it gives me X. I tell it to "do Z with this data", and it once again would really rather just do X with it.

For a while now, I have had to feed it more context and tailored prompts than I previously had to.

[-] DR_Hero@programming.dev 5 points 1 year ago

CisHet here, also with a statistically improbable number of close trans friends.

Growing up, I ate eggs so often they said I would turn into one...

I think I'm safe for now...

view more: next ›

DR_Hero

joined 1 year ago