[-] darkevilmac@vlemmy.net 7 points 1 year ago

Yes and no, I've worked on the backend for big apps before. You generally try and keep backwards compatibility as long as possible to give clients time to update. You can't just change API routes and have all the clients be on the latest version overnight.

[-] darkevilmac@vlemmy.net 12 points 1 year ago

Excel, is that you?

[-] darkevilmac@vlemmy.net 39 points 1 year ago

Nice try dentist

[-] darkevilmac@vlemmy.net 2 points 1 year ago

Rendering with JS definitely makes a difference, it's part of the reason SSR is such a big deal for SEO.

[-] darkevilmac@vlemmy.net 31 points 1 year ago* (last edited 1 year ago)

It's all well and good to have a revolution, but if nobody knows you're having one then nothing really changes. There are still benefits to centralised services, one of which being scale. To effectively index so much data you need scale, which is why smaller search engines tend to be just white labels of things like Bing.

[-] darkevilmac@vlemmy.net 3 points 1 year ago

Maybe, I'm a bit more optimistic though. I think even if they just did something like a read only service that pulls from other federated sources like their web crawlers do for regular sites they would basically be done.

The only concern there would be people trying to block them like everyone has been doing to Meta.

[-] darkevilmac@vlemmy.net 77 points 1 year ago

I feel like Google is going to have to find a way to effectively index federated content at some point. The only way to really get human information is from sites like Reddit and Twitter. And both of those platforms seem to be dedicated to completely imploding at the moment.

[-] darkevilmac@vlemmy.net 14 points 1 year ago

Don't forget poetry!

darkevilmac

joined 1 year ago