'AI isn't reliable, has a ton of bias, tells many lies confidently, can't count or do basic math, just parrots whatever is fed to them from the internet, wastes a lot of energy and resources and is fucking up the planet...'.
When I see these critics about ai I wonder if it's their first day on the planet and they haven't met humans yet.
When I'm asking a question I don't want to hear what most people think but what people that are knowledgeable about the subject of my question think and LLM will fail at that by design.
LLMs don't wastes a lot, they waste at a ridiculous scale. According to statista training GPT-3 is responsible for 500 tCO2 in 2024.
All for what ? Having an automatic plagiarism bias machine ?
And before the litany of "it's just the training cost, after that it's ecologically cheap" tell me how you LLM will remain relevant if it's not constantly retrained with new data ?
LLMs don't bring any value, if I want information I already have search engine (even if LLMs degraded the quality of the result), if I want art I can pay someone to draw it, etc...
That seems so little because it doesn't account for the data-centers construction cost, hardware production cost, etc...
1 model costing as much as 1250 people breathing for a year is enormous to me.
'AI isn't reliable, has a ton of bias, tells many lies confidently, can't count or do basic math, just parrots whatever is fed to them from the internet, wastes a lot of energy and resources and is fucking up the planet...'. When I see these critics about ai I wonder if it's their first day on the planet and they haven't met humans yet.
... You are deliberately missing the point.
When I'm asking a question I don't want to hear what most people think but what people that are knowledgeable about the subject of my question think and LLM will fail at that by design.
LLMs don't wastes a lot, they waste at a ridiculous scale. According to statista training GPT-3 is responsible for 500 tCO2 in 2024. All for what ? Having an automatic plagiarism bias machine ? And before the litany of "it's just the training cost, after that it's ecologically cheap" tell me how you LLM will remain relevant if it's not constantly retrained with new data ?
LLMs don't bring any value, if I want information I already have search engine (even if LLMs degraded the quality of the result), if I want art I can pay someone to draw it, etc...
500 tons of CO2 is... surprisingly little? Like, rounding error little.
I mean, one human exhales ~400 kg of CO2 per year (according to this). Training GPT-3 produced as much CO2 as 1250 people breathing for a year.
That seems so little because it doesn't account for the data-centers construction cost, hardware production cost, etc... 1 model costing as much as 1250 people breathing for a year is enormous to me.