

Or when a truck is moving traffic lights
Or when a truck is moving traffic lights
Yes, as is already happening with police crime prediction AI. In goes data that says there is more violence in black areas, so they have a reason to police those areas more, tension rises and more violence happens. In the end it’s an advanced excuse to harass the people there.
He’s afraid of losing his little empire.
OpenAI also had no clue on recreation the happy little accident that gave them chatGPT3. That’s mostly because their whole thing was using a simple model and brute forcing it with more data, more power, more nodes and then even more data and power until it produced results.
As expected, this isn’t sustainable. It’s beyond the point of decreasing returns. But Sam here has no idea on how to fix that with much better models so goes back to the one thing he knows: more data needed, just one more terabyte bro, ignore the copyright!
And now he’s blaming the Chinese into forcing him to use even more data.
That’s Stellantis. It’s the same company that’s building cards that are now showing ads when the car stops at a red light.
We used to call those the AI winters. Barely any progress for years until someone has a great idea and suddenly there is a new form of AI and a new hype cycle again ending I in AI winter.
In a few years, somebody will find a way that leaves LLM in the dust but comes with its own set of limitations.
They are in the end BS generation machines that are trained so much they accidentally happen to be right often enough.
Fedi could learn a lot from them.
I left Facebook because the lack of control, so I’d say Fedi learned.
Line in the sand? Going after political opponents. Censoring information. Dismantling media. Abandoning rule of law. Business and government mixing too much.
USA is speed running these.