

It’s rough finding out that you’re not “one of the good ones”.
I don’t read my replies
It’s rough finding out that you’re not “one of the good ones”.
Have they tried budgeting or making coffee at home?
The contracts that steal music from artists haven’t changed one iota. Unless you’ve got juice like Paul McCarty, Beyonce, or Taylor Swift, and even then it can be a fight that takes years.
A long time ago, you could go to a special store and trade government paper for music disks and tape that you got to keep forever.
You don’t really hear about serial killers anymore.
CMV:
99% of self-identifying stoics are just dudes who watched a video on Marcus Aurelius and took away that holding in your feelings is a virtue actually.
Don’t forget that there is a BIOS setting that allows VM to work faster. It’s called virtualization. Search for the setting in your BIOS (it’s called something different across manufacturers). It’s usually located in a menu under “advanced CPU settings” or similar.
It’s often switched off by default.
I fucking love this because it leaves everybody with one of two conclusions. One, AI isn’t capable of doing the simplest of jobs. or Two, working a drive thru is actually quite complex and difficult and humans that master it are more capable than trillion dollar software.
Wow, AI researchers are not only adopting philosophy jargon, but they’re starting to cover some familiar territory. That is the difference between signifier (language) and signified (reality).
The problem is that spoken language is vague, colloquial, and subjective. Therefore spoken language can never produce something specific, universal, or objective.
“When we detect users who are planning to harm others, we route their conversations to specialized pipelines where they are reviewed by a small team trained on our usage policies and who are authorized to take action, including banning accounts,” the blog post notes. “If human reviewers determine that a case involves an imminent threat of serious physical harm to others, we may refer it to law enforcement.”
See? Even the people who make AI don’t trust it with important decisions. And the “trained” humans don’t even see it if the AI doesn’t flag it first. This is just a microcosm of why AI is always the weakest link in any workflow.
This is exactly the use-case for an LLM and even OpenAI can’t make it work.
Hitting the saloon with Lenny in Red Dead Redemption 2 means consuming a dangerous amount of questionable liquors.
The idea of personal action vs. corporate/government action is a false choice. The government can force the corpos to stop burning the planet, but that will mean significant lifestyle changes for everybody.
It also means getting our shit together about immigration/ migration/ refugees. And not just in the US, but globally. A humanitarian catastrophe is assured otherwise.
I’m not optimistic.
This is a core component of a corrupt system. Everyone needs dirt on everyone else to feel secure that they can screw you if you ever try to run against the grain.
Just look back at any of the police who get criminally charged. One of the charges against the cop will likely be “theft of state funds” or similar language. This means the cop stole overtime, which is so common as to be universal.
Another common example is tax-evasion under a dictatorship. Everyone in the ruling class does it, but run afoul of the regime, and off to jail you go.
The cheapest method is to abandon the body. People die without family all the time and the State has a method to dispose of unclaimed corpses. Cost $0
My technology skill makes me satisfied that your scale starts at zero, but annoyed it didn’t end with nine.
The American Psychological Association met with the FTC in February to urge regulators to address the use of AI chatbots as unlicensed therapists.
Protect our revenue, er patients!
It’s stupid not to be bigoted in a world where other people are.
An argument only someone invested in bigotry would make.
found no evidence to date that Microsoft’s Azure and AI technologies have been used to target or harm people in the conflict in Gaza.
-Microsoft, in May
Dear Microsoft, If you looked for evidence, that is going to imply that your software could totally be used to harm people, it just isn’t in this case. As far as you know.
It’s interesting to see the site treat it’s unpaid workers more and more like low level employees. I guess capitalists just can’t help themselves.