Chthonic

@Chthonic@slrpnk.net
0 Post – 21 Comments
Joined 12 months ago

It's not actually about listing the fees. They're worried that if they have to list the fees, customers will realize they're paying 19.99 a month to rent a router, or are getting charged for a land line they didn't ask for.

3 more...

When I was at Costco, for Member Service Week they literally gave us a rock, like from the gravel outside the office, with the note: "You rock!"

5 more...

That's misophobia, misophonia is when you don't like how soy paste sounds.

3 more...

I understand you guys are frustrated by Republican hypocrisy but it is literally designed into/a selling point of conservatism.

Wilhoit's Law:

Conservatism consists of exactly one proposition, to wit: There must be in-groups whom the law protects but does not bind, alongside out-groups whom the law binds but does not protect.

My understanding is that the SEC would have fucked him if he just shut it down, because it would indicate that he never intended to buy it in the first place and instead was just trying to manipulate the stock market (which is definitely what he was doing).

7 more...

It's not. He never wanted to buy twitter, he just wanted to pump and dump the stock, but because he is stupid and the plan was obvious they sued him to make him honor the deal.

So if he just turned around and shut the company down, it would give the SEC legal grounds to argue that his intention all along was market manipulation.

4 more...

How long will we continue to get news stories whenever a minor entity leaves X (formerly Twitter)?

1 more...

ill give u a bone squirtle

What's fucked up is that if you die here you die for real

3 more...

When I used to do copywriting for junk SEO, I began to suspect that my editor didn't actually read anything I wrote and just passed it through a content uniquness filter, so I started putting in random references to HP Lovecraft stories in the articles I got assigned.

They all got published, no questions asked. For a while if you searched "Homeopathy and the Esoteric Cult of Dagon" my content was the only result

4 more...

Don't worry I'll have your share, get fucked Mitch.

I know McConnell is just a lightning rod for hate for the GOP and as soon as he's gone some other amoral, sociopathic mercenary will take his place, but damn if he's not just the worst.

I work on chatbots for a big tech company. Every team is trying to use GenAI for everything. 90% of the stuff they try won't work. I have to explain that LLMs can't actually think at least three times a week. The hype train was too strong. Even calling it AI feels misleading.

That said, there are some genuinely great applications for LLMs that i've enjoyed looking into.

3 more...

Brilliant, very meta, love it

It's not that wild, is there anything more republican than voting against your own best interests?

I like the lighting and composition but it looks a little fried, how hard did you sharpen?

Do you honestly believe that if trump regains power they're going to nail him on state charges? We'll be lucky to ever have elections again, let alone have him face consequences for his crimes. If he wins it's gonna be full blown fascism

That may be true for warehouse employees, but the corporate offices are a toxic mess of shitty culture and dated ideas. I've never seen a tech department bleed so much underpaid talent to Amazon.

When I quit because they tried to force me back into the office mid-pandemic (August 2020) I had multiple offers for fully remote positions with twice the salary within a few weeks.

But yeah, if you are a cashier at a warehouse or whatever I hear it's a solid gig.

If he were smarter and/or not a walking ego then yeah, that would have been the move. Though if he were smart he probably wouldn't be in this mess.

They don't reason, they're stochastic parrots. Their internal mechanisms are well understood, no idea where you got the notion that the folks building these don't know how they work. It can be hard to predict/understand how an LLM generated a given prompt because of the huge training corpus and statistical nature of neural nets in general.

LLMs work the same as any other net, just with massive sample sets. They have no reasoning capabilities of any kind. We are naturally inclined to ascribe humanlike thought processes to them because they produce human-sounding outputs.

If you would like the perspective of real scientists instead of a "tech-bro" like me I would recommend Emily Bender and Timnit Gebru. I'd recommend them as experts without a vested interest in the massively overblown hype about what LLMs are actually capable of.

Hah, this was about 10 years ago - I doubt anything I wrote is still around.

How has Kenshi not been mentioned

1 more...