xor

@xor@lemm.ee
0 Post – 13 Comments
Joined 11 months ago

a great prank for computer labs... just rotate everything by 0.5 degrees...

1 more...

my mnemonics are:
e.g. = egxample, i.e. = in eother words

law of large numbers: it's probably fairly representative

well the chlorine in tap water is pretty bad for plants...

first they made terrorism the biggest problem in the world, declared war on it, made tons of draconian laws against it... and then they changed what the word means.
btw, check out the Animal Enterprise Terrorism Act, which makes it "terrorism" to work at a job with livestock and then secretly film the abuse other people do to them.
in some states, that is "legally" terrorism.
(its illegally, but good luck appealing that when you're in a guantanamo style blacksite somewhere, being force fed rectally)

i thought that too, but it appears to be a real news site...
i still find it hard to believe that reality has come this close to being a made-up shitpost...
but maybe, the fortnight playing convicted pedophile actually called the police because his wife's boyfriend might beat him up...
im pretty sure this means the universe will be collapsing in on itself pretty soon...

you mean !collapse@toomanysymbols.confusingmarkup/killsusability&

"no, you're thicc"... then explain to her it's the most popular body type by today's beauty standards... which fluctuates and health and happiness are all that matters... and youd love her if she was a brain in a jar, but she happens to be really hot... something like that
also, get an Australian Cattle Dog... that'll force ya'll to be more active...

no, it's transparency about moderation... :

under AB 587, a “social media company” that meets the revenue threshold must provide to the California AG: A copy of their current terms of service. Semiannual reports on content moderation. The semiannual reports must include: (i) how the terms of service define certain categories of content (e.g., hate speech, extremism, disinformation, harassment and foreign political interference); (ii) how automated content moderation is enforced; (iii) how the company responds to reports of violations of the terms of service; and (iv) how the company responds to content or persons violating the terms of service. The reports must also provide detailed breakdowns of flagged content, including: the number of flagged items; the types of flagged content; the number of times flagged content was shared and viewed; whether action was taken by the social media company (such as removal, demonetization or deprioritization); and how the company responded.

you click the enter key...

you have no idea, the depths of my laziness... the vast wastelands of me not trying hard...

amazing! what other topics do you not want to watch a video about?

3 more...

tell me more about how you don't want to watch a video

5 more...