Even_Adder

@Even_Adder@lemmy.dbzer0.com
9 Post – 504 Comments
Joined 1 years ago

So Kadokawa had this failing mobile game named Kemono Friends, but they wanted a TV anime for it, so they get Omoto Tatsuki to direct it. He takes their product and actually turns it into something people want to watch, going as far as, releasing supplemental clips from the show on Twitter and even arranging a collaboration with the Tobu Zoo in Saitama Japan. This collab gave us the tragic saga of Grape-kun the Humbolt Penguin, a meme in its own right. Since the show didn't have much of a budget he hired no-name voice talent, launching their carriers thanks to the popularity of the show.

Ultimately, it was the clips on Twitter that did him in if rumors are to be believed. He was sharing clips as usual, but Kadokawa wanted that content for the home video special features, so they fired him abruptly. He already saved Kemono Friends for them, so they cut him loose.

I'm interested to see who else they screwed over, considering they also got caught bribing the Tokyo 2020 Organizing Committee to secure a sponsorship deal.

I'm not a bot.

My first time seeing someone go to bat for Kadokawa, considering how they treat their employees.

Kadokawa group sucks.

6 more...

Music copyright is such a shitshow. It doesn't surprise that they would try this.

Edit: I just heard the generated songs that are part of the lawsuit. They're pretty fucked if this is true,

9 more...

Here's one. Did they overfit their model and think they could block the bad prompts?

You're not going to develop AI for the benefit of humanity at Microsoft. If they go there, we'll know "Open"AI's mission was all a lie.

19 more...

Reminder that this is made by Ben Zhao, the University of Chicago professor who stole open source code for his last data poisoning scheme.

18 more...

I always thought it was a man at the gallows, not someone killing themselves.

16 more...

Tenuous at best.

14 more...

Yeah, but now we get to be robbed of a feature, like we were when Namco patented playing mini-games on loading screens. We needn't have suffered as much during the worst years of loading.

3 more...

Is this Claude? What a square.

Copilot chose violence.

3 more...

It always gets me that they made Obama white in this.

7 more...

I hope this helps tank that IPO.

1 more...

You shouldn't put too much stock in these detection tools. Not only do they not work, they flag non-native English speakers for cheating more than native speakers.

1 more...

So long as they only hit other Cybertruck owners.

15 more...

He took GPLv3 code, which is a copyleft license that requires you share your source code and license your project under the same terms as the code you used. You also can't distribute your project as a binary-only or proprietary software. When pressed, they only released the code for their front end, remaining in violation of GPLv3.

4 more...

Can someone explain the kunai intersection to me?

32 more...

God, I wish AMD and Intel could get their shit together.

16 more...

Quoting the U.S. Copyright Office's own guidance:

In other cases, however, a work containing AI-generated material will also contain sufficient human authorship to support a copyright claim. For example, a human may select or arrange AI-generated material in a sufficiently creative way that “the resulting work as a whole constitutes an original work of authorship.” Or an artist may modify material originally generated by AI technology to such a degree that the modifications meet the standard for copyright protection

Don't go nuts.

Reminder that this is made by Ben Zhao, the University of Chicago professor who illegally stole open source code.

1 more...

Ben Zhao, the University of Chicago professor behind this stole GPLv3 code for his last data poisoning scheme. GPLv3 is a copyleft license that requires you share your source code and license your project under the same terms as the code you used. You also can’t distribute your project as a binary-only or proprietary software. When pressed, they only released the code for their front end, remaining in violation of the terms of the GPLv3 license.

Nightshade also only works against open source models, because the only models with open models are Stable Diffusion's, companies like Midjourney and OpenAI with closed source models aren't affected by this. Attacking a tool that the public can inspect, collaborate on, and offer free of cost isn’t something that should be celebrated.

It's true.

There's at least this guy. Check the rest of the videos, this probably wasn't a joke.

5 more...

Can I have my cum back?

It's not fair the developers take the heat for this. We should learn to find the right people to complain to.

32 more...

This is just plain 'ol photoshop. The same slab of meat is duplicated in both hands at different sizes.

Infiltrating databases is too fancy a description for what was actually happening. Someone was just scraping the gallery website a little too fast, and that caused a mild DoS attack.

Nvidia has always been a tech company that also happens to make consumer graphics cards.

6 more...

It likely doesn't break the law. You should check out this article by Kit Walsh, a senior staff attorney at the EFF, and this one by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries.

Headlines like these let people assume that it's illegal, rather than educate people on their rights.

6 more...

Reminder that this is made by Ben Zhao, the University of Chicago professor who illegally stole open source code for his last data poisoning scheme.

You're too late.

1 more...

He would join Microsoft. So much for making AI to benefit humanity.

1 more...

I've only heard that running images through a VAE just once seems to break the Nightshade effect, but no one's really published anything yet.

You can finetune models on known bad and incoherent images to help it to output better images if the trained embedding is used in the negative prompt. So there's a chance that making a lot of purposefully bad data could actually make models better by helping the model recognize bad output and avoid it.

6 more...

That bot isn't what we would call AI these days.

9 more...

The model should be capable of much better than this, but they spent a long time censoring the model before release and this is what we got. It straight up forgot most human anatomy.

3 more...

I recommend reading this article by Kit Walsh, a senior staff attorney at the EFF if you haven't already. The EFF is a digital rights group who most recently won a historic case: border guards now need a warrant to search your phone.

Also remember that AI training isn’t only for mega-corporations. We can already train open source models, we shouldn't put up barriers that only benefit the ultra-wealthy. If we weaken fair use, we hand corporations a monopoly of a public technology by making it prohibitively expensive to for regular people to keep up. Mega corporations already own datasets, and have the money to buy more. And that's before they make users sign predatory ToS allowing them exclusive access to user data, effectively selling our own data back to us. Regular people, who could have had access to a competitive, corporate-independent tool for creativity, education, entertainment, and social mobility, would instead be left worse off and with fewer rights than where they started.

15 more...

God I hate SONY.

You are allowed to use copyrighted content for training. I recommend reading this article by Kit Walsh, a senior staff attorney at the EFF if you haven't already. The EFF is a digital rights group who most recently won a historic case: border guards now need a warrant to search your phone.

14 more...

This meme has already gotten too esoteric for me. Can I bother you for an explanation or a link to one?

4 more...