University of Chicago researchers release Nightshade to public, a tool that is intended to "poison" pictures in order to ruin generative models trained on them.☆ Yσɠƚԋσʂ ☆@lemmy.ml to Technology@lemmy.ml – 37 points – 9 months agonightshade.cs.uchicago.edu4Post a CommentPreviewHotTopNewOldReminder that this is made by Ben Zhao, the University of Chicago professor who stole open source code for his last data poisoning scheme.Thank you for backgroundIf this technology is so great, why does the site not show any before / after examples? Let alone demonstrating that it does what he claims?Because he can't.
Reminder that this is made by Ben Zhao, the University of Chicago professor who stole open source code for his last data poisoning scheme.Thank you for background
If this technology is so great, why does the site not show any before / after examples? Let alone demonstrating that it does what he claims?Because he can't.
Reminder that this is made by Ben Zhao, the University of Chicago professor who stole open source code for his last data poisoning scheme.
Thank you for background
If this technology is so great, why does the site not show any before / after examples? Let alone demonstrating that it does what he claims?
Because he can't.