Stealing everything you’ve ever typed or viewed on your own Windows PC is now possible with two lines of code — inside the Copilot+ Recall disaster.
doublepulsar.com
Q. Is this really as harmful as you think?
A. Go to your parents house, your grandparents house etc and look at their Windows PC, look at the installed software in the past year, and try to use the device. Run some antivirus scans. There’s no way this implementation doesn’t end in tears — there’s a reason there’s a trillion dollar security industry, and that most problems revolve around malware and endpoints.
You are viewing a single comment
Nah…. Just… just nah. This will never fly in enterprise environments
Not just enterprise. Some organizations handle extremely sensitive information of victims of crimes, survivors of wars, potential political targets, just to name a few. A feature taking a screenshot and registering all of that data is a nonstarter. MS will have to prove that the feature doesn't run with certain gov clients, the privacy risk is way too high.
On the other end of the spectrum, the vast majority of home users have no idea how to disable this or that it's even activated. There will be folders of Recall shit filling up everywhere, waiting for someone who knows it's there to access it.
If any of them access their work data on the Microsoft 365 web apps, it's now sitting in that folder, and they will not know.
This is honestly the biggest evidence yet of a need for some sort of regulation that certain privacy related things should not be allowed to be activated by default. They should always be opt-in, period.
Enterprise will love it because it will allow them timestamped access to everything their employees are doing during the day.
They will have it set up to alert on a various things...
"So, Bob, you were playing Minesweeper from 9:45 to 9:53, was that a scheduled break for you?"
"Jane, your screen showed no substantive changes from 1:03 to 4:15, you weren't in a meeting, what were you doing?"
The surveillance would be a double edged sword. If they were to be hacked, all sensitive information that was going through their PCs could be compromised.
They will convince themselves it can't be compromised. Never under-estimate the stupidity of middle management.
And no one was able to stop the White Star Line executives by saying, "maybe you shouldn't be 100% sure the Titanic is unsinkable?"
It won't.
All the crap from MS only affects ignorant home users. (I say that with no criticism - home users often lack significant expertise in this stuff).
Corporate has an IT team dedicated to image building, based on requirements gathering, which is well documented and well tested before it's deployed to even a small test group (usually us fellow IT geeks get to be Guinea pigs first).
Once it's been certified, then they'll deploy to a second, larger group, test and verify.
Wash, rinse, repeat.
Plus they'll probably start with new hires and anyone with a machine that is falling off lease/aging out. This gives them a little room, in that new hires don't have any local data (no one should have much in the first place), and people with aging machines can hold onto the old machine for a couple weeks as a fallback, just in case.
I've seen it several times, been part of deployment and upgrade teams.
Additionally, they deploy policies to redirect any MS network services to their own internally hosted services - windows is designed to do this, there are specific policies for everything, such us Windows Update services, even the MS App Store. Because no company wants machines pulling random crap from outside the company (they probably even block the access at the network level - I would).
Everything you’re describing is how it should be done. Realistically it isn’t done properly, all the time, and that’s why breaches happen.
Just like telemetry, this can be disabled on enterprise version of their OS.
This will fly for corporations wanting to use it themselves against their employees.