- cross-posted to:
- worldnews@lemmit.online
- technology@lemmit.online
- cross-posted to:
- worldnews@lemmit.online
- technology@lemmit.online
University vending machine error reveals use of secret facial recognition | A malfunctioning vending machine at a Canadian university has inadvertently revealed that a number of them have been usin…::Snack dispenser at University of Waterloo shows facial recognition message on screen despite no prior indication
This is a pretty “generous” take. I ask you then: if the company isn’t doing communicating any of the scans/recordings, what is the purpose of the technology being installed in the first place?
Cameras are one thing.
But if you can actually process it, that’s a meaningful cost per unit. The only reason you do that is if you’re planning to use it.
This type of analysis is cheap nowadays. You could easily fit a model to extract demographics from an image on a Jetson Nano (basically a Raspberry Pi with a GPU). Models have gotten more efficient while hardware has also gotten cheaper.
MSRP is $100. Even assuming you can cut that to $50 in bulk, $50 per unit is something that manufacturers are going to take seriously as an added cost. They’re not going to pay it without an intent to use it.
And that’s before software costs. Even leveraging open source it’s still going to take investment to tailor it to your deployment.
I doubt they would implement thing on every vending machine. They can still derive some useful analytic data from a smaller sample size
That’s using it.
The only possible reason to have the hardware is because you intend to use it.
Marketing is often targeted, especially online (which is a huge privacy issue). I would guess they are using the data from these vending machines to measure the success of their marketing campaigns.
Like I said: generous. You are "guess"ing that what they are doing with it is above board. I’m not that trusting of corporations.
People trusted Boeing would put planes together with the utmost concern for safety… Then a fucking for feel off mid-flight.
The FAA failed to regulate Boeing. I’m pro regulation and laws that protect people’s privacy. And if this company and the individuals within it break the law they should receive appropriate punishments with fines tied to international revenue.
My point is that the laws should relate to privacy independent of the technology. The “ban face recognition” narrative misses the point and doesn’t address the threats. Facial recognition technology can be used in ways that don’t threaten individuals privacy and non facial recognition technologies can be a threat to individual privacy.
It’s cynical to assume this company is breaking privacy with no evidence. But it’s fair to say there needs to be greater punishments and regulations