It’s already too late to stop the ubiquitous tracking and monitoring of the public through biometrics, says Peter Waggett, Programme Leader at IBM’s Emerging Technology Group. We need to stop worrying about prevention, and start working out how to make the most of data garnered from that kind of surveillance.
“We’re fighting the wrong battle when we ask should we stop people being observed. That is not going to be feasible. We need to understand how to use that data better,” urged Waggett, who was speaking as part of a Nesta panel debate on what biometrics mean for the future of privacy.
“I’ve been working in biometrics for 20 years, and it’s reaching a tipping point where it’s going to be impossible not to understand where people are and what they are doing. Everything will be monitored. It’s part of the reason why when we put together the definition of biometrics it included biological and behavioural characteristics — it can be anything.”
To back up his point, Waggett identified a few of the futures once portrayed in science fiction movies, now a reality. Minority Report is generally the go to film for these kinds of comparisons. But it’s the commercial aspects of the film Waggett flagged up, rather than the gesture technology. In the film, the protagonist walks into a shop where an advert immediately pops up and draws on his past preferences to offer up some suggestions. “The one thing they got wrong is you won’t recognise you’re being scanned — the flashing red light in the film is for effect, but all that’s now feasible.”
It is a perfect example of how we need to be aware, now more than ever, of what data we are giving up, and, for companies, how best that data can be used without infringing on customer privacy and potentially threatening that relationship.
The EyeSee mannequins fitted in stores across Europe, Canada and the US in 2012 to gather age, sex and racial data on retail customers using facial recognition, so that stores can market their stores accordingly, are a great example of how not to engender customer trust. While just this week Iconeme, a technology and design company, has launched its VMBeacon mannequin system, which uses beacon technology in smartphones to automatically alert customers to product details via an app. When a customer comes within 100m of one of the mannequins, they will receive an alert about the available content, including details on the items the mannequin is wearing and links to purchase them straight from the shop’s website. It operates 24/7, so a passerby can buy an item when window-shopping, rather than entering the store.
There will, however, be options to “share with friends and access additional offers and rewards”, which will no doubt come with the proviso that customers share yet more personal data. For now, the system comes with privacy settings, so the user can choose whether or not to share basic data such as age and gender.
It is feasible, however, that if this type of technology gains traction customers will become willing to open their data for increasingly attractive “rewards”. Already retailers have the option of asking customers to sign in to the app with Facebook or Google+, as with most apps, which could potentially open up a whole realm of analytics options depending on the user’s privacy parameters. Add a camera to that mannequin, and using Facebook’s facial recognition tools it could soon be asking you — by name — how you feel today, or pointing out that your clothes are looking particularly shabby.
“The pressure to ID people is becoming more and important with things like the internet of things,” points Waggett. If we are to securely make the most of those future networks, we’re going to have to free up more of our biometrics. “Google Glass wants to block facial recognition to stop people using invasive technology, but I think a lot of these things can be used for good.”
Of course it’s not just facial recognition tools that are on the precipice of becoming widespread. Car seats can read your heartbeat, and you can even drive a vehicle with your brain power, says Waggett.
“We did a challenge with the BBC where we drove a London taxi using a gaming headset picking up brainwaves. You can be identified as an individual from different brainwaves and effectively block out anyone else from driving that car because of those unique signals.”
He conceded that yes, people are attempting to work out ways to break these systems, just like any other. But as fellow pannelist John Bustard, a lecturer at Queen’s University Belfast at the Centre for Secure Information Technologies, pointed out, many of these issues are already being addressed. Bustard, an expert in gait biometrics, explained how his team has been attempting to hack into academic level biometric systems. “These attempts were sometimes a sign of how we could reduce the effectiveness of biometrics, sometimes it rendered them completely ineffective.”
“However, countermeasures to reduce effectiveness of these types of attacks addressed the problems in all cases. In some cases, such as with gait, it actually resulted in a system that was superior to the original. We’re at the early stages, but this is an arms race, this kind of research.”
This is the process that would be undertaken when testing any kind of security system, backing up Waggett’s point that biometrics are really just a new tool to add to our existing arsenal of defences. It’s how we deploy it, that is the central question that needs answering — not whether we deploy it at all.
“Biometric systems are becoming much more accurate and ubiquitous,” said Waggett. “It is impossible not to be identifiable by some kind of signal you’re leaving behind. Accuracy is going up almost exponentially and we are dealing with concerns about privacy and how we map that.
“But trying to stop this would be fighting the wrong battle. The information is out of the bottle already — we have to deal with the issues surrounding it now. Embrace the challenge of what we’ve got, embrace understanding it and focus on what we can do with that new data.”
http://www.wired.co.uk/news/archive/2014-03/26/biometrics-the-good-and-bad
“Surrender” is the gist of the article here, and it’s hysterical how they portray Goggle to be the good guys:
“Google Glass wants to block facial recognition to stop people using invasive technology, but I think a lot of these things can be used for good.”
Consumers can put a stop to this, like anything else, by simply refusing to buy the stuff, and expressing their anger at being watched 24/7. Ending this is really just a question of whether or not people will be able to resist buying the latest gadget, and having the cajones to stand up and express their disdain for the technology.
Yes, if you sit there silently like a sheep, and buy whatever the TV tell you to, then you’re doomed to having these products spy on you 24/7, and paying for it too.