_GOOGLEGLASS

Data, privacy and crime with Google Glass

Two weeks ago, Google CEO Sergey Brin’s presented Google Glass at TED. This week, there’s have an army of Glass-wearers wandering around the SXSW conference in Austin Texas. Suffice to say, Google Glass has generated a lot of discussion lately. There’s been a lot of excitement about the technology, but also a lot of concerns.

We linked to a great article last week about the “Google Glass feature no one is talking about.” It said:

“The most important Google Glass experience is not the user experience – it’s the experience of everyone else. The experience of being a citizen, in public, is about to change.”

Concerns about privacy aren’t anything new, although the article does raise important questions about new implications for these concerns that have perhaps been overlooked in favour of technological achievements. It’s true that wearing a device that can record audio or video on voice command as well as interacting with the other headsets in the area definitely takes concerns to a whole level beyond smartphones, although many of the comments on the article point out that the positive flipside – the potential to reduce crime.

We’ll be doing a series of posts this week that look at how technology such as Google Glass will affect law and order and interaction with them. Today, we look at the potential benefits of this technology.

In his (excellent) book, The Triumph of the City, Ed Glaeser paints this picture of crime:

“In the game of Clue, players solve a murder by progressively eliminating all the possible suspects. Real cops often do the same thing, but this process is much harder in cities because there are a lot more suspects to consider. As a result, the probability of being arrested for any given crime drops by about 8 percent as city population doubles.”

The book asserts that the probability of getting caught is a larger deterrent than harsher sentences. Cities, in other words, give criminals an advantage.

But that could change thanks to products like Google Glass.

Before this starts sounding too much like a science fiction film (Minority Report, maybe?), consider that police Oakland, California already uses software called ShotSpotter. According to this (excellent) article on The Guardian, ShotSpotter taps into hundreds of hidden microphones around the city and “not only alerts the police to the sound of gunshots but also triangulates their location.” The potential of a system like this when everyone is wearing an audio/video enabled headset is intuitively significant.

There is also software such as PredPol (predictive policing), which was reported in that same Guardian piece as being inspired by Amazon’s ability to “not only to anticipate but also to promote or otherwise shape future behaviour”. While this software currently incorporates only historically-recorded crime statistics, it could be expanded to include the masses of additional personal data that will also be captured by technologies such as Google Glass.

Even without advanced algorithms, technology like this could empower individual citizens. Just last week, a Mardi Gras attendee being brutally handled by police was filmed on a smartphone and circulated online. Considering that a phone still requires manual action and effort, the compression of time and effort afforded by a voice-command audio/video recording with cloud storage and seamless sharing will likely increase such occurrences. Police already rely on CCTV footage to assist with crime solving. When everyone is a potential walking CCTV camera then criminals as much as innocent citizens will be vulnerable to filming, which is something to consider.

Google Glass sits at the heart of the quantified self movement, which seeks to “incorporate technology into data acquisition on aspects of a person’s daily life.” The question is whether we’re willing to make that trade off is another.

Apart from the impacts on privacy, there’s also the question about oversight on the technology. As that same Guardian article points out, we know nothing of the Amazon algorithms on which PredPol is based. Even if they were open and available, relatively few of us would be able to interpret it.

Algorithms for policing software could be similarly opaque, which might be justified as minimising the chance that criminals could game them. In which case, we need to ask who’s going to monitor (if not police) the algorithms that may police us? How will they avoid “filter bubble“-style phenomena?

We don’t yet know what shape any change will from Google Glass will take, in part because Google like to release products without instructions to see what bottom-up uses emerge. The crime prevention discussed is one obvious use.

But as we prepare for that change, its’ important consider both the positives and negatives. That is, are we happy to give up some privacy if it helps reduce crime, or does this create some sort of uber Stasi and we’d rather take our risks with the criminals?