The Atlanta Police Department (APD) is gearing up to place a whopping 12,000 extra cameras around the city and, subsequently, our campus grounds, in hopes of lowering crime rates. When it comes to surveillance, the rather thin line between invitation and intrusion can be blurred. With the addition of 12,000 more “eyes,” it will be.
Our city is no stranger to crime. In fact, we have a hefty helping of it, with Atlanta violent crime rates being over three times that of the U.S. average, according to City-Data.com. So, while this rate has shown a steady decrease over the past thirteen years, the need for extended safety precautions, like extra surveillance cameras, seems justified.
Recruiting people to our city for business and residence is priority right now and so safe streets are a priority as well. We’re a ripe city, on the cusp of yet another great economic harvest with growing businesses in both entertainment and corporate industries and a brand new, shiny ferris wheel. The last thing the city wants is a headache brought on by increasing crime and decreasing investments.
But to what extent are we willing to go in order to “polish” the city?
Pretty far.
What’s initially outrageous about this initiative by the APD is the amount of extra surveillance they’re pushing for. The addition of 12,000 extra cameras seems like a push for intimidation and flexing of power rather than a push for comfort and security––and we’ll feel the effects of this. Many liken the sea of cameras to a “mass surveillance society” where most, if not all, of the population is being monitored at any given time.
With this comes the fear of losing civil and political rights and freedoms and the fear of a move towards a totalitarian state where the government controls your every move. While this fear may seem a bit outlandish to some, 12,000 extra cameras is a step, albeit a small step, towards a possibility of such a state if such power is abused by authorities.
The National Security Agency (NSA) has seen its share of lawsuits for allegedly abusing such power. Most notably, the NSA warrantless surveillance controversy under the Bush administration where the phone calls, text messages, emails and other modes of electronic communication of U.S. citizens were tapped without warrant under the premise of counterterrorism.
We want to walk the streets and feel like the city is ours. We don’t want to feel like we’re in a remake of “Men In Black,” where our world is actually in a bubble that’s being carefully observed by some suits in an anonymous location somewhere. Unless they’ve got a kinky habit, people simply don’t like to be watched by authoritative figures like the police, and we’d rather not know that we’re being watched.
“There’s no reason for a city to be building a giant video surveillance network like that. It will create some significant chilling effects on society if every minute, people know that they’re being watched,” American Civil Liberties Union Senior Policy Analyst Jay Stanley said.
What’s even more disturbing than the amount of added eyes is what they aim to see. Reminiscent of the blockbuster “Minority Report,” your face and your movements could be used to aid the police in what they call “predictive analysis.”
Facial recognition software is currently being tested, and while this could prove beneficial in catching criminals, it seems to strip away our identity as thermal blooded individuals, leaving a cold robotic corpse. We may as well wet identification signs on our foreheads. After all, what info will the authorities receive upon recognition of your face? Age? Race? Occupation? Criminal history? Whatever the info, it seems questionable that face recognition would be used in assessing a situation when issues like racial profiling still permeate our atmosphere.
But facial recognition isn’t the biggest splinter in the woods. Also being tested is the crowd behavior modeling and prediction program. This camera surveillance software will not only be able to identify suspicious packages, but predict fights based on crowd movement, preventing future crimes. Like facial recognition software, this could be beneficial but just as damaging.
Movement has not been the best mode of assessing a situation. Many people have been harmed or killed by police for reaching for what seemed like a weapon or moving in a manner that seemed threatening. These mistakes are one too many and this new program could lead to several more.
A hyperactive crowd of students gathering to protest immigration laws could appear differently on camera and lead to the presence of authorities. A gathering of student emcees engaging in a cypher in front of Walter’s shoe store could be mistaken for a brewing fight on camera. While this software will be intelligent enough to recognize patterns of crowd activity that leads to criminal activity, it’s still a machine and machines cannot predict every move that a human being will make.
Both of these softwares could lead to discrimination and civil unrest if abused.
Again, the addition of more eyes around a city that still has a pretty hefty amount of crime is welcome. It’s the “eyes behind the eyes” that have to remain objective and reasonable. Technology doesn’t cause a totalitarian society nor does technology cause discrimination and wrongful deaths. People who abuse technology cause these things.
Students and residents alike should seek more info on the software being tested and the intentions of its operators. Only with an understanding of both, can we work with authorities, creating a cooperative society instead of a merely operative one.