The blood-curdling screams as a gunman stalks his chosen prey from room to room is truly disturbing. Computer algorithms, however, don’t have blood that can be curdled. And that’s the problem facing technology today: we’re experts at building tools that prey on universal insecurities, but terrible at keeping people safe.
Case in point is Facebook and YouTube’s inability to contain the Christchurch massacre livestream and subsequent viral videos. While both platforms have an army of human moderators, the bulk of the content removal process is algorithm-based and is better suited to copyright protection than filtering human atrocities.
Facebook is on record for removing one and a half million videos within the first 24 hours after the attack. In the click economy sharing and posting viral videos is big business and altering reproductions to stay ahead of the takedown computers is an essential skill. Change the key or pace of the audio, put a watermark on the video, splice it into other footage… These are only a couple of the AI-beating measures taken to avoid detection.
The shooter knew this would happen, which only makes him more dangerous. His forum posts and manifestos are cesspools of memes and inside jokes from the gutters of the internet. What I find chillingly ironic, however, is that brands would pay a lot of money for this kind of weaponising of internet culture.
Fame wasn’t the shooter’s game, though. I believe his unwitting aim was to bare the underbelly of our beloved communications network and let us sniff the unwashed navel.
“All of these nationalist, religious fundamentalist and racist movements have one answer: they say the best time was in the past,” says CPUT innovation in society research chair Thomas Thurner. I interviewed Thurner for an article in the May issue of TECH magazine and that thought stuck with me. “They are all counter-movements to this very pro-technology world we live in and are trying to relieve the tension of this very intense and fast reality.”
As we lose our humanity to the onward march of technology and innovation, everything we know about the world is questioned. Science forges ahead with new findings that confirm the anti-creationist theories about the planet and our evolution. We find out that we are not cast in the image of a divine power and, even worse, that we are mere animals.
The computers don’t care about the fragility of the human condition, all they can do is execute their programming. Data, while open to interpretation, will reveal the truth, no matter the consequence. Learning that you’re not special and then being thrown into the same pit as those you considered lesser is a difficult pill to swallow if you cling to the nostalgia of a time when everything favoured those who look like you. Maintaining your vision of the status quo can easily become a life-or-death situation.
Technology, in many ways, is a faceless threat to humanity. But when it it empowers the lesser to rise up and level the playing field it gains that community’s face. And because most of our social media platforms have been co-opted by marketers to sell us things by leaning on our insecurities, that face is interpreted as the embodiment of that emotional victimization.
As much as I love what humans have managed to create, I think this tragedy is a reminder that we should re-evaluate the mental effect of the social internet. The shooter had access to the very same services we all do. He could curate his information sources to support his views. He could surround himself with like-minded people from all over the world and then sought to appease their similar hunger for entertainment with a grotesque exhibition.
In the same way technology enabled the relative free movement of skills and people to create the “problems” he was trying to address, it gave him the platform to promote his vile acts. AI doesn’t hear the terrified screams of the victims. AI can’t interpret his actions as an act of terrorism. Machine Learning can neither predict these kinds of impulsive actions, nor piece together the online puzzle that lead him to act in the manner that he did. Only a human with empathy for his humanity can do that.
For better or worse the Christchurch shooter is one of our own and it seems unfair to expect technology to solve the problem or limit its reach. We make the technology.
Leave a Reply