Fear the Robot Revolution: Dallas

The "Robot" used in the Police Killing of an armed and dangerous Dallas man, suspected in killing five DPD officers and wounding seven more was, to be sure, a rolling drone, somewhat modified, rather than some autonomous homunculus on a mission to kill a man. 

Still, it's a disquieting moment when devices like this remote controlled unit, designed to investigate and possibly remove bombs, was used to deliver one that was meant to detonate.

Top of mind is the detachment and ease with which this state-sanctioned killing took place. Still, one has to wonder whether the use of the robotic device was at least as detached and easy as a man fuelled by hate, pointing a high-powered rifle at unsuspecting law men and women, and then pulling the trigger to the effect of hitting 12 people. Horrifying.

CNN describes how the situation took place here (story and video) and ZDnet discusses the controversy of death by government robot here. It's important to give some thought to these issues as we move toward a world where our technology penetrates every facet of our civil life.

"Facebook Decides Which Killings We’re Allowed to See"

Joseph Cox and Kason Koebler writing for Motherboard:

A video of the aftermath of a fatal shooting of a black man by a police officer was temporarily removed from Facebook. The company has said the removal was due to a “technical glitch.”

“We're very sorry that the video was inaccessible,” a Facebook spokesperson told The Telegraph. “It was down to a technical glitch and restored as soon as we were able to investigate.”

They go on to say:

The video has since been restored, but with a “Warning—Graphic Video,” disclaimer.

“Videos that contain graphic content can shock, offend and upset. Are you sure you want to see this?” the disclaimer continues.

Judging by the timestamps on tweets, the video was restored within around an hour of being removed.

Facebook did not respond to a series of questions about the apparent glitch, or if the video was flagged by a user or by Facebook itself.

This is problematic. Just last week, Facebook announced that they were going to change their news feed algorithms to favour content generated and uploaded by users rather than news sites like Upworthy and CNN. Here we have someone attempting to make sure the world sees the injustice she believes that her lover and her are facing and Facebook essentially silences her for a time.

Now it could be that of the millions of users who viewed the video, thousands or tens of thousands flagged the content as inappropriate. If that's the case, then they should just say so. 

Given the aforementioned algorithm change and the recent kerfuffle about the Trending News suffering from liberal editorial bias, Facebook is in jeopardy of losing the trust that it's re-gained over the last several years after multiple privacy issues. 

The bottom line is that while the social media giant is clearly a private company with its own processes, Facebook needs to come up with a clear set of guidelines for those within and without to work with. That would cease this confusion, as well as build and maintain trust.