UPDATED 13:59 EDT / JULY 04 2016

NEWS

Facebook called “a monster” by Israeli security minister

There have been plenty of positive things to come out of the social media revolution over the last 10 years or so, and tools like Facebook and Twitter have been used for everything from tracking family members during a natural disaster to exposing human rights violations to international attention. Unfortunately, just as social media can be used for good, it can also be used to harass, incite violence, and even coordinate terrorist attacks.

Nearly every social media platform, including Facebook, has its hands full trying to find ways to prevent bad actors from using their tools for nefarious purposes, but in a recent interview, Israel’s Minister of Internal Security, Gilad Erdan, accused Facebook of not doing enough to prevent terrorists from using its platform to incite violence against Israel.

“Facebook today, which brought an amazing, positive revolution to the world, sadly, we see this since the rise of Daesh (Islamic State) and the wave of terror, it has simply become a monster,” Erdan told Channel 2 television (via Reuters).

“Facebook today sabotages, it should be known, sabotages the work of the Israeli police, because when the Israeli police approach them, and it is regarding a resident of Judea and Samaria, Facebook does not cooperate,” he continued. “It also sets a very high bar for removing inciteful content and posts.”

With a user base of roughly 1.09 billion daily users, it might seem an impossible task for Facebook to actively monitor the constant stream of content flowing through its platform, but this is one area that the social network’s investment in artificial intelligence research could pay off in a big way. Even a behemoth corporation like Facebook does not have the resources to employ real people to monitor all of the data coming through its site, but through natural language processing, a smart AI could automatically identify harmful posts or messages and remove them.

Of course, the definition of what is “harmful” could be up for debate, and given that Facebook has been repeatedly accused of censoring speech on behalf of the government in countries like Turkey, Facebook may find that a solution for one problem only raises several more.

Image courtesy of Facebook Inc

Since you’re here …

… We’d like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we don’t have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams at theCUBE — take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.

If you like the reporting, video interviews and other ad-free content here, please take a moment to check out a sample of the video content supported by our sponsors, tweet your support, and keep coming back to SiliconANGLE.