Remember meForgot password?
    Log in with Twitter

article imageOp-Ed: Controlling extremist content goes beyond major media platforms

By Ken Hanly     Mar 17, 2019 in Technology
The shooter in the terrorist mass-shooting at two mosques in Christchurch in New Zealand posted video of the event live on Facebook lasting 17 minutes that was quickly posted to YouTube among other sites.
The issue of smaller sites
A recent Verge article notes: "But Facebook, Google, and Twitter aren’t the only places weighing how to handle violent extremism. And traditional moderation doesn’t affect the smaller sites where people are still either promoting the video or praising the shooter. In some ways, these sites pose a tougher problem — and their fate cuts much closer to fundamental questions about how to police the web. After all, for years, people have lauded the internet’s ability to connect people, share information, and route around censorship. With the Christchurch shooting, we’re seeing that phenomenon at its darkest."
Major platforms such as Facebook and YouTube removed postings of the whole video but portions of the video still are extant on these platforms as being part of news about the event. A recent article notes: " YouTube also has a system for immediately removing child pornography and terrorism-related content, by fingerprinting the footage using a hash system. But that system isn’t applied in cases like this, because of the potential for newsworthiness. YouTube considers the removal of newsworthy videos to be just as harmful. YouTube prohibits footage that’s meant to “shock or disgust viewers,” which can include the aftermath of an attack. If it’s used for news purposes, however, YouTube says the footage is allowed but may be age-restricted to protect younger viewers."
However, the central hub that the shooter used was 8chan it seems. The image board community has members who support far-right extremism. The channel is already left out of Google's Search listings and has been booted from one hosting service due to problems with child pornography. Thus there are still sites where the video is promoted and the shooter is even praised. 8chan claims it vigorously deletes any child pornography it receives. Subsequent to the shooting Internet service providers in New Zealand did block 8chan and a few other sites.
The problem with this sort of action is that it involves service providers making decisions as to the boundaries of free speech. The last several years far-right sites have been deplatformed by the actions of payment processors, domain registrars, hosting companies and other infrastructure providers withdrawing support. This deplatforming is not just associated with the far right but also with the far left. What are classified as extreme or hate groups are denied use of the Internet for funding as happened with the crowdfunding sites Hatreon and MakerSupport. The social network Gab was temporarily knocked out.
Control of all sites difficult if not impossible
Since there are still numerous companies that provide Internet service often a site that is taken down can find another service that will allow it to operate. The right-wing Daily Stormer for example was able to return on line after several bans. Gab the social network was able to get public support from the Seattle-based domain registrar. The troll haven Kiwi Farms links to a BitTorrent file of the New Zealand massacre video. This does not require hosting of any kind of central platform.
The Cloudflare approach
Cloudlflare is a company that aids in protecting sites against denial-of-service attacks. The company's general counsel Douglas Kramer said: “We view ourselves as an infrastructure company on the internet. We are not a content company. We don’t run a platform or create content or suggest it or moderate it. And so we largely view our point of view as one driven by neutrality,”
Kramer compares the situation of the company deciding what content it would protect is akin to a truck driver who makes decisions about what content he is allowed to transport in the newspapers he delivers. However, Cloudfare did ban the Daily Stormer but only after it claimed that Cloudfare endorsed its white supremacist ideology. Cloudfare urged governments to develop mechanisms for fighting problematic material on line as they had the sort of legitimacy that web platforms did not.
Government control of Internet content
Personally I do not think that social media giants should be making decisions as to what is allowable content on the Internet. This article argues that it is not just dangerous to allow this but futile. However, it may also be dangerous and even futile to allow service providers and others to control content.
However, government control of content is also dangerous. It should be limited to as little as possible. Social media should be warned to remove content that is criminal. If individuals are using the Internet to promote hate speech than they can be warned to desist or criminal action taken against them or the service provider. Personally I am even against hate speech laws. I think they drive the haters underground and lead them to violent action since after all even their communications are taken as unlawful. The establishment in effect makes their speech criminal. Note that in many countries there are hate speech laws which make speech opposed to or disrespectful of the ruler, king, or the armed forces criminal. The idea that you can prevent hateful ideas by banning their communication seems to me on the face of it to encourage the growth of such ideas and drive them underground where they become even more dangerous. Many intelligence services want jihadist sites to remain so that they can keep track of what goes on and what the jihadists are thinking. The repression of thoughts hateful or not shows a lack of confidence in the ability of a populace to fight hateful and erroneous ideas.
At present the idea of a responsible Internet for many is one that does not allow the expression of negative, hateful speech on social media. It will not tolerate such speech and will do its best to eliminate it. This is being socially responsible. To me a responsible Internet would have a social media that tolerated such negative speech. Tolerance is essential to freedom. Tolerance involves allowing much that one regards as evil to be expressed. One should speak out against it not ban it.
This opinion article was written by an independent writer. The opinions and views expressed herein are those of the author and are not necessarily intended to reflect those of
More about New Zealand attack videos, control of extremist content on the internet, Internet control
Latest News
Top News