Email
Password
Remember meForgot password?
    Log in with Twitter

article imageOp-Ed: Instagram to remove ‘graphic’ images of self-harm- How sweet

By Paul Wallis     Feb 8, 2019 in Internet
Sydney - Serial selfie site Instagram has graciously condescended to remove all images of “graphic self-harm” from the site. The move comes as yet another young British teen, Molly Russell, who took her life as a result of exposure to traumatic content.
The removal of these images seems to be part of a superficial damage control model seen way too many times before online. Any number of kids die, and some irresponsible collection of clowns finally does something about it.
WHEN does social media take responsibility?
This isn’t exactly impressive. Instagram is the new home of Insta-Celebrities, a must see for kids of all ages, and it’s being used as the forum for all the typical online horror stories. This horrible situation may not be Instagram’s idea, but to do nothing is inexcusable. Pinterest, rather annoyingly, is also in the firing line for image issues.
There’s no shortage of stories of teen suicides caused by online interactions with images, people, and a culture which let’s face it is nothing less than repulsive. Social media does have physical limitations in how fast it can respond to posts, true. You can only delete posts and block users so efficiently.
That said – Chronic abuse of social media sites and users over decades is hardly new. Nobody should be surprised that self-harm, in particular, a peer group standard, which includes any sort of “tough” imagery, is rife on social media. The difference in this case is that self-harm is a particularly serious, counterintuitive behaviour, and the risks are immediate and very real.
Meanwhile, let’s clarify – There is absolutely nothing acceptable about the current state of social media and the risks to kids. This must cease, and cease now. Nothing less than a safe environment can be tolerated. Instagram needs to be aware, also that not all parents and others involved may be quite so legally tolerant. If everyone affected by online abuse decided to do a class action, or even a single successful lawsuit, it’d be a Roe vs Wade moment for social media.
Can regulation work?
More likely, despite “modern” politics in the United States and elsewhere, is regulation of some kind, probably the ultra-conspicuous, but far-reaching variety. You know the sort of thing – “For our kids”, decades later.
The trouble with regulation, although the potential value of it is undeniable, is that it could be very broadly defined. Add to this multiple jurisdictions, and the fact that “online” doesn’t mean immunity from any type of law enacted anywhere, and you see the problem.
That’s not necessarily good news for enforcement, social media sites, kids, parents, or anyone else. Whatever’s done MUST work at the time of posting, or we’re going to have busloads more of kids suiciding because of idiots posting some type of crap online.
Stop it before it starts. Also check the incoming accounts for abusers. Fake accounts, dodgy IDs, there’s a chance of spotting these guys before they can do anything.
I hesitate to say “parental monitoring”, sadly, because the likely upshot is parents becoming the enemy trying to protect their kids from things kids don’t really understand. “Discreet” parental monitoring might be better, monitoring without obvious clashes. I think that is possible, with decent software and a good hands on/hands off balance for parents and kids.
One thing for sure – Anything that works has to be better than a multi-billion sector that doesn’t work. Sufficient comment?
This opinion article was written by an independent writer. The opinions and views expressed herein are those of the author and are not necessarily intended to reflect those of DigitalJournal.com
More about instagram, Instagram suicides, social media regulation, self harm on social media, parental monitoring on social media
More news from
Latest News
Top News