“This photo, video or account does not violate our Community Guidelines”? Well, change your guidelines or practice what you preach, Instagram

By Penny Hoffmann

What do Instagram employees do to investigate whether content as serious as child pornography is evident? All it takes a lot of the time is a quick scroll through the account to see it.

If people who do not violate Instagram’s community guidelines get their content taken down (sometimes it happens quicker than other times), it makes one question whether staff –┬ápeople or maybe even bots – should remain employed if they are not abiding by their job’s responsibilities.

Pedophile gangs and nearly anything you could think of are present on social media.

When their attempt to lure their victims into danger is successful, victims are manipulated or blackmailed into posing for and sending nudes.

The effects that pornography has on the human body can be seen here.

These can then be uploaded to Instagram, other social media platforms and elsewhere to please unjust criminals and others.

When this content is uploaded, the time it takes to investigate who is responsible and to take it down matters tremendously. The longer the content is kept up, the longer the victim and potential future victims do not receive justice and the longer these criminals do not receive proper help.

Everyday Instagrammers are resorting to putting their own lives on the line to war against this content.

One Instagrammer is @swat.captain__:

“Today i saw a child being raped and it took Instagram five hours to take it down after many reports.”

“Yesterday i saw videos of a child being beat. Instagram did nothing.”

Swat Captain recommends that more people are employed for Instagram to fight pornographic and violent content as, if bots are being used, they are not efficiently upholding Instagram’s community guidelines:

“Instagram uses bots to do their work i believe and i don’t blame them. There’s a ton of reports coming in all the time about this kind of stuff, but i’d say they should check pornography or violence reports because for some reason a machine can’t.”

Due to this Instagram account’s work, one pedophile will be meeting police in California instead of a little girl and her cousin.

It is advised that there is just parental or adult guidance when teenagers and children use social media. Many child manipulation cases are not reported to receive proper solutions. Parents can be perpetrators, thus making it difficult for teenagers and children to reach out.