To make it easier to catch and resolve volatile situations early on, Facebook is changing how content is reported. It’s giving users tools to better communicate their feelings and handle conflicts themselves. The first change is specifically for 13- and 14-year-olds. If a boy in that age range wants to report a mean or threatening post or image a schoolmate has put on Facebook, he can click “This post is a problem” and go through a series of casually worded questions to determine what kind of issue he is having and how serious it is.

Kids aren’t the only ones who need a little help communicating their feelings on the Internet. Facebook looked at photos that are reported for removal by all ages, flagged for offenses like being pornographic, containing hate speech or depicting drug use. When they started to dig in, the team noticed images were frequently being flagged for more personal reasons — someone didn’t like how they looked in the photo, was embarrassed their boss could see them dancing on a table or maybe was just trying to wipe away evidence of an old romance.

Usually when a photo is reported for violating community standards, it goes to a Facebook employee who has to determine what steps to take. That adds up a lot of requests. By expanding the options and directing people to ask the person who posted a photo to take it down, Facebook is putting its members in charge of their own issues and saving itself some resources as a bonus.

Check out the article:

New Facebook Tools Could Help Battle Cyberbullying