The announcement follows a spike in the number of people using live-streaming services to commit crimes, some of them devastating to view, in real time.
As Facebook Live grows in popularity, social-media sites are struggling to deal with violence being committed live on its platform.
There was the recent murder of an 11-month-old girl in Thailand by her father and, earlier, a man randomly shot dead in the US city of Cleveland.
At the annual F8 Facebook Developer Conference in the United States last month, Facebook co-founder Mark Zuckerberg conceded the company must take responsibility.
"We have a lot more to do here. And we're reminded of this this week by the tragedy in Cleveland. We have a lot of work, and we will keep doing all we can to prevent tragedies like this from happening."
Facebook has announced plans to bring 3,000 new employees aboard to help the social-networking site better monitor and promptly remove violent content.
The move marks a major expansion from the current global-operations team of 4,500 reviewing the millions of reports from users the site receives each week.
Those reports include non-video requests, too, some related to matters like hate speech and the exploitation of children.
Facebook shares had dropped slightly before the announcement.
But A&G Capital chief investment officer Hilary Kramer says the move is also about improving the company's relationship with its users.
"The first impression when one hears that Facebook is hiring 3,000 employees to monitor posts and to try to find the fake news that's out there amongst the Facebook posters is that it's really about public relations and trying to improve their relationship with their users. This has been a major problem for Facebook."
The issue of how to monitor and better respond to graphic content being live-streamed onto social-media platforms has become a heated topic.
It is a sinister edge that Peter Csathy, founder of the US-based, digital-consultancy firm Creatv Media, says could not have been predicted.
"It's a typical story when it comes to new technology. It's unintended consequences, and nobody is to blame for any of this. It's not Facebook's fault that this happened. No matter how big they are, they couldn't have anticipated these kinds of terrible things that happened. I absolutely believe that."
The videos of the Thailand and Cleveland murders were both left on the site for 24 hours.
Mr Csathy says the first thing to be done now is an awareness campaign that informs users on how to respond when they face such confronting content.
"There's probably a majority of Facebook users who still don't really know how to flag objectionable content. Facebook users aren't necessarily the most sophisticated when it comes to social media or technology. So I think an education campaign can help mitigate some of the risks and some of the terrible impact that comes as a result of this."
Murdoch University communication and media-studies professor Toby Miller says major social-media sites should be treated as mainstream media platforms.
"Ever since Facebook started, ever since Twitter started, they have been peddling a myth, and the myth is this: 'We are not like a TV network. We are not like a phone network. We are truly different. We are the creatures of the public.' This is simply nonsense, and it's irresponsible to continue operating with that governing myth."
Professor Miller says governments around the world need to subject social-media sites to the same strict procedures as other media companies.
"Nowhere in the world, including here in the US where I'm speaking to you from, is free speech absolute. There are always limits put on it, in terms of public safety, for example. And so it seems to me that there should be no difficulty for the Australian government in deciding that it will regulate this media form, social media, in the same sense in which it regulates, you know, Channel 9 or any other media service. It has that power. There is no restriction on it."
Yet, Mr Csathy says, the platforms cannot be shut down completely.
"Nothing is a hundred per cent foolproof. And so, as much as one could throw an endless stream of resources to be able to try to attack the issue, it's impossible to do. Almost like the war on terrorism. So much is being put into and invested, in terms of trying to fight it, but you can't lock down society. Things will happen."
The video-content editor of the technology publication Techly, Riordan Lee, says graphic content will always attract viewers.
"There's definitely a sort of morbid curiosity, I think, that a lot of people have. I think you see that with ISIS, with beheadings and things like that. They're things that tend to get a lot of views, which is quite horrifying. And there's definitely an element of human curiosity that is drawn to this, and that's something that's perhaps unfortunate but probably completely natural and understandable."
Earlier this week, Twitter announced 16 live-streaming video partnerships.
Peter Csathy says he hopes the social-media site will be mindful of the dangers.
"Live-streaming is a wonderful development. It's democratising. All of us can get our voices out around the world, that's very exciting. And these are the corner cases, the tragic corner cases that nobody anticipated. Now we know they exist, and all we can do is try to mitigate them the best we can. But this technology is here to stay. You can't shut it down."
Now that Facebook has acted, Peter Csathy says he hopes other social-media platforms will think about the dangers of the new technology -- and act.