Experts said the companies could set their detection tools and removal processes to be more aggressive, but YouTube and Facebook have said they want to be careful not to remove sensitive videos that either come from news organizations or have news value.
On YouTube, the video lingered for hours after the attack as different individuals republished it.
According to authorities, a shooter appeared to livestream video of the attack on Facebook, documenting the attack on Facebook from the drive to the Al Noor Mosque from a first-person perspective, and it showed the shooter walking into the mosque from the vehicle and opening fire.
Facebook also issued a statement saying it had taken down the suspected shooter's Facebook and Instagram accounts and removed the video he posted of the attack.
Social networks have been caught flat-footed in many cases by videos showing violent acts including suicides and assassinations.
Facebook director of policy in Australia and New Zealand Mia Garlick said: "We continue to work around the clock to remove violating content from our site using a combination of technology and people". "We also cooperate with law enforcement to facilitate their investigations as required", it said.
Frustrated with years of similar obscene online crises, politicians around the globe on Friday voiced the same conclusion: social media is failing.
The fallout from the attack featured all the astringent elements that have become hallmarks of such modern acts of nihilistic violence: a discussion of the negative externalities of a globalized world, a left-wing quick to blame firearm proliferation, a right-wing eager to highlight spiritual disrepair, and social media behemoths seeking but struggling to contain internet hysteria.
PewDiePie, whose real name is Felix Kjellberg, said on Twitter he felt "absolutely sickened" that the alleged gunman referred to him during the livestream.
Users intent on sharing the violent video took several approaches - doing so at times with an nearly military precision. Reuters was unable to confirm the authenticity of the footage. Others shared shorter sections or screenshots from the gunman's livestream, which would also be harder for a computer program to identify.
She also said none of the people in custody were on security watch lists. Twitter and Google said they were working to stop the footage being reshared.
Politicians in multiple countries said social media companies need to take ownership of the problem.
"Tech companies have a responsibility to do the morally right thing".
The major internet platforms have pledged to crack down on sharing of violent images and other inappropriate content through automated systems and human monitoring, but critics say it isn't working. "We will do whatever is humanly possible for it to never happen again".
Prime Minister Jacinda Ardern said the events in Christchurch represented "an extraordinary and unprecedented act of violence", and that numerous victims could be migrants or refugees, according to The Associated Press.