Facebook has announced new restrictions on live-streaming.
It follows the Christchurch attacks in March, in which 51 people were killed in shootings at two mosques.
The gunman live-streamed the attacks online.
Facebook will now restrict people who have 'broken certain rules' from using its live video service.
The company said: "From now on, anyone who violates our most serious policies will be restricted from using Live for set periods of time – for example 30 days – starting on their first offense.
"For instance, someone who shares a link to a statement from a terrorist group with no context will now be immediately blocked from using Live for a set period of time."
They've also pledged to work with academics to research better ways to detect 'manipulated media' such as edited videos.
The social network's announcement comes as Taoiseach Leo Varadkar and other world leaders gather in France in a bid to start tackling online extremism.
New Zealand Prime Minister Jacinda Ardern and French President Emmanuel Macron want countries and tech firms to formally agree to a pledge - dubbed the 'Christchurch Call' - to eliminate terrorist and extremist content online.
Mrs Arden welcomed Facebook's announcement today as a 'tangible first step'.
She observed: "There is a lot more work to do, but I am pleased Facebook has taken additional steps today alongside the Call and look forward to a long term collaboration to make social media safer by removing terrorist content from it."