NZFVC webinar on consultation on regulating online services and media platforms
Thu 29 Jun 2023
We are hosting a webinar on 10 July 2023 looking at the consultation by Te Tari Taiwhenua | Department of Internal Affairs asking for feedback on a new system for keeping people safe online. Submissions for the consultation are due by 31 July 2023.
The New Zealand Family Violence Clearinghouse is hosting a webinar on Understanding the Safer Online Services and Media Platforms consultation. The webinar will give an overview of the proposed reforms and why this is important for people working in family violence and sexual violence. The webinar will be on Monday 10 July from 12:00pm to 1:00pm. Panellists for the webinar are Anjum Rahman from Inclusive Aotearoa Collective Tāhono, Kate Hannah from The Disinformation Project and Jo Robertson from The Light Project.
Consultation on regulating online services and media platforms
Te Tari Taiwhenua | Department of Internal Affairs (DIA) has proposed changes to the way media and online platforms are regulated in New Zealand to keep people safe online. DIA is inviting feedback on these reforms.
Submissions to the Safer Online Services and Media Platforms consultation close 31 July 2023.
General Manager Policy, Department of Internal Affairs, Suzanne Doig says "Child protection and consumer safety for media and online content is not as strong as it should be in Aotearoa. The existing regulatory system is decades old and predates media platforms such as social media."
Why is reform needed?
The regulatory legislation, including the Films, Videos and Publications (Classification Act) 1993 and the Broadcasting Act 1989, is over 30 years old. It does not cover the range of harms people experience across online services and media platforms.
A key focus of the proposed reform is around protecting children and young people from harm caused by online content. The DIA Frequently Asked Questions also highlights that many people are harmed by online content:
"During the targeted engagement phase of the programme, we also repeatedly heard concerns about how social media is used to harass, bully, or otherwise harm people. This concern was consistent across our engagement with Māori, young people, advocates for children, women, older people, disabled people, LGBTQIA+ communities, and communities that have been or are targets of ethnic, racial and/or religious discrimination.
These are real harms that are happening to New Zealanders. That’s why we need to get these settings right."
Misogynistic abuse and violent threats against women in New Zealand including wāhine Māori was also identified as a significant issue.
The consultation Discussion Document highlights many gaps and issues in the current regulatory system including challenges for seeking help or reporting harmful content:
"New Zealanders must figure out which of five industry complaint bodies to go to if they feel content is unsafe or breaches the conditions of the platform it is on. On top of that, not all forms of content are covered by those bodies. The system is also very reactive because it relies mainly on complaints about individual pieces of content. For most forms of content, we do not have the tools and powers to ensure that platforms are doing what they should to manage the risks of harmful content."
The Discussion Document also highlights that under our current system, behaviour that is illegal is sometimes tolerated online:
"Our current system has legal powers to deal with the most awful and illegal content like child sexual exploitation and promotion of terrorism, regardless of whether it is delivered online or through traditional forms of media such as printed publications. But sometimes content that includes other illegal actions (such as threatening to injure) can be taken less seriously or even amplified online."
The major change in the proposed reform is the way that social media platforms are regulated. Social media content is not consistently regulated in New Zealand. Many platforms have their own systems for dealing with unsafe content, but these are not overseen by a regulatory authority in New Zealand and are often voluntary. The Discussion Document notes:
'Unlike traditional broadcasters, like television and radio, online platforms do not have a single agreed code of standards, ethics, and rules. While platforms have their own policies to manage these harms, it is now internationally acknowledged that they need to be brought into formal regulatory systems to reduce the risk of harm."
What is proposed?
The proposed reform outlines a new way to regulate providers, like social media platforms (such as Facebook, YouTube and Twitter) and traditional media platforms (like radio and TV) under one framework. The focus is on regulating platforms, versus content. The purpose of the reforms is to enhance protections by reducing exposure to harmful content regardless of the platform. The changes would bring all platforms into one framework with consistent safety standards.
The are 4 key elements to the proposed changes. These elements are based on the principle that platforms are responsible for the safety of the products and services they provide. The 4 elements include:
- an industry regulation model that uses codes of practice
- an independent regulator
- continuing to remove and block access to the most harmful content
- investment in education and awareness
The consultation Factsheet explains the different roles in the proposed changes:
- "Parliament will set New Zealand’s expectations for the safety-based outcomes platforms must achieve.
- Codes of practice will set out more detailed minimum expectations for harm minimisation, user protection and transparency across services.
- An independent regulator would be responsible for approving the codes and making sure platforms comply with those codes, as well as funding and finding opportunities for education."
The Discussion Document provides more detailed information about the key elements, different roles and how the codes of practice would work. It also includes 4 appendices that cover more details about the current situation, the principles guiding the work, the rights framework in New Zealand and a comparison table for frameworks in other countries.
DIA is not proposing changes to what is considered illegal in New Zealand. The Discussion Document says:
"The system would retain powers of censorship for the most extreme types of content (called ‘objectionable’ material). This material is already illegal, and it will remain illegal to produce, publish, possess and share."
And it goes on to say:
"The regulator would also have powers to require illegal material to be removed quickly from public availability in New Zealand. These powers exist already for objectionable material. We are proposing that the regulator should also have powers to deal with material that is illegal for other reasons, such as harassment or threats to kill. We seek your feedback on what other kinds of illegal material the regulator should have powers to deal with."
The proposed changes are focused on regulating platforms based on their role in providing a platform for distributing harmful content. The current proposals would significantly affect the functions of the Classification Office, Film and Video Labelling Body and Broadcasting Standards Authority (see page 65 of the Discussion Document).
The Discussion Document also notes:
"Organisations such as Netsafe will continue to help people navigate the new framework, and generally educate people on how to keep themselves safe online. Netsafe would also be an important partner in helping to identify emerging systemic issues for the regulator’s attention, as well as supporting the regulator’s monitoring and public awareness role."
DIA has prepared several documents to explain the changes. This includes:
- a detailed discussion document (90 pages)
- brief executive summary
- factsheet (available in multiple languages)
- frequently asked questions
Most of these documents are available in alternate formats including large print, audio recording and video recording with sign language. You can find all information on the DIA Public Consultation: Safer Online Services and Media Platforms webpage.
DIA is hosting information session webinars. These sessions will explain the proposed changes, how to give feedback and answer questions. There are two sessions remaining on 8 July 2023 and 20 July 2023. Register for the free sessions.
InternetNZ | Ipurangi Aotearoa is hosting a free webinar on the basics of the Safer Online Services and Media Platforms framework on 4 July 2023. Register for the webinar.
DIA has also provided background information and a report summarising key themes from initial targeted engagement, Content Regulatory Review Summary of initial targeted engagement: September 2021 (published April 2022).
Update: InternetNZ has created short guides about key factors of the Safer Online Services and Media Platforms consultation. This includes information about how the consultation considers Te Tiriti o Waitangi and issues for Māori.
How to give feedback
The Discussion Document outlines 26 questions. You can provide feedback on some or all of these questions. You can also provide feedback that is not addressed in the questions.
You can give feedback by:
- submitting the online form
- downloading and filling in a form and sending it to firstname.lastname@example.org
- post to Safer Online Services and Media Platforms Consultation, Department of Internal Affairs, PO Box 805, Wellington 6140
- email to email@example.com.
Questions for feedback are listed in the Discussion Document on pages 12-13. For each question, it lists the page where you can find the relevant information in the Discussion Document.
Comments from community agencies
Anjum Rahman, Co-Lead of Inclusive Aotearoa Collective Tāhono “...welcome[d] the move to an independent regulator that will be protected from political interference by the State or any political actors. ” However, Rahman highlighted concerns saying
“...as the proposal is for the regulator to regulate platforms rather than content, the content regulation is left to industry codes of practice.
"We have been advocating for the past two years to have proper involvement of impacted communities in the development of the Aotearoa New Zealand Code of Practice for Online Safety and Harms administered by NZ Tech. We continue to be of the opinion that this Code will not serve the needs of the communities who need it most.
“To have further codes of practice developed by industry, rather than civil society and communities, will lead to similar results. Any new regulatory structures must include a strong community voice embedded in the model, rather than at the discretion of industry actors."
InternetNZ Chief Executive, Vivien Maidaborn welcomed the consultation saying "It could potentially be one of the most important opportunities in a generation to enable regulation that helps address harmful online content." Maidaborn encouraged communities who disproportionately experience harm to give feedback saying:
“It is especially important that communities that are at most risk have their say. This proposed regulation, and the codes to be developed under it, will not serve those communities unless they are heard during this process.
“Feedback from diverse voices is critical if the government is to get this right and the systems are going to be effective.”
InternetNZ | Ipurangi Aotearoa published a new report that explores what an ‘Internet for good’ means for people in Aotearoa. The report, An Internet that benefits (July 2023), shares findings from conversations with over 140 people across Aotearoa about their experiences with and aspirations for the internet. InternetNZ commissioned Toi Āria: Design for Public Good and Making Everything Achievable to do this research. The team actively sought to hear from tāngata whenua, Pasifika, people living with disability, LGBQTTQIA+ communities, migrant and former refugee communities and younger people. The report highlights 7 themes:
- "Our Aotearoa context is unique
- The Internet is here to stay
- The Internet is changing us
- The Internet is not safe for everybody
- A better Internet is a more accessible Internet
- A better Internet is a more diverse Internet
- A better Internet needs better education".
For more information see the InternetNZ media releases New research shows people want an active part in shaping the future of the Internet and Aspirations for a better Internet for all of Aotearoa.