Report March 2025
Your organisation description
Empowering Users
Commitment 17
In light of the European Commission's initiatives in the area of media literacy, including the new Digital Education Action Plan, Relevant Signatories commit to continue and strengthen their efforts in the area of media literacy and critical thinking, also with the aim to include vulnerable groups.
We signed up to the following measures of this commitment
Measure 17.1 Measure 17.2 Measure 17.3
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
During the reporting period Meta ran a range of media literacy topics, focusing on a range of areas, including Youth, EU Elections, Gen AI, as well as EU national elections. These campaigns are outlined in more detail in QRE 17.2.1 with reach metrics outlined in SLI 17.2.1.
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Commitment 18
Relevant Signatories commit to minimise the risks of viral propagation of Disinformation by adopting safe design practices as they develop their systems, policies, and features.
We signed up to the following measures of this commitment
Measure 18.1 Measure 18.2 Measure 18.3
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Commitment 18 covers the current practices for Instagram in the EU. In keeping with Meta’s public announcements on 7 January 2025, we will continue to assess the applicability of this chapter to Instagram and we will keep under review whether it is appropriate to make alterations in light of changes in our practices, such as the deployment of Community Notes.
Commitment 19
Relevant Signatories using recommender systems commit to make them transparent to the recipients regarding the main criteria and parameters used for prioritising or deprioritising information, and provide options to users about recommender systems, and make available information on those options.
We signed up to the following measures of this commitment
Measure 19.1 Measure 19.2
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Commitment 21
Relevant Signatories commit to strengthen their efforts to better equip users to identify Disinformation. In particular, in order to enable users to navigate services in an informed way, Relevant Signatories commit to facilitate, across all Member States languages in which their services are provided, user access to tools for assessing the factual accuracy of sources through fact-checks from fact-checking organisations that have flagged potential Disinformation, as well as warning labels from other authoritative sources.
We signed up to the following measures of this commitment
Measure 21.1 Measure 21.2 Measure 21.3
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Commitment 21 covers the current practices for Instagram in the EU. In keeping with Meta’s public announcements on 7 January 2025, we will continue to assess the applicability of this chapter to Facebook and Instagram and we will keep under review whether it is appropriate to make alterations in light of changes in our practices, such as the deployment of Community Notes.
Commitment 23
Relevant Signatories commit to provide users with the functionality to flag harmful false and/or misleading information that violates Signatories policies or terms of service.
We signed up to the following measures of this commitment
Measure 23.1 Measure 23.2
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Commitment 24
Relevant Signatories commit to inform users whose content or accounts has been subject to enforcement actions (content/accounts labelled, demoted or otherwise enforced on) taken on the basis of violation of policies relevant to this section (as outlined in Measure 18.2), and provide them with the possibility to appeal against the enforcement action at issue and to handle complaints in a timely, diligent, transparent, and objective manner and to reverse the action without undue delay where the complaint is deemed to be founded.
We signed up to the following measures of this commitment
Measure 24.1
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Relevant updates to user notice and appeal processes were also made in 2023, in line with DSA requirements.
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Commitment 25
In order to help users of private messaging services to identify possible disinformation disseminated through such services, Relevant Signatories that provide messaging applications commit to continue to build and implement features or initiatives that empower users to think critically about information they receive and help them to determine whether it is accurate, without any weakening of encryption and with due regard to the protection of privacy.
We signed up to the following measures of this commitment
Measure 25.1 Measure 25.2
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Empowering Researchers
Commitment 26
Relevant Signatories commit to provide access, wherever safe and practicable, to continuous, real-time or near real-time, searchable stable access to non-personal data and anonymised, aggregated, or manifestly-made public data for research purposes on Disinformation through automated means such as APIs or other open and accessible technical solutions allowing the analysis of said data.
We signed up to the following measures of this commitment
Measure 26.1 Measure 26.2 Measure 26.3
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 26.1
Relevant Signatories will provide public access to non-personal data and anonymised, aggregated or manifestly-made public data pertinent to undertaking research on Disinformation on their services, such as engagement and impressions (views) of content hosted by their services, with reasonable safeguards to address risks of abuse (e.g. API policies prohibiting malicious or commercial uses).
QRE 26.1.1
Relevant Signatories will describe the tools and processes in place to provide public access to non-personal data and anonymised, aggregated and manifestly-made public data pertinent to undertaking research on Disinformation, as well as the safeguards in place to address risks of abuse.
QRE 26.1.2
Relevant Signatories will publish information related to data points available via Measure 25.1, as well as details regarding the technical protocols to be used to access these data points, in the relevant help centre. This information should also be reachable from the Transparency Centre. At minimum, this information will include definitions of the data points available, technical and methodological information about how they were created, and information about the representativeness of the data.
- Community Standards Enforcement Report: We publish this report publicly in our Transparency Centre on a quarterly basis to more effectively track our progress and demonstrate our continued commitment to making our services safe and inclusive. The report shares metrics on how we are doing at preventing and taking action on content that goes against our Community Standards (against 12 policies on Instagram).
- Quarterly Adversarial Threat Report: We share publicly our findings about coordinated inauthentic behaviour (CIB) we detect and remove from our platforms. As part of our quarterly adversarial threat reports, we will publish information about the networks we take down to make it easier for people to see progress we’re making in one place.
Measure 26.2
Relevant Signatories will provide real-time or near real-time, machine-readable access to non-personal data and anonymised, aggregated or manifestly-made public data on their service for research purposes, such as accounts belonging to public figures such as elected official, news outlets and government accounts subject to an application process which is not overly cumbersome.
QRE 26.2.1
Relevant Signatories will describe the tools and processes in place to provide real-time or near real-time access to non-personal data and anonymised, aggregated and manifestly-made public data for research purposes as described in Measure 26.2.
- Searching and filtering: searching public posts across Facebook and Instagram is easy with comprehensive sorting and filtering options. Post results can be filtered by language, view count, media type, content producer and more.
- Multimedia: Photos, videos and reels are available for dynamic search, exploration and analysis.
- Producer lists: customizable collections of content producers can be used to refine search results. Researchers can apply custom producer lists to a search query to surface public content from specific content owners on Facebook or Instagram.
- Endpoints and data fields: With 8 dedicated endpoints, the Content Library API can search across over 100 data fields from Instagram posts, including a subset of personal Instagram accounts.
- Search indexing and results: Powerful search capabilities can return up to 100,000 results per query.
- Asynchronous search: allows for queries to run in the background while a researcher works on other tasks. Query progress is monitored and tracked by the API.
For more details - see here.
QRE 26.2.2
Relevant Signatories will describe the scope of manifestly-made public data as applicable to their services.
- Posts shared by and information about Instagram business and creator accounts including from a subset of personal accounts.
- Available for most countries and territories but excluded from countries where Meta is still evaluating legal and compliance requirements
- The number of times a post or reel was displayed on screen
For more details - see here.
QRE 26.2.3
Relevant Signatories will describe the application process in place to in order to gain the access to non-personal data and anonymised, aggregated and manifestly-made public data described in Measure 26.2.
Measure 26.3
Relevant Signatories will implement procedures for reporting the malfunctioning of access systems and for restoring access and repairing faulty functionalities in a reasonable time.
QRE 26.3.1
Relevant Signatories will describe the reporting procedures in place to comply with Measure 26.3 and provide information about their malfunction response procedure, as well as about malfunctions that would have prevented the use of the systems described above during the reporting period and how long it took to remediate them.
Commitment 27
Relevant Signatories commit to provide vetted researchers with access to data necessary to undertake research on Disinformation by developing, funding, and cooperating with an independent, third-party body that can vet researchers and research proposals.
We signed up to the following measures of this commitment
Measure 27.1 Measure 27.2 Measure 27.3 Measure 27.4
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 27.1
Relevant Signatories commit to work with other relevant organisations (European Commission, Civil Society, DPAs) to develop within a reasonable timeline the independent third-party body referred to in Commitment 27, taking into account, where appropriate, ongoing efforts such as the EDMO proposal for a Code of Conduct on Access to Platform Data.
QRE 27.1.1
Relevant Signatories will describe their engagement with the process outlined in Measure 27.1 with a detailed timeline of the process, the practical outcome and any impacts of this process when it comes to their partnerships, programs, or other forms of engagement with researchers.
Measure 27.2
Relevant Signatories commit to co-fund from 2022 onwards the development of the independent third-party body referred to in Commitment 27.
QRE 27.2.1
Relevant Signatories will disclose their funding for the development of the independent third-party body referred to in Commitment 27.
Measure 27.3
Relevant Signatories commit to cooperate with the independent third-party body referred to in Commitment 27 once it is set up, in accordance with applicable laws, to enable sharing of personal data necessary to undertake research on Disinformation with vetted researchers in accordance with protocols to be defined by the independent third-party body.
QRE 27.3.1
Relevant Signatories will describe how they cooperate with the independent third-party body to enable the sharing of data for purposes of research as outlined in Measure 27.3, once the independent third-party body is set up.
SLI 27.3.1
Relevant Signatories will disclose how many of the research projects vetted by the independent third-party body they have initiated cooperation with or have otherwise provided access to the data they requested.
| Country | Nr of research projects for which they provided access to data |
|---|---|
| Austria | 0 |
| Belgium | 0 |
| Bulgaria | 0 |
| Croatia | 0 |
| Cyprus | 0 |
| Czech Republic | 0 |
| Denmark | 0 |
| Estonia | 0 |
| Finland | 0 |
| France | 0 |
| Germany | 0 |
| Greece | 0 |
| Hungary | 0 |
| Ireland | 0 |
| Italy | 0 |
| Latvia | 0 |
| Lithuania | 0 |
| Luxembourg | 0 |
| Malta | 0 |
| Netherlands | 0 |
| Poland | 0 |
| Portugal | 0 |
| Romania | 0 |
| Slovakia | 0 |
| Slovenia | 0 |
| Spain | 0 |
| Sweden | 0 |
| Iceland | 0 |
| Liechtenstein | 0 |
| Norway | 0 |
Measure 27.4
Relevant Signatories commit to engage in pilot programs towards sharing data with vetted researchers for the purpose of investigating Disinformation, without waiting for the independent third-party body to be fully set up. Such pilot programmes will operate in accordance with all applicable laws regarding the sharing/use of data. Pilots could explore facilitating research on content that was removed from the services of Signatories and the data retention period for this content.
QRE 27.4.1
Relevant Signatories will describe the pilot programs they are engaged in to share data with vetted researchers for the purpose of investigating Disinformation. This will include information about the nature of the programs, number of research teams engaged, and where possible, about research topics or findings.
Commitment 28
COOPERATION WITH RESEARCHERS Relevant Signatories commit to support good faith research into Disinformation that involves their services.
We signed up to the following measures of this commitment
Measure 28.1 Measure 28.2 Measure 28.3 Measure 28.4
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 28.1
Relevant Signatories will ensure they have the appropriate human resources in place in order to facilitate research, and should set-up and maintain an open dialogue with researchers to keep track of the types of data that are likely to be in demand for research and to help researchers find relevant contact points in their organisations.
QRE 28.1.1
Relevant Signatories will describe the resources and processes they deploy to facilitate research and engage with the research community, including e.g. dedicated teams, tools, help centres, programs, or events.
Relevant details about research tools are available on our Transparency Centre.
Measure 28.2
Relevant Signatories will be transparent on the data types they currently make available to researchers across Europe.
QRE 28.2.1
Relevant Signatories will describe what data types European researchers can currently access via their APIs or via dedicated teams, tools, help centres, programs, or events.
- Meta Content Library and API. For Instagram, it will include public posts and data. Data from the Library can be searched, explored, and filtered on a graphical user interface or through a programmatic API. 700+ researchers globally now have access to Meta Content Library.
- Ad Targeting Data Set, which includes detailed targeting information for social issue, electoral, and political ads that ran globally since August 2020. 150+ researchers globally have accessed Ads Targeting API since it launched publicly in Sept 2022.
- Influence Operations Research Archive for coordinated inauthentic behaviour (CIB) Network Disruptions, as outlined in QRE 27.4.1.
Measure 28.3
Relevant Signatories will not prohibit or discourage genuinely and demonstratively public interest good faith research into Disinformation on their platforms, and will not take adversarial action against researcher users or accounts that undertake or participate in good-faith research into Disinformation.
QRE 28.3.1
Relevant Signatories will collaborate with EDMO to run an annual consultation of European researchers to assess whether they have experienced adversarial actions or are otherwise prohibited or discouraged to run such research.
Measure 28.4
As part of the cooperation framework between the Signatories and the European research community, relevant Signatories will, with the assistance of the EDMO, make funds available for research on Disinformation, for researchers to independently manage and to define scientific priorities and transparent allocation procedures based on scientific merit.
QRE 28.4.1
Relevant Signatories will disclose the resources made available for the purposes of Measure 28.4 and procedures put in place to ensure the resources are independently managed.
Empowering fact-checkers
Commitment 30
Relevant Signatories commit to establish a framework for transparent, structured, open, financially sustainable, and non-discriminatory cooperation between them and the EU fact-checking community regarding resources and support made available to fact-checkers.
We signed up to the following measures of this commitment
Measure 30.1 Measure 30.2 Measure 30.3 Measure 30.4
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 30.1
Relevant Signatories will set up agreements between them and independent fact-checking organisations (as defined in whereas (e)) to achieve fact-checking coverage in all Member States. These agreements should meet high ethical and professional standards and be based on transparent, open, consistent and non-discriminatory conditions and will ensure the independence of fact-checkers.
QRE 30.1.1
Relevant Signatories will report on and explain the nature of their agreements with fact-checking organisations; their expected results; relevant quantitative information (for instance: contents fact-checked, increased coverage, changes in integration of fact-checking as depends on the agreements and to be further discussed within the Task-force); and such as relevant common standards and conditions for these agreements.
The detail of our partnership with fact-checkers (i.e., how they rate content and what actions we take as a result) is outlined in QRE 21.1.1 and here.
QRE 30.1.3
Relevant Signatories will report on resources allocated where relevant in each of their services to achieve fact-checking coverage in each Member State and to support fact-checking organisations' work to combat Disinformation online at the Member State level.
SLI 30.1.1
Relevant Signatories will report on Member States and languages covered by agreements with the fact-checking organisations, including the total number of agreements with fact-checking organisations, per language and, where relevant, per service.
| Country | Nr of agreements with fact-checking organisations |
|---|---|
| Austria | 0 |
| Belgium | 0 |
| Bulgaria | 0 |
| Croatia | 0 |
| Cyprus | 0 |
| Czech Republic | 0 |
| Denmark | 0 |
| Estonia | 0 |
| Finland | 0 |
| France | 0 |
| Germany | 0 |
| Greece | 0 |
| Hungary | 0 |
| Ireland | 0 |
| Italy | 0 |
| Latvia | 0 |
| Lithuania | 0 |
| Luxembourg | 0 |
| Malta | 0 |
| Netherlands | 0 |
| Poland | 0 |
| Portugal | 0 |
| Romania | 0 |
| Slovakia | 0 |
| Slovenia | 0 |
| Spain | 0 |
| Sweden | 0 |
| Iceland | 0 |
| Liechtenstein | 0 |
| Norway | 0 |
Measure 30.2
Relevant Signatories will provide fair financial contributions to the independent European fact-checking organisations for their work to combat Disinformation on their services. Those financial contributions could be in the form of individual agreements, of agreements with multiple fact-checkers or with an elected body representative of the independent European fact-checking organisations that has the mandate to conclude said agreements.
QRE 30.2.1
Relevant Signatories will report on actions taken and general criteria used to ensure the fair financial contributions to the fact-checkers for the work done, on criteria used in those agreements to guarantee high ethical and professional standards, independence of the fact-checking organisations, as well as conditions of transparency, openness, consistency and non-discrimination.
QRE 30.2.2
Relevant Signatories will engage in, and report on, regular reviews with their fact-checking partner organisations to review the nature and effectiveness of the Signatory's fact-checking programme.
QRE 30.2.3
European fact-checking organisations will, directly (as Signatories to the Code) or indirectly (e.g. via polling by EDMO or an elected body representative of the independent European fact-checking organisations) report on the fairness of the individual compensations provided to them via these agreements.
Measure 30.3
Relevant Signatories will contribute to cross-border cooperation between fact-checkers.
QRE 30.3.1
Relevant Signatories will report on actions taken to facilitate their cross-border collaboration with and between fact-checkers, including examples of fact-checks, languages, or Member States where such cooperation was facilitated.
Measure 30.4
To develop the Measures above, relevant Signatories will consult EDMO and an elected body representative of the independent European fact-checking organisations.
QRE 30.4.1
Relevant Signatories will report, ex ante on plans to involve, and ex post on actions taken to involve, EDMO and the elected body representative of the independent European fact-checking organisations, including on the development of the framework of cooperation described in Measures 30.3 and 30.4.
Commitment 31
Relevant Signatories commit to integrate, showcase, or otherwise consistently use fact-checkers' work in their platforms' services, processes, and contents; with full coverage of all Member States and languages.
We signed up to the following measures of this commitment
Measure 31.1 Measure 31.2 Measure 31.3 Measure 31.4
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 31.1
Relevant Signatories that showcase User Generated Content (UGC) will integrate, showcase, or otherwise consistently use independent fact-checkers' work in their platforms' services, processes, and contents across all Member States and across formats relevant to the service. Relevant Signatories will collaborate with fact-checkers to that end, starting by conducting and documenting research and testing.
Measure 31.2
Relevant Signatories that integrate fact-checks in their products or processes will ensure they employ swift and efficient mechanisms such as labelling, information panels, or policy enforcement to help increase the impact of fact-checks on audiences.
Measure 31.3
Relevant Signatories (including but not necessarily limited to fact-checkers and platforms) will create, in collaboration with EDMO and an elected body representative of the independent European fact-checking organisations, a repository of fact-checking content that will be governed by the representatives of fact-checkers. Relevant Signatories (i.e. platforms) commit to contribute to funding the establishment of the repository, together with other Signatories and/or other relevant interested entities. Funding will be reassessed on an annual basis within the Permanent Task-force after the establishment of the repository, which shall take no longer than 12 months.
QRE 31.3.1
Relevant Signatories will report on their work towards and contribution to the overall repository project, which may include (depending on the Signatories): financial contributions; technical support; resourcing; fact-checks added to the repository. Further relevant metrics should be explored within the Permanent Task-force.
Measure 31.4
Relevant Signatories will explore technological solutions to facilitate the efficient use of this common repository across platforms and languages. They will discuss these solutions with the Permanent Task-force in view of identifying relevant follow up actions.
QRE 31.4.1
Relevant Signatories will report on the technical solutions they explore and insofar as possible and in light of discussions with the Task-force on solutions they implemented to facilitate the efficient use of a common repository across platforms.
Commitment 32
Relevant Signatories commit to provide fact-checkers with prompt, and whenever possible automated, access to information that is pertinent to help them to maximise the quality and impact of fact-checking, as defined in a framework to be designed in coordination with EDMO and an elected body representative of the independent European fact-checking organisations.
We signed up to the following measures of this commitment
Measure 32.1 Measure 32.2 Measure 32.3
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 32.1
Relevant Signatories will provide fact-checkers with information to help them quantify the impact of fact-checked content over time, such as (depending on the service) actions taken on the basis of that content, impressions, clicks, or interactions.
Measure 32.2
Relevant Signatories that showcase User Generated Content (UGC) will provide appropriate interfaces, automated wherever possible, for fact-checking organisations to be able to access information on the impact of contents on their platforms and to ensure consistency in the way said Signatories use, credit and provide feedback on the work of fact-checkers.
Measure 32.3
Relevant Signatories will regularly exchange information between themselves and the fact-checking community, to strengthen their cooperation.
QRE 32.3.1
Relevant Signatories will report on the channels of communications and the exchanges conducted to strengthen their cooperation - including success of and satisfaction with the information, interface, and other tools referred to in Measures 32.1 and 32.2 - and any conclusions drawn from such exchanges.
Transparency Centre
Commitment 35
Signatories commit to ensure that the Transparency Centre contains all the relevant information related to the implementation of the Code's Commitments and Measures and that this information is presented in an easy-to-understand manner, per service, and is easily searchable.
We signed up to the following measures of this commitment
Measure 35.1 Measure 35.2 Measure 35.3 Measure 35.4 Measure 35.5 Measure 35.6
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 35.1
Signatories will list in the Transparency Centre, per each Commitment and Measure that they subscribe to, the terms of service and policies that their service applies to implement these Commitments and Measures.
Measure 35.2
Signatories provide information on the implementation and enforcement of their policies per service, including geographical and language coverage.
Measure 35.3
Signatories ensure that the Transparency Centre contains a repository of their reports assessing the implementation of the Code's commitments.
Measure 35.4
In crisis situations, Signatories use the Transparency Centre to publish information regarding the specific mitigation actions taken related to the crisis.
Measure 35.5
Signatories ensure that the Transparency Centre is built with state-of-the-art technology, is user-friendly, and that the relevant information is easily searchable (including per Commitment and Measure). Users of the Transparency Centre will be able to easily track changes in Signatories' policies and actions.
Measure 35.6
The Transparency Centre will enable users to easily access and understand the Service Level Indicators and Qualitative Reporting Elements tied to each Commitment and Measure of the Code for each service, including Member State breakdowns, in a standardised and searchable way. The Transparency Centre should also enable users to easily access and understand Structural Indicators for each Signatory.
Commitment 36
Signatories commit to updating the relevant information contained in the Transparency Centre in a timely and complete manner.
We signed up to the following measures of this commitment
Measure 36.1 Measure 36.2 Measure 36.3
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 36.1
Signatories provide updates about relevant changes in policies and implementation actions in a timely manner, and in any event no later than 30 days after changes are announced or implemented.
Measure 36.2
Signatories will regularly update Service Level Indicators, reporting elements, and Structural Indicators, in parallel with the regular reporting foreseen by the monitoring framework. After the first reporting period, Relevant Signatories are encouraged to also update the Transparency Centre more regularly.
Measure 36.3
Signatories will update the Transparency Centre to reflect the latest decisions of the Permanent Task-force, regarding the Code and the monitoring framework.
QRE 36.1.1
With their initial implementation report, Signatories will outline the state of development of the Transparency Centre, its functionalities, the information it contains, and any other relevant information about its functioning or operations. This information can be drafted jointly by Signatories involved in operating or adding content to the Transparency Centre.
QRE 36.1.2
Signatories will outline changes to the Transparency Centre's content, operations, or functioning in their reports over time. Such updates can be drafted jointly by Signatories involved in operating or adding content to the Transparency Centre.
SLI 36.1.1
Signatories will provide meaningful quantitative information on the usage of the Transparency Centre, such as the average monthly visits of the webpage.
| Country | Our company would like to provide the following data: Nr of fact-checkers IFCN-certified |
|---|---|
| Austria | 0 |
| Belgium | 0 |
| Bulgaria | 0 |
| Croatia | 0 |
| Cyprus | 0 |
| Czech Republic | 0 |
| Denmark | 0 |
| Estonia | 0 |
| Finland | 0 |
| France | 0 |
| Germany | 0 |
| Greece | 0 |
| Hungary | 0 |
| Ireland | 0 |
| Italy | 0 |
| Latvia | 0 |
| Lithuania | 0 |
| Luxembourg | 0 |
| Malta | 0 |
| Netherlands | 0 |
| Poland | 0 |
| Portugal | 0 |
| Romania | 0 |
| Slovakia | 0 |
| Slovenia | 0 |
| Spain | 0 |
| Sweden | 0 |
| Iceland | 0 |
| Liechtenstein | 0 |
| Norway | 0 |
Permanent Task-Force
Commitment 37
Signatories commit to participate in the permanent Task-force. The Task-force includes the Signatories of the Code and representatives from EDMO and ERGA. It is chaired by the European Commission, and includes representatives of the European External Action Service (EEAS). The Task-force can also invite relevant experts as observers to support its work. Decisions of the Task-force are made by consensus.
We signed up to the following measures of this commitment
Measure 37.1 Measure 37.2 Measure 37.3 Measure 37.4 Measure 37.5 Measure 37.6
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 37.1
Signatories will participate in the Task-force and contribute to its work. Signatories, in particular smaller or emerging services will contribute to the work of the Task-force proportionate to their resources, size and risk profile. Smaller or emerging services can also agree to pool their resources together and represent each other in the Task-force. The Task-force will meet in plenary sessions as necessary and at least every 6 months, and, where relevant, in subgroups dedicated to specific issues or workstreams.
Measure 37.2
Signatories agree to work in the Task-force in particular – but not limited to – on the following tasks: Establishing a risk assessment methodology and a rapid response system to be used in special situations like elections or crises; Cooperate and coordinate their work in special situations like elections or crisis; Agree on the harmonised reporting templates for the implementation of the Code's Commitments and Measures, the refined methodology of the reporting, and the relevant data disclosure for monitoring purposes; Review the quality and effectiveness of the harmonised reporting templates, as well as the formats and methods of data disclosure for monitoring purposes, throughout future monitoring cycles and adapt them, as needed; Contribute to the assessment of the quality and effectiveness of Service Level and Structural Indicators and the data points provided to measure these indicators, as well as their relevant adaptation; Refine, test and adjust Structural Indicators and design mechanisms to measure them at Member State level; Agree, publish and update a list of TTPs employed by malicious actors, and set down baseline elements, objectives and benchmarks for Measures to counter them, in line with the Chapter IV of this Code.
Measure 37.3
The Task-force will agree on and define its operating rules, including on the involvement of third-party experts, which will be laid down in a Vademecum drafted by the European Commission in collaboration with the Signatories and agreed on by consensus between the members of the Task-force.
Measure 37.4
Signatories agree to set up subgroups dedicated to the specific issues related to the implementation and revision of the Code with the participation of the relevant Signatories.
Measure 37.5
When needed, and in any event at least once per year the Task-force organises meetings with relevant stakeholder groups and experts to inform them about the operation of the Code and gather their views related to important developments in the field of Disinformation.
Measure 37.6
Signatories agree to notify the rest of the Task-force when a Commitment or Measure would benefit from changes over time as their practices and approaches evolve, in view of technological, societal, market, and legislative developments. Having discussed the changes required, the Relevant Signatories will update their subscription document accordingly and report on the changes in their next report.
QRE 37.6.1
Signatories will describe how they engage in the work of the Task-force in the reporting period, including the sub-groups they engaged with.
Monitoring of the Code
Commitment 38
The Signatories commit to dedicate adequate financial and human resources and put in place appropriate internal processes to ensure the implementation of their commitments under the Code.
We signed up to the following measures of this commitment
Measure 38.1
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 38.1
Relevant Signatories will outline the teams and internal processes they have in place, per service, to comply with the Code in order to achieve full coverage across the Member States and the languages of the EU.
QRE 38.1.1
Relevant Signatories will outline the teams and internal processes they have in place, per service, to comply with the Code in order to achieve full coverage across the Member States and the languages of the EU.
Commitment 39
Signatories commit to provide to the European Commission, within 1 month after the end of the implementation period (6 months after this Code’s signature) the baseline reports as set out in the Preamble.
We signed up to the following measures of this commitment
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Commitment 40
Signatories commit to provide regular reporting on Service Level Indicators (SLIs) and Qualitative Reporting Elements (QREs). The reports and data provided should allow for a thorough assessment of the extent of the implementation of the Code’s Commitments and Measures by each Signatory, service and at Member State level.
We signed up to the following measures of this commitment
Measure 40.1 Measure 40.2 Measure 40.3 Measure 40.4 Measure 40.5 Measure 40.6
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Commitment 41
Signatories commit to work within the Task-force towards developing Structural Indicators, and publish a first set of them within 9 months from the signature of this Code; and to publish an initial measurement alongside their first full report.
We signed up to the following measures of this commitment
Measure 41.1 Measure 41.2 Measure 41.3
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Crisis and Elections Response
Elections 2024
[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].
Threats observed or anticipated
Over many years, Meta has developed a comprehensive approach for elections on its platforms. While each election is unique, we have used our experience working on more than 200 elections around the world to build a robust election program that includes mature processes, tools, and policies to protect speech on our platform and safeguard the integrity of the elections. We continuously improve these measures to make sure they remain responsive to risks as they emerge, and we have reinforced these efforts in light of the regulatory framework set out under the Digital Services Act, the Election Guidelines, and our commitments under this Code.
We outlined our comprehensive approach for elections, and its particular relevance to the 2024 European Parliament (“EP”) elections, in our public post-elections report for the EP elections available on our Transparency Center. This work continued in earnest for the snap legislative elections in France, which were called on 9 June 2024 following the results of the EP elections, and which occurred shortly thereafter. Additionally, similar efforts were made for the Presidential and Parliamentary elections in Romania, held on 24 November 2024, and 1 December 2024, respectively.
Our comprehensive approach for elections was outlined in our public post-elections report for the EP elections available on our Transparency Center. Meta’s approach to elections is outlined in full across the following pillars:
- Utilising and deploying our policies, and our overall content moderation efforts, to remove policy-violating content and help keep people safe on our platforms
- Our election risk management processes
- Cooperation with external stakeholders
- Tools to support civic engagement
- Preventing interference and disinformation
- Reducing the spread of misinformation
- Safeguards and transparency efforts related to political advertising
- Responsible approach to Generative AI
This work continued in earnest for the European National elections, including snap legislative elections in France. Below we provide a summarised overview of support for the legislative elections in France and the impact of our efforts during this period, with the focus on 2 key aspects, which are:
- Cooperation with external stakeholders in advance of the elections:
- Working Group on Elections & Rapid Response System
- Engagement with national authorities
- Our work in the Generative AI space
Mitigations in place
Cooperation with External Stakeholders
France: Pre-Election Engagements with National Authorities and Civil Society:
As part of the Working Group, Meta participated in the various sessions organised ahead of the legislative elections in France to discuss election readiness with the signatories of the EU CoP on Disinformation, including fact checkers and civil society organisations. In these engagements, along with other signatory platforms, we presented the efforts and tools we were deploying to fight against misinformation and foreign interference, and to provide more transparency on political ads. In addition, we shared information on our civic products aimed at informing users. Meta also responded to questions from the different participants on escalation channels and approaches.
Meta conducted outreach and delivered comprehensive training to Autorité de régulation de la communication audiovisuelle et numérique (Arcom), as France’s appointed DSC. Arcom, as well as other onboarded DSCs, have access to Meta’s government reporting channels.
In addition to our engagements with VIGINUM at the roundtables discussed above, we held an engagement with them on 21 May 2024 to discuss our investments to prevent foreign interference, protect the elections, and establish the appropriate communication channels between our teams to ensure we could identify and tackle potential operations efficiently.
Ahead of the EP elections, Meta organised training sessions and office hours on our policies and products with French government organisations, political parties, and civil society organisations. Political parties were provided an email alias to contact for any urgent escalations around the election. We additionally launched an EU Election Center (https://www.facebook.com/government-nonprofits/eu) in all 24 EU official languages, including French, to support our government partners. For the legislative elections in France, these same resources were available and further office hours were offered to ensure provision of best practices and support.
Romania
As part of the elections preparations efforts, Meta has engaged with a full range of Romanian stakeholders to inform our processes and procedures and hear their concerns. Engagements with government and non government partners started ahead of the 2024 EP Elections, and continue at this point in time.
- Romanian government stakeholders: We are in regular contact with the AnCOM (Romanian Digital Service Coordinator), the Ministry of Digitalisation, the Electoral Body and the Romanian Cybersecurity agency on elections related topics . All of them are onboarded to our direct escalation channels, where they have been reporting content to us.
- Election Engagements with the European Commission, National Authorities and Civil Society: Similar to what we did in France, Meta participated in the various sessions organised ahead and after the 2024 elections to discuss election readiness with the signatories of the EU CoP on Disinformation, including fact checkers and civil society organisations. In these engagements, along with other signatory platforms, we presented the efforts and tools we were deploying to fight against misinformation and foreign interference, and to provide more transparency on political ads. In addition, we shared information on our civic products aimed at informing users. Meta also responded to questions from the different participants on escalation channels and approaches.
Working Group on Elections & Rapid Response System:
As part of the Working Group, Meta participated in the various sessions organised ahead of the legislative elections in France to discuss election readiness with the signatories of the EU CoP on Disinformation, including fact checkers and civil society organisations. In these engagements, along with other signatory platforms, we presented the efforts and tools we were deploying to fight against misinformation and foreign interference, and to provide more transparency on political ads. Meta also responded to questions from the different participants on escalation channels and approaches.
Romania
Rapid Alert System: Meta participated in the Rapid Alert system and has been in regular touch with civil society organisations from Romania, through various meetings and roundtables organised by the Disinfo working group. Meta created a direct escalation channel for five Romanian partners to report Community Standards violations, and unlawful content.
Political parties: Meta started engaging with Romanian Political Parties already in advance to the European Parliamentary Elections. Ahead of the 2024 Presidential and Parliamentary elections, Meta organised online training sessions on our policies and products and provided, and on how to contact Meta in case of an escalation.
Responsible Approach to Gen AI
Meta’s approach to responsible AI is another way that we are safeguarding the integrity of elections globally, including for the EU national elections.
Community Standards, Fact-Checking, and AI Labelling:
Meta’s Community Standards and Advertising Standards apply to all content, including content generated by AI. AI-generated content is also eligible to be reviewed and rated by Meta’s third-party fact-checking partners, whose rating options allow them to address various ways in which media content may mislead people, including but not limited to media that is created or edited by AI.
Meta labels photorealistic images created using Meta AI, as well as AI-generated images from Google, OpenAI, Microsoft, Adobe, Midjourney, and Shutterstock that users post to Facebook and Instagram.
Meta has begun labelling a wider range of video, audio, and image content when we detect industry-standard AI image indicators or when users disclose that they’re uploading AI-generated content. Meta requires people to use this disclosure and label tool when they post organic content with a photorealistic video or realistic-sounding audio that was digitally created or altered, and may apply penalties if they fail to do so. If Meta determines that digitally created or altered image, video, or audio content creates a particularly high risk of materially deceiving the public on a matter of importance, we may add a more prominent label, so that people have more information and context.
Political Ads and Meta’s AI Disclosure Policy:
- Depict a real person as saying or doing something they did not say or do; or
- Depict a realistic-looking person that does not exist or a realistic-looking event that did not happen, or alter footage of a real event that happened; or
- Depict a realistic event that allegedly occurred, but that is not a true image, video or audio recording of the event.
If advertisers do not disclose these specified scenarios, the ad may be disapproved. Repeated failure to disclose may result in further penalties to the account.
As a result of our policies and measures relating to AI-generated content, between 1 June - 21 July 2024, over 50 SIEP ads created by users in France across Facebook and Instagram were labelled with the “digitally created” AI disclaimer as a result of self-disclosure, providing enhanced transparency to users.
The below table shows information on the number of ads accepted and run with SIEP disclaimers as well as the number of ads removed for non-compliance with Meta’s SIEP policy between 1 June - 21 July 2024, where the inferred advertiser location at the time of enforcement was France. This reflects application of the above mentioned policies and measures.
Number of SIEP ads removed for not complying with our SIEP ads policy on Facebook and Instagram combined | Over 20,000
Continuing to Foster AI Transparency through Industry Collaboration:
Policies and Terms and Conditions
Policy
Prohibited Ads Policy
Changes (such as newly introduced policies, edits, adaptation in scope or implementation)
We've established measures in which ads related to voting around elections (this includes primary, general, special, and run-off elections) are subject to additional prohibitions and will be rejected if in violation of our policies. This policy applies to the Member States of the EU.
Rationale
Ads targeting the EU with the following content aren't allowed:
- Ads that discourage people from voting in an election. This includes ads that portray voting as useless/meaningless and/or advise people not to vote.
- Ads that call into question the legitimacy of an upcoming or ongoing election.
- Ads with premature claims of election victory.
This prohibition includes ads that call into question the legitimacy of the methods and processes of elections, as well as their outcomes.
Scrutiny of Ads Placements
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
Political Advertising
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
Policy updates regarding digitally altered content
- Depict a real person as saying or doing something they did not say or do; or
- Depict a realistic-looking person that does not exist or a realistic-looking event that did not happen, or alter footage of a real event that happened; or
- Depict a realistic event that allegedly occurred, but that is not a true image, video, or audio recording of the event.
Meta will add information on the ad when an advertiser discloses in the advertising flow that the content is digitally created or altered. This information will also appear in the Ad Library. If it is determined that an advertiser did not disclose as required, Meta will reject the ad. Repeated failure to disclose may result in penalties against the advertiser.
The expected impact of this policy is to increase users' awareness of when they are viewing advertisements related to social issues, elections or politics that are digitally altered. It will also increase the transparency of these ads by requiring that advertisers disclose this information.
Integrity of Services
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
All the measures outlined in Chapters 14 to 16 of this report were in place ahead of the European national elections.
Empowering Users
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
Reminders
We proactively point users to reliable information on the electoral process through in-app ‘Election Day Information’. These are notices at the top of feed on both Facebook and Instagram, reminding people of the day they can vote and re-directing them to national authoritative sources on how and where to vote.
For the legislative elections in France, the ‘Election Day Information’ feature ran between 29 - 30 June, and 6 - 7 July 2024 and directed users to a voting information page on the Ministry of the Interior's website. Users in metropolitan France and overseas territories clicked on these in-app notifications more than 599K times on Facebook and more than 496K times on Instagram, as shown in the table below:
Election Day Information; Instagram Clicks; Over 496,000
Media Literacy Partnerships
Around the EP elections, Meta engaged in several media literacy efforts. This included two campaigns in France to combat misinformation and prevent electoral interference:
A collaboration with the local fact-checking partner AFP Fact Check, producing a Reel video featuring popular French astronaut Thomas Pesquet reviewing a series of pictures and videos that had been shared online as hoaxes. He explains best practices and tools people should leverage when faced with a piece of news that seems unlikely. According to AFP, the videos resulted in nearly 2.5 million views on Instagram and Facebook.
Participation in a multi-platform campaign operated by the French partner NGO Génération Numérique, consisting of a series of educational short videos gathering tips and recommendations on avoiding becoming a victim of misinformation. According to Génération Numérique, the videos reached over 200k users and generated nearly 300k impressions on Instagram and Facebook alone.
Additional efforts included wider campaigns with the European Fact-Checking Standards Network (EFCSN) on how to spot AI-generated and digitally altered media, with the European Disability Forum (EDF). We refer readers to our EP post-elections report for further detail on these and other initiatives.
Ahead of the French legislative elections, Meta continued this investment in media literacy by launching a campaign on Meta owned channels (Facebook and Instagram). This campaign aimed to increase awareness of the tools and processes that Meta deploys on its own platforms (Facebook, Instagram, and WhatsApp) in advance of an election, to help inform French users how Meta works to combat misinformation, prevent electoral interference, and protect electoral candidates. The campaign ran from 20 June until 4 July 2024, a few days before the second round of the election. It reached 2.1 million users in France, generating 10.6 million impressions.
Training political candidates
Ahead of the EP elections, Meta organised training sessions and office hours on our policies and products with French government organisations, political parties, and civil society organisations. Political parties were provided an email alias to contact for any urgent escalations around the election. We additionally launched an EU Election Center (https://www.facebook.com/government-nonprofits/eu) in all 24 EU official languages, including French, to support our government partners. For the legislative elections in France, these same resources were available and further office hours were offered to ensure provision of best practices and support.
Empowering the Research Community
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
Empowering the Fact-Checking Community
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
In France, as a result of our misinformation policies and measures, we labelled over 1.8 million pieces of content on Facebook, and over 65K pieces of content on Instagram, with fact checks in the month leading up to and including the electoral period.
Content Treated with Misinformation Labels Around the French Elections
% of reshares attempted that were not completed on treated content | Facebook: 55.9% | Instagram: 46.6%
In addition, we had the measures outlined below.
Crisis 2024
[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].
Threats observed or anticipated
[War of aggression by Russia on Ukraine]
As outlined in our benchmark report, we took a variety of actions with the objectives of:
- Helping to keep people in Ukraine and Russia safe: We’ve added several privacy and safety features to help people in Ukraine and Russia protect their accounts from being targeted.
- Enforcing our policies: We are taking additional steps to enforce our Community Standards, not only in Ukraine and Russia but also in other countries globally where content may be shared.
- Reducing the spread of misinformation: We took steps to fight the spread of misinformation on our services and consulted with outside experts.
- Transparency around state-controlled media: We have been working hard to tackle disinformation from Russia coming from state-controlled media. Since March 2022, we have been globally demoting content from Facebook Pages and Instagram accounts from Russian state-controlled media outlets and making them harder to find across our platforms. In addition to demoting, labelling, demonetizing and blocking ads from Russian State Controlled Media, we are also demoting and labelling any posts from users that contain links to Russian State Controlled Media websites.
- In addition to these global actions, in Ukraine, the EU and UK, we have restricted access to Russia Today, Sputnik, NTV/NTV Mir, Rossiya 1, REN TV and Perviy Kanal and others.
- On 15 June 2024, we added restrictions to further state-controlled media organisations targeted by the EU broadcast ban under Article 2f of Regulation 833/2014. These included: Voice of Europe, RIA Novosti, Izvestia, Rossiyskaya Gazeta.
- On 17 September 2024, we expanded our ongoing enforcement against Russian state media outlets. Rossiya Segodnya, RT, and other related entities were banned from our apps globally due to foreign interference activities.
[Israel - Hamas War]
In the spirit of transparency and cooperation we share below the details of some of the specific steps we are taking to respond to the Israel - Hamas War.
Mitigations in place
Our main strategies are in line with what we outlined in our benchmark report, with a focus on safety features in Ukraine and Russia, extensive steps to fight the spread of misinformation (including through media literacy campaigns), tools to help our community access crucial resources, transparency around state controlled media and monitoring/taking action against any coordinated inauthentic behaviour.
This means (as outlined in previous reports) we will continue to:
- Monitor for coordinated inauthentic behaviour and other adversarial networks (See commitment 16 for more information on behaviour we saw from Doppelganger during the reporting period).
- Enforce our Community Standards
- Work with fact-checkers
- Strengthen our engagement with local experts and governments in the Central and Eastern Europe region
[Israel - Hamas War]
In the wake of the 07/10/2023 terrorist attacks in Israel and Israel’s response in Gaza, expert teams from across Meta took immediate crisis response measures, while protecting people’s ability to use our apps to shed light on important developments happening on the ground. As we did so, we were guided by core human rights principles, including respect for the right to life and security of the person, the protection of the dignity of victims, and the right to non-discrimination - as well as balancing those with the right to freedom of expression. We looked to the UN Guiding Principles on Business and Human Rights to prioritise and mitigate the most salient human rights risks: in this case, that people may use Meta platforms to further inflame an already violent conflict. We also looked to international humanitarian law (IHL) as an important source of reference for assessing online conduct. We have provided a public overview of our efforts related to the war in our Newsroom. The following are some examples of the specific steps we have taken:
- We quickly established a dedicated crisis response staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to this rapidly evolving situation in real time. This allows us to remove content that violates our Community Standards faster, and serves as another line of defence against misinformation.
- We continue to enforce our policies around Dangerous Organisations and Individuals, Violent and Graphic Content, Hate Speech, Violence and Incitement, Bullying and Harassment, and Coordinating Harm.
- In addition to this, our teams have detected and taken down a cluster of activity linked to Coordinated Inauthentic Behaviour (CIB) and attributed to Hamas in 2021. These fake accounts attempted to re-establish their presence on our platforms.
- In Q3 2024, we also removed 15 Facebook accounts, 15 Pages, and 6 accounts on Instagram for violating our policy against coordinated inauthentic behavior. This network originated in Lebanon and targeted primarily Israel. This network posted original content in Hebrew about news and geopolitical events in Israel with generic hashtags like #Israel, #Jerusalem, #Netanyahu, among others. It included posts about Israel’s dependence on US support, claims that Israeli people are leaving the country, claims of food shortages in Israel, and criticism of the Israeli government and its military strikes in the Middle East.
- We memorialise accounts when we receive a request from a friend or family member of someone who has passed away, to provide a space for people to pay their respects, share memories and support each other.
- We’re working with third-party fact-checkers in the region to debunk false claims. Meta’s third-party fact-checking network includes coverage in both Arabic and Hebrew, through AFP, Reuters and Fatabyyano. When they rate something as false, we move this content lower in Feed so fewer people see it.
- We recognise the importance of speed in moments like this, so we’ve made it easier for fact-checkers to find and rate content related to the war, using keyword detection to group related content in one place.
- We’re also giving people more information to help them decide what to read, trust, and share, by adding warning labels on content rated false by third-party fact-checkers and applying labels to state-controlled media publishers.
- We also have limits on message forwarding and we label messages that haven’t originated with the sender so people are aware that something is information from a third party.
- Hidden Words: This tool filters offensive terms and phrases from DM requests and comments.
- Limits: When turned on, Limits automatically hide DM requests and comments on Instagram from people who don’t follow you, or who only recently followed you.
- Comment controls: You can control who can comment on your posts on Facebook and Instagram and choose to turn off comments completely on a post by post basis.
- Show More, Show Less: This gives people direct control over the content they see on Facebook.
- Facebook Reduce: Through the Facebook Feed Preferences settings, people can increase the degree to which we demote some content so they see less of it in their Feed.
- Sensitive Content Control: Instagram’s Sensitive Content Control allows people to choose how much sensitive content they see in places where we recommend content, such as Explore, Search, Reels and in-Feed recommendations.
- https://www.oversightboard.com/decision/bun-kobfl44h/
- https://www.oversightboard.com/decision/bun-86tj0rk5/
Policies and Terms and Conditions
No further policy updates since our benchmark report
Rationale
We continue to enforce our Community Standards and prioritise people’s safety and well-being through the application of these policies alongside Meta’s technologies, tools and processes. There are no substantial changes to report on for this period.
Israel - Hamas War
For the duration of the ongoing crisis, Meta has taken various actions to mitigate the possible content risks emerging from the crisis. This includes, inter alia, under the Dangerous Organisations and Individuals Policy, removes imagery depicting the moment an identifiable individual is abducted, unless such imagery is shared in the context of condemnation or a call to release, in which case we allow with a Mark as Disturbing (MAD) interstitial; and, remove Hamas-produced imagery for hostages in captivity in all contexts. Meta has some further discretion policies which may be applied when content is escalated to us.
Scrutiny of Ads Placements
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
Measures taken to demonetise disinformation related to the crisis (Commitment 1 and Commitment 2)
As mentioned in our baseline report, our Advertising Standards prohibit ads that include content debunked by third-party fact-checkers and advertisers that repeatedly attempt to post content rated by fact-checkers may also incur restrictions to advertise across Meta technologies.
As mentioned in our baseline report, we prohibited ads or monetisation from all Russian state-controlled media. Before Russian authorities blocked access to Facebook and Instagram, we paused ads targeting people in Russia, and advertisers in Russia are no longer able to create or run ads anywhere in the world.
[Israel - Hamas War]
As noted in our baseline report, our policies are based on years of experience and expertise in safety combined with external input from experts around the world. We are continuously working to protect the integrity of our platforms and adjusting our policies, tools, and processes.
Political Advertising
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
As noted in our baseline report, our policies are based on years of experience and expertise in safety combined with external input from experts around the world. We are continuously working to protect the integrity of our platforms and adjusting our policies, tools, and processes.
Israel - Hamas War
As noted in our baseline report, our policies are based on years of experience and expertise in safety combined with external input from experts around the world. We are continuously working to protect the integrity of our platforms and adjusting our policies, tools, and processes.
AI Generated or altered SIEP ads disclosure (Commitment 3)
Advertisers now have to disclose whenever a social issue, electoral, or political ad contains a photorealistic image or video, or realistic sounding audio, that was digitally created or altered to:
- Depict a real person as saying or doing something they did not say or do; or
- Depict a realistic-looking person that does not exist or a realistic-looking event that did not happen, or alter footage of a real event that happened; or
- Depict a realistic event that allegedly occurred, but that is not a true image, video or audio recording of the event.
Meta will add information on the ad when an advertiser discloses in the advertising flow that the content is digitally created or altered. This information will also appear in the Ad Library. If it is determined that an advertiser did not disclose as required, Meta will reject the ad. Repeated failure to disclose may result in penalties against the advertiser.
Integrity of Services
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
Measures taken in the context of the crisis to counter manipulative behaviours/TTCs ( Commitment 14)
As mentioned in our baseline report, we have technical teams building scaled solutions to detect and prevent these behaviours, and are partnering with civil society organisations, researchers, and governments to strengthen our defences. We also improved our detection systems to more effectively identify and block fake accounts, which are the source of a lot of the inauthentic activity.
Since the invasion began, we shared what measures we’ve taken to help keep Ukrainians and Russians safe, our approach to misinformation, state-controlled media and ensuring reliable access to trusted information.
As mentioned in the baseline report, throughout the war, we have mobilised our teams, technologies and resources to combat the spread of harmful content, especially disinformation and misinformation as well as adversarial threat activities such as influence operations and cyber-espionage.
We continue to work with a cross-functional team of experts from across the company, including native Ukrainian and Russian speakers, who are monitoring the platform around the clock, allowing us to respond to issues in real time.
Israel - Hamas War
As noted in our baseline report, our policies are based on years of experience and expertise in safety combined with external input from experts around the world. We are continuously working to protect the integrity of our platforms and adjusting our policies, tools and processes.
Removing a Coordinated Inauthentic Behaviour Network (Commitment 14, Commitment 16)
In Q3, 2024, we removed 15 Facebook accounts, 15 Pages, and 6 accounts on Instagram for violating our policy against coordinated inauthentic behavior. This network originated in Lebanon and targeted primarily Israel. This network posted original content in Hebrew about news and geopolitical events in Israel with generic hashtags like #Israel, #Jerusalem, #Netanyahu, among others. It included posts about Israel’s dependence on US support, claims that Israeli people are leaving the country, claims of food shortages in Israel, and criticism of the Israeli government and its military strikes in the Middle East.
We removed this network before it was able to build authentic audiences on our apps.
Empowering Users
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
As noted in our baseline report, our policies are based on years of experience and expertise in safety combined with external input from experts around the world. We are continuously working to protect the integrity of our platforms and adjusting our policies, tools and processes.
Actions taken against dis- and misinformation content (for example deamplification, labelling, removal etc.) (Commitment 17)
State controlled media: We continue to take the actions we outlined in our benchmark report. We have taken further action to limit the impact of state controlled media, described above.
Escalation channel: This channel continues to operate as outlined in our benchmark report.
Promotion of authoritative information, including via recommender systems and products and features such as banners and panels (Commitment 19)
As mentioned in our baseline report, we provided tools to help our community access crucial resources and take action to support people in need.
We continued supporting the Halo Trust and the State Emergency Service of Ukraine to spread authoritative factual information about the risks in contaminated areas, risks related to unexploded ordinances and life-saving information around shelters. Notably we sponsored the targeted ads campaigns of Halo Trust and improved the WhatsApp chat bot run by the State Emergency Service of Ukraine to ensure a safe and secure infoline.
In addition, we provided an ad credits budget to 'Ty Yak?', a national mental health awareness campaign, to promote mental health resources for people affected by the war.
We continue to see funds raised on Facebook and Instagram for nonprofits in support of humanitarian efforts for Ukraine.
We continue to work through our Data for Good program, which empowers humanitarian organizations, researchers, UN agencies, and European policymakers to make more informed decisions on how to support the people of Ukraine.
Israel - Hamas War
As noted in our baseline report, our policies are based on years of experience and expertise in safety combined with external input from experts around the world. We are continuously working to protect the integrity of our platforms and adjusting our policies, tools, and processes.
Warning Screens on sensitive content, Sensitive Content Control and Facebook Reduce: (Commitment 17)
The 07/10/2023 attack by Hamas was designated as a Terrorist Attack under Meta’s Dangerous Organisation and Individuals policy. Consistent with that designation, we removed all content showing identifiable victims at the moment of the attack. Following that, people began sharing this type of footage in order to raise awareness and condemn the attacks. Meta’s goal is to allow people to express themselves while still removing harmful content. In turn, we began allowing people to post this type of footage within that context only, with the addition of a warning screen to inform users that it may be disturbing. If the user’s intent in sharing the content is unclear, we err on the side of safety and remove it.
Hidden words Filter (Commitment 18, Commitment 19)
Hidden Words help people choose offensive terms and phrases to hide, so they are protected from seeing them.
Limits (Commitment 18, Commitment 19,)
When turned on, Limits automatically hide DM requests and comments on Instagram from people who don’t follow you, or who only recently followed you.
This tool gives people choice about DM and requests they receive, which may be important when engaging online around sensitive topics.
Comment Controls (Commitment 18, Commitment 19)
This tool gives people control over engagement with what they post on Facebook and Instagram.
Show more Show less: (Commitment 18, Commitment 19)
Show More, Show Less gives people direct control over the content they see on Facebook. Selecting “Show more” will temporarily increase the amount of content that is like the post a user gave feedback on, while selecting “Show Less” means a user will temporarily see fewer posts like the one that feedback was given on.
Empowering the Research Community
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
Measures taken to support research into crisis related misinformation and disinformation (Commitment 17-25)
As mentioned in our baseline report, we continued providing baseline population density maps (the high resolution settlement layer) of Ukraine and surrounding countries to humanitarian organisations for supply-chain planning and to aid demining efforts. These are the most accurate in the world with 30 metre resolution and demographic breakouts by combining updated census estimates with satellite imagery (i.e., no Facebook user data).
Our Social Connectedness Index has been used by leading researchers, including the European Commission - Joint Research Centre unit on Demography, Migration and Governance to quantify the rate at which Ukrainian refugees seek shelter in European regions with existing Ukrainian diaspora.
Israel - Hamas War
As noted in our baseline report, our policies are based on years of experience and expertise in safety combined with external input from experts around the world. We are continuously working to protect the integrity of our platforms and adjusting our policies, tools, and processes.
Content Library and API tools (Commitment 26)
As we previously reported, Meta has opened access to tools such as the Content Library and API tools to provide access to near real-time public content from Pages, Posts, Groups and Events on Facebook and public content on Instagram. Details about the content, such as the number of reactions, shares, comments and, for the first time, post view counts are also available. Researchers can search, explore and filter that content on both a graphical User Interface (UI) or through a programmatic API. Together, these tools provide the most comprehensive access to publicly-accessible content across Facebook and Instagram of any research tool built to date.
Qualified individuals pursuing scientific or public interest research, including journalists can gain access to the tools if they meet all the requirements.
Empowering the Fact-Checking Community
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
As noted in our baseline report, our policies are based on years of experience and expertise in safety combined with external input from experts around the world. We are continuously working to protect the integrity of our platforms and adjusting our policies, tools, and processes.
Cooperation with independent fact-checkers in the crisis context, including coverage in the EU (Commitment 30-33)
As mentioned in our baseline report, for misinformation that does not violate our Community Standards, but undermines the authenticity and integrity of our platform, we work with our network of independent third-party fact-checking partners.
Israel - Hamas War
As noted in our baseline report, our policies are based on years of experience and expertise in safety combined with external input from experts around the world. We are continuously working to protect the integrity of our platforms and adjusting our policies, tools, and processes.
Working with fact checker in the region and deploying keyword detection (Commitment 30)
When they rate something as false, we move this content lower in Feed so fewer people see it.
Content Warning Labels (Commitment 31)
Meta is adding warning labels on content rated false by third-party fact-checkers and applying labels to state-controlled media publishers. We also have limits on message forwarding and label messages that haven’t originated with the sender so people are aware that something is information from a third party.
Meta is supporting people in the region by giving them more information to decide what to read, trust and share by adding warning labels onto relevant content.