Google

Report March 2025

Submitted

Your organisation description

Advertising

Commitment 1

Relevant signatories participating in ad placements commit to defund the dissemination of disinformation, and improve the policies and systems which determine the eligibility of content to be monetised, the controls for monetisation and ad placement, and the data to report on the accuracy and effectiveness of controls and services around ad placements.

We signed up to the following measures of this commitment

Measure 1.1 Measure 1.2 Measure 1.3 Measure 1.5 Measure 1.6

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

If yes, list these implementation measures here



Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

If yes, which further implementation measures do you plan to put in place in the next 6 months?

Transparency Centre

Commitment 34

To ensure transparency and accountability around the implementation of this Code, Relevant Signatories commit to set up and maintain a publicly available common Transparency Centre website.

We signed up to the following measures of this commitment

Measure 34.1 Measure 34.2 Measure 34.3 Measure 34.4 Measure 34.5

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

In line with Code commitments and alongside other Signatories, Google helped maintain the EU Code of Practice on Disinformation Transparency Centre, located at https://disinfocode.eu in H2 2024 (1 July 2024 to 31 December 2024).

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Commitment 35

Signatories commit to ensure that the Transparency Centre contains all the relevant information related to the implementation of the Code's Commitments and Measures and that this information is presented in an easy-to-understand manner, per service, and is easily searchable.

We signed up to the following measures of this commitment

Measure 35.1 Measure 35.2 Measure 35.3 Measure 35.4 Measure 35.5 Measure 35.6

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

In line with Code commitments, and alongside other Signatories, Google populated the EU Code of Practice on Disinformation Transparency Centre with related relevant information in H2 2024 (1 July 2024 to 31 December 2024).

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Commitment 36

Signatories commit to updating the relevant information contained in the Transparency Centre in a timely and complete manner.

We signed up to the following measures of this commitment

Measure 36.1 Measure 36.2 Measure 36.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

In line with Code commitments, Google uploaded its report to the newly launched Transparency Centre in March 2025.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

Yes

If yes, which further implementation measures do you plan to put in place in the next 6 months?

In line with Code commitments, Google plans to upload reports and pertinent updates to the Transparency Centre located at https://disinfocode.eu.

Measure 36.3

Signatories will update the Transparency Centre to reflect the latest decisions of the Permanent Task-force, regarding the Code and the monitoring framework.

QRE 36.1.1

With their initial implementation report, Signatories will outline the state of development of the Transparency Centre, its functionalities, the information it contains, and any other relevant information about its functioning or operations. This information can be drafted jointly by Signatories involved in operating or adding content to the Transparency Centre.

Note: The below QRE response has been reproduced (in some instances truncated in order to meet the suggested character limit) from the previous report as there is no new information to share now.

Google continues to upload its report according to the approved deadlines.

QRE 36.1.2

Signatories will outline changes to the Transparency Centre's content, operations, or functioning in their reports over time. Such updates can be drafted jointly by Signatories involved in operating or adding content to the Transparency Centre.

Google is committed to maintaining the website and supporting the development of relevant SLIs.

SLI 36.1.1

Signatories will provide meaningful quantitative information on the usage of the Transparency Centre, such as the average monthly visits of the webpage.

Between 1 July 2024 to 31 December 2024, the common Transparency Centre was visited by approximately 20,255 unique users, and reports were downloaded approximately 5,600 times from about 1,400 unique users. Download metrics specifically for Google’s EU Code of Practice on Disinformation reports between 1 July 2024 to 31 December 2024 are listed below:
  • The baseline report published in January 2023 was downloaded 237 times by 188 unique users.
  • The H1 2023 report published in July 2023 was downloaded 43 times by 34 unique users.
  • The H2 2023 report published in March 2024 was downloaded 210 times by 91 unique users.
  • The H1 2024 report published in September 2024 was downloaded 1,038 times by 172 unique users.

Country Our company would like to provide the following data: Nr of fact-checkers IFCN-certified
Austria 0
Belgium 0
Bulgaria 0
Croatia 0
Cyprus 0
Czech Republic 0
Denmark 0
Estonia 0
Finland 0
France 0
Germany 0
Greece 0
Hungary 0
Ireland 0
Italy 0
Latvia 0
Lithuania 0
Luxembourg 0
Malta 0
Netherlands 0
Poland 0
Portugal 0
Romania 0
Slovakia 0
Slovenia 0
Spain 0
Sweden 0
Iceland 0
Liechtenstein 0
Norway 0

Permanent Task-Force

Commitment 37

Signatories commit to participate in the permanent Task-force. The Task-force includes the Signatories of the Code and representatives from EDMO and ERGA. It is chaired by the European Commission, and includes representatives of the European External Action Service (EEAS). The Task-force can also invite relevant experts as observers to support its work. Decisions of the Task-force are made by consensus.

We signed up to the following measures of this commitment

Measure 37.1 Measure 37.2 Measure 37.3 Measure 37.4 Measure 37.5 Measure 37.6

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

N/A

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 37.6

Signatories agree to notify the rest of the Task-force when a Commitment or Measure would benefit from changes over time as their practices and approaches evolve, in view of technological, societal, market, and legislative developments. Having discussed the changes required, the Relevant Signatories will update their subscription document accordingly and report on the changes in their next report.

QRE 37.6.1

Signatories will describe how they engage in the work of the Task-force in the reporting period, including the sub-groups they engaged with.

Google has continued to meaningfully engage in Permanent Task-force Plenary sessions and sub groups, including but not limited to participation and/or co-steering of meetings, producing documents, and providing feedback.

Monitoring of the Code

Commitment 38

The Signatories commit to dedicate adequate financial and human resources and put in place appropriate internal processes to ensure the implementation of their commitments under the Code.

We signed up to the following measures of this commitment

Measure 38.1

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

N/A

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 38.1

Relevant Signatories will outline the teams and internal processes they have in place, per service, to comply with the Code in order to achieve full coverage across the Member States and the languages of the EU.

QRE 38.1.1

Relevant Signatories will outline the teams and internal processes they have in place, per service, to comply with the Code in order to achieve full coverage across the Member States and the languages of the EU.

Note: The below QRE response has been reproduced (in some instances truncated in order to meet the suggested character limit) from the previous report as there is no new information to share now.

Google has several teams across the company, including teams in Product, Policy, and Trust and Safety, whose work is relevant to but not restricted to Commitments made under this Code. This is core to Google’s mission of connecting people with high-quality information and preventing bad actors from misusing Google services to spread harmful content. To enforce policies fairly, consistently, and at scale, Google relies both on specially-trained experts and machine learning technology and has invested heavily in moderation efforts across platforms. Google enforces its policies globally, including in all EEA Member States and languages.

Commitment 39

Signatories commit to provide to the European Commission, within 1 month after the end of the implementation period (6 months after this Code’s signature) the baseline reports as set out in the Preamble.

We signed up to the following measures of this commitment

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

In line with Code commitments in H2 2024 (1 July 2024 to 31 December 2024), Google provided its fifth report to the European Commission.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Commitment 40

Signatories commit to provide regular reporting on Service Level Indicators (SLIs) and Qualitative Reporting Elements (QREs). The reports and data provided should allow for a thorough assessment of the extent of the implementation of the Code’s Commitments and Measures by each Signatory, service and at Member State level.

We signed up to the following measures of this commitment

Measure 40.1 Measure 40.2 Measure 40.3 Measure 40.4 Measure 40.5 Measure 40.6

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

In line with Code commitments in H2 2024 (1 July 2024 to 31 December 2024), Google provided its fifth report which included reporting on Service Level Indicators (SLIs) and Qualitative Reporting Elements (QREs) to the European Commission.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Commitment 41

Signatories commit to work within the Task-force towards developing Structural Indicators, and publish a first set of them within 9 months from the signature of this Code; and to publish an initial measurement alongside their first full report.

We signed up to the following measures of this commitment

Measure 41.1 Measure 41.2 Measure 41.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

  • Google has been an active participant in the working group dedicated to developing Structural Indicators.
  • Google supported the publication of Structural Indicators by TrustLab, through its collaboration with EDMO, ERGA, Avaaz and the European Commission.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

Yes

If yes, which further implementation measures do you plan to put in place in the next 6 months?

Google will continue to support the publication of Structural Indicators, and work towards further honing their methodology and scope.

Commitment 42

Relevant Signatories commit to provide, in special situations like elections or crisis, upon request of the European Commission, proportionate and appropriate information and data, including ad-hoc specific reports and specific chapters within the regular monitoring, in accordance with the rapid response system established by the Task-force.

We signed up to the following measures of this commitment

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As requested by the European Commission, Google provides an Annex on Elections to this report. In H2 2024, Google activated temporary RRSs for the elections in Romania and France at the EC’s request and also participated in discussions on the establishment of a permanent rapid response system (RRS) by the Task-Force. 

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Commitment 43

Relevant Signatories commit to provide, in special situations like elections or crisis, upon request of the European Commission, proportionate and appropriate information and data, including ad-hoc specific reports and specific chapters within the regular monitoring, in accordance with the rapid response system established by the Taskforce.

We signed up to the following measures of this commitment

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

N/A

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Commitment 44

Relevant Signatories that are providers of Very Large Online Platforms commit, seeking alignment with the DSA, to be audited at their own expense, for their compliance with the commitments undertaken pursuant to this Code. Audits should be performed by organisations, independent from, and without conflict of interest with, the provider of the Very Large Online Platform concerned. Such organisations shall have proven expertise in the area of disinformation, appropriate technical competence and capabilities and have proven objectivity and professional ethics, based in particular on adherence to auditing standards and guidelines.

We signed up to the following measures of this commitment

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

N/A - It was agreed with the European Commission that this commitment is duplicative of DSA requirements, and should therefore be deleted from the EU Code of Practice on Disinformation text. Google is taking steps to be subject to an audit under the DSA, for relevant services.

If yes, list these implementation measures here

N/A

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

N/A

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Crisis and Elections Response

Elections 2024

[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].

Threats observed or anticipated


Overview
In elections and other democratic processes, people want access to high-quality information and a broad range of perspectives. High-quality information helps people make informed decisions when voting and counteracts abuse by bad actors. Consistent with its broader approach to elections around the world, during the various elections across the EU in H2 2024, Google was committed to supporting this democratic process by surfacing high-quality information to voters, safeguarding its platforms from abuse and equipping campaigns with the best-in-class security tools and training. 

To do so, Google will continue its efforts in 2025 to: 
  • Safeguard its platforms;
  • Inform voters by surfacing high-quality information;
  • Equip campaigns and candidates with best-in-class security features and training; and
  • Help people navigate AI-generated content.

Mitigations in place


Across Google, various teams support democratic processes by connecting people to election information like practical tips on how to register to vote or providing high-quality information about candidates. In 2024, a number of key elections took place around the world. In June 2024, voters across the 27 Member States of the European Union took to the polls to elect Members of European Parliament (MEPs). In H2 2024, voters also cast their ballots in the Romanian presidential election and in the second round of the French legislative election. Google was committed to supporting these democratic processes by surfacing high-quality information to voters, safeguarding its platforms from abuse and equipping campaigns with the best-in-class security tools and training. Across its efforts, Google also has an increased focus on the role of artificial intelligence (AI) and the part it can play in the misinformation landscape — while also leveraging AI models to augment Google’s abuse-fighting efforts. 

Safeguarding Google platforms and disrupting the spread of misinformation
To better secure its products and prevent abuse, Google continues to enhance its enforcement systems and to invest in Trust & Safety operations — including at its Google Safety Engineering Centre (GSEC) for Content Responsibility in Dublin, dedicated to online safety in Europe and around the world. Google also continues to partner with the wider ecosystem to combat misinformation. 
  • Enforcing Google policies and using AI models to fight abuse at scale: Google has long-standing policies that inform how it approaches areas like manipulated media, hate and harassment, and incitement to violence — along with policies around demonstrably false claims that could undermine democratic processes, for example in YouTube’s Community Guidelines and its Political Content Policies for advertisers. To help enforce Google policies, Google’s AI models are enhancing its abuse-fighting efforts. With recent advances in Google’s Large Language Models (LLMs), Google is building faster and more adaptable enforcement systems that enable us to remain nimble and take action even more quickly when new threats emerge.
  • Working with the wider ecosystem: Since Google’s inaugural contribution of €25 million to help launch the European Media & Information Fund, an effort designed to strengthen media literacy and information quality across Europe, 70 projects have been funded across 24 countries so far. Google also supports numerous civil society, research and media literacy efforts from partners, including the Civic Resilience Initiative, Baltic Centre for Media Excellence, CEDMO and more.

Helping people navigate AI-generated content
Like any emerging technology, AI presents new opportunities as well as challenges. For example, generative AI makes it easier than ever to create new content, but it can also raise questions about trustworthiness of information. Google put in place a number of policies and other measures that have helped people navigate content that was AI-generated. Overall, harmful altered or synthetic political content did not appear to be widespread on Google’s platforms. Measures that helped mitigate that risk include: 
  • Ads disclosures: Google expanded its Political Content Policies to require advertisers to disclose when their election ads include synthetic content that inauthentically depicts real or realistic-looking people or events. Google’s ads policies already prohibit the use of manipulated media to mislead people, like deep fakes or doctored content.
  • Content labels on YouTube: YouTube’s Misinformation Policies prohibit technically manipulated content that misleads users and could pose a serious risk of egregious harm — and YouTube requires creators to disclose when they have created realistic altered or synthetic content, and will display a label that indicates for people when the content they are watching is synthetic. For sensitive content, including election related content, that contains realistic altered or synthetic material, the label appears on the video itself and in the video description.
  • A responsible approach to Generative AI products: In line with its principled and responsible approach to its Generative AI products like Gemini, Google has prioritised testing across safety risks ranging from cybersecurity vulnerabilities to misinformation and fairness. Out of an abundance of caution on such an important topic, Google is restricting the types of election-related queries for which Gemini will return responses.
  • Provide users with additional context: 'About This Image' in Search helps people assess the credibility and context of images found online.
  • Digital watermarking: SynthID, a tool in beta from Google DeepMind, directly embeds a digital watermark into AI-generated images, audio, text, or audio. Google recently expanded SynthID’s capabilities to watermark AI-generated text in the Gemini app and web experience, as well as to video in Veo, Google’s recently announced and most capable generative video model. 
  • Industry collaboration: Google joined the C2PA coalition and standard, a cross-industry effort to help provide more transparency and context for people on AI-generated content. Alongside other leading tech companies, Google also pledged to help prevent deceptive AI-generated imagery, audio or video content from interfering with this year’s global elections. The ‘Tech Accord to Combat Deceptive Use of AI in 2024 Elections’ is a set of commitments to deploy technology countering harmful AI-generated content meant to deceive voters.

Informing voters surfacing high-quality information
In the build-up to elections, people need useful, relevant and timely information to help them navigate the electoral process. Here are some of the ways Google makes it easy for people to find what they need, and which were deployed during elections that took place across the EU in 2024: 
  • Voting details and Election Results on Google Search: Google put in place a ‘How to Vote’ and ‘How to Register’ feature for the national parliamentary elections in France, which featured aggregated voting information from the French Electoral Commission on Google Search. 
  • High-quality Information on YouTube: For news and information related to elections, YouTube’s systems prominently surface high-quality content, on the YouTube homepage, in search results and the ‘Up Next’ panel. YouTube also displays information panels at the top of search results and below videos to provide additional context. For example, YouTube may surface various election information panels above search results or on videos related to election candidates, parties or voting.
  • Ongoing transparency on Election Ads: All advertisers who wish to run election ads in the EU on Google’s platforms are required to go through a verification process and have an in-ad disclosure that clearly shows who paid for the ad. These ads are published in Google’s Political Ads Transparency Report, where anyone can look up information such as how much was spent and where it was shown. Google also limits how advertisers can target election ads.

Equipping campaigns and candidates with best-in-class security features and training
As elections come with increased cybersecurity risks, Google works hard to help high-risk users, such as campaigns and election officials, civil society and news sources, improve their security in light of existing and emerging threats, and to educate them on how to use Google’s products and services. 
  • Security tools for campaign and election teams: Google offers free services like its Advanced Protection Program — Google’s strongest set of cyber protections — and Project Shield, which provides unlimited protection against Distributed Denial of Service (DDoS) attacks. Google also partners with Possible, The International Foundation for Electoral Systems (IFES) and Deutschland sicher im Netz (DSIN) to scale account security training and to provide security tools including Titan Security Keys, which defend against phishing attacks and prevent bad actors from accessing users’ Google Accounts.
  • Tackling coordinated influence operations: Google’s Threat Intelligence Group helps identify, monitor and tackle emerging threats, ranging from coordinated influence operations to cyber espionage campaigns against high-risk entities. Google reports on actions taken in its quarterly bulletin, and meets regularly with government officials and others in the industry to share threat information and suspected election interference. Mandiant also helps organisations build holistic election security programs and harden their defences with comprehensive solutions, services and tools, including proactive exposure management, proactive intelligence threat hunts, cyber crisis communication services and threat intelligence tracking of information operations. A recent publication from the team gives an overview of the global election cybersecurity landscape, designed to help election organisations tackle a range of potential threats.
  • Helpful resources at euelections.withgoogle: Google launched an EU-specific hub at euelections.withgoogle with resources and trainings to help campaigns connect with voters and manage their security and digital presence. In advance of the European Parliamentary elections in 2019, Google conducted in-person and online security training for more than 2,500 campaign and election officials, and, for the 2024 EU Parliamentary elections, Google built on these numbers by directly reaching 3,500 campaigners through in-person trainings and briefings on election integrity and tackling misinformation across the region.

Google is committed to working with government, industry and civil society to protect the integrity of elections in the European Union — building on its commitments made in the EU Code of Practice on Disinformation. 

Crisis 2024

[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].

Threats observed or anticipated


War in Ukraine

Overview
The ongoing war in Ukraine has continued throughout 2024, and Google continues to help by providing cybersecurity and humanitarian assistance, and providing high-quality information to people in the region. The following list outlines the main threats observed by Google during this conflict:

  1. Continued online services manipulation and coordinated influence operations;
  2. Advertising and monetisation linked to state-backed Russia and Ukraine disinformation;
  3. Threats to security and protection of digital infrastructure.


Israel-Gaza conflict

Overview
 Following the Israel-Gaza conflict, Google has actively worked to support humanitarian and relief efforts, ensure platforms and partnerships are responsive to the current crisis, and counter the threat of disinformation. Google identified a few areas of focus for addressing the ongoing crisis:

  • Humanitarian and relief efforts;
  • Supporting Israeli tech firms and Palestinian businesses; and
  • Platforms and partnerships to protect our services from coordinated influence operations, hate speech, and graphic and terrorist content.

Mitigations in place


War in Ukraine

The following sections summarise Google’s main strategies and actions taken to mitigate the identified threats and react to the war in Ukraine.

1. Online services manipulation and malign influence operations
Google’s Threat Analysis Group (TAG) is helping Ukraine by monitoring the threat landscape in Eastern Europe and disrupting coordinated influence operations from Russian threat actors. Google has also announced new long-term partnerships across Central and Eastern Europe.

In the Baltics, Google entered into long-term partnerships with the Civic Resilience Initiative and the Baltic Centre for Media Excellence. These two organisations have received €1.3 million in funding from Google to build on their impactful work towards increasing media literacy, building further resilience and actively tackling disinformation in Lithuania, Latvia and Estonia. Furthermore, Google is partnering with the Charles University in Prague, the main research centre of the Central European Digital Media Observatory (CEDMO) project, and providing €1 million in funding for CEDMO to further expand its research into information disorders, and work to increase the level of media and digital literacy in Poland, Czechia and Slovakia.

2. Advertising and monetisation linked to Russia and Ukraine disinformation
By H2 2024, Google had paused the majority of commercial activities in Russia – including ads serving in Russia, ads on Google’s properties and networks globally for all Russian-based advertisers, new Cloud sign ups, the payments functionality for most of Google’s services, AdSense ads on state-funded media sites, and monetisation features for YouTube viewers in Russia. Due to the war in Ukraine, Google paused ads containing content that exploits, dismisses, or condones the war. In addition, Google paused the ability of Russia-based publishers to monetise with AdSense, AdMob, and Ad Manager in August 2024. Free Google services such as Search, Gmail and YouTube are still operating in Russia. Google will continue to closely monitor developments.

3. Threats to security and protection of digital infrastructure
Google expanded eligibility for Project Shield, Google’s free protection against Distributed Denial of Service (DDoS) attacks, shortly after the war in Ukraine broke out. The expansion aimed to allow Ukrainian government websites and embassies worldwide to stay online and continue to offer their critical services. Since then, Google has continued to implement protections for users and track and disrupt cyber threats. 

TAG has been tracking threat actors, both before and during the war, and sharing their findings publicly and with law enforcement. TAG’s findings have shown that government-backed actors from Russia, Belarus, China, Iran, and North Korea have been targeting Ukrainian and Eastern European government and defence officials, military organisations, politicians, NGOs, and journalists, while financially motivated bad actors have also used the war as a lure for malicious campaigns. 

Google is continuing to provide critical cybersecurity and technical infrastructure support by donating 50,000 new Google Workspace licences to the Ukrainian government. By providing these licences and a year of free access to Google Workspace solutions, including Google’s cloud-first, zero-trust security model, Google can help provide Ukrainian public institutions with the security and protection they need to deal with constant threats to their digital systems. In February 2023, Google also announced an extension of the free access to premium Google Workspace for Education features for 250 universities and colleges until the end of August 2023.

Google aims to continue to follow the following approach when responding to future crisis situations: 
  • Elevate access to high-quality information across Google services;
  • Protect Google users from harmful disinformation;
  • Continue to monitor and disrupt cyber threats;
  • Explore ways to provide assistance to support the affected areas more broadly.

Future measures
Google is continually making investments in products, programs and partnerships to help fight disinformation, both in Ukraine and globally. Google will continue to monitor the situation and take additional action as needed.


Israel-Gaza conflict

Humanitarian and relief efforts
Google.org provided $6 million in Google.org funding, with $3 million to Israel organisations focused on mental health support, and $3 million in support to Gaza organisations focused on humanitarian aid and relief, including $1 million to Save the Children, $1 million to Palestinian Red Crescent, and $1 million to International Medical Corps (IMC). Specifically, Google’s humanitarian and relief efforts with these organisations include: 
  • Natal - Israel Trauma and Resiliency Centre: In the early days of the war, calls to Natal’s support hotline went from around 300 a day to 8,000 a day. With our funding, they were able to scale their support to patients by 450%, including multidisciplinary treatment and mental & psychosocial support to direct and indirect victims of trauma due to terror and war in Israel. 
  • International Medical Corps (IMC): As of October 2024, our support helped fund the delivery of two mobile operating theaters, doubling the surgical capacity of IMC’s field hospital, and enabling them to provide over 210,000 health consultations and well over 7,000 (often lifesaving) surgeries, as well as other support such as access to safe drinking water to nearly 200,000 people.

In addition, Google employees also directed more than $11 million in funding including employee donations and matching funding from Google.org to organisations providing aid and support in Israel and Gaza. 

Supporting Israeli tech firms and Palestinian businesses
Across Europe and Israel, Google is committed to supporting startups as they work at the forefront of innovation: striving to solve some of the most critical issues facing the world. These pioneering startups and businesses often struggle to access the support, expertise and tools they need to help them scale. In light of the Israel-Gaza conflict, Google is investing $8 million to support Israeli tech firms and Palestinian businesses. Of that investment, Google is providing $4 million to support Israeli AI startups and offer access to Google's knowledge, expertise (e.g. Cloud support), and mentorship opportunities in Israel and $4 million to support Palestinian startups and businesses. In addition, Google has announced that it will provide loans and grants to 1,000 Palestinian small businesses in partnership with local and global non-profit organisations, and will also provide seed grants to 50 Palestinian tech startups in hopes to preserve 4,500 jobs and create additional job opportunities. 

Platforms and partnerships
As the conflict continues, Google is committed to tackling misinformation, hate speech, graphic content and terrorist content by continuing to find ways to provide support through its products. For example, Google has deployed language capabilities to support emergency efforts including emergency translations, and localising Google content to help users, businesses and NGOs. Google has also pledged to help its partners in these extraordinary circumstances. For example, when schools closed in October 2023, the Ministry of Education in Israel used Meet as their core teach-from-home platform and Google provided support. Google has been in touch with Gaza-based partners and participants in its Palestine Launchpad program, its digital skills and entrepreneurship program for Palestinians, to try to support those who have been significantly impacted by this crisis.