Out-Law Analysis Lesedauer: 6 Min.

Irish firms must work to understand ESG obligations under AI Act


Businesses across Ireland must strengthen their understanding of artificial intelligence (AI) and the impact the EU AI Act is set to have on the technology, including through the introduction of additional environmental, social and governance (ESG) obligations.

Given Ireland’s very considerable, and growing, renewables and tech sectors, Ireland has a unique perspective in this new and critically important intersection between AI and ESG.

As AI and ESG continue to impact businesses in Ireland, it is essential that board members and executives understand the implications and business potential of both. Businesses are no longer being judged solely on their financial information as investors are increasingly looking to invest in companies which align with their values. New legislation is now starting to impose concrete ESG obligations which Irish businesses should be aware of.

EU AI Act

The EU AI Act entered into force on 1 August, but obligations under the act will be phased in over the next three years. The objective of the EU AI Act is to ensure safe AI systems that respect fundamental human rights, while also fostering innovation.

Under the EU AI Act, AI systems will be categorised into four different levels: unacceptable risk, high-risk, limited risk and minimal risk. Obligations will vary depending on the category of the AI system.

All parties involved in the development, usage, import, distribution or manufacturing of AI systems will be held accountable under the EU AI Act. The Act will also apply to providers and deployers of AI systems located outside the EU where the output produced by the AI system is intended to be used in the EU.

Environmental factors and the EU AI Act

The well-established goal of reaching net zero now needs to address the rise in AI technologies and their associated energy and water demands. New features of AI, including machine learning and generative AI, are leading to rapid expansion and evolution of data centres. The existing and new generation data centres rely on water resources and significant amounts of energy consumption.

Data centres are used to house, connect and operate computer systems, servers and associated equipment for data storage, processing, and distribution, as well as related activities. There are currently 82 data centres in Ireland and a further 14 in the planning process and under construction.

There are growing concerns that the rise in demand for AI could negatively impact Ireland’s climate targets. Data centres in Ireland are currently being built, or are at planning stage, at a rate which surpasses the rate at which the country’s renewable energy sector is being developed.

Data centres are key purchasers (off-takers) of renewable power; a challenge being managing intermittency of generation with constant demand. An increasing feature of the carbon accounting in such Corporate Power Purchase Agreements (CPPAs) is a perception that they don’t enable full system decarbonisation. Specifically, there is a move underway to matching generation and demand to a far more granular basis rather than matching supply and demand on an annual basis.

According to the National Energy and Climate Plan, Ireland’s data centres could consume up to 31% of Ireland’s overall electricity within the next three years.

While the recent EU AI Act does not explicitly focus on the environmental impacts of AI systems, it does reference some sustainability principles and standards. The EU AI Act encourages the development and use of AI tools in a “sustainable and environmentally friendly manner” as well as in a way to benefit all mankind. The newly established AI Office and member states must draw up codes of conduct concerning the voluntary application of specific requirements to all AI systems. This includes the assessment and minimisation of the impact of AI systems on environmental sustainability, as well as on energy-efficient programming and techniques for the efficient design, training and use of AI. These codes of conduct will be periodically reviewed by the European Commission in relation to their impact and effectiveness, including as regards environmental sustainability.

Article 40 of the EU AI Act mandates standardisation bodies to create reporting and documentation procedures for efficient resource use by certain AI systems as well as energy efficient development of AI models. These procedures aim to reduce energy and resource consumption of high-risk AI systems during their lifecycle and promote energy efficient development of general-purpose AI models. While such standards will enhance transparency about AI’s ecological impact, they may be time-consuming and inefficient.

Additionally, the EU Energy Efficiency Directive 2023/1791 requires EU countries to collectively ensure an extra 11.7% reduction in energy consumption by 2030. The directive contains requirements on transparency for sustainability credentials of the data centres sector. The EU’s Corporate Sustainability Reporting Directive, which entered into force on 25 July 2024, includes a further requirement of a new level of scrutiny of the materials used to build data centres along with other reporting obligations. As there is a shift from voluntary to mandatory reporting, this will likely see data centres being built that are environmentally compliant.

Social factors and the EU AI Act

In line with the social pillar of ESG, the EU AI Act prohibits AI systems that pose unacceptable risks to human rights, such as those that manipulate behaviour, exploit vulnerabilities or engage in social scoring.

According to a recent survey, 1 in 4 companies have introduced AI to their recruitment processes. Businesses deploying such high-risk AI tools for recruitment and employment will need to conform to certain governance requirements under the EU AI Act including risk management, data quality, transparency, human oversight and accuracy.

AI systems used for the evaluation of performance and behaviour of existing employees or affecting the working relationship will be considered high-risk. In support of the social pillar of ESG, the EU AI Act demands that human oversight be built into every stage of the AI’s life cycle. Employers will need to ensure that a person with appropriate training and authority is appointed to review the data collected and decisions reached. AI-based decisions may be tainted with integral bias or discrimination, and so the appointed person will need to have the appropriate authority to monitor the AI and override any AI-based decisions, if necessary. In addition, employers must ensure that any employees dealing with and using AI systems have a sufficient level of AI literacy.

While most obligations in relation to the deployment of AI systems only commence from 2 August 2025, the AI literacy requirement applies to all businesses using AI systems from 2 February 2025.

Governance and the EU AI Act

The EU AI Act establishes a comprehensive regulatory framework for governing AI systems, including requirements for transparency, accountability, and human oversight. Also, in the case of high-risk AI systems, they must undergo rigorous assessments and continuous monitoring to ensure compliance with ethical standards and legal obligations.

Boards and executives have a responsibility to effectively monitor the risks and opportunities posed by AI and to ensure that their business has implemented appropriate risk assessments and safeguarding procedures to effect adequate decision making. Boards will therefore need to keep on top of AI systems adopted by their business and ensure compliance with the EU AI Act. Irish businesses should also consider appointing directors with experience in managing AI technologies and ESG considerations in a business environment. Boards must ensure that their business is prepared for compliance with the new EU AI Act as well as the various sustainability legislation emerging from the back of the EU Green Deal.

Boards should acknowledge the ways in which AI can be used to recognise and mitigate risks related to ESG factors as well as transform ESG reporting. With the shift to mandatory sustainability reporting, this will require businesses to become more transparent about their operations as well as lead to the management of vast volumes of data which, if not managed correctly, will cause difficulties in data analysation and potential violations of the EU AI Act and the GDPR. Boards should consider how AI can be used to streamline data collection and extract insights from the data related to ESG metrics. Data is crucial for ESG reporting and businesses should be aware of the ways in which AI can help to measure, manage and monitor such data. 

Amid the quickly evolving AI landscape and increasing regulatory requirements of ESG factors, now is the right time for businesses to adapt in a way that is compliant with the EU AI Act and ESG related legislation.

Co-written by Deirdre Lynch, Dani Kane and Isabel Humburg of Pinsent Masons.

Pinsent Masons is hosting an event in Dublin on 27 November exploring the impact of AI and technology on health and safety management an enforcement.

We are working towards submitting your application. Thank you for your patience. An unknown error occurred, please input and try again.