September 2, 2025
EU AI Act – What Corporate Legal Teams Must Do Before 2026
The EU AI Act makes AI compliance mandatory. Those who use 2025 wisely will reduce risks and gain a clear edge over competitors.
The countdown is on: It started months ago, in February 2025, the first provisions of the EU AI Act began to take effect. By 2026, companies must demonstrate that their AI systems are being used in compliance with the law.
What does this mean for your organization’s most critical processes? For the first time, the regulation establishes binding rules for artificial intelligence across Europe, pursuing a dual objective: minimizing risks while fostering innovation. For companies, AI compliance thus becomes mandatory. Particularly affected are high-risk AI systems that support sensitive decisions in HR, finance, or supply chain management.
Are you prepared to take the lead and secure compliance before it’s too late? Companies that have not established robust structures by 2026 risk fines, loss of trust, and operational disruptions. Unlike the GDPR, the AI Act does not only cover personal data but the entire functionality and impact of AI systems. Responsibility clearly lies with the legal department, which must drive governance and assert its role ahead of IT and procurement.
Are you ready for the EU AI Act and its deadlines?
The AI Act follows a risk-based approach: the greater the potential impact of an AI system on fundamental rights, security, or society, the stricter the obligations.
- Prohibited AI systems: Practices such as social scoring or manipulative subliminal techniques will be banned from 2025.
- High-risk AI systems: Subject to strict regulation – e.g., applicant screening, credit scoring, financial risk models, or supplier scoring.
- Limited-risk systems: Such as chatbots, which must be clearly identified as AI.
- Minimal-risk tools: Applications without significant decision-making impact that trigger no special obligations.
Timeline:
- 2024: Law entered into force; transition periods begin.
- February 2025: Bans on unacceptable AI systems take effect.
- August 2025: First obligations for providers of General-Purpose AI (foundation models) apply.
- August 2026: Comprehensive requirements for high-risk systems take effect (including risk management, documentation, and human oversight).
- 2027: Full implementation of all provisions.
Which AI Systems Count as “High-Risk”?
Companies often underestimate which applications are classified as high-risk:
- HR: applicant screening, aptitude analyses, career predictions
- Finance: credit scoring, fraud detection, risk models
- Supply Chain & Procurement: supplier scoring, ESG ratings, price forecasts
- Compliance & Legal Tech: contract analysis, due diligence tools, eDiscovery
The duty of care also applies to SaaS and external systems. Weaknesses at providers can quickly become liability traps for their customers.
Countdown to 2026: Get Ready for EU AI Act Compliance
Are you on track to meet the EU AI Act compliance requirements by 2026? To meet the requirements on time, legal departments must establish clear structures. Three pillars are essential:
- AI Risk Register
The risk register is the core of AI governance. It documents all systems in use, their classification, scope, risks, responsible parties, and review cycles. A complete register facilitates internal steering and also serves as evidence for supervisory authorities.
- Vendor Due Diligence
Companies must systematically review their AI providers:
- What training data was used?
- Are bias and fairness tests in place?
- How is the decision logic documented?
- What certifications are available?
- How are data protection and security ensured?
In addition, contract clauses on audit rights, liability, and exit options are mandatory. This is the only way to prove the duty of care toward AI providers.
- Internal Governance Policies
The legal department must establish binding AI governance policies:
- Evaluation of new AI projects; approval and escalation processes
- Reporting and monitoring obligations; staff training
- Clear definition of roles between Legal, IT, Compliance, and Procurement
Only with binding policies can risks be controlled and liability issues avoided.
Coordination Model: Legal Before IT and Procurement
In many companies today, IT and procurement drive AI projects – focusing on technology and costs. That is not enough. The EU AI Act requires a coordination model in which the legal department takes the lead:
- Legal: defines regulatory requirements, develops contract standards, steers governance
- IT: ensures technical implementation, monitoring, and documentation
- Compliance: responsible for audits, training, and risk reporting
- Procurement: embeds due diligence requirements into supplier contracts
This model ensures that legal risks are identified, documented, and considered in all projects. Without legal leadership, companies risk deploying systems that are not legally sustainable.
2025 Is the Turning Point – Are You Ready to Build Compliance Structures?
Companies should use 2025 proactively to build structures. This is the only way to demonstrate compliance with the EU AI Act on time and minimize risks:
- Identify and classify high-risk systems
- Introduce a pilot AI Risk Register
- Establish standardized vendor due diligence
- Anchor binding AI governance policies
- Launch early training programs for managers and staff
Conclusion: The countdown is on
The EU AI Act is not an IT project – it is a governance issue. Those who act now not only reduce risks but also gain trust with regulators, business partners, and customers. Corporate legal teams must create governance structures, transparently document risks, and enforce the duty of care toward AI providers.
Those prepared by 2026 will not only achieve compliance but also secure competitive advantages in an increasingly regulated market. Leverage the power of legal tech today to streamline compliance, strengthen governance, and stay ahead of the curve.
Further information:
Check the official information on the EU AI Act on the Euopean Commission website