Article
European AI Act: Opportunities and challenges

European AI Act: Opportunities and challenges

May 8, 2024

The effectiveness of the EU’s new AI Act will depend on how businesses, policymakers, and society adapt

Has the European AI economy been left behind? In terms of sheer number of AI startups and levels of investment, it's clear the US leads the field, trailed by China. The new EU AI Act, and its implications for both politics and business in Europe, must be examined within this context. Will the new regulation foster a competitive European AI economy? Or will it hamper both the creation and adoption of cutting-edge AI tools?

For European companies utilizing AI systems, the EU AI Act also presents a spectrum of opportunities and challenges. (AI generated image)
For European companies utilizing AI systems, the EU AI Act also presents a spectrum of opportunities and challenges. (AI generated image)

The political objective of the EU AI Act is unmistakable: to promote trustworthy AI across Europe. Approved by the European Parliament in March 2024 and pending endorsement from the Council, it is slated to be fully enforceable 24 months after it comes into effect (with some exceptions regarding this timeline). Representing a global first, the EU AI Act establishes a comprehensive legal framework for AI, aiming to ensure the trustworthiness of AI systems within Europe and beyond. As outlined by the European Commission, the Act mandates that AI systems adhere to fundamental rights, safety standards, and ethical principles while addressing risks associated with highly impactful AI models. To achieve this objective, the EU AI Act categorizes AI systems based on their risk level into four distinct categories: unacceptable, high risk, limited risk, and minimal risk.

Timeline of the creation and adoption of cutting-edge AI tools

But what are the consequences of the EU AI Act both for companies developing AI-based systems and for the broader economy adopting AI tools?

European developers of AI systems might be able to benefit from a number of new opportunities:

  • Competitive advantage /brand building: European AI companies can establish a reputation for producing "bulletproof" AI solutions, boosting demand for their products and services.
  • Focus on innovation: With standardized regulations, EU AI companies can direct their efforts towards innovation rather than navigating complex compliance rules.
  • First mover advantage: Being the first global legal framework for AI providers, the EU AI Act positions European companies as first movers, potentially setting a precedent for other countries to follow suit.
  • Level playing field: All AI companies, regardless of origin, must comply with the EU AI Act to operate within the EU, creating a level playing field and ensuring fair competition within the region.

Al risk categories according to the EU Al act

However, the Act could also pose challenges for European AI developers:

  • Initial costs: While compliance search costs may decrease, adhering to the EU AI Act will entail expenses, particularly during the initial phase of implementation.
  • Prolonged time to market: Meeting EU AI Act requirements may increase administrative burdens, delaying the launch of new AI products.
  • Competitive disadvantage: Companies in regions with less stringent AI regulations may outpace or undercut European counterparts in developing AI solutions.
  • Innovation restrictions: Overly strict EU AI Act regulations could impede research and development, potentially prompting the development of new AI technologies in other regions.
  • Uncertainty: While the EU AI Act establishes a clear framework, practical applications may lead to ambiguous cases, necessitating court decisions to clarify interpretations.

Three examples of how the EU AI act may impact different industries

Healthcare: AI medical solutions in the EU are regulated under the Medical Devices Regulation (MDR) and In-Vitro Diagnostic Medical Devices Regulation (IVDR). The proposed EU AI Act will subject certain AI healthcare applications to strict regulations, especially those deemed high-risk. Clear guidelines are needed to help manufacturers navigate compliance with both existing medical device regulations and the forthcoming EU AI Act.

Financial services: In financial services, AI-powered assessments of creditworthiness, pricing, or risk may face tighter regulations in the future due to their potential high-risk nature. Stricter norms would ensure fairness, transparency, and compliance with laws, aiming to protect consumers and maintain market stability.

Automotive: The EU AI Act could offer clear legal guidelines for autonomous driving, defining roles and responsibilities for manufacturers, developers, and operators. This could simplify regulations for the technology. However, it would also bring new requirements, such as safety standards and ethical AI guidelines, which manufacturers must adhere to.

It isn't just companies operating directly in the AI development space – the new EU AI Act also presents a spectrum of opportunities and challenges for European companies utilizing AI systems.

The opportunities include:

  • Enhanced trust and safety: Users can place confidence in solutions governed by the EU AI Act, ensuring a high level of safety and transparency in deployment.
  • Reduced search costs: With a focus on solutions from European AI companies, users can streamline their search processes, assured that these solutions meet rigorous standards.
  • Competitive advantage: Companies utilizing AI solutions compliant with the EU AI Act can leverage this adherence as a competitive edge, potentially branding themselves as "Powered by EU AI solutions." If non-EU companies leverage this advantage to operate within the EU, the influence of the EU AI Act could extend globally.
  • Data protection: Aligned with the General Data Protection Regulation (GDPR), the EU AI Act mandates that the processing of personal data within AI contexts must guarantee an adequate level of data protection.

The potential challenges include:

  • Increased costs: Compliance expenses affect both AI providers and users. For instance, it is foreseen that deployers of high-risk AI systems must implement human oversight, monitoring, and report serious incidents and malfunctions, adding to operational costs.
  • Higher prices: AI solutions compliant with the EU AI Act may carry a premium compared to solutions from regions with lower or no standards, potentially leading to higher prices for consumers.
  • Innovation uncertainty: Regions with lower or no standards may foster more innovative AI solutions, creating uncertainty for companies adhering to stricter regulations.
  • Limited functionality: Stricter rules may necessitate disabling or restricting certain AI features to ensure compliance, resulting in less capable or personalized products for end users.
  • Talent flight: If bureaucracy becomes too imposing, talents might look for more open markets and faster developing ecosystem.

Adapting to a new regulatory landscape

It is currently unclear whether the EU AI Act will foster or hamper the EU AI industry and the economy at large. Thus far, reactions to the EU AI Act have been varied. While politicians commend its ethical standards, business leaders express concerns about potentially stifling innovation.

Ultimately, the consequences of the EU AI Act will depend on how effectively it balances these opportunities and challenges, and how businesses, policymakers, and society at large adapt to this new regulatory landscape. Success in navigating these complexities could position Europe as a global leader in ethical and responsible AI development while ensuring its competitiveness in the rapidly evolving AI industry .

Disclaimer

This article is for informational purposes only and is not offered as professional and legal advice for any specific matter. Professional and legal advice should always be sought before taking any action or refraining from taking any action based on this presentation.

Roland Berger group of companies ("Roland Berger") and the editors and the contributing authors do not assume any responsibility for the completeness and accuracy of the information contained therein and expressly disclaim any and all liability to any person in respect of the consequences of anything done or permitted to be done or omitted to be done wholly or partly in reliance upon the whole or any part of the presentation.

The article may contain links to external websites and external websites may link to the presentation. Roland Berger is not responsible for the content or operation of any such external sites and disclaims all liability, howsoever occurring, in respect of the content or operation of any such external websites.

Sign up for our newsletter

Further readings
Portrait of Maria Mikhaylenko
Senior Partner, Global Managing Director
Milan Office, Southern Europe
Portrait of Hasmeet Kaur
Partner, Global Managing Director
Munich Office, Central Europe
Load More