Introduction: What is the European AI Act?
The European Union (EU) has enacted an important law to govern the use of artificial intelligence (AI), called the AI Act, on 1 August 2024. It is the first law in the world that sets uniform rules for the development, market entry and use of AI systems. Its main goal is to ensure that AI systems are safe, transparent and respect human rights. Harmonized Standards are very important in this law, as they provide technical guidelines for AI systems that have high risk. This article will explain the role of harmonized standards, their formation process and their importance in the AI Act in easy Hindi.
Check fast for the first news of Business & Finance on this WhatsApp channel
What are Harmonized Standards?
Harmonised Standards are European standards created by European standardisation organisations (ESOs) such as CEN, CENELEC, and ETSI. These standards turn some of the general legal requirements of the AI Act into firm technical guidelines. For example, the AI Act calls for “appropriate risk management measures for high-risk AI systems”, but what these measures will be is specified by the Harmonised Standards. When an AI system complies with these standards and these standards are published in the Official Journal of the European Union (OJEU), it is considered in conformity to the law, called the presumption of conformity. This makes it easier for companies, especially small and medium enterprises (SMEs), to comply with the rules.
Developing Harmonised Standards
The process of creating Harmonised Standards is complicated and involves many people and organisations. The European Commission sends a Standardisation Request to ESOs, which spells out the scope of the standards, timelines and legal requirements. The Joint Technical Committee (JTC 21) of CEN and CENELEC is at the forefront of creating standards for AI. These standards cover areas such as risk management, data governance, transparency, human oversight and cybersecurity. In May 2023, the Commission sent a standardisation request, which was accepted by CEN-CENELEC, and standardisation work has begun. However, the process has been a bit slow due to its complexity and the need for consensus from all stakeholders.
Role of Harmonised Standards in the AI Act
Harmonised standards make it easier to comply with the rules for high-risk AI systems under the AI Act. These systems, such as AI used in medical devices or law enforcement, can impact human health, safety and fundamental rights. Systems that comply with these standards receive a CE marking, indicating that they meet the requirements of the AI Act. This not only makes it easier to comply with legal regulations, but also increases trust among customers and companies. For example, standards ensure that AI systems have high data quality and transparency.
Challenges and features
Creating harmonized standards for the AI Act is not easy. Traditional standards were created primarily for physical products, but in the case of AI, standards also need to take into account ethical and social issues, such as bias and protection of fundamental rights. According to a JRC report, existing international standards, such as ISO/IEC 42001, do not fully cover all the requirements of the AI Act. Therefore, new standards must prioritise the risks of AI, such as the impact on health and safety. In addition, standards must be applicable across all sectors and systems, clear and compatible with the latest technologies.
Timeline and implementation
The AI Act will come into full force on 2 August 2026, but rules for higher risk AI systems will apply after a transition period of 2-3 years. The European Commission had set a target for ESOs to have standards ready by April 2025, but due to the complexity this deadline has now been extended to December 2025. Progress reports must be submitted to CEN-CENELEC every three months. The Commission and all stakeholders are working together to ensure that the standards are ready on time.
Conclusion
Harmonised standards are the backbone of the implementation of the European AI Act. These standards not only provide technical guidelines, but also help make AI systems safe, transparent and ethical. For companies, especially SMEs, these standards make it easier to comply with regulations and promote competition in the European market. As standards are being developed, it is important that all stakeholders, such as customer groups and industry, take an active part in the process to ensure that the future of AI is trustworthy and human-centred.
Check fast for the first news of Business & Finance on this WhatsApp channel
Frequently asked questions (FAQs)
- What are Harmonised Standards?
- These are European standards that translate the legal requirements of the AI Act into technical guidelines.
- Why are Harmonised Standards important in the AI Act?
- They make it easier to comply with regulations for high-risk AI systems and increase trust.
- Who develops these standards?
- European standardisation organisations such as CEN, CENELEC, and ETSI develop them.
- Are Harmonised Standards mandatory?
- No, they are voluntary, but following them is considered compliance.
- When will these standards be implemented?
- The goal is to have them ready by December 2025, so that the AI Act can be fully implemented by 2026.
Also Read:
Swara Bhaskar’s net worth: Owner of crores, charges such a huge amount for a film
Tilak Varma Net Worth: Earnings in crores from cricket and journey to a luxurious lifestyle