Tough EU rules on using AI in high-risk situations will mean Britain’s professional and ethical standards must scale up quickly, BCS, the Chartered Institute for IT has said.
The IT industry professional body said that the “sweeping legislation” will impact British businesses providing AI services to the EU.
It warned that the new rules will require organisations to meet “unprecedented standards of ethics and transparency.”
Last week it was announced that the European Union would ban the use of AI for high risk applications like mass surveillance or ranking social behaviour.
Companies could face fines of 4 per cent of global revenue - or €20 million, or greater - if they fail to comply with the new rules or report correct information, like GDPR rules.
“The EU has realised that AI can be of huge benefit or of huge harm to society, and has decided to regulate on standards for the design, development, and adoption of AI systems to ensure we get the very best out of them,” said Dr Bill Mitchell, director of policy at BCS. “There will be a huge amount of work to do to professionalise large sections of the economy ready for this sweeping legislation.”
Mitchell said that these ambitious plans will be impossible to deliver without a fully professionalised AI industry.
“Those with responsibility for adopting and managing AI will need to ensure their systems comply with these new regulations, as well as those designing and developing these systems,” he urged. “The IT profession - and particularly those involved in AI – will in future need to evidence they have behaved ethically, competently and transparently. In principle this is something we should all welcome, and it will help restore public trust in AI systems that are used to make high stakes decisions about people.”
The EU legislation will set Europe on a different path to the US and China by directly prohibiting the use of AI for indiscriminate surveillance and social scoring.
BCS said that the law would establish a number of use cases for AI, including education, financial services and recruitment, and designate them as high risk.
“For these high-stakes examples regulation would include mandatory third-party audit, of both actual data and quality management systems,” the organisation said. “The new rules are likely to take some years to become law but need to be prepared for now especially in the areas of staff training and development.”






Recent Stories