The EU is set to unveil a proposal to regulate the sprawling field of artificial intelligence next week, with the aim of reassuring the public against "Big Brother"-like abuses.
The European Commission, the EU's executive arm, has been preparing the proposal for over a year, with big tech companies worrying that the bloc's definition of AI is too broad.
The rules are part of the EU's effort to set the terms on AI and catch up with the US and China in a sector that spans from voice recognition to insurance and law enforcement.
The draft regulation will create a ban on a very limited number of uses that threaten the EU's fundamental rights.
This would make "generalised surveillance" of the population off limits as well as any tech that was "used to manipulate the behaviour, opinions or decisions" of citizens.
Anything resembling a social rating of individuals based on their behaviour or personality would also be prohibited, the draft said.
Military application of artificial intelligence will not be covered by the rules, which will require ratification by EU member states as well as the European Parliament.
Infringements, depending on their seriousness, may bring companies fines of up to four percent of global turnover.
To promote innovation, Brussels also wants to provide a clear legal framework for companies across the bloc's 27 member states.
To this end, the draft regulation says companies will require a special authorisation for applications deemed "high-risk" before they reach the market.
High-risk systems would include "remote biometric identification of persons in public places" as well as "security elements in critical public infrastructure".
Other uses, not classified as "high risk", will have no additional regulatory constraints beyond existing ones.
Google and other tech giants are taking the EU's AI strategy very seriously as Europe often sets a standard on how tech is regulated around the world.
Last year, Google warned that the EU's definition of artificial intelligence was too broad and that Brussels must refrain from over-regulating a crucial technology.