Impact assessment of data controller accepted risks to data subjects but provided no practical means to mitigate them.
The Hungarian data protection authority (SA) has issued its decision regarding data protection issues relating to the use of AI.
While investigating another issue, the Hungarian SA became aware of the fact that the data controller in the case (a financial services provider) performed automated analysis on the customer service phone calls. Because information about data processing was not clearly provided to data subjects, the Hungarian SA started an ex officio investigation against the data controller in 2021 to review the general data processing practices of the data controller regarding the automated analysis.
The data controller recorded all customer service phone calls. Each night, a software automatically analysed all new audio recordings. The software used AI to find keywords, and to guess the emotional state of the client at the time of the call. The result of the analysis was stored in the data controller’s systems for 45 days, along with, and connected to, the voice call. It constituted a list of people sorted by the likelihood of dissatisfaction, based on the audio recording of the customer service phone call. Based on the result of the analysis, designated employees marked clients to be called by customer service operatives trying to assess their reasons for dissatisfaction. No information on this particular data processing was provided to data subjects, no right of objection was technically possible, and the data processing was planned and carried on despite the data controller knowing this.
The impact assessment of the data controller accepted that the reviewed data processing used AI and caused high risk to the fundamental rights of data subjects. Neither the impact assessment, nor the legitimate interest assessment provided any actual risk mitigation. While in theory clients could have been informed or had the right to object, in practice these rights were non-existent.
The Hungarian SA said that AI is by nature difficult to deploy in a transparent and safe manner, and so additional safeguards are necessary. Due to the way AI functions, it is difficult to confirm the results of personal data processing by AI, and it may be biased. It determined that numerous sections of the GDPR had been infringed to a serious extent, and for a long period. Therefore, it ordered the data controller to stop processing information about the emotional state of the clients, and to only continue the data processing if made compliant with the GDPR. It issued an administrative fine in Hungarian forints equal to approximately EUR 650,000.
The EU’s proposed AI Act would also have an impact on this sort of software with large fines possible,