Artificial intelligence causes a fine of up to one million dirhams
UAE law punishes anyone who copies or imitates sounds, images, or video clips with the aim of deceiving victims by defrauding them using the latest artificial intelligence technologies.
Legal Advisor Ihab Al-Najjar stated that according to Federal Decree-Law No. 34 of 2021 regarding Combating Rumors and Cybercrimes, which is the law specialized in combating crimes related to cybercrimes, the use of artificial intelligence techniques to deceive people is illegal.
The penalty imposed on the user of artificial intelligence techniques in fraud:
He explained that according to Article 40 of the Decree Law, “ Anyone who seizes an illegal movable property for himself or the account of another person using any fraudulent method or impersonating a name shall be punished by imprisonment for not less than one year and a fine of not less than two hundred and fifty thousand dirhams and not more than one million dirhams,” Borrowing or false impersonation through an information network, electronic information system, or other information technology means.
Fraud method using artificial intelligence:
This happens when people receive phone calls from family members, friends, or their bosses asking for large sums of money right away. They mention in their emails that they are facing some sudden scenario that is difficult for them to describe and that they are in desperate need of money. Before being surprised, they... He has fallen victim to a scam or trick through artificial intelligence programs, techniques, and applications, and the recipient of the call does not think about taking any action to verify the identity of the caller because the caller’ voice is exactly like the voice of the owner of the number, which he is familiar with, and this is the latest trend in “ artificial intelligence.” which can quickly and easily copy anyone's voice from their social media profiles, scammers use this technology to carry out their schemes because anyone with a three-second audio clip can create cloned voices with a match rate of up to 85%, giving them access to victims' personal information. They are allowed to steal or carry their money.