23 December 2024
INNOVATION SOCIAL COMMITMENT

Guidelines and Clues to Catch Deepfakes

#CaixaBank   |   #Cybersecurity   |   #Artificial intelligence   |   #Security

In recent years, Artificial Intelligence (AI) tools have become more well-known and accessible, allowing many work processes to be streamlined. Whether by gathering a large volume of information or automating repetitive tasks, AI presents itself as an ally in many fields. However, alongside its many benefits, the presence of deepfakes has also increased. Deepfakes are manipulated images, videos, or audios that appear real through sophisticated AI software.

From manipulated speeches that seem to be delivered by well-known personalities or politicians to completely unreal images, the malicious use of this AI-based software poses a threat for perpetrating new types of fraud. Therefore, it is essential to be informed and critical of the content we consume to detect and differentiate what is real from what could be a deepfake.

Aware of the difficulty in detecting a deepfake, CaixaBank highlights the signs and guidelines that can help suspect the origin of content:

  • Pay attention to the face and body: Deepfake usage often focuses on the face and features, so poor alignment or disproportion with the body can raise suspicion.
  • Blinking frequency: The number of blinks can reveal deepfakes, as natural blinking is a challenge for AI software manipulating images, providing an advantage for users to recognize if a video is real or manipulated.
  • Facial and skin details: Observing even the smallest facial and skin details can be crucial. Extremely smooth or wrinkled skin, unnatural expressions, or images with blurry edges can be signs of fraud.
  • Source verification: Analyzing the source and its credibility may take a few minutes but is valuable to determine if the video or content originates from a reliable source.
  • Video length: The length of a video can also be a measure of its reliability. Deepfakes are often found in short videos, so encountering a long video makes it more likely to be real.

CaixaBank, in the security section of its website, explains this technique and recommends that in the case of any unusual or suspicious request from someone known received in audio or video format, it is best to contact the person directly through an official channel to ensure the legitimacy of the request.

You might also be interested in