Marichka Padalko - attack 15.05.2025

Місто фіксації онлайн-атаки
Kyiv
Дата онлайн-атаки
15.05.2025
Зафіксовані види онлайн-атак
Use of AI
Source of Threat
Unknown
Соціальна мережа, сайт чи інший онлайн-простір онлайн-атаки
TikTok

On May 15, 2025, a video with signs of a deepfake was shared on the TikTok channel @ua_dopomoha8. The video used the likeness of Ukrainian journalist and TSN host on the “1+1” TV channel, Marichka Padalko.

In the video, she allegedly announces that “due to increased tax burdens on utility payments, state banks are providing a one-time payment of UAH 6,000 to bank cards.” Viewers are invited to follow a link and submit their personal data to receive this payout.

In reality, no such support program exists. The state does not collect personal data through third-party links, and all social payments are made exclusively via the official “Diia” application or official government resources. The video is an example of a fraudulent deepfake designed to mislead citizens and likely collect their personal information.

Analysis of the video (about 17 seconds long) shows signs of manipulation with deepfake technology. The central focus is a woman’s face accompanied by an audio track. However, the lip movements are not synchronized with the audio: the correlation coefficient between facial movement intensity and audio energy is nearly zero, which is a typical indicator of synthetic video.

Visual frame analysis detected anomalies in skin texture coloration and blurry transitions in the central part of the face. This suggests an overlay of generated imagery onto original footage. Modern deepfake algorithms can realistically recreate eyes and mouth, but often struggle with movement consistency and color palette uniformity — both of which are observed here.

Correlation coefficient between motion in the video and audio energy = –0.08. This is nearly zero (slightly negative) correlation. The graph shows that moments of intense facial movement do not align with peaks in voice. This is a strong indicator of a deepfake: in authentic videos, lip movements and voice are normally synchronized, with a significantly higher correlation.

Based on the analysis and detected inconsistencies, the video has a high likelihood of being a deepfake.

Why This is an Example of TFGBV (Technology-Facilitated Gender-Based Violence)

  • Use of a female journalist’s likeness without consent.
    The video was manipulatively created with AI using Marichka Padalko’s face and voice, a well-known TV presenter. This represents the exploitation of a woman’s body and voice as tools of deception, undermining her professional reputation and creating additional security risks.
  • Gendered vulnerability and discreditation.
    Female journalists are often targeted by such attacks because of their high public visibility and the trust they hold with audiences. Using a woman’s likeness enhances the “trust effect” but simultaneously discredits her as both a person and a professional. This introduces a gender-discriminatory element: the attack targets not just a journalist, but specifically a woman in media.
  • Digital violence and invasion of privacy.
    Disseminating fake content on behalf of the journalist is a form of online violence that may result in loss of audience trust, threats on social media, or secondary victimization. She loses control over her image, which is turned into a tool of fraud.
  • Broader impact on other women.
    Such cases create a chilling effect for other female journalists: seeing that they too could become victims of deepfake manipulation increases fear, self-censorship, and withdrawal from public life.

For this reason, the NGO Women in Media developed the guide: “Steps for Newsroom to Take in the First 24 Hours Following an Online Attack against a Woman Journalist.” The guide is based on practices from UNESCO (The Chilling), the Coalition Against Online Violence, PEN America, the Dart Center for Journalism and Trauma, and IWMF.

The document outlines types of threats, manipulations, and disinformation; provides contacts of support services; and suggests step-by-step response algorithms. An attack on a female journalist is an attack on the entire newsroom — therefore, the media’s main task is to support and protect their colleague.

Report

Report an online attack

If you are a woman journalist who encountered an online attack or witnessed such an incident, we would appreciate your contribution. Please report this case to help us identify threats and protect the rights of women in the media. Fill out the form, share key details, and join us in creating a safer information environment.

Fill Out the Form
Copied!