
Stealing someone’s face, body, or voice and creating an AI-generated copy of a person that will speak and move however its “creator” wants is easy. With the help of artificial intelligence, malicious actors are turning well-known female journalists into fakes.
Ukrainian female journalists are increasingly becoming “digital avatars” — not of their own will, but because manipulators want to exploit the public’s trust in them as a tool of influence. What only a few years ago seemed like a technological curiosity is now turning into a form of online violence. Artificial intelligence has become a weapon for discrediting and spreading disinformation.
Texty.org.ua, together with the NGO “Women in Media,” set out to investigate how TikTok users employ AI to create or distort videos featuring journalists’ likenesses. The focus was on female journalists, who represent the majority of Ukraine’s media workforce. We analyzed 595 AI-generated videos that used the likenesses of prominent Ukrainian women journalists. Collectively, these videos have garnered more than 24 million views on TikTok. Most of the examples we found bear all the hallmarks of technology-facilitated gender-based violence (TFGBV).

In the course of the analysis, we encountered dozens of videos where their voices, appearance, or speaking style were artificially altered or fully generated. Various AI technologies were used — from voice synthesis layered over real footage to full deepfakes with completely generated visuals and audio.
Most of the content in the sample showed signs of audio manipulation. Real news stories were re-voiced with newly generated voices created to push a specific message, storyline, or narrative. Sometimes the artificial voice mimicked well-known Ukrainian TV hosts, amplifying the illusion of authenticity.

Such audio manipulations often give themselves away: unnatural pronunciation, incorrect stress patterns, strange pauses, and broken speech rhythm are all signs of machine generation.
Among the journalists whose likenesses were used most frequently were: Solomiia Vitvitska, Anastasiia Daugule, Alla Mazur, Nataliia Ostrovska, Iryna Prokofieva, Liliia Naliagaka, Marichka Padalko, Anastasiia Mazur, Olena Morozova, and Nataliia Moseichuk.
The Texty.org.ua team even created a fake video of Valeriia Pavlenko, the host of their YouTube channel, to demonstrate how these technologies work.
Female journalists are associated with trust, and recognizable faces help increase reach. But fake videos damage the real journalist’s reputation. People start attacking someone for things she never said.

Fake videos are usually used to spread anti-Ukrainian and harmful narratives, but sometimes they simply promote subscriptions to certain channels. The most frequent calls to action in these videos are to sign a petition or complete a survey — 268 cases (45%). In practice, this means directing users to external sites, encouraging them to subscribe to fake accounts, or tricking them into providing personal data to malicious actors.
For example, «popular TV presenters» talk about a petition that will:
- return Zaluzhny to the post of commander-in-chief;
- lead to Zelensky’s resignation or make pro-Russian politician Yevhen Murayev president;
- send all police officers and employees of the Recruiting office to the front;
- help confiscate the property of top officials or Yulia Tymoshenko;
- force member of the Ukrainian Parliament Mariana Bezuhla to go to the frontline.
At first glance, these messages seem random. However, their structure clearly repeats the usual tactics of Russian disinformation: appealing to emotions, stirring up controversy, and undermining trust in the state.
For example, the question of Valerii Zaluzhny’s possible return to the post of commander-in-chief, although it has no real basis, provokes controversy and strong emotions. The promotion of Yevhen Muraiev is expected: he is a figure whom the Russian media has repeatedly tried to legitimize.
Another thesis — about sending all police officers to the front — also sounds unrealistic. However, it is actively promoted by sources that spread Russian narratives in Ukraine and, unfortunately, resonates with part of society.
The idea of confiscating the property of high-ranking officials is another tool for increasing social polarization, designed to separate the state from its citizens.
Nearly every fourth AI-generated video we found exploits female journalists’ images to achieve a similar goal — collecting users’ data and boosting audiences for deceptive platforms through lies about social assistance. We identified 145 such videos (24%).

A false generated AI video shows journalists Nataliia Ostrovskaya and Iryna Prokofieva (TV presenters on the 1+1 channel) saying that Ukrainians can receive allowances, pensions, or payments if they fill out the appropriate forms. The appeals are often targeted at specific categories: pensioners, parents, teachers, and those who remained in the country after the start of the large-scale invasion. The payments are allegedly made by various organizations, including the UN, the Red Cross, the EU, Canada, the US, state banks, and the president himself.
Almost 20% of the analyzed videos (116) contain harmful or anti-Ukrainian narratives. Most often, this content is about the horrors of mobilization, illegal actions by law enforcement agencies, and the discrediting of the government and military-political leadership. Such videos claim that Ukraine’s neighbors are allegedly planning to occupy the western regions or that peace will come in the near future — all you have to do is subscribe to social media to find out the date.
Speculation on current events happens instantly. For example, on July 19, 2024, political and public figure Iryna Farion was killed, and within a few days, AI-generated voices superimposed on real news reports were spreading fake news on TikTok that the killer was a relative of Farion, that the wrong person had been arrested, or that there was real footage of the tragedy.
The structure of nearly all AI-generated videos featuring Ukrainian female journalists clearly mirrors typical Russian disinformation tactics: emotional manipulation, provoking conflict, and undermining trust in the state.
Full investigation read on Texty.org.ua
- Women in Media NGO documents online attacks on Ukrainian women journalists on an interactive map. If you have faced online violence because of your professional activity and wish to share your story, please fill out this online form or contact us at ngo.womeninmedia@gmail.com.
This material was prepared by Texty.org.ua and the NGO “Women in Media” in partnership with UNESCO and with support from Japan. The authors are responsible for the choice and presentation of the facts contained in the material, as well as for the views expressed therein, which do not necessarily reflect the position of UNESCO and do not obligate the Organization in any way.