How artistic AI-generated images are being deceptively used to harvest engagements

On April 26, 2024, Goal Africa, a verified account on Facebook, posted a photo of what looked like a beautiful carving of Portuguese football star, Cristiano Ronaldo, with a young African boy standing beside it. The social media page of the renowned international sports media outlet, Goal.com, captioned the photo “This young lad carved Cristiano Ronaldo out of limestone”.

The photo garnered over 96,000 reactions and an equally high number of comments and shares. Many commenters commended the supposed talent of the young boy. But others figured out that the photo was not real. It is AI-generated. Many fact-checking organisations across Africa debunked the post. Today, Meta has limited the post’s visibility and tagged it with a fact-check report from one of the organisations under its Third-party fact-checking programme. Thus, before accessing the post, one would be made to see the fact-check report.

But the post from Goal Africa is not isolated. It’s part of a new trend on social media. The digital media platforms have recently been awash by AI-generated photos being passed off as artistic pieces of work to harvest engagements and reactions. To marvel, the so-called artistic works are often attributed to young African boys who are positioned beside them. These photos are published without any tag or indication that they are synthetic, misleading many on social media.

Drawing on the love for automobiles, animals, politics and football

A search conducted by Fact-Check Ghana spotted several social media pages and groups where such AI-generated images are produced and published. While the exact AI platforms used to generate these images are not known because of the proliferation of such platforms, there appears to be a consistent theme under which the images are being generated.

Analysing many of such posts, the team identified the following themes used in creating such AI-generated images:

Automobiles – A considerable number of the images display wooden carvings of cars or clay-moulded vehicles. Others display aeroplanes and trains, specifically made from plastic bottle containers. These supposed plastic vehicles are ostensibly published to leverage the global climate change conversation that encourages reusing plastics rather than dumping them.

A train supposedly built from plastic

Animals – The trending images have also focused on animals. The team spotted several images of monkeys, cats, bears, and badgers supposedly created from grass and clay. One particular image of a cat was intriguingly created from the pieces of the edible parts of an orange.

 Politics and prominent personalities– Some of the images are also dedicated to popular local and international political figures, businesses and entertainment personalities.

Former US President Barak Obama purportedly carved by a boy

Football stars – Feeding into the debates on social media about who is the greatest of all time (GOAT), some of the artistic AI-generated images keep presenting football stars such Lionel Messi and Cristiano Ronaldo as the GOATs.

The objective of sharing these images, Fact-Check Ghana realised, is to garner reactions and engagements. Therefore, the purveyors tend to draw on the interests of the public on social media that the text-to-image platforms can provide.

The pages and groups misleading people with synthetic artistic images

Using keywords and phrases like “Bestboy talents”, “wood carvings” “talented boy”, Fact-Check Ghana observed that there are random posts from accounts on the social media pages, especially Facebook, that have published such artistic AI-generated posts. (The key phrases were realized from some of the posts the team had noted prior to the search). Some of the posts, on X for instance, were published in the comment section of users’ posts. However, a few accounts and groups have regularly published artistic AI-generated photos.

On Facebook, we observed the following accounts:

  • “It’s great we’re all interested” (a page with 141, 000 followers)
  • “MRK Decor Designd” (A page with 79,000 followers)
  • “Just Decoration (A group with 6,600 followers)
  • “Bestboy Talents” (A group with 62 followers)
  • Bestboy Talents (A page with O followers but many AI-generated posts published)
  • Bestboy Talents (A group with 5 followers).

On X, we observed two accounts:

  • @captainshawlar (573 followers)
  • @tahkeek95 (1,006 followers)

The bio pages of all the accounts on both Facebook and X indicate they are established for something else other than sharing AI-generated photos, but they consistently post such synthetic images to engage their followers.

Profile of MRK Decor Designd on Facebook says the page deals in home decor

For instance, the profile of “MRK Décor MRK Decor Designd” on Facebook states that it is into “Home decor · Interior Design Studio · Set Decorator” and indeed have published many photos of home decorations. Also, the bio page of @tahkeek95 on X indicates the account is into politics, cricket, and motivation among others.

The behaviour of these accounts on the digital platforms further supports the observation that they publish AI-generated images to deceive users on the platforms to engage with their content for their pages to gain prominence.

Inauthentic behaviour

Meta describes it as “inauthentic behaviour” when a user of its platforms (Facebook and Instagram) misleads others “about the source or origin of content” among other things. Therefore, the media technology organisation considers posting artistic AI-generated photos to deceive others into engagement as a violation of its community standards and deserving punitive action.

Racially stereotyping the African boy

A running trend Fact-Check Ghana noted was that the images were young black or African boys planted by the AI-generated images to convey the idea that they created the artistic work. What was also observed was how the boys were framed in the artwork.

Most of the images showed wretched and shabbily dressed black children with a background that depicted a rural, poor and relatively low-quality life. The surrounding environment behind the boys showed makeshift brick buildings, corroded roofings and dirty communities.

On the contrary, a few times some of the images used white boys, they were rather presented in fine and better clothing with more pleasing backgrounds. The immediate environments of the images with white boys showed a clean and developed community.

“The machine learning AI platforms can be discriminatory. They have a penchant to be racist,” Adwoa Adobea-Owusu, a leading investigative journalist in Ghana, said.

“How the black boys positioned by these AI-generated images are presented further perpetuates the stereotyping of some of the Western media and tech organisations. They seem to suggest this is how creative black children live. And this is to say, their living conditions are poor but their output is great so appreciate them.”

Adwoa’s comment comes on the back of a recent apology rendered by Google for the “inaccuracies” from its Gemini image generation platform. The tech giant apologised for its platform producing inaccurate and offensive images of specific genders, races and historical figures.

How to spot such deceptive artistic AI-generated images

Reviewing hundreds of such artistic AI-generated images, Fact-check Ghana outlines the following tips to spot the images:

  • Look out for how surreal or realistic the artistic work is. Some of them are flatly weird and impossible to create in reality. For instance, one of the images showed a rooster made from grass or a deer sculpted from riverside grass.
  • Spot the malformed fingers and toes. While some of the new text-to-image platforms are generally getting better and not producing six or four fingers for human beings, they are still struggling with getting right the shape of the fingers and toes, especially when  resting on surfaces. They also find challenges in getting correct how the human fingernails are fitted into the tip of the finger.
  • Zoom into the faces of the humans in the background. When the images attempt to generate a community of humans in the background, they struggle to provide clear different human faces. The platforms end up presenting what often seem like mutilated faces.

Related articles