Image

AI Is Causing Cultural Stagnation, Researchers Find

Generative AI relies on a massive body of training material, primarily made up of human-authored content haphazardly scraped from the internet.

Scientists are still trying to better understand what will happen when these AI models run out of that content and have to rely on synthetic, AI-generated data instead, closing a potentially dangerous loop. Studies have found that AI models start cannibalizing this AI-generated data, which can eventually turn their neural networks into mush. As the AI iterates on recycled content, it starts to spit out increasingly bland and often mangled outputs.

There’s also the question of what will happen to human culture as AI systems digest and produce AI content ad infinitum. As AI executives promise that their models are capable enough to replace creative jobs, what will future models be trained on?

In an insightful new study published in the journal Patterns this month, an international team of researchers found that a text-to-image generator, when linked up with an image-to-text system and instructed to iterate over and over again, eventually converges on “very generic-looking images” they dubbed “visual elevator music.”

“This finding reveals that, even without additional training, autonomous AI feedback loops naturally drift toward common attractors,” they wrote. “Human-AI collaboration, rather than fully autonomous creation, may be essential to preserve variety and surprise in the increasingly machine-generated creative landscape.”

As Rutgers University professor of computer science Ahmed Elgammal writes in an essay about the work for The Conversation, it’s yet another piece of evidence that generative AI may already be inducing a state of “cultural stagnation.”

The recent study shows that “generative AI systems themselves tend toward homogenization when used autonomously and repeatedly,” he argued. “They even suggest that AI systems are currently operating in this way by default.”

“The convergence to a set of bland, stock images happened without retraining,” Elgammal added. “No new data was added. Nothing was learned. The collapse emerged purely from repeated use.”

It’s a particularly alarming predicament considering the tidal wave of AI slop drowning out human-made content on the internet. While proponents of AI argue that humans will always be the “final arbiter of creative decisions,” per Elgammal, algorithms are already starting to float AI-generated content to the top, a homogenization that could greatly hamper creativity.

“The risk is not only that future models might train on AI-generated content, but that AI-mediated culture is already being filtered in ways that favor the familiar, the describable and the conventional,” the researcher wrote.

It remains to be seen to what degree existing creative outlets, from photography to theater, will be affected by the advent of generative AI, or whether they can coexist peacefully.

Nonetheless, it’s an alarming trend that needs to be addressed. Elgammal argued that to stop this process of cultural stagnation, AI models need to be encouraged or incentivized to “deviate from the norms.”

“If generative AI is to enrich culture rather than flatten it, I think systems need to be designed in ways that resist convergence toward statistically average outputs,” he concluded. “The study makes one thing clear: Absent these interventions, generative AI will continue to drift toward mediocre and uninspired content.”

More on generative AI: San Diego Comic Con Quietly Bans AI Art

The post AI Is Causing Cultural Stagnation, Researchers Find appeared first on Futurism.

Releated Posts

CEOs Say Yeah, AI Might Be a Bubble, But They’re Gonna Keep Shoveling Money Into the Furnace Because All Their Friends Are

A new survey by accounting firm KPMG US found a contradiction in how CEOs are thinking about AI:…

Mar 12, 2026 2 min read

Grammarly Is Pulling Down Its Explosively Controversial Feature That Impersonates Writers Without Their Permission

Grammarly infuriated journalists, authors, and academics with its “Expert Review” feature, which impersonated writers — both dead and…

Mar 11, 2026 3 min read

The AI-Generated Tilly Norwood Just Dropped the Worst Music Video We’ve Ever Seen

Late last year, video production company Particle6 triggered near-universal backlash when it unveiled its so-called “AI actress” dubbed…

Mar 11, 2026 4 min read

US Military Tested Havana Syndrome Weapon on Large Mammals, Whistleblowers Says

Sprawling revelations about so-called Havana Syndrome show no signs of going away. Rumors of the alleged neurological condition…

Mar 11, 2026 3 min read

US Military Investigating Whether AI Was Involved in Bombing Elementary School in Iran

Commercial satellite imagery captured last week shows the eerie devastation following the bombing of an Iranian elementary school.…

Mar 11, 2026 3 min read

Waymos Are a Huge Drain on Public Resources, Government Data Shows

In their 2022 book “The Road to Nowhere,” tech critic Paris Marx considered the implications of the death…

Mar 11, 2026 4 min read

FCC Deciding Whether to Allow Startup to Launch Huge Mirror Satellite to Blast Sunlight on Cities at Nighttime

The US Federal Communication Commission is reviewing an application to launch and deploy a massive mirror satellite in…

Mar 10, 2026 4 min read

Body Horror Robot Turns Human Into Centaur

We’ve seen plenty of robot appendages designed to decrease exertion, from futuristic exoskeletons that can allow you to…

Mar 10, 2026 4 min read

Insiders Afraid the Government Will Nationalize the AI Industry

Depending on who you ask, AI was the financial growth story of 2025. In the first nine months…

Mar 10, 2026 3 min read