Study finds 75% of Facebook shares are made without reading the content

A new study has found that most social media users share links without clicking on them first, relying only on headlines and short summaries. The analysis, which examined over 35 million public Facebook posts, found that around 75% of shares occurred without users engaging with the full content. Notably, political content—especially from both extremes of the ideological spectrum—was more likely to be shared without being clicked than neutral content. The findings have been published in Nature Human Behavior.

The researchers aimed to understand how and why people share content on social media without reading it first. Social media platforms thrive on sharing, a behavior that drives engagement and allows content to go viral. However, the ease and speed of sharing mean users often act impulsively, spreading links based on superficial cues like headlines or the number of likes. This behavior can inadvertently contribute to the dissemination of misinformation, particularly in the political sphere. Previous research has suggested that people often form opinions from short snippets, creating an illusion of knowledge without truly understanding the content.

“The inspiration for our research is in understanding the phenomenon of sharing, which in my mind is the single most influential action on social media. Not only does sharing result in the multiplicative effect of information spreading through networks of individuals, it has in recent years fueled the epidemic of online misinformation,” said corresponding author S. Shyam Sundar, Evan Pugh University Professor and the James P. Jimirro Professor of Media Effects at Penn State.

“I have been interested in fellow online users acting as de facto communication sources ever since my dissertation back in 1995. With sharing features of social media, the ability of ordinary people to serve as sources of news and public affairs information has dramatically increased. What most people do not realize is that their friends and family on social media do not have the journalistic training to vet facts and double-check them before disseminating. We tend to be swayed by whatever they share.”

“In my lab group, we have long been interested in studying how deeply online users process information, how much thought they put into what they read and forward on social media and mobile phones,” Sundar told PsyPost. “So, when the opportunity arose to study sharing on a large scale with the URL Shares dataset released by Meta, which is the largest social science dataset ever assembled, we were obviously interested in exploring the sheer the volume of the phenomenon of sharing without clicking, which is an indicator of the superficiality of information processing.”

To investigate this phenomenon, the team analyzed a massive dataset provided by Facebook’s collaboration with Social Science One. The dataset included billions of interactions with over 35 million URLs shared on Facebook from 2017 to 2020. The team focused on the top 4,617 domains (such as CNN, Fox News, and The New York Times) and 35 million URLs shared on Facebook during this four-year period.

The researchers examined two main areas: the frequency of “shares without clicks” and the patterns of political content sharing. The data were separated into political and non-political content using a machine learning classifier trained to identify politically relevant keywords. Political content included URLs tied to elections, candidates, and other partisan topics, while non-political content ranged from entertainment to general news.

The team analyzed users’ sharing behaviors across different political leanings—liberal, neutral, and conservative—and examined whether users’ ideological alignment with the content influenced their likelihood of sharing it without clicking. They also looked specifically at fact-checked URLs to identify patterns in the spread of misinformation.

Across all 35 million URLs analyzed, approximately 75% of shares occurred without the users clicking on the link to view its full content. This trend was even stronger for political content, particularly at the ideological extremes. The spread of misinformation was particularly concerning. Fact-checked URLs identified as false were more likely to be shared without being clicked than true content.

“A key takeaway is that most of the shared links we encounter in Facebook are shared without first being read by the person sharing them,” Sundar explained. “This tells us that social media users and simply glancing at the headline and the blurb when deciding to blast a news link to their networks. Such dissemination can have a multiplicative effect and result in rapid spread of information to millions of folks online. This can result in vitality of misinformation, spreading fake news and conspiracy theories.”

The researchers observed another clear pattern: the more politically extreme the content, the more likely it was shared without being clicked. This trend held true for users across the political spectrum. In other words, whether content was strongly liberal or conservative, it attracted more superficial sharing compared to neutral content.

Users were more likely to share content that aligned with their political beliefs. For example, liberals were more likely to share left-leaning content without clicking, while conservatives were more likely to share right-leaning content. This suggests that users rely on headlines that confirm their existing biases, potentially bypassing the need to engage with the full content.

“The more politically extreme the content is, the more it is shared without being clicked upon first,” Sundar told PsyPost. “This is true for both extreme left and extreme right. As we know, there tends to be a lot of strong opinions and biased commentary on the extremes of the political spectrum. As such, there is more scope for fake news and conspiracy theories masquerading as legitimate news in politically extreme news domains.”

“In the dataset we accessed, there were 2,969 URLs that were fact-checked by a third party and determined to be false. The vast majority of these links were from conservative news domains and so unsurprisingly, we found that conservatives were five times more likely than liberals to share these links, most often without clicking on them and reading the false stories first. This suggests that if politically partisan users see a headline that seems aligned with their political ideology, they will readily share the story without bothering to verify if it is really true.”

The study highlights a concerning trend in how social media users interact with content. But it does have limitations. The analysis relied on aggregated data, meaning the researchers could not observe individual users’ behaviors directly. Some shares without clicks might still reflect deliberate actions—for example, resharing familiar content without revisiting it.

Additionally, the study focused only on Facebook, so it remains unclear whether similar patterns exist on other platforms like Twitter or Instagram. Future research could explore these behaviors on a broader scale and examine how different devices, such as mobile phones versus computers, influence users’ sharing habits.

The researchers suggest that these findings have significant implications for both social media platforms and users. Social media interfaces could be redesigned to encourage more deliberate sharing. For instance, platforms could implement prompts reminding users to read an article before sharing it or provide indicators showing whether a link has been clicked. These interventions could reduce the spread of misinformation and promote more thoughtful engagement with news content.

“If platforms implement a warning that the content might be false and make users acknowledge the danger in doing so, that might help people think before sharing,” Sundar said.

The study, “Sharing without clicking on news in social media,” was authored by S. Shyam Sundar, Eugene Cho Snyder, Mengqi Liao, Junjun Yin, Jinping Wang, and Guangqing Chi.