Health misinformation advances among different social groups, increases distrust in doctors, and accelerates the use of artificial intelligence for medical decisions, according to a global survey that points to growing difficulties in identifying reliable sources amid information overload and the erosion of public trust.
Seven out of ten people worldwide believe that at least one in six false or contested statements about health is true, according to the 2026 Edelman Trust Barometer Special Report on Trust and Health, a survey based on interviews with over 16,000 people across 16 countries.
Contrary to the perception consolidated in recent years, the data indicates that medical misinformation does not circulate only among radicalized groups, those with little education, or those excessively connected to social networks, but reaches virtually all layers of the population.
Richard Edelman, CEO of the company responsible for the research, told Fortune that the study dismantles the idea that only people more distrustful of traditional science question widely accepted medical consensuses.
-
New study reveals unexpected connection between bees, flowers, and crops and suggests a possible solution to one of the biggest dilemmas currently faced by agriculture and global food production.
-
System that promises to store energy in hills, using a fluid denser than water, reaches full power in the UK and can transform former industrial areas into smaller hydroelectric batteries, quick to build and focused on clean energy.
-
Elon Musk looks at Brazil and lowers the price of Starlink, his internet: price at Magalu drops again in May and portable antenna with dual band Wi-Fi 5, 3×3 MU-MIMO technology, and satellite connection becomes a target for Brazilians in areas without fiber, 4G, or 5G.
-
Meta has just altered what made Instagram famous: the company is launching a feature this Wednesday (13) that changes the way photos are shared on the app and may indicate where the next generation of social networks is headed.
“It’s quite an impressive set of data,” said Edelman.
In his assessment, there was an expectation that the highest levels of skepticism would be concentrated in specific niches of society. “And it’s not true. It’s everyone,” he said.
Health misinformation crosses social and political barriers
The numbers gathered by the research show that 69% of respondents with a university degree believe in at least one of these statements, a percentage practically identical to the 70% recorded among people without higher education.
When the data is cross-referenced with ideological positioning, the differences also appear less pronounced than expected by the researchers.
Among respondents identified with the right, 78% said they believe in at least one of the contested claims, while the rate reached 64% among people aligned with the left.
The report also highlights that the pattern repeats across different age groups and regions of the world, with higher rates in developing countries than in economies considered wealthier.
Despite the recurring perception that the United States would be the main epicenter of health misinformation, the country does not appear among the most critical in the survey.
Information overload increases the difficulty of trusting sources
For experts linked to the Edelman Trust Institute, the current crisis is not only associated with the absence of reliable information but mainly with the excess of content available on digital platforms and social networks.
Amid contradictory messages and the difficulty of distinguishing reliable sources from misleading content, people’s confidence in their own ability to make informed health decisions dropped 10 percentage points in just one year, reaching 51%.
Meanwhile, the media’s credibility in addressing health topics remains below the levels recorded before the COVID-19 pandemic, a scenario that reinforces the feeling of insecurity in the face of the growing volume of information.
“People are overwhelmed with information, and I’m not sure if they can differentiate one source from another,” said Edelman.
According to Jennifer Hauser, Global Health Chair at Edelman, the excessive amount of available data may end up complicating medical decisions that previously seemed simpler for a large part of the population.
Artificial intelligence gains ground in medical decisions
In this environment of uncertainty and excess of informational stimuli, artificial intelligence has come to occupy an increasingly larger space in the routine of people seeking quick answers to questions related to their own health.
The report indicates that 35% of respondents already use some AI tool to manage their own health, whether to clarify symptoms, obtain immediate information, or seek a second opinion on received diagnoses.
Among those who already use artificial intelligence resources for medical topics, 84% said they resort to technology to receive quick answers, while 74% stated they use it as complementary support in the face of clinical diagnoses.
Experts express concern because some respondents attribute to AI functions traditionally associated with medical training and qualified professional evaluation.
According to the research, 64% believe that a person fluent in artificial intelligence could perform at least one medical task as well as, or better than, a trained doctor.
In this group, 21% cited the definition of treatments and medications, while 17% mentioned disease diagnoses.
Relationship between patients and doctors faces strain
Even with the rapid advancement of artificial intelligence, personal doctors continue to be identified as the most reliable sources of health information in the 16 markets analyzed by the global survey.
Even so, the report identifies clear signs of strain in the relationship between patients and professionals, especially when people report feeling that their questions are met with judgment or little openness to dialogue.
Jennifer Hauser stated that some respondents see AI as a less critical alternative and, in some cases, more welcoming during the search for medical guidance.
“AI can be less judgmental than doctors,” she said.
In her assessment, many patients do not expect to find absolute authority figures in consulting rooms, but professionals capable of guiding complex decisions in an accessible and transparent manner.
In the United States, this loss of trust also appears in other recent surveys.
A study published in JAMA Network Open showed that trust in doctors and hospitals fell from 71.5% in April 2020 to 40.1% in January 2024.
Costs and difficulty of access accelerate behavior change
In addition to the trust crisis, specialists associate the population’s behavior change with the increasing difficulties in accessing medical care and the high costs of health systems.
Data released by West Health in partnership with Gallup indicated, in 2025, that 35% of adults in the United States said they could not access quality and affordable healthcare if needed at that moment.
According to researchers, the impact is even stronger among Black, Hispanic adults, and low-income individuals, groups that report greater vulnerability in the face of rising medical expenses.
In this context, there is a growing search for answers outside the traditional health system, including research on social networks, content produced by influencers, and automated tools based on artificial intelligence.
A survey by KFF released in January 2026 showed that 66% of American adults were concerned about the ability to pay medical expenses for themselves and their families.
By April, a new round of the survey recorded a rate of 64%, keeping healthcare costs among the main financial concerns of households.
Experts advocate a new form of scientific communication
In the assessment of researchers linked to Edelman, combating health misinformation cannot start from the premise that only certain social groups would be more susceptible to false or misleading content.
Justin Blake, executive director of the Edelman Trust Institute, stated that the main contribution of the survey is precisely in demonstrating that the public affected by divisive beliefs is much broader than previously imagined.
The reading of the data indicates that campaigns based solely on factual corrections tend to be insufficient when factors such as fear, resentment, social isolation, and a sense of institutional abandonment begin to influence individual decisions.
Dave Bersoff, head of research at the institute, related this process to a gradual erosion of the social fabric and increased distrust among groups with different views on public issues.
Richard Edelman also advocated changes in the way science communicates with the population, arguing that isolated technical responses no longer produce the same trust effect observed in previous years.
“For years, science was about the ‘what’,” he stated. “In the next phase, scientists will have to talk about the ‘why’ and the ‘how’.”
According to experts interviewed in the report, rebuilding trust bonds requires acknowledging legitimate doubts, increasing the clarity of medical guidelines, and preventing false information from continuing to interfere with decisions that can directly affect public health.

Be the first to react!