Cross Icon
Let’s talk.
Let’s talk about
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Back to Blog

What Is Real World Evidence? A Guide to RWE in Clinical Research

Learn what real world evidence is, how it differs from real world data, where RWD comes from, and how regulators and research teams use RWE in practice.
Infographic showing real-world evidence data sources in clinical research

Real world evidence, or RWE, is the clinical insight researchers build from healthcare data collected outside a conventional randomised controlled trial. The underlying data comes from electronic health records, insurance claims, patient registries, wearables, and participant responses captured through mobile apps. Together these sources answer the questions controlled trials alone cannot reach, from how a therapy performs in everyday practice to how rare adverse events surface over time.

Real world evidence sits next to real world data (RWD) in almost every modern research conversation, and the two terms get confused constantly. RWD is the raw source. RWE is the interpreted finding that can support a regulatory, clinical, or payer decision.

If you've ever asked what is real world evidence, you've probably hit fuzzy definitions, vendor pitches, and confusion between RWE and real world data (RWD). This guide keeps it simple. We'll cover what RWE is, how it differs from RWD, where the data comes from, how regulators and research teams use it, and what a credible RWE programme looks like in 2026.

The stakes have climbed. The US Food and Drug Administration (FDA) 2018 framework and the European Medicines Agency (EMA) DARWIN EU network moved RWE from optional to standard practice in drug development, post market surveillance, and payer decisions. Our team at WeGuide has partnered with research groups generating app and wearable RWD for more than a decade, so this guide draws on what actually works in the field.

What is real world evidence?

Real world evidence is clinical evidence about a medical product's safety, effectiveness, or use, drawn from data collected outside a conventional randomised controlled trial. Researchers, regulators, and payers rely on RWE to answer questions that tightly controlled trials cannot.

The FDA defines RWE as "clinical evidence regarding the usage and potential benefits or risks of a medical product derived from analysis of real-world data." The EMA takes a similar position. Both agencies accept RWE as supporting evidence for new indications, post market safety, and pragmatic trial design, provided the data meets fit for purpose standards.

The definition is the easy part. What trips teams up is the line between RWE and RWD.

Real world evidence vs real world data: what's the difference?

Think of it this way. Real world data is the ingredient. Real world evidence is the dish. RWD is raw: records, signals, survey responses, claims. RWE is the analysis, the study, the conclusion you can act on.

Real world data (RWD) Real world evidence (RWE)
Routine healthcare data Clinical insight or conclusion
Examples: EHR, claims, registries, wearables, patient apps A study or analysis built on RWD
Raw source material Interpreted finding
Does not answer a research question on its own Answers a specific clinical, regulatory, or payer question

A concrete example. A hospital's electronic health records (EHR) contain millions of rows of RWD. When a research team analyses those records to compare outcomes between two diabetes therapies in patients over 65, the resulting study is RWE. Same with a wearable. A smartwatch generates RWD every minute. A study using that data to track post treatment heart rhythm in paediatric cancer patients is RWE.

Want to see what a modern RWE setup looks like? Take a quick look at WeGuide's real world evidence platform to see how RWD and participant reported data sit in one place.

Where does real world data come from? Five main sources

Most top ranked explainers list three or four RWD sources. In practice, RWD sits in five buckets, and each has a different strength.

  1. Electronic health records (EHR). The richest source, including clinical notes, lab results, diagnoses, prescriptions, and vital signs. Limited by format variation between systems and uneven coverage across populations.
  2. Medical claims and billing data. Captures most insured interactions across multiple providers. Strong on diagnoses and procedures. Weaker on clinical nuance and outcomes.
  3. Patient registries and disease cohorts. Purpose built collections tracking specific conditions or populations, often including patient reported outcomes and long term follow up. Excellent for rare disease and post market safety work.
  4. Wearables and connected devices. Continuous signals like heart rate, step count, sleep, blood glucose, and inhaler use. Rich, granular, and anchored to the participant. Limited by device access and tech literacy, which can skew the sample.
  5. Patient generated data through apps and electronic patient reported outcomes (ePRO). Survey responses, symptom diaries, eConsent records, telehealth interactions, and adherence logs captured directly through a participant app. The closest source to the patient's lived experience, and increasingly central to regulatory submissions.

WeGuide projects most often work with sources three, four, and five, sometimes connected to EHR through our integration engine. A good RWE strategy picks the source that fits the question, not the one that's easiest to access.

How is real world evidence used?

RWE shows up in almost every part of modern drug and device development.

Regulatory decisions

Regulators accept RWE for new indications, expanded labels, and device clearances when the data meets fit for purpose standards. FDA guidance sets out expectations for study design, data quality, and reporting. The EMA's DARWIN EU network provides coordinated data sources across European member states.

Post market safety surveillance

RWE is how we watch drugs and devices behave after launch. It complements spontaneous adverse event reporting with systematic longitudinal data from claims, EHRs, and registries.

Health economics and payer decisions

Payers and health technology assessment bodies like Australia's Pharmaceutical Benefits Advisory Committee (PBAC) and Canada's CDA-AMC use RWE to judge cost effectiveness in routine practice. Reimbursement often turns on it.

Trial design and external control arms

Sponsors use RWE to shape trial inclusion criteria, pick endpoints, and in some cases replace placebo arms with a synthetic control built from RWD. That matters most in rare diseases where recruiting a placebo arm is neither ethical nor practical.

Label expansion

Once a product is approved, teams often need evidence for new patient groups, new doses, or real world effectiveness. RWE builds that evidence without running a fresh interventional trial.

Real world evidence vs clinical trials

RWE is not a replacement for randomised controlled trials (RCT). It complements them.

Randomised controlled trial Real world evidence study
Population Narrow, selected Broad, routine care
Design Randomised, blinded, controlled Observational or pragmatic
Internal validity High Variable
External validity Often limited Often high
Time to answer Months to years Often faster once data is flowing
Cost High per participant Varies by data source
Best for Causal claims, regulatory approval Generalisability, long term follow up, rare events, post launch questions

A new cancer therapy still needs an RCT to earn approval. But once it's on the market, RWE answers the questions the RCT couldn't: does the drug work in people over 75 with multiple comorbidities? Does adherence hold up outside a trial setting? Are there rare adverse events that only surface after 50,000 exposures? That's the useful pairing. RCTs answer "can it work." RWE answers "does it work, for whom, and for how long."

Examples of real world evidence in action

Theory is helpful. Real trials make the concept concrete. Three examples from WeGuide projects show how app, wearable, and registry RWD turn into usable evidence.

GenV: a cohort study of over 100,000 Australian families

GenV is one of the largest population cohort studies ever attempted in Australia. Researchers at the Murdoch Children's Research Institute set out to follow more than 100,000 families from newborn to adulthood, tracking health, development, and wellbeing. Every few weeks, parents answer short surveys through the WeGuide participant app. Over time, those responses become RWE on child development, chronic disease risk, and service use at population scale. No randomised trial could match that breadth or duration.

FindAir smart inhaler: automated respiratory RWD

Self reported inhaler use is famously unreliable. When WeGuide partnered with FindAir to connect their smart inhaler directly into the respiratory research platform, a different picture emerged. Every actuation is time stamped and logged automatically, so researchers see when participants actually use their inhaler, not when they remember to note it later. That's a cleaner signal for adherence, exacerbation timing, and medication effectiveness, and it arrives continuously rather than in weekly diary chunks.

Beat2Beat: Apple Watch for paediatric cardiac monitoring

Some cancer therapies can damage a child's heart. The Beat2Beat study at Monash Children's Hospital uses Apple Watch and the WeGuide app to monitor heart rhythm during and after treatment, so clinical teams can see patterns that only appear at home. The result is continuous, participant level RWD that a weekly ECG in clinic would miss. Clinicians get earlier warning signs, and researchers build evidence that could reshape cardiac monitoring guidelines.

Ready to see how app generated RWD flows into analytics? Our analytics dashboard turns continuous participant data into study insights your team can act on.

How regulators use real world evidence: FDA, EMA, TGA

Regulators have moved quickly on RWE in the past decade. Any team working in global studies needs to know three agency positions.

FDA

The 21st Century Cures Act (2016) required the FDA to create a framework for using RWE in approval and post market decisions. The agency published that framework in 2018 and has issued multiple guidance documents since, including "Considerations for the Use of Real-World Data and Real-World Evidence to Support Regulatory Decision-Making for Drug and Biological Products" (updated 2023). PDUFA VII (2022) committed the FDA to further staff and process investments in RWE review.

EMA

The EMA launched DARWIN EU in 2022, a federated network of data sources across European Union member states. The agency has also published guidance on registry based studies and is steadily clarifying expectations for submission grade RWE.

TGA (Australia)

Australia's Therapeutic Goods Administration leans on RWE for post market monitoring and pharmacovigilance. Local context matters. Australian data sovereignty rules and the Privacy Act shape where and how RWD can be stored and processed. A good platform supports these requirements rather than forcing workarounds.

Regulatory note: RWE acceptance varies by decision type and jurisdiction. Consult your regulatory adviser before relying on RWE for a specific submission.

Common challenges with real world evidence

RWE has limits. Pretending otherwise is the fastest way to lose a regulator or a reviewer.

  • Confounding and selection bias. People who receive a treatment differ from those who don't, and those differences often explain outcomes more than the treatment itself. Careful study design and statistical adjustment help, but they don't erase the problem.
  • Missing and inconsistent data. EHR systems vary between sites. Claims data misses clinical nuance. Registries depend on consistent entry. Patient apps rely on engagement. Every source has gaps, and honest handling of those gaps is part of the work.
  • Regulatory acceptability varies. A given dataset might support a post market safety decision but not a new indication. Fit for purpose is the real question, not "is this RWD good enough in general."
  • Standards and interoperability. The Observational Medical Outcomes Partnership (OMOP) common data model and Fast Healthcare Interoperability Resources (FHIR) help data move between systems, but adoption is uneven and costly at scale.
  • Patient privacy and data governance. Strong consent, clear governance, and secure storage aren't optional. Australian and European regulations are particularly firm about cross border data flows and participant rights.

Naming these constraints openly builds trust with regulators, ethics committees, and participants. Glossing over them costs credibility later.

Building a real world evidence strategy

A credible RWE programme answers five questions in order.

  1. What decision are you trying to support? Start with the regulator, payer, or clinical question. The decision shapes the evidence, not the reverse.
  2. Which data source fits? Match the question to the source. Label expansion in a rare disease probably needs a patient registry and patient reported outcomes, not claims. A safety signal in a common drug probably lives in claims or EHR data.
  3. Is the data fit for purpose? Check relevance, reliability, completeness, and timeliness. ISPOR Good Research Practices and the FDA's fit for purpose criteria are the working standards.
  4. How will you engage participants? App generated RWD only works when participants stay engaged across months or years. Behavioural science, multi language support, and simple workflows matter more than long feature lists.
  5. What platform will carry it? Look for a system that collects, stores, and analyses data across multiple sources, respects consent, and supports the regulatory context you work in. That's where WeGuide's real world evidence platform earns its place.

Teams that work through those five questions produce evidence a regulator will actually accept. Teams that skip them produce RWD without useful RWE.

Conclusion

Real world evidence is clinical insight drawn from the data of routine healthcare and participant driven research. It complements randomised controlled trials and, increasingly, informs regulatory, clinical, and payer decisions across the US, Europe, and Australia.

Four ideas worth keeping:

  • RWD is the ingredient. RWE is what you cook with it.
  • The five main RWD sources are EHRs, claims, registries, wearables, and patient generated data from apps and ePRO.
  • The FDA, EMA, and TGA each accept RWE for specific decisions when the data is fit for purpose.
  • Honest acknowledgement of confounding, missingness, and governance builds more credibility than overselling.

Research teams earn the strongest real world evidence when they plan from the decision back to the source, engage participants well, and pick a platform that respects data quality and consent. Book a demo of WeGuide to see how app and wearable generated RWD can power your next study.

Questions we frequently get asked about this topic

Contact us
No items found.
Share the article
Become our partner
Ready to revolutionise your next clinical trial or healthcare project? We’d love to hear from you!
Get in touch
UP NEXT
Real World Data in Clinical Trials: Practical Guide