Here is a finding that surprises most people the first time they come across the ePRO vs paper PRO question. A 2022 head to head study published in JMIR Formative Research calculated the per participant cost of an ePRO assessment at around $56, while the same assessment on paper came in at roughly $171. The electronic version cost about a third as much. And yet paper questionnaires still show up in a large share of active trials across Australia, the UK and the US.
Why the gap between what the evidence shows and what teams actually do?
In most cases, it is not because paper is better. It is because switching feels risky, the instrument you need has always been on paper, or the last ePRO rollout you saw was clunky. This article walks through the ePRO vs paper PRO question the way we think about it at WeGuide. We will cover what the peer reviewed evidence actually says on cost, compliance and data quality, where paper still has a legitimate role, and how to decide for your next protocol. You will leave with a practical framework, not a sales pitch.
What are paper PRO and ePRO?
A patient reported outcome, or PRO, is any report about a patient's health that comes directly from the patient, without interpretation by a clinician. A paper based PRO captures that report on a printed questionnaire, typically filled in at a clinic visit or at home and returned by post or at the next appointment. An ePRO, short for electronic patient reported outcome, captures the same instrument electronically, usually through a smartphone app, tablet or browser.
The instrument itself is the same. A validated measure like the EQ 5D, PROMIS or a disease specific tool such as the KOOS does not change when you move it to a phone. What changes is everything around the instrument. How the question is delivered, when it is answered, how the data lands in your dataset, and what happens when a response is missing.
For a broader look at how ePRO fits with clinical outcome assessments more generally, our eCOA vs ePRO guide is a good starting point.
ePRO vs paper PRO at a glance
Two notes on this table. First, the cost figures from Nguyen and colleagues are specific to their oncology symptom study and will not hold exactly for every trial. Second, compliance is the one dimension where the evidence is genuinely mixed, and we will come back to that below.
What peer reviewed studies show about ePRO vs paper PRO
Vendor blog posts tend to claim ePRO wins on every metric. The published literature is more nuanced, and worth reading carefully before you choose.
Cost and data handling
Nguyen and colleagues (2022, JMIR Formative Research) ran the most widely cited direct cost comparison of ePRO clinical trials data capture. They compared paper and ePRO for symptom monitoring in oncology patients and reported a per participant cost of $171 for paper versus $56 for ePRO. The main drivers of the paper cost were transcription labour, query resolution and the administrative overhead of chasing missing responses.
A separate systematic review summarised by the ISPOR ePRO Consortium found that 88.2% of compared studies showed equivalence or superiority of electronic versions on data equivalence measures. That is a strong signal that moving a validated instrument to a phone does not break the psychometric properties, provided the migration is done properly.
Patient preference
The SPRUCE trial reported by Philipps and colleagues (2023) asked participants which mode they preferred after using both. 72% chose electronic. A meaningful minority, however, still preferred paper. That minority tends to be older, less comfortable with smartphones, or lives with a condition that makes small screen interaction difficult. A good ePRO programme plans for that minority rather than pretending they do not exist.
Where the evidence is mixed
Here is the nuance that vendor content usually skips. In the same Nguyen 2022 study, paper actually outperformed ePRO on questionnaire completion at the 6 week mark. Participants in the paper arm returned more complete questionnaires over the full study period than the ePRO arm. That finding does not overturn the cost and data quality picture, but it is an honest reminder that ePRO is not automatically better at keeping people engaged.
Completion rates in any PRO study depend heavily on reminder design, question load, and how the tool fits into the participant's day. A well built ePRO beats paper on compliance in most modern trials. A badly built one can lose to paper.
Five reasons most trials are moving to ePRO
1. Data quality improves through design
Paper questionnaires are easy to fill in wrong. Participants skip items, tick two boxes for one question, or write answers in the margin. A trained coordinator then has to decide what to do with each anomaly during transcription. Every one of those decisions is a potential query, an audit trail entry, and a small erosion of data integrity.
ePRO removes most of that at the source. Required items cannot be skipped without an explicit decline, skip logic hides irrelevant questions, and numeric ranges are enforced. The data that arrives in your dataset is the data the participant actually entered, with timestamps attached. If you want to go deeper on this, our article on the advantages of mobile data collection unpacks the mechanics.
Our clinical instruments library shows how validated PROs are migrated with these controls built in, rather than bolted on.
2. Cost scales in the right direction
Paper cost per participant stays roughly constant. Every new participant needs another printed pack, another manual entry, another round of chasing. ePRO cost is front loaded in the build and scales efficiently. Once the forms are configured and validated, adding the thousandth participant costs almost nothing.
Consider the BRACE Trial, which we supported through BCG and the Murdoch Children's Research Institute. The study enrolled more than 6,000 participants across five countries and maintained over 90% adherence to the ePRO schedule. On paper, the data entry labour alone for a study that size would have been substantial. Electronically, it simply flowed into the dataset. You can read the details in our BRACE Trial case study.
3. Real time signals enable earlier action
With paper, you learn what participants reported at the next monitoring visit, or later. With ePRO, you can see a worrying symptom score the day it is entered. For safety critical trials, and increasingly for quality of life monitoring in oncology, that shift from retrospective to prospective visibility changes what the data can do. A clinician can follow up while the episode is still current, not three months after it resolved.
4. Regulatory alignment is cleaner
The FDA's 2013 Guidance on Electronic Source Data in Clinical Investigations treats electronic capture, or eSource, as the source record, which removes the transcription step and the control point that goes with it. Combined with 21 CFR Part 11 compliant systems, ICH E6(R2) expectations on data integrity, and CDISC aligned data standards, a properly implemented ePRO is usually easier to defend at audit than a paper trail that has been typed into a database by a human.
For multi site and decentralised studies, that advantage compounds. Our decentralised clinical trial overview walks through how electronic source sits inside a broader remote study architecture.
5. Multilingual and accessibility support becomes feasible
A paper study running in English, Mandarin, Arabic, Vietnamese and Greek needs five printed versions, five sets of translation validation, and five separate returns processes. The INHERIT COVID 19 study used WeGuide to deliver questionnaires in multiple languages from a single build, which would have been impractical on paper. See how the multilingual trial platform handled INHERIT.
Accessibility works the same way. Text sizing, screen reader compatibility and audio prompts are straightforward in a digital build. They are very hard on paper.
Where paper PRO still has a place
Being honest about this matters. Paper is not obsolete, and treating it that way costs you trust with the sites and participants who rely on it.
Paper still makes sense when:
- The participant population has low smartphone access or confidence. Older cohorts, some rural communities, and certain rehabilitation settings are examples.
- The instrument is only available on paper and recreating it electronically would require re validation you cannot schedule. This is less common than it was, but it still happens.
- The study has a very short duration and small sample. The build cost for a 30 person, single visit study can exceed the paper handling cost.
- Site infrastructure genuinely cannot support devices or reliable connectivity. Offline capable ePRO helps, but it does not solve every connectivity problem.
- Regulatory or ethics constraints in a specific jurisdiction require a paper backup. Some committees still ask for this, and a hybrid approach is the cleanest answer.
A hybrid model, where most participants use ePRO and a small subset use paper, is often the right answer for studies that straddle digitally confident and digitally excluded populations.
A decision framework: paper, ePRO, or both?
Rather than choosing dogmatically, work through these questions.
If most of your answers sit on the right, a hybrid model or paper first approach is legitimate. If most sit on the left, you are probably paying a real tax for every month you stay on paper.
A mid sized rehabilitation clinic we worked with early in 2025 went through this exercise for a 12 week outcomes study. They had been running paper PROMs for a decade and assumed ePRO would alienate their older patients. The decision framework flagged the study length, the multi site footprint and the need for weekly symptom tracking as strong ePRO signals. They piloted a hybrid, with paper available on request. Uptake of the electronic version ran at 94%, and the paper option preserved access for the 6% who needed it. Six months later they had shut down paper entirely because no one was using it.
Making the switch: practical considerations
Use validated instruments in their validated form
Not every paper instrument has been psychometrically validated for electronic administration, and copyright holders are increasingly specific about what modifications are allowed. Before you migrate, check the licence, check the ISPOR ePRO Consortium guidance, and confirm that equivalence testing has been done or is not required. Our PROMs and PREMs platform outlines how we handle licensed instruments inside WeGuide.
Plan for BYOD and provisioned devices
Bring your own device, or BYOD, works for most adult studies and reduces logistics cost substantially. For cohorts with limited device access, a small pool of provisioned devices with data SIMs is usually enough to close the gap. The decision is not all or nothing.
Budget for participant onboarding
The single biggest predictor of ePRO success is how well participants are onboarded in the first week. A three minute walkthrough at consent, a practice questionnaire before the real study starts, and a clear contact route if something goes wrong will move your compliance rate more than any feature will. Good patient engagement software makes each of those steps feel routine rather than optional.
Design reminders like you mean them
Default system notifications are not a reminder strategy. Time of day matters, day of week matters, and for certain populations a text message or an email still outperforms a push notification. WeGuide's digital form builder makes this configurable rather than hard coded.
Keep a paper fallback for the people who need it
Even in trials that are overwhelmingly electronic, keeping a paper option on hand for the minority who need it is usually the right call. It costs very little to maintain and it preserves access.
Frequently asked questions
Is ePRO cheaper than paper PRO?
In most studies, yes. The 2022 Nguyen et al. comparison found ePRO cost about $56 per participant versus $171 for paper, driven mostly by transcription labour and query handling. The gap grows as sample size and study duration increase. Very short, very small studies can be an exception because ePRO build cost is front loaded.
Are ePRO and paper PRO equivalent for validated instruments?
The ISPOR systematic review found equivalence or superiority in 88.2% of comparisons, so the default position is that a validated paper instrument remains valid electronically when migrated properly. That said, you should check the specific instrument's licensing terms and published equivalence evidence before assuming.
What does the FDA say about ePRO?
The FDA's 2013 Guidance on Electronic Source Data in Clinical Investigations treats electronic capture as acceptable source data provided the system meets 21 CFR Part 11 requirements for attributable, legible, contemporaneous, original and accurate data. ePRO that meets Part 11 is usually easier to defend at audit than transcribed paper.
Do older patients prefer paper PRO?
Preference skews younger for electronic and older for paper, but the split is less dramatic than people expect. The SPRUCE trial reported 72% preferring electronic overall, and within older cohorts a majority still prefer electronic once they have been onboarded properly. Offering a paper option for the minority who genuinely need it is the considerate and practical answer.
Key takeaways
The evidence on ePRO vs paper PRO is clearer than the debate often suggests. Electronic capture reduces per participant cost, lifts data quality, gives you real time signals, aligns with current regulatory thinking, and makes multi language and multi site studies practical. Peer reviewed studies including Nguyen 2022 and the SPRUCE trial support that picture, and the ISPOR systematic review puts equivalence or superiority for ePRO at 88.2% of comparisons.
Paper still has a role. For populations with limited device access, for very short studies, for instruments that only exist on paper, a paper or hybrid approach remains legitimate. Being honest about that is part of running good research.
The question for your next protocol is not whether ePRO is better in the abstract. It is whether your study, your population and your instruments fit the pattern where ePRO wins. For most modern trials, they do.



