Cross Icon
Let’s talk.
Let’s talk about
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Back to Blog

eConsent Best Practices: Designing for Comprehension and Compliance

A signed consent form isn't the same as informed consent, and most eConsent platforms miss that distinction. Here are 10 best practices for designing eConsent.
Clinical trial participant reading an eConsent form on a tablet during informed consent review

A signed consent form isn't the same as informed consent. Every ethics committee knows this. Most eConsent platforms forget it.

eConsent (electronic informed consent) should do a better job than paper at closing that gap, and when it's built well, it does. Interactive content, video, audio, knowledge checks, version control, and live signature capture all help participants understand what they're joining before they sign. But only if the build puts comprehension first. A prettier signature workflow isn't better consent.

This guide walks through 10 eConsent best practices that move comprehension, compliance, and retention. We'll cover the regulatory baseline from the FDA and ICH, multilingual and accessible design, virtual eConsent in decentralised clinical trials, and the questions every sponsor should ask a vendor.

What is eConsent?

eConsent, or electronic informed consent, is the digital process of giving and recording informed consent for a clinical trial or health programme. An eConsent form replaces the paper consent form with interactive content, video, audio, knowledge checks, and electronic signatures. The FDA, EMA, MHRA, and TGA all accept eConsent in clinical trials when the system supports identity verification, an audit trail, version control, and the substantive requirements of GCP.

eConsent often sits inside a wider patient engagement platform. That matters because participants don't experience consent as a standalone event. For them, it's the first task in a programme that might run for months or years. WeGuide's eConsent platform is built this way, with screening, consent, ePRO, and ongoing engagement sharing the same participant experience.

See eConsent in action. Want a walkthrough of interactive consent forms, multilingual builds, and version control? Book a short WeGuide demo and we'll show you how it works in practice.

Why signature rate is the wrong KPI for eConsent

Most eConsent tools report signature rate: how many participants completed the flow and signed. That number tells you about the funnel. It tells you almost nothing about comprehension.

The ethics frame matters here. Informed consent has three parts: information, comprehension, and voluntariness. A signature confirms the third. It doesn't confirm the first two. A participant who scrolls through 20 pages on a phone and taps "I agree" has signed. They may or may not have understood.

Comprehension shows up later. In the questions a participant asks at week 4 that they should have asked at week 0. In the withdrawals that happen once the time commitment becomes concrete. In the adherence data that drifts when participants realise the trial demands more than they signed up for.

A systematic review by Pires and colleagues found that eConsent consistently improves participant comprehension over paper when the design uses interactive content and comprehension checks. When the design doesn't, eConsent and paper come out the same. Design, not format, is what moves the needle.

Every best practice that follows traces back to a single question: did the participant actually understand what they agreed to?

10 eConsent best practices that actually work

1. Write consent at an 8th grade reading level

Most adult readers sit between 7th and 9th grade. Consent forms routinely land at 12th grade or higher. Drop the reading level with short sentences, active voice, and everyday words. Free tools can check Flesch Reading Ease and grade level on every revision. Many IRBs and HRECs now ask for a readability score as part of review.

2. Put a plain language summary at the top

Before the full consent, include a one page summary covering purpose, what the participant will do, the main risks, time commitment, and withdrawal rights. The full consent still appears after. This layered approach respects the participant's time and gives them a decision quality document before the legal detail.

3. Use multimedia on purpose, not for decoration

A 90 second explainer video can carry information that three pages of text cannot. Animations help with how a procedure works. Audio helps participants with low literacy or vision impairment. Always caption video, offer a text alternative, and ask whether each asset is doing a job only that format can do. If not, cut it.

4. Add knowledge checks that gate real risks

Short knowledge checks belong on the items that actually matter for informed consent: main risks, withdrawal rights, use of biospecimens, data sharing, and anything unusual for the participant's condition. Skip checks on trivia. Three well placed questions beat ten shallow ones. When a participant answers incorrectly, loop them back to the explanation before they move on.

5. Offer consent in the participant's preferred language

Multilingual consent isn't a nice to have for Australian trials, North American trials, or any study with a CALD community. Translation needs to be professional, cognitively tested where possible, and version controlled alongside the English master. A platform with native multi language support makes this a build setting, not a separate project.

6. Design for accessibility from day one

WCAG 2.1 AA is the baseline, not an aspiration. Test with screen readers. Check colour contrast. Offer larger text. Make sure every interactive element works with keyboard navigation. Accessibility isn't only about compliance. It's about whether a visually impaired or neurodivergent participant can genuinely consent without a sighted proxy.

7. Layer the content

Separate "must know" from "should know" from "nice to know." The top layer is the plain language summary. The second is the full consent. The third covers appendices such as biobank release, genetic data sharing, and optional sub studies. Let participants drill down when they want detail without burying them in it.

8. Support in person, remote, and hybrid flows

Some cohorts need a site coordinator sitting beside them. Some need a televisit with a remote witness. Some can consent asynchronously from home. A flexible platform handles all three without forcing a choice at the protocol level.

9. Keep version control and signature binding tight

Every signature must tie to an exact consent version, with a timestamp, a participant identifier, and an audit trail. When the IRB or HREC approves a new version, the system should prevent new participants from signing an outdated version. This is a 21 CFR Part 11 expectation, not a bonus feature.

10. Plan consent renewal before the protocol forces it

Protocols amend. Risks evolve. Sub studies open. Design the consent renewal path during the initial build. Model the scenarios: a risk increases, a new biomarker is added, a sub study recruits from the active cohort. Each needs a clear participant journey. Renewing consent designed under deadline pressure after an amendment lands is always messier than renewal built from day one.

eConsent regulatory baseline: FDA, ICH, EMA, MHRA, TGA

Electronic informed consent is accepted by every major clinical trial regulator. What differs is the shape of "accepted."

21 CFR Part 11

The FDA's Part 11 governs electronic records and electronic signatures in US trials. A Part 11 aligned eConsent system supports user authentication, validated workflows, audit trails, and controls over who can edit an approved consent.

FDA Use of Electronic Informed Consent Q&A

The FDA's Use of Electronic Informed Consent Q&A covers the practical questions: identity verification, witness signatures, adapting consent for participants with impaired capacity, and how to store records. Cite it directly when designing flows, not the general Part 11 text.

ICH E6(R2) Good Clinical Practice

ICH GCP sets the global baseline for informed consent. Section 4.8 covers informed consent of trial participants. Any eConsent flow has to meet the same substantive requirements as paper: the participant understands, is not coerced, can withdraw, and knows what's being collected and why.

EMA, MHRA, and TGA

The EMA's guideline on computerised systems and electronic data in clinical trials (updated 2023) covers electronic informed consent explicitly. The MHRA accepts eConsent and discusses implementation issues through its community forums. In Australia, HRECs review eConsent under the National Statement on Ethical Conduct in Human Research, and the TGA expects GCP aligned processes whichever format you use.

Regulators don't require a specific eConsent vendor. They require proof that the consent was informed, documented, and auditable. That's a software problem and a process problem at the same time.

eConsent best practices for decentralised and hybrid trials

Decentralised clinical trials changed where consent happens. Virtual eConsent lets a participant in a regional town join a trial hosted in a metropolitan centre without a long drive, a sick day, or a carer's help. It also raises questions that site based consent never had to answer.

Identity verification is the first one. A remote participant has no receptionist checking a driver licence. Options range from government ID upload with a liveness check, to video witness, to site coordinator call. The right answer depends on trial risk, cohort, and jurisdiction.

Witness signatures are the second. Some trials, especially those involving minors or participants with impaired capacity, require a witness under GCP. Televisit technology handles this for many trials. For higher risk consent, a video call with the site coordinator signing concurrently in the platform works better than asynchronous consent.

The third is what happens when something goes wrong mid flow. A participant who reads the risks section and has a real question needs a route to a human. A well designed virtual eConsent has a "talk to the study team" button next to every risk statement, not buried in a footer.

Choosing eConsent software: a short buyer's checklist

When evaluating eConsent software, ask every vendor the same five questions.

  1. Show me an audit trail, version history, and user role controls for a live study, not a sales demo. If sandbox flows are the only ones you see, that's a red flag for real world deployment.
  2. How long does it take to build consent in a new language, and how many languages are supported natively? A platform that needs a four week turnaround per language makes multi country trials harder than they need to be.
  3. What's your accessibility audit history against WCAG 2.1 AA? Ask for the most recent report.
  4. How does eConsent integrate with the rest of the trial, including screening, ePRO, EDC, and CTMS? A standalone eConsent is worse than an integrated one because participants cross a product boundary at the start of every trial.
  5. Who's on the other end when something breaks mid study? Partnership matters. A vendor who hands over a manual and disappears makes amendments and consent renewals harder than they have to be.

WeGuide's Form Builder and eConsent share the same underlying platform, so screening, consent, and ongoing forms all sit inside one participant experience.

Common mistakes to avoid

Five things we see trip up otherwise solid eConsent programmes.

  • Shipping a 15 page PDF inside a digital wrapper and calling it an electronic consent form. A searchable PDF isn't a comprehension gain, and a digital consent form should earn its format.
  • Skipping the plain language summary. Participants start with the longest document and often never reach the short one.
  • Using the same knowledge check for every cohort, regardless of literacy, language, or condition.
  • Leaving consent renewal design until the IRB approves an amendment. Designing renewal under deadline pressure almost always goes badly.
  • Measuring eConsent by signature completion rate only. It's the easiest number to improve and the least useful one by itself.

A recent PMC study on implementing electronic consent across a multi centre trial captured this well: the teams that planned for comprehension, multilingual builds, and consent renewal from day one recruited nearly 12,000 participants successfully. Those that bolted pieces on mid study kept hitting the same avoidable problems.

Conclusion

eConsent works when the participant actually understands what they agreed to. Every practice in this guide points at that single outcome.

Five things to hold onto:

  • eConsent best practices start with comprehension, not signature flow.
  • Plain language, multimedia, and knowledge checks support understanding. Version control, accessibility, and multilingual support keep consent inclusive and compliant.
  • The FDA, ICH, EMA, MHRA, and TGA all accept electronic informed consent when the process is informed, documented, and auditable.
  • Virtual and hybrid consent flows are mature, with design choices that depend on trial risk, cohort, and jurisdiction.
  • Choose a partner who builds eConsent inside the rest of the trial, not a bolt on tool.

If you're designing the next trial and want to see how eConsent, Form Builder, and multi language support fit together in practice, we'd be happy to run you through it. Book a walkthrough of WeGuide and we'll show you how it works on a live setup.

Questions we frequently get asked about this topic

Contact us
No items found.
Share the article
Become our partner
Ready to revolutionise your next clinical trial or healthcare project? We’d love to hear from you!
Get in touch
UP NEXT
eCOA vs ePRO: Key Differences and When To Use Each