Publication Link

Why we built SkinTracker Link to heading

Clinical trials for atopic dermatitis (AD) still rely on in‑person visits, which drive up costs, limit geographic diversity, and create barriers for patients who can’t easily travel. Our goal was simple: test whether a smartphone‑based app could capture disease severity as reliably as a clinician‑examined photograph or a DSLR‑grade image, while keeping participants engaged over an eight‑week hybrid study.

Study design in a nutshell Link to heading

ItemDetails
Population28 volunteers (18 with physician‑diagnosed AD, 10 healthy controls)
Duration8 weeks (baseline + week 8 in‑clinic visits; bi‑weekly remote tasks)
AssessmentsEczema Area Severity Index (EASI), Investigator Global Assessment (IGA), Patient‑reported outcomes (POEM, NRS itch)
Data captureMobile‑app photos (34 standardized poses per session), DSLR photos (baseline & week 8), Apple Watch activity & sleep data
Compliance checksBi‑weekly phone check‑ins, push notifications, auto‑quality detection on the app

Key quantitative findings Link to heading

  • Agreement between modalities: Intraclass correlation coefficients (ICCs) for the primary scores were excellent:
    • EASI ICCs ranged from 0.90–0.97 (app vs. in‑person, app vs. DSLR).
    • IGA ICCs ranged from 0.88–0.94.
  • Component‑level performance: The four EASI sub‑domains (erythema, edema/papulation, excoriation, lichenification) showed moderate‑to‑good agreement (ICCs ≈ 0.50 – 0.83).
  • Participant compliance: 97.6% of scheduled photo tasks were completed; >95% of POEM and itch surveys were submitted.
  • User satisfaction: On a 1‑7 Likert scale, the convenience of the app versus an all‑in‑person protocol scored a mean of 6.7 (SD = 0.5).

What worked (and what didn’t) Link to heading

What worked Link to heading

  • Automatic blur/dark detection forced participants to retake unusable images, keeping data quality high.
  • Bi‑weekly phone check‑ins were repeatedly cited as the biggest motivator for adherence—human contact beats push notifications alone.

What fell short Link to heading

  • The voice‑diary feature lacked prompts, so participants rarely used it.
  • Apple Watch bands caused skin irritation for a subset of AD patients, highlighting the need for hypoallergenic accessories in future protocols.

Limitations Link to heading

  • Sample size: Only 18 AD patients; statistical power is modest and the confidence intervals around ICCs are relatively wide.
  • Demographic skew: Median age ~34 years; younger, tech‑savvy participants may over‑represent compliance compared with older or less digitally literate populations.
  • Device lock‑in: All participants needed an iPhone (the app currently runs only on iOS). This excludes Android‑only users and amplifies the “digital divide” we already warned about.

Implications for remote dermatology research Link to heading

The data prove that a well‑engineered mobile‑photography workflow can deliver clinically acceptable severity scores. For sponsors, this means:

  1. Cost reduction: Fewer in‑clinic visits translate directly into lower travel reimbursements and staff time.
  2. Broader reach: With proper device provisioning (e.g., loaner phones, cross‑platform apps), trials could tap under‑represented communities that are traditionally excluded by geography.
  3. Hybrid designs: Even with perfect remote data, occasional in‑person checkpoints remain valuable for safety monitoring and calibration.

Next steps Link to heading

  • Expand the app to Android and web interfaces.
  • Integrate a structured voice‑diary prompt library.
  • Conduct a larger, multisite validation that deliberately oversamples older adults and non‑English speakers.