A 172-sample pilot on Olink Explore HT should do one thing well: de-risk scale-up. That means proving your control architecture, quantifying precision, catching preanalytical pitfalls early, and deciding with numbers—before moving to 600+ samples. This guide lays out a plate-level blueprint you can plug into your SOPs, grounded in Olink's control schema and common QC practice.
Your confirmed objective and thresholds will anchor the plan:
- Balanced scoring for go/no-go: 40% platform performance, 40% preanalytics/batch harmonization, 20% biological signal.
- Replicate strategy: Plate Control 5× + Sample Control 3× per plate + ~5% technical duplicates across the cohort.
- QC thresholds: per-sample call rate ≥70% assays above LOD; intra-assay CV ≤20%; inter-plate CV ≤30%; re-run if internal controls fail.
Key takeaways
- Lock the plate fundamentals: each 96-well plate reserves 10 external control wells (2 negatives, 5 Plate Controls, 3 Sample Controls), leaving ~86 study wells; with ~5% technical duplicates, revisit plate math early. This control schema is documented in the Olink Explore HT validation data and mirrored in the NCI BRD SOP that notes using the Plate Control median for per-assay normalization.
- Normalize the Olink way: internal-control adjustment → log2 NPX → Plate Control median centering; use bridging when plates/runs/sites differ, and verify harmonization via PCA/UMAP, as outlined in the OlinkAnalyze bridging and NPX documentation.
- Treat LOD and missingness explicitly: compute per-sample call rate (target ≥70% above LOD), avoid computing NC-LOD from too few negatives, and document assay-level completeness per the OlinkAnalyze LOD guidance.
- Preanalytics first: screen out visibly hemolyzed/lipemic/turbid samples; validate spectrophotometric cues locally; re-run when internal controls or plate checks fail, aligning with contamination markers described in plasma proteome quality literature.
- Decide with a rubric: apply the 40/40/20 weighted go/no-go table to quantify platform performance, harmonization, and early biological signal.
Step 1 — Plate math and mapping for two 96-well plates
Olink Explore HT plates include ten external controls per plate: Negative Controls in duplicate, Plate Control in five replicates, and Sample Control in triplicate. This is documented in the Olink Explore HT validation data PDF and mirrored in institutional guidance, where the Plate Control median is used for per-assay plate centering and the Sample Control triplicates support precision estimates, as described by the NCI BRD SOP 'Olink Explore HT Analytical Validation'.
Implication for capacity: 96 − 10 = 86 study wells per plate. For 172 study injections, two plates cover 172 wells exactly. However, ~5% technical duplicates add wells beyond the unique sample count. If you intend 172 unique participants plus ~5% duplicates (9 extra wells), you will need either:
- a third plate (to keep all 172 unique samples and the ~5% duplicates), or
- a reduced duplicate rate, or
- fewer unique samples (e.g., 163 unique + 9 duplicates = 172 wells total across two plates).
Make this decision upfront and capture it on the plate map. When duplicates are used, distribute them across plates to support inter-plate precision checks.
Step 2 — Replicate design and randomization
Implement the confirmed replicate plan: Plate Control 5× and Sample Control 3× on every plate, plus ~5% technical duplicates drawn from real study samples. The Sample Control triplicates yield intra-plate CVs; comparing the Sample Control across plates yields inter-plate CVs. Cohort duplicates provide a matrix-realistic precision check.
Randomize disease/control status and preanalytical factors (site, collection date, freeze–thaw count) across plates. Balanced block randomization avoids confounding biology with batch. Keep a complete log of duplicate identities and positions.
Precision computations (NPX scale): for an assay a measured r times in the same condition, CV_a = SD_a / Mean_a × 100%. Use Sample Control triplicates for within-plate CV; compare the same control across plates for inter-plate CV. Confirm targets: intra-assay CV ≤20%; inter-plate CV ≤30% as your pilot acceptance criteria (decision thresholds, not manufacturer specifications).
For extended context on NPX and normalization steps, see the explainer in the Olink data analysis process article on our knowledge site, which outlines internal vs. external control roles and plate-level centering.
Step 3 — Preanalytical screening SOP (hemolysis, lipemia, turbidity)
Start with a visual triage. Exclude or set aside for investigation any samples that are clearly hemolyzed (pink/red tinge), markedly turbid/cloudy, or milky/opaque (lipemia). This aligns with widely used plasma/serum proteomics quality practices and Olink's emphasis on robust sample integrity.
If your lab supports spectrophotometric indices, adopt locally validated cutoffs—for example, hemolysis proxy A414 > ~0.2–0.3 AU and turbidity/lipemia indices at 600–700 nm > ~0.1–0.2 AU (instrument- and pathlength-dependent). Validate these thresholds on your platform and document your acceptance/rejection criteria; the quality marker approach used in plasma proteomics programs provides practical guidance on contamination signatures and workflow control, as shown by Geyer et al.'s plasma proteome quality marker panel (2019).
After the run, corroborate sample integrity using QC signals: internal control deviations, unexpected elevation of erythrocyte proteins (e.g., HBA1/HBB) or platelet markers (PF4/PPBP) are hallmarks of contamination or clotting/activation. If internal controls fail or external control checks fall outside expected ranges, re-run the affected samples or the entire plate per your SOP and document the event.
For practical handling notes (aliquoting, freeze–thaw tracking, timing), see our concise Olink sample preparation guidelines reference.
Step 4 — From run to NPX: normalization and bridging
The Olink NPX workflow, in brief: (1) adjust using internal controls at the sample level to remove step-specific technical variation; (2) work on the log2 NPX scale; (3) center each assay to the Plate Control median on that plate. This harmonizes plate medians per assay before downstream analysis. When multiple plates or runs are involved—especially across sites or reagent lots—use bridging samples that appear on both plates/runs to compute per-assay offsets and align distributions, following the OlinkAnalyze bridging introduction.
Confirm harmonization with diagnostics: PCA/UMAP on plate labels and density plots per plate should show no systematic clustering after normalization and bridging. If plates still separate, revisit the centering step, evaluate bridge offsets, and consider intensity-based adjustments recommended in Olink toolchain vignettes. For a quick primer on NPX and control-driven normalization logic, see our Olink Reveal technology overview.
According to the OlinkAnalyze LOD guidance, avoid estimating negative-control-based LODs with too few negatives; Explore HT provides 2 negatives per plate. If you aggregate across just two plates, you still have only four negatives—below the suggested minimum (often ≥10 negatives) for stable LOD estimation—so default to the manufacturer-provided assay LODs or compute NC-LOD only when sufficient negatives are available.
Step 5 — QC metrics you must compute
- Per-sample call rate: For each sample, compute the fraction of assays above LOD. Use assay-specific LODs supplied with the run or NC-LOD only if you have enough negatives to estimate it reliably. Adopt the pilot decision threshold: median per-sample call rate ≥70% of assays above LOD. This is a pragmatic acceptance criterion for pilots rather than a manufacturer specification; large cohorts often apply completeness filters at the protein level as well (see principles used in population-scale Explore projects such as the UK Biobank Explore normalization documentation).
- Precision: Compute intra-assay CVs from Sample Control triplicates on each plate and inter-plate CVs from the same Sample Control measured across plates; corroborate with cohort duplicates. Pilot targets are intra-assay CV ≤20% and inter-plate CV ≤30%. Replicate roles and normalization via Plate Control median are detailed in the Olink Explore HT validation data PDF and the NCI BRD SOP.
- Missingness and assay-level filters: Summarize per-assay detection frequency across samples. For downstream analyses, consider filtering assays with very low detection frequency given your intended endpoints, but retain them for reporting QC so you understand platform behavior near LOD.
If any metrics fail: re-extract/re-aliquot problematic samples, re-run plates with control anomalies, or exclude underperforming assays from decision-critical analyses. Always log the deviation and the corrective action.
For worked examples of NPX interpretation and QC readouts, see the Interpreting Olink serum proteomics guide, which walks through common thresholds and plots used by analysts.
Step 6 — Cohort split for pathway coverage in 172 samples
A pilot's job is to estimate effect sizes and confirm pathway coverage, not to deliver definitive p-values. Split disease and control subjects evenly across plates and randomize key strata (e.g., sex, age bands, site). Place your ~5% duplicates so that at least some cross plates, enabling inter-plate precision checks in real matrices. After normalization/bridging, inspect sentinel pathways: is directionality plausible for known biomarkers? What fraction of pathway proteins are above LOD in the intended subgroups? If coverage is thin in critical pathways, note it now—this informs whether to proceed, redesign, or supplement with alternative panels.
Step 7 — Quantified go/no-go rubric (40% platform, 40% harmonization, 20% biology)
Use the following weighted scoring. Treat thresholds as pilot decision criteria aligned with common practice and documentation, not universal specifications. Document your exact computations and justifications.
| Domain | Weight | Metric/Check | Pass threshold (pilot) | Typical action if fail |
| Platform performance | 40% | Intra-assay CV (Sample Control triplicates) | ≤20% | Investigate outlier assays; check near-LOD effects; consider assay-level exclusions or re-run |
| Platform performance | 40% | Inter-plate CV (Sample Control across plates; cohort duplicates) | ≤30% | Revisit normalization; confirm plate-control centering; consider bridging; re-run suspect plate |
| Platform performance | 40% | Per-sample call rate | Median ≥70% assays above LOD | Review LOD basis; exclude poor-quality samples; assess preanalytics and internal controls |
| Preanalytics & harmonization | 40% | Internal/external control checks | All within expected ranges; no failed plates | Re-run failing plates; audit pipetting/reagents |
| Preanalytics & harmonization | 40% | Post-normalization plate separation | No systematic plate clustering in PCA/UMAP | Re-check centering/bridging; adjust normalization strategy |
| Biological signal | 20% | Sentinel pathway directionality and effect sizes | Plausible directionality; effect sizes consistent with prior | Revisit cohort balance; increase n for weak strata; reassess pathway targets |
Decision guidance:
- Go: All domains pass or only minor shortfalls with clear remediation that won't bias scale-up.
- Conditional Go: One domain marginal; implement remediation and reconfirm metrics before proceeding.
- No-Go: Multiple domain failures (e.g., poor plate harmonization and low call-rate) indicate redesign and repeat pilot.
Next steps
Finalize and archive (pilot close-out): Lock and store plate maps, randomization seed + script, preanalytics logs, QC summary tables (CVs, call rates, missingness), and a signed decision memo explicitly referencing the rubric above.
Scale-up readiness signals: Proceed to scale only when CV distributions are stable within target thresholds, PCA/UMAP shows no residual plate-driven clustering post-normalization, call-rate distributions are acceptable, and sentinel pathway signals are directionally credible.
Engage Creative Proteomics for Olink data analysis (recommended): If you want a production-grade, reproducible pipeline for NPX QC, harmonization, and downstream statistics, our team can run the full analysis workflow—delivering standardized QC dashboards, plate/batch diagnostics, and study-ready outputs aligned to your cohort design and endpoints.
Optional expert pass (fast review): For a second set of eyes on plate design, batching strategy, or cutoff sanity checks, request a brief methods/QC review before you commit to the full cohort.
Disclosure / Service note
Creative Proteomics is a biotechnology service provider offering Olink panel data analysis and interpretation services for research use only. Upon request, we can support a neutral pilot review and share practical QC calculator templates used to compute CVs and call-rate metrics.
Selected references
- Control schema and normalization roles: Olink Explore HT validation data PDF; NCI BRD SOP "Olink Explore HT Analytical Validation'.
- LOD and bridging methods: OlinkAnalyze LOD guidance; OlinkAnalyze bridging introduction.
- Preanalytics contamination markers: Geyer et al., Plasma proteome quality marker panel (2019).
- Population-scale normalization example: UK Biobank Explore normalization documentation.