The effect of serial dilution error on calibration inference in immunoassay.

Journal Article (Journal Article)

A common practice in immunoassay is the use of sequential dilutions of an initial stock solution of the antigen of interest to obtain standard samples in a desired concentration range. Nonlinear, heteroscedastic regression models are a common framework for analysis, and the usual methods for fitting the model assume that measured responses on the standards are independent. However, the dilution procedure introduces a propagation of random measurement error that may invalidate this assumption. We demonstrate that failure to account for serial dilution error in calibration inference on unknown samples leads to serious inaccuracy of assessments of assay precision such as confidence intervals and precision profiles. Techniques for taking serial dilution error into account based on data from multiple assay runs are discussed and are shown to yield valid calibration inferences.

Full Text

Duke Authors

Cited Authors

  • Higgins, KM; Davidian, M; Chew, G; Burge, H

Published Date

  • March 1, 1998

Published In

Volume / Issue

  • 54 / 1

Start / End Page

  • 19 - 32

PubMed ID

  • 9544505

International Standard Serial Number (ISSN)

  • 0006-341X


  • eng

Conference Location

  • United States