You are viewing documentation for the old version 2.30 of Analyseit. If you are using version 3.00 or later we recommend you go to the
BlandAltman difference plot.
BlandAltman bias plots
Difference plots (also known as Bias plots) visually compare two methods, a test method against a reference/comparative method, for analytical accuracy. Almost any kind of difference plot can produced, including plots recommended by BlandAltman, Hyltoft Petersen, and the CLSI, to assess agreement and repeatability.
The requirements of the test are:
 Two methods measured on a continuous scale.
 Any number of replicates can be observed for each method, though all cases must have the same number of replicates.
Arranging the dataset
Data in existing Excel worksheets can be used and should be arranged in the List dataset layout. The dataset must contain at least two continuous scale variables containing the observations for each method. If replicates are observed then a List dataset with repeat/replicate measures layout should be used to arrange the replicates for each method.
When entering new data we recommend using New Dataset to create a new method comparison dataset.
Using the test
To start the test:
 Excel 2007:
Select any cell in the range containing the dataset to analyse, then click Comparison on the Analyseit tab, then click Difference plots.
Excel 97, 2000, 2002 & 2003:
Select any cell in the range containing the dataset to analyse, then click Analyse on the Analyseit toolbar, click Method comparison then click Difference plots.
 Click Reference/Comparative method and Test method and select the methods or individual replicates to compare.
 If the methods contains replicates click Use replicates and select:
1st

Uses only the first replicate of each method. 
Mean

Uses the mean of the replicates of each method. 
1st v Mean of Reference

Uses the 1st replicate of the test method and the mean of the replicates of the reference method. 
 Click OK to run the test.
The report shows the number of cases analysed, and, if applicable, how many cases were excluded due to missing values. The name, number of replicates, and repeatability (if measured in duplicate), in terms of SD or CV, depending on the Plot option (see below), of each method is shown. The range of observations (minimum and maximum) for the reference/comparative method is shown.
The scatter plot (see below) shows the observations of reference/comparative method (X) plotted against the test method (Y). The Use replicates option determines how replicates for each method, if available, are plotted.
(click to enlarge)
Understanding and configuring the difference plot
The difference/bias plot can be configured to produce plots recommended by BlandAltman, Hyltoft Petersen,or CLSI. A histogram of the differences can also be shown to see the distribution of the differences.
Absolute differences or the difference as a percentage of the analyte concentration can be plotted. The Pearson r correlation statistic on the report shows the correlation between the differences and average value. A high degree of correlation suggests bias is nonconstant across the range, and a percent difference plot may be more suitable.
To configure the difference plot:
 If the Difference Plot dialog box is not visible click Edit on the Analyseit tab/toolbar.
 Click Plot and select Difference to plot absolute differences on the Y axis of the plot, or Difference as % to plot the differences as a percentage of the analyte concentration.
 If replicates are measured for the test method tick all individual observations to show all replicate observations on the Y axis of the plot. Otherwise the Use replicates option determines how replicates are plotted.
 Click Against and select Reference Method to plot the reference method on the X axis of the plot (as recommended by the CLSI), or Average to plot the average of the methods (as recommended by BlandAltman).
 Tick Histogram of differences to show a histogram of the distribution of the differences.
 Click OK.
The difference plot (see below) is shown beneath the scatter plot.
(click to enlarge)
Beneath the difference plot is the histogram of differences (see below), with a normal curve overlay to assist in judging whether the differences are normally distributed.
(click to enlarge)
Determining and plotting bias
Bias can be calculated and plotted on the difference plot, to show differences between the methods.
To show and calculate bias:
 If the Difference Plot dialog box is not visible click Edit on the Analyseit tab/toolbar.
 Click Overlay and select with Bias to calculate bias and show it on the difference plot, or Bias + CI to calculate and show bias with a confidence interval.
 Enter Confidence level to calculate around the bias, as a percentage between 50 and 100, excluding the % sign.
 Click OK.
The difference plot (see below) shows bias, and if enabled, confidence interval showing the range likely to contain the true bias. Bias is also shown on the report above the scatter plot.
(click to enlarge)
Bias is shown, which ideally should be zero, with a confidence interval showing the range likely to contain the true bias. The bias is expressed as a SD or a CV, depending on the Plot option (see above)
Determining and plotting Limits of agreement
Limits of agreement can be calculated and plotted on the difference plot to show the likely range of differences between the methods.
To show and calculate limits of agreement:
 If the Difference Plot dialog box is not visible click Edit on the Analyseit tab/toolbar.
 Click Overlay and select with Bias + Limits of agreement to calculate limits of agreement and show them on the difference plot, or Bias + Limits of agreement + CI to calculate and show limits of agreement with a confidence interval.
 Enter Limits of agreement to calculate, as a percentage between 50 and 100 excluding the % sign.
 If either method is observed in replicate, click SD differences and select Between mean measurements to calculate the SD from the differences between the mean of the methods, Between single measurements, Between single measurements of Test (method)TODO
 Click OK.
The difference plot (see below) shows bias, limits of agreement, and if enabled, confidence intervals showing the range likely to contain the true bias and limits. Large limits of agreement imply poor precision in one or both methods.
(click to enlarge)
Examining Repeatability
Repeatability plots can be shown to assess the imprecision of the methods when observed in replicate.
To show repeatability plots:
 If the Difference Plot dialog box is not visible click Edit on the Analyseit tab/toolbar.
 Tick Repeatability plot.
 Click OK.
Repeatability plots (see below) are shown for the methods with replicate measurements, showing the SD/CV (depending on the Plot option) (Y axis) against the mean of the replicates (X axis).
(click to enlarge)
Comparing against a bias/imprecision goal specification
Bias and imprecision can be compared against a performance goals. The allowable bias & imprecision can be specified in absolute units of the analyte, as a percentage of analyte concentration, or as a combination of the two in which case the larger of the absolute and percentage concentration is used.
To compare bias against a goal:
 If the Difference Plot dialog box is not visible click Edit on the Analyseit tab/toolbar.
 Click Compare against and select:
Bias 
Difference plot shows allowable bias and observed bias is tested to determine if within the goal. 
Bias + Imprecision 
Difference plot shows allowable bias and observed bias and observed imprecision are tested to determine if within goal. 
TEa = Bias + 1.65 * Imprecision 
Total allowable error is calculated bias and imprecision goals. Same as bias + imprecision except the difference plot also shows total allowable error.

 Enter Allowable bias as an absolute value, as a percentage of analyte concentration, or enter both values for a combination.
 If comparing imprecision enter Allowable imprecision as an absolute value, as a percentage of analyte concentration, or enter both values for a combination.
 Tick with Allowable Error bands to show the bias specification on the difference plot.
 Click OK.
The bias and imprecision (if specified) goals are shown with a hypothesis test to determine if the observed bias/imprecision is within the goal. If the bias/imprecision pvalue is statistically significant then the observed bias/imprecision is worse than the goal.
If with Allowable Errors bands is checked the difference plot shows the allowable bias (see below). The confidence interval around the fitted linear line should fall within the allowable bias band if the methods are comparable within allowable bias.
(click to enlarge)
Comparing against a TEa and Systematic/Random Error%
Bias can be compared against a systematic error%, and imprecision against a random error%, of a total allowable error goal. The total allowable error can be specified in absolute units of the analyte, as a percentage of analyte concentration, or as a combination of the two in which case the larger of the absolute and percentage concentration is used.
To compare bias against a total allowable error goal:
 If the Difference Plot dialog box is not visible click Edit on the Analyseit tab/toolbar.
 Click Compare against and select:
TEa 
Difference plot shows total allowable error for visual assessment. 
TEa, %SE 
Difference plot shows total allowable error and the allowable bias range. Observed bias is tested to determine if within the goal.

TEa, %SE, %RE 
Difference plot shows total allowable error and the allowable bias range. Observed bias and imprecision are tested to determine if within goal.

 If required enter TEa (total allowable error) as an absolute value, as a percentage of analyte concentration, or enter both values for a combination.
 If required enter % for Systematic error, that is the percentage of the TEa to allow bias to vary within.
 If required enter % for Random error, that is the percentage of the TEa to allow precision to vary within.
 Tick with Allowable Error bands to show the bias specification on the scatter plot.
 Click OK.
The bias and imprecision (if specified) goals are shown, and a hypothesis test determines if the observed bias/imprecision is within the goal. If the bias/imprecision pvalue is statistically significant then the observed bias/imprecision is significantly worse than the goal.
If the Allowable Errors bands option is checked the difference plot shows the allowable bias (see above). The confidence interval around the fitted linear line should fall within the allowable bias band if the methods are comparable within allowable bias.
References to further reading
 Measurement in Medicine: the Analysis of Method Comparison Studies
D.G. Altman, J.M. Bland, The Statistician 32 1987; 307317
 Statistical Methods for Assessing Agreement Between Two Methods of Clinical Measurement
J.M. Bland, D.G. Altman, The Lancet February 8 1986; 307310
 Measurement Agreement in Method Comparison Studies
J.M. Bland, D.G. Altman, Statistical Methods in Medical Research 1998; 8: 135160
 Comparing Methods of Measurement: Why plotting difference against standard method is misleading
J.M. Bland, D.G. Altman, The Lancet 1995; 346: 108587
 Method Comparison  A Difference Approach
M.A. Pollock, S.G. Jefferson, J.W. Kane, K. Lomax, G. MacKinnon, C.B. Winnard, Ann. Clin. Biochem. 1992; 29: 556560
 Graphical interpretation of analytical data from comparison of a field method with a Reference Method by use of difference plots
Per Hyltoft Petersen, Dietmar Stöckl, et. al, Clinical Chemistry Vol 43 No. 11 1997; 20392046
 Agreement Between Methods of Measurement with Multiple Observations Per Individual.
Bland J. Martin, Altman Douglas G; Journal of Biopharmaceutical Statistics 17 2007: 571582.
 User Verification of Performance for Precision and Trueness (2nd edition).
CLSI EP15A2; ISBN 1562385747 (2006)
 Estimation of Total Analytical Error for Clinical Laboratory Methods.
CLSI EP21A; ISBN 156238502X (2003)
(click to enlarge)