Important: It looks like you are browsing from a non-Windows device. Please be aware that Analyse-it is only available for Microsoft Windows.
Training and consultancy for Analyse-it
Read the blog post
Print
Statistical Reference Guide
Method comparison
Estimating the bias between methods at a decision level
Estimate the average bias (or average difference) at a decision level using the regression fit.
You must have already completed either of the tasks:
Fitting ordinary linear regression
Fitting Deming regression
Fitting Passing-Bablok regression
Activate the analysis report worksheet.
On the
Analyse-it
ribbon tab,
in the
Method Comparison
group, click
Predict At
.
The analysis task pane shows the
Comparability
task.
In the
Decision level
grid, under the
Id
column, type an identifier for the decision level, and then under the
X
column, type the value of the decision level
Optional:
To test if the two methods are equal at each decision level:
On the
Analyse-it
ribbon tab,
in the
Method Comparison
group, click
Test Equality
.
The analysis task pane shows the
Comparability
task.
To compare the p-value against a predefined significance level, in the
Significance level
edit box, type the maximum probability of rejecting the null hypothesis when in fact it is true (typically 5% or 1%).
Optional:
To test if two methods are practically equivalent at each decision level:
On the
Analyse-it
ribbon tab,
in the
Method Comparison
group, click
Test Equivalence
.
The analysis task pane shows the
Comparability
task.
If the allowable difference is a constant or proportional value across the measuring interval, in the
Allowable difference
group, select
Across measuring interval
, and then in the
Absolute
edit box, type the bias in measurement units, and/or in the
Relative
edit box, type the bias as a percentage (suffix with % symbol).
If the allowable difference varies for each level, in the
Allowable difference
group, select
Each level
and then in the
Decision level
grid, under the
Allowable difference
column, alongside each level, type the absolute bias in measurement units, or the relative bias as a percentage (suffix with % symbol).
To compare the p-value against a predefined significance level, in the
Significance level
edit box, type the maximum probability of rejecting the null hypothesis when in fact it is true (typically 5% or 1%).
Click
Recalculate
.
Related concepts
Average bias
Available in Analyse-it Editions
Method Validation edition
Ultimate edition
What is Analyse-it?
Administrator's Guide
User's Guide
Statistical Reference Guide
Distribution
Compare groups
Compare pairs
Contingency tables
Correlation and association
Principal component analysis (PCA)
Factor analysis (FA)
Item reliability
Fit model
Method comparison
Correlation coefficient
Scatter plot
Fit Y on X
Fitting ordinary linear regression
Fitting Deming regression
Fitting Passing-Bablok regression
Linearity
Residual plot
Checking the assumptions of the fit
Average bias
Estimating the bias between methods at a decision level
Testing commutability of other materials
Difference plot (Bland-Altman plot)
Fit differences
Plotting a difference plot and estimating the average bias
Limits of agreement (LoA)
Plotting the Bland-Altman limits of agreement
Mountain plot (folded CDF plot)
Plotting a mountain plot
Partitioning and reducing the measuring interval
Study design
Measurement systems analysis (MSA)
Reference interval
Diagnostic performance
Control charts
Process capability
Pareto analysis
Study Designs
Bibliography
Version 5.30
Published 15-Apr-2019
Products
Support
Buy
Blog
About us
Download trial
Search
Sign in
Products
Standard Edition
Quality Control & Improvement Edition
Method Validation Edition
Ultimate Edition
Compare editions
Download 30-day trial
Support
Getting Started tutorials
Examples
User guide
Statistical Reference guide
Trouble shooting
Administrator's guide
Download latest version
Training & consultancy
Store
Buy new license
Upgrade license
Renew maintenance
Renew subscription
Resellers
FAQs
Company
Blog
About us
Contact us
Privacy policy
Analyse-it Software, Ltd.
The Tannery, 91 Kirkstall Road,
Leeds, LS3 1HS,
United Kingdom
support@analyse-it.com
+44-(0)113-247-3875
® Copyright Analyse-it Software, Ltd. 2019.
All rights reserved.
MSA (Measurement System Analysis) software
Measurement System Analysis software
Reference interval software
ROC curve software
Sensitivity & Specificity analysis software
Method comparison software
Bland-Altman software
Deming regression software
Passing Bablok software
Method Validation software
Statistical Process Control (SPC) statistical software
SPC software
Six Sigma statistical software
Excel SPC addin
Excel Statistical Process Control (SPC) add-in
Pareto plot software software for Excel
Pareto plot add-in software for Excel
Pareto chart add-in software for Excel
Control chart Excel add-in
Process Capability statistical software
Capability Analysis add-in software
Principal Component analysis addin software
Excel PCA add-in
Excel ANOVA add-in
ANCOVA software
Multiple Regression analysis add-in software
Multiple Linear Regression statistical software
Excel model fitting software
Excel statistics analysis addin software
Excel statistical analysis addin software
Statistics software
Statistical analysis software