Last year was a productive year for Analyse-it, with the addition of survival analysis functions and .
There was also lots of work behind the scenes, making small improvements to keep everything running smoothly. These days software updates are released quickly, and often without warning, which can be problematic for products like Analyse-it that rely on another application like Microsoft Excel. We release updates to Analyse-it to fix problems when they arise, so it’s always good to have to ensure you get those updates.
We also recently released a new healthcare-focused edition of Analyse-it: the . This edition includes statistics for comparing agreement between clinical methods; constructing reference ranges; determining diagnostic accuracy of a screening technique; analyzing and comparing survival times for different treatments, plus statistics for analyzing, visualizing and describing data; comparing groups for differences; advanced regression and modelling techniques; and procedures for looking at correlation and reducing large amounts of data to the key factors.
We’re happy to announce a couple of major additions to the Analyse-it Ultimate edition: Kaplan-Meier survival curves and Cox’s proportional hazards survival model. There are also improvements for users of other editions.
If you have you can download and install the update now, see: or visit the . If maintenance on your license has expired, you can renew it to get this update and forthcoming updates, see . If you already have another edition of Analyse-it, you can upgrade to the Ultimate edition for just the difference in cost, see .
describe and visualize time-to-event data. They are often used in survival and reliability analysis to model time to death or time to failure of a component. You can visualize multiple survival functions, with confidence bands, and test for differences between them. In addition you can describe the key quantities such as the area under the curve or the quantiles.
Example of multiple Kaplan-Meier survival curves:
model time-to-event data (survival/reliability analysis). They do not rely on estimating a survival function, but instead estimate the hazard ratios of covariates. We have added proportional hazards to the Analyse-it Fit Model analysis so you can include multiple categorical/continuous covariates and complex interaction terms. The analysis report include everything you’d expect: parameter estimates and confidence intervals, covariance matrix, likelihood ratio and Wald test of the contribution of each term, hazard ratios for each term, and the baseline survival function and plot.
Recently we’ve been busy updating Analyse-it to stay aligned with the latest updates to the CLSI protocols, and added a new inverse prediction feature.
If you have you can download and install the update now, see or visit the . If maintenance on your license has expired you can renew it to get this update and forthcoming updates, see .
New CLSI EP6-Ed2
The CLSI recently released guideline , which replaces the EP06-A published in 2003.
EP06-A relied on fitting a linear (straight line), 2nd (parabolic) and 3rd (sigmoidal) order polynomials to the data. A method was then determined to be linear or possibly non-linear based on statistical criteria. The degree of nonlinearity was then calculated as the difference between the linear fit and the best fitting non-linear model (parabolic or sigmoidal curves). Nonlinearity could then be compared against allowable nonlinearity criteria.
The new CLSI EP6-Ed2 protocol no longer requires fitting polynomial models to determine linearity. Instead, the deviation from linearity is calculated as the difference between the mean of each level and a linear fit through the data. That can then be compared against the allowable nonlinearity criteria. Other changes to the protocol include experimental design and there is now more focus on the structure of the variance across the measuring interval.
We've been busy making lots of improvements to Analyse-it recently, so we thought we'd highlight the significant new and improved features.
If you have , you can download and install the update now, see . If maintenance on your license has expired, you can renew it to get this update and forthcoming updates, see .
Fit Model now supports Probit regression. Probit regression is similar to logistic regression, as both use a link function to transform a linear model into a nonlinear relationship. A linear model uses the equation Y = α + β x, whereas both logit and probit equation use the form Y = f(α + β x). They only differ in the definition of the link function f(): the logit model uses the cumulative distribution function of the logistic distribution; the probit model uses the cumulative distribution function of the standard normal distribution. Both functions give a predicted probability, Y.
Health sciences, such as epidemiology, often use the logit model as the predictor coefficients are interpretable in terms of log odds-ratios. The probit model coefficients cannot be interpreted as easily but may produce a better fitting model in other scenarios. For example, in method validation, probit regression is used to model the hit rate of a molecular test. You can then use the model to establish a detection limit or determine diagnostic cut-off points from an underlying continuous response. For a guided example, see our blog post
Recent improvements to the , in version 5.50 and later, include the addition of probit regression. Probit regression is useful when establishing the detection limit (LoD) for an RT-qPCR assay.
The protocol provides guidance for estimating LoD and is recognized by the FDA. In this blog post, we will look at how to perform the relevant part of the CLSI EP17-A2 protocol using Analyse-it.
For details on experimental design, see section 5.5 in the CLSI EP17-A2 guideline. In Analyse-it, you should arrange the data in 2 columns: the first should be the concentration, and the second should be the result, positive or negative. You should have a minimum of 20 replicates at each concentration. We have put together a hypothetical example in the workbook which you can use the follow the steps below:
To Analyse-it:
The analysis task pane opens.
NOTE: If using Analyse-it pre-version 5.65, on the Fit panel, in the Predict X given Probability edit box, type 0.95.
Following our last blog post, today, we will show how to calculate binary agreement using the . The protocol is a useful companion resource for laboratories and diagnostic companies developing qualitative diagnostic tests.
In Analyse-it, you should arrange the data in frequency or case form, as discussed in the blog post: . You can find an example of both and follow the steps below, using the workbook .
NOTE: The Average method is useful when comparing two laboratories or observers where neither is considered a natural comparator. The reference method is asymmetric, and the result will depend on the assignment of the X and Y methods, whereas the average method is symmetric, and the result does not change when swapping the X and Y methods.
INFO: Older versions of Analyse-it do not support the Average method, and the Agreement by category checkbox is called Agreement.
The analysis report shows positive and negative agreement statistics.
Due to COVID-19, there is currently a lot of interest surrounding the sensitivity and specificity of a diagnostic test. These terms relate to the accuracy of a test in diagnosing an illness or condition. To calculate these statistics, the true state of the subject, whether the subject does have the illness or condition, must be known.
In recent FDA guidance for laboratories and manufacturers, , the FDA state that users should use a clinical agreement study to establish performance characteristics (sensitivity/PPA, specificity/NPA). While the terms sensitivity/specificity are widely known and used, the terms PPA/NPA are not.
protocol describes the terms positive percent agreement (PPA) and negative percent agreement (NPA). When you have two binary diagnostic tests to compare, you can use an agreement study to calculate these statistics.
As you can see, these measures are asymmetric. That is, interchanging the test and comparative methods, and therefore the values of b and c, changes the statistics. They do, however, have a natural, simple, interpretation when one method is a reference/comparative method and the other a test method.
It is important in diagnostic accuracy studies that the true clinical state of the patient is known. For example, in developing a SARS-CoV-2 anti-body test, for the positive subgroup, you might enlist subjects who had a positive SARS-CoV-2 PCR test and clinically confirmed illness. Then, for the negative subgroup, you might use samples taken from subjects before the illness was in circulation. It is also essential to consider other factors, such as the severity of illness, as they can have a marked effect on the performance characteristics of the test. A test that shows high sensitivity/specificity in a hospital situation in very ill patients can be much less effective in population screening where the severity of the illness is less.
In cases where the true condition of the subject is not known, and only results from a comparative method and a new test method are available, an agreement measure is more suitable. We will cover that scenario in detail in a future blog post.
In our last post, we mentioned that the 'accuracy' statistic, also known as the probability of a correct result, was a useless measure for diagnostic test performance. Today we'll explain why.
Let's take a hypothetical test with a sensitivity of 86% and specificity of 98%.
As a first scenario we simulated test results on 200 subjects with, and 200 without, the condition. The accuracy statistic (TP+TN)/N is equal to (172+196)/400 = 92%. See below:
In a second scenario we again simulated test results on 400 subjects, but only 50 with, and 350 without, the condition. The accuracy statistic is (43+343)/400 = 96.5%. See below:
The accuracy statistic is effectively a weighted average of sensitivity and specificity, with weights equal to the sample prevalence P(D=1) and the complement of the prevalence (that is, P(D=0) = 1-P(D=1)).
Accuracy = P(TP or TN) = (TP+TN)/N = Sensitivity * P(D=1) + Specificity * P(D=0)
Therefore as the prevalence in the sample changes so does the statistic. The prevalence of the condition in the sample may vary due to the availability of subjects or it may be fixed during the design of the study. It's easy to see how to manipulate the accuracy statistic to weigh in favor of the measure that performs best.
There’s currently a lot of press attention surrounding the finger-prick antibody IgG/IgM strip test to detect if a person has had COVID-19. Here in the UK companies are buying them to test their staff, and some in the media are asking why the government hasn’t made millions of tests available to find out who has had the illness and could potentially get back to work.
We did a quick Google search, and there are many similar-looking test kits for sale. The performance claims on some were sketchy, with some using as few as 20 samples to determine their performance claim! However, we found a webpage for a COVID-19 IgG/IgM Rapid antibody test that used a total of 525 cases, with 397 positives, 128 negatives, clinically confirmed. We have no insight as to the reliability of the claims made in the product information. The purpose of this blog post is not to promote or denigrate any test but to illustrate how to look further than headline figures.
We ran the data through the version 5.51. Here's the workbook containing the analysis:
We’ve been busy over the last few months adding various new features and improvements to Analyse-it.
There are always many changes behind the scenes, which aren’t really news-worthy, but seek to make Analyse-it compatible with the latest changes to Microsoft Excel and Windows. Microsoft have recently embraced a more aggressive continuous-release deployment model with Office and Excel, and these releases have caused quite a few headaches for developers, notably with the .
For these reasons we advise you have active maintenance so you get these updates as they are released – if not we recommend you on your license to get the latest updates and ensure compatibility with the often fast-paced releases.
Starting with version 5.20 we introduced a new feature so you can set the scale (minimum, maximum and units) for a variable that will then be used, where possible, to set the scaling for chart axes. Normally Analyse-it will choose excellent axis scaling, but sometimes you know better and want a specific minimum, maximum, or units. Previously you would have to set the scale manually on the charts output from the analysis, and any changes would be lost on recalculate. Now you can control the scaling, see
Our focus at Analyse-it has always been on the development and improvement of our software. While we provide extensive help, tutorials, and technical support for Analyse-it, one area we do not cover is training and consultancy. As many of you will know we are based in England in the United Kingdom, and providing training and consultancy is often done better locally, in-person.
Instead we partner with experts who can provide training and consultancy in various disciplines, in local language, and geographically near (or at least nearer) to our customers. You can always find a list of current consultant and training partners at
One of the experts we have had a long relationship with is Dr. Thomas Keller. Dr Keller is an independent statistician and has run for 15 years. One his many areas of expertise is the planning and evaluation of experiments for method validation and he has been involved in international working groups (IFCC, CLSI) in the fields of clinical chemistry and laboratory medicine. Dr. Keller was actually a customer and started to provide training in Analyse-it shortly after. His reputation is second to none in the industry and he has provided consultancy and training to many companies using Analyse-it. See an example of a offered by Dr. Keller. He also provides for anything from simple questions to full courses for individuals and small groups.
It’s been a long-requested feature, and today we’re happy to announce that Analyse-it version 5.10 now includes the ability to save the dataset filter with an analysis and re-apply it on recalculation.
Analyse-it always allowed you to use Excel auto-filters to quickly limit analysis to just a subset of the data, but until now that filter wasn’t saved. Each time you recalculated the analysis it was based on the currently active filter rather than the filter in-effect when you created the analysis.
This “active filter” method had its uses in exploratory data analysis: you can easily create an analysis, adjust the filter criteria, click Recalculate to see the changes to the analysis, then repeat as necessary to explore the data. But it also had its limitations. For example, if you created two analyses to look at subjects where Age > 20 and Age <= 20, simply clicking Recalculate on those analyses could get you in a mess if you didn’t reset the filter conditions on each analysis before recalculating.
Update 19-Sep-2019: Unfortunately this continues to be an issue for some users and unfortunately there is currently no solution from Microsoft except to suggest the use of compatibility mode as detailed below. We have requested this be fixed, so please up-vote it at
Update 27-Jun-2018: Although we have a fix for this issue on an internal build, it appears that Microsoft Office version 1807 (which is currently only available on the Office Insider track) fixes this issue. The missing user-interface problem was caused by a bug in Microsoft Office 1805/1806 updates. We will release our fix shortly, but the 1807 version update will also become available to everyone over the next month or so. If you want to get it immediately see .
Microsoft has recently released updates to both and to provide support for multiple monitor high DPI (dots-per-inch) displays.
In the early days of Microsoft Windows, monitors were assumed to have 96 DPI and all applications worked on that assumption: with a user interface fixed on that assumption. In the last 15 years, monitors with higher DPI have started to appear with the benefit that on-screen text and graphics look much smoother because there are so many more dots per inch. That caused problems for many applications which were fixed to assume 96 DPI, causing their user interface to scale improperly on high-DPI monitors. Applications like Analyse-it supported high DPI monitors and adjusted their user interface appropriately... until now.
Today marks the 20th birthday of Analyse-it.
It was December 1997 when we shipped the first disks containing Analyse-it to paying customers. In some ways it seems just like yesterday, but in other respects software development and Analyse-it has come so far in those 20-years.
As many of you know Analyse-it wasn’t our first foray into developing statistical software. My co-founder in Analyse-it, Simon Huntington, had previously developed . Astute was the first statistical add-in for Microsoft Excel, initially released in 1992 for Excel 4.0, and released a few months before Microsoft’s Data Analysis Toolpak which appeared in Excel 5.0.
We started developing Analyse-it in 1996. Astute was no longer available and so we started work on developing a replacement. We were just a start-up business, keen to get our first product to market and so worked 16 hour days, 6 days a week, for 18 months to build that first release of Analyse-it.
Software development back then was pretty brutal. It had improved leaps and bounds since the 1980s, when we started developing software, but it was still time-consuming and sheer hard (mental) work. The software development tools were relatively basic and computing power was a fraction of what it is today. We initially developed Analyse-it in C++, the only commercially-viable programming language back then, and although a very flexible programming language it was very easy to produce bug-ridden and hard-to-maintain software. Each addition or change to the software code during development took 5-10 minutes to compile before the software could be run (or tested) to see the results. Not quite as slow as the mainframe era, but in an iterative process such as software development it was still tediously slow. Compilation often failed, due to typos in the source code, which then needed to be corrected and compilation restarted. Finally when we had an executable we could then start debugging and testing it. Windows 95 was the operating system of choice, but wasn’t the most resilient. It was easy to crash Windows resulting in a reboot, reload of the software development tools, and so on. Another 15 minutes gone.
Prediction intervals on Deming regression are a major new feature in the Analyse-it Method Validation Edition version 4.90, just released.
A prediction interval is an interval that has a given probability of including a future observation(s). They are very useful in method validation for testing the commutability of reference materials or processed samples with patient samples. Two CLSI protocols, and both use prediction intervals.
We will illustrate this new feature using an example from CLSI EP14-A3:
1) Open the workbook .
2) On the Analyse-it ribbon tab, in the Statistical Analysis group, click Method Comparison and then click Ordinary Deming regression.
3) In the X (Reference / Comparative) drop-down list, select Cholesterol: A.
4) In the Y (Test / New) drop-down list, select Cholesterol: B.
5) On the Analyse-it ribbon tab, in the Method Comparison group, click Restrict to Group.
Often we collect a sample of data not to make statements about that particular sample but to generalize our statements to say something about the population. Estimation is the process of making inferences about an unknown population parameter from a random sample drawn from the population of interest. An estimator is a method for arriving at an estimate of the value of an unknown parameter. Often there are many competing estimators for the population parameter that differ based on the underlying statistical theory.
A point estimate is the best estimate, in some sense, of the population parameter. The most well-known estimator is the sample mean which produces an estimate of the population mean.
It should be obvious that any point estimate is not absolutely accurate. It is an estimate based on only a single random sample. If repeated random samples were taken from the population the point estimate would be expected to vary from sample to sample. This leads to the definition of an interval estimator which provides a range of values defined by the limits [L, U].
As we mentioned last week in the , in this release we took the opportunity to revamp the documentation.
The revamp involved rewriting many topics to make the content clearer, adding new task-oriented topics, including refresher topics on common statistical concepts, and improving the indexing and links between topics so you can more easily navigate the help system.
The new task-oriented topics give you step-by-step instructions on completing common tasks. For example you will now find topics on how to , , , and even simple tasks like . We have also fully documented the supported dataset layouts for each type of analysis so you can see how to arrange your data for Analyse-it. The links in each topic help you more easily find related topics, for example links to topics on how to interpret the statistics, links to explain the pros and cons of the available statistical tests, links to topics for common tasks, and a link showing you how to arrange the dataset.
Of all the requests, the most customer-requested improvement is the new . Previously we offered EPUB and Kindle reader editions of the help, but not PDF. To be honest, producing a PDF user guide from the tools we use to write the help was a real technical challenge. The PDF produced just wasn’t good enough for us, and certainly not for our customers – the formatting and layout were poor, indexing was non-existent, and there were so many other niggles. So we took the time to make the user-guide both look good and be usable. Take a look and let us know what you think!
Last week we released version 4.80 of Analyse-it.
The new release includes multi-way , , and in the Standard edition, and since every licence includes the Standard edition, these features are available to all users. We also took the opportunity to revamp the and develop a . We’ll go into more details on the improvements in the next few weeks.
If you have you can download and install the update now, see . If maintenance on your license has expired you can renew it to get this update and forthcoming updates, see .
Today we released version 4.60 of the Analyse-it Method Validation edition.
The new release now includes 3 nested-factor precision analysis, which extends Analyse-it’s support for CLSI EP05-A3 multi-laboratory precision studies.
We are delighted to announce the addition of the Analyse-it Quality Control and Improvement Edition to the range of Analyse-it products.
The new edition includes the most impressive statistical process control (SPC) charts available in any Excel statistical software package, including Shewhart, Levey-Jennings, CUSUM, and EWMA charts. Process capability statistics and plots help you ensure a process is able to meet specifications. And Pareto plots help you identify the quality-related problems that need the most attention and let you monitor efforts to reduce them.
Screenshot: Xbar-R chart of a quality characteristic by phase with stratification.
Screenshot: Process capability plots by phase.
Screenshot: Pareto plot of failures with stratification and sorted by key plot.
The new release is currently available for pre-release beta testing. Anyone can download and test the new release, though you will need an active internet connection to use it as we monitor usage and reliability of the pre-release.
Microsoft officially released a couple of days ago, and Analyse-it version 4.20 now adds support for Excel 2016.
Over the next few weeks we will tweak the Analyse-it user interface so it matches the new Office 2016 user interface styles. Line styles on the plots in Excel 2016 now also appear a lot thicker, due to anti-aliasing (smoothing), so we will decide whether to address that in a future update – let us know what you think.
If you have you can download and install the update now, see . If maintenance on your licence has expired you can renew it to get this update and forthcoming updates, see .
A critical feature of any analytical and statistical software is accuracy. You are making decisions based on the statistics obtained and you need to know you can rely on them.
We have documented our previously, but another good benchmark to test statistical software against is the NIST StRD. The Statistical Engineering and Mathematical and Computational Sciences Divisions of NIST’s Information Technology Laboratory developed datasets with certified values for a variety of statistical methods against which statistical software packages can be benchmarked. The certified values are computed using ultra-high precision floating point arithmetic and are accurate to 15 significant digits.
For more information about the NIST StRD see:
We tested version 4.00 of Analyse-it against the NIST StRD on an Intel Xeon dual processor PC.
No statistical package achieves perfect accuracy for all the tests and no one package performs best for every test. Most statistical packages use IEEE754 double precision (64bit) floating point arithmetic and due to finite precision, round-off, and truncation errors in numerical operations, are unable to obtain the exact certified value.
Today we released version 4.0 of the Analyse-it Method Validation edition. This is a major new release with many new features and improvements.
The latest release of the Analyse-it Method Validation edition now supports 10 of the latest CLSI evaluation protocol (EP) guidelines. guidelines are world-renowned and are recognized by the College of American Pathologists (CAP), The Joint Commission, and the US Food and Drug Administration (FDA).
Analyse-it has been a driving force in the adoption of statistics in method validation for over 15 years, has influenced many recommendations, and is the only software available with such extensive coverage for the latest CLSI guidelines.
CLSI guidelines supported in version 4.0 include:
Measurement Systems Analysis (MSA) is a new feature in version 4.0. MSA unifies precision and linearity, which were available in earlier versions of Analyse-it, but also includes trueness (bias) and detection capability so you can establish the limit of blank (LoB) and limit of detection (LoD). The unification of these analyses in MSA lets you dig deep to examine and understand the performance characteristics of a measurement procedure.
The recent of passing of Professor Rick Jones (see ) caused me to reflect on the past.
I was very fortunate to earn a work placement with Dr Rick Jones at The University of Leeds in the summer of 1990. Rick was enthusiastic about the role of IT in medicine, and after securing funding for a full-time position he employed me as a computer programmer. Early projects included software for automating the monitoring of various blood marker tests and software to diagnose Down’s syndrome. At the time many hospitals had in-house solutions for diagnosing Down’s syndrome, and although the project took many years and the help of many other people to complete, it eventually gained widespread adoption.
Around 1992, Rick came up with the idea of a statistics package that integrated into Microsoft Excel. Armed with a ring bound folder containing the Excel SDK and a pile of medical statistics books, I set about the task of writing the software in C++. It wasn’t long before the first version of Astute was ready and commercially released.
Today we released version 3.80 of the Analyse-it Standard edition.
The new release includes Principal Component Analysis (PCA), an extension to the multivariate analysis already available in Analyse-it. It also includes probably the most advanced implementation of biplots available in any commercial package.
New features include:
The tutorial walks you through a guided example looking at how to use correlation and principal component analysis to discover the underlying relationships in data about New York Neighbourhoods. It demonstrates the amazing new features and helps you understand how to use them. You can either follow the tutorial yourself, at your own pace, or .
If you you will no doubt already know about the recent improvements in the Analyse-it Method Validation edition and the release of our first video tutorial. If not, now is a good time to since we post short announcements and feature previews on Facebook, and use the blog only for news about major releases.
The latest changes and improvements to the Analyse-it Method Validation edition include:
Finally, we are delighted to release our first video tutorial. The tutorial is the video equivalent of the tutorial above. It walks and talks you through using Analyse-it to determine the agreement between methods. Sit back and .
We intend to produce more video tutorials in future, so let us know what you think: what you like, dislike, and how we can improve them in future.
Today we released version 3.70 of Analyse-it.
The new version includes many new features which some of you may have read about on our page over the last few weeks:
Compare groups
Fit Model
Correlation
If you have you will be notified an update is available when you next start Analyse-it, or you can download and install the update now, see . If maintenance on your licence has expired now is a good time to renew it to get this update and forthcoming updates, see .
We'll go into more detail on the new features over the next few weeks. In the meantime, to get the latest news, see announcements of forthcoming features, and get handy tips on how to use the myriad of features in Analyse-it.
Probably the greatest concern when using statistical software is reliability. Is the software producing accurate, numerically correct results that have been validated?
It’s a very important question. Many of you work in FDA and regulated environments where the penalties for mistakes are very high. And those of you outside such environments are still making important business and research decisions using Analyse-it. It’s therefore imperative that the software you depend upon is developed to a professional standard, thoroughly tested and validated.
Today we are publishing details of the software development and validation process we use at Analyse-it, see:
For those of you working in regulated environments the document is essential to justify the use of off-the-shelf, third-party software.
For everyone else it explains exactly how Analyse-it is developed, tested and validated. You can use it as a checklist against which to judge the quality of alternatives. But be wary. Professional software development is complex and developing numerical software more so. We have over 25-years software development experience (each developer, not aggregate), and over 20-years developing statistical software.
We have just released version 3.60 of the Analyse-it Standard edition. It now includes repeat-measures ANOVA and Friedman tests in the Compare Pairs analysis.
If you have active maintenance, Analyse-it will notify you an update is available in the next few days, or you can download it immediately at:
If you do not have active maintenance, now is a great time to extend maintenance to get this latest update and all updates for the next 1- or 3-years, see:
Today we released the Analyse-it Method Validation edition version 3.5. The software is feature complete, validated, and includes documentation. It supports Excel 2007, Excel 2010 (32- and 64-bit) and Excel 2013 (32- and 64-bit).
We took this opportunity to rename the product from the Analyse-it Method Evaluation edition to the Method Validation edition. The product is the same, but the new name better reflects the intended purpose of the product.
Method Comparison
Precision
Linearity
Reference interval
Diagnostic performance / ROC
Binary diagnostic tests
For more information about the new version, and to download a free 30-day trial, see:
Pricing for the Analyse-it Method Validation edition starts at US$ 699 for a 1-user perpetual licence. If you already have a licence you may qualify for a free upgrade, if you have active maintenance, otherwise you can extend maintenance to get the upgrade (and all updates for 1- or 3-years) free of charge. To see if you qualify for a free upgrade, otherwise get a quote to extend maintenance, see:
Today we released the first public beta test version of the Analyse-it Method Evaluation edition, version 3.5. The software is feature complete and is validated – it is now only missing documentation.
We invite everyone to download the beta and try the new version of the software before it is finally released in September. You will need Excel 2007, 2010, or 2013 (32-bit and 64-bit versions are supported) and it can be installed and used alongside older versions of Analyse-it so it won't interrupt your day-to-day work.
To download the beta version:
To activate the software use the product key:
The software will be publically released at the end of September 2013.
If you purchased a licence in the last 12 months, the 12 months of maintenance included means you will qualify for a free upgrade to the new version.
If you are outside the 12 month free upgrade period you can purchase 12 months of maintenance, to get the upgrade (and all updates in the following year), for 20% of the cost of your licence. For example, if you have a 1-user licence then the upgrade will cost 20% of the cost of a 1-user licence. Similarly if you have a 3-user licence the upgrade cost would be 20% of the cost of a 3-user licence.
Today we released the 3rd alpha release of the Analyse-it Method Evaluation Edition 3.5. Alpha releases are versions of the software that are still in active development, but are released to small group of customers so we can identify and fix any problems before the public beta release.
This release now completes the package with method comparison, which includes Deming regression, Passing-Bablok regression, and Bland-Altman difference plots. Linearity, precision analysis, diagnostic performance (ROC analysis and binary test performance) and reference intervals were already included in earlier alpha releases.
If you would like to take part in this and subsequent test phases reply to this post or . The test releases will run alongside any existing version of Analyse-it, so your day-to-day work won't be interrupted or affected. And those who help during testing will receive a discount on the upgrade (a free upgrade for those who contribute the most) when the product is released later this year.
Today we released the 2nd alpha of the Analyse-it Method Evaluation Edition 3.5.
Alpha releases are pre-release versions of the software that are still in active development. We release them to a small group of customers so we can get feedback and quickly identify and fix any problems before the public beta release. If you want to take part in the test phase reply or comment on to this post or . You can use pre-release versions of Analyse-it alongside your existing version of Analyse-it, so it won't disrupt your work. And, if you help during in the test phases you will get a discount on the upgrade (a free upgrade for those who contribute the most) when the product is released later this year.
This latest alpha release includes linearity and precision analysis, plus diagnostic test performance (ROC analysis and binary test performance) and reference intervals from the 1st alpha.
Some of the new features included so far are:
We are now starting to release test previews of a major update to the Analyse-it Method Evaluation edition. The new release will include many new features (we'll reveal more in the coming weeks) and will support 32- and 64-bit versions of Excel 2007, 2010, and 2013.
During the initial test phases we release development versions of the application to a small group of customers to ensure it installs and runs as expected on a wide range of PCs and configurations. The official beta test phase stage then follows where more customers are invited to download and use the software, while we iron-out the final few bugs before the official release. The official release is planned for summer 2013.
If you want to take part in the test phase, reply to this post or and let us know what aspects of Analyse-it you use:
Analytical Linearity, Precision, Accuracy, Diagnostic performance (ROC, binary test performance), Reference ranges, Agreement (Bland-Altman).
What is a sample quantile or percentile? Take the 0.25 quantile (also known as the 25th percentile, or 1st quartile) -- it defines the value (let’s call it x) for a random variable, such that the probability that a random observation of the variable is less than x is 0.25 (25% chance).
A simple question, with a simple definition? The problem is calculating quantiles. The formulas are simple enough, but a take a quick look on Wikipedia and you’ll see there are at least 9 alternative methods . Consequently, statistical packages use different formulas to calculate quantiles. And we're sometimes asked why the quantiles calculated by Analyse-it sometimes don’t agree with Excel, SAS, or R.
Excel uses formula R-7 (in the Wikipedia article) to calculate the QUARTILE and PERCENTILE functions. Excel 2010 introduced two new functions that use slightly different formulas, with different denominators: PERCENTILE.INC and PERCENTILE.EXC.
SAS, R and some other packages let you choose which formula is used to calculate the quantiles. While this provides some flexibility, as it lets you reproduce statistics calculated using another package, the options can be confusing. Most non-statisticians don’t know when to use one method over another. When would you use the "Linear interpolation of the empirical distribution function" versus the "Linear interpolation of the modes for the order statistics for the uniform distribution on [0,1]" method?
Yesterday we improved the help in the and added a statistical reference guide. The guide tells you about the statistical procedures in Analyse-it, with help on using and understanding the plots and statistics. It’s a work in progress, and we intend to improve it further with your comments and feedback, but it’s important to understand the role of the guide.
Firstly, the guide is not intended to be a statistics textbook. While it covers key concepts in statistical analysis, it is no substitute for learning statistics from a good teacher or textbook.
Secondly, the guide does not include the mathematical formulas behind the statistics. While an understanding of the mathematics is useful, it is better to understand the practical application of statistics: when and where they can be used, and how to interpret the results. Software makes it unnecessary to know the exact formulas, and often the exact mathematics used in software differ from those in textbooks since optimised routines are used to ensure good performance and numerical precision.
Leeds, England (PRWEB) October 03, 2012 -- Analyse-it Software, Ltd. today announced a major new release of their popular , Analyse-it®. With support for Excel 2007, 2010 and the forthcoming Excel 2013, Analyse-it transforms Microsoft Excel into a cost-effective powerful statistical analysis and data visualization package. Statistics and plots are included for exploring and describing data, estimating parameters, testing hypotheses, uncovering relationships and fitting models.
"We’ve made major improvements to in Analyse-it 3.0", said Simon Huntington, Director of Statistical Products at Analyse-it. "The improvements make Analyse-it a serious rival to larger statistical packages costing up to 5 times the price. Unlike the alternatives though, Analyse-it lets you perform all your statistical analysis without having to leave, or export your data, from Microsoft Excel."
Amongst the hundreds of improvements, model fitting and regression analysis have been improved to support simple models such as linear, logarithmic, exponential and power regression and advanced models such as and . Relationships between variables can be visualized using the and partial residual leverage plots show the effect of each term when building the model. Residual plots, distribution plots, lag plots, and sequence plots are also included for checking model assumptions. And an influence plot helps to quickly identify outliers and influential points, based on Studentized residuals and Cook’s D.
Today we pushed the release candidate of the Analyse-it Standard Edition v3.0 for Microsoft Excel 2007 & 2010, our statistical analysis software for Microsoft Excel.
The release candidate is feature complete and is intended to be the final, almost public release of the software.
The software is now validated against our library of thousands of tests to ensure the statistics and plots are accurate and correct. You can now use the software in your day to day work and, like previous versions, it can be used alongside your current version of Analyse-it until you become more comfortable with it.
The release candidate also includes a user-guide and five tutorials to guide you through using the software by example. We recommend you complete the tutorials first as they help you quickly understand how to use Analyse-it 3.0 and demonstrate many of the types of statistical analysis included. The send feedback feature is available from the help toolbar so you can easily send your suggestions on how we can improve the help.
We’re pleased to release the final beta of the Analyse-it Standard Edition, v3.0. The beta is now publicly open to anyone as we iron out any final issues and conduct final testing before release.
To download the beta, please visit:
Screenshot: Analyse-it fit model analysis includes an influence plot to identify points with a substantial effect on the fitted model.
Many of you have asked what statistical tests and plots are included in the new release. The full specification is shown below. In coming weeks we'll announce pricing, upgrade pricing, and award free licences to the beta testers who contributed most time and effort.
Descriptive statistics
Mean, Median, Variance, SD, Skewness, Kurtosis
Quantiles / Percentiles
Frequency distribution table
Contingency table
Correlation coefficients – r, rs, tau
Plots
Histogram
Frequency plot
Dot plot
Box plot
Mean error bar plot
Today we released the first public beta of Analyse-it Standard Edition, v3.0.
Maybe we're biased, but it is an amazing product! It's been a heck of a lot of work, but the range of statistical tests and plots in Analyse-it v3 rival what the expensive, established statistical packages provide. In fact, in many cases we’ve surpassed what they offer. And the statistical plots go beyond what's available in any other Excel add-in.
The beta-test programme is open to everyone. Even if you don't currently use Analyse-it you are welcome to join. If you do use Analyse-it the beta will run alongside any existing version, so your day-to-day work won't be interrupted or affected.
To join, drop an e-mail to and . We will then send you a link with instructions on how to download, install and start using the software. As a reward for your time, all Google+ beta-test followers will get a discount on the upgrade when Analyse-it v3.0 is launched.
In the coming weeks we will reveal more about Analyse-it v3.0, including upgrade and new licence pricing. If you are currently considering whether to buy a licence for Analyse-it v2.0, don’t hesitate – you’ll get the upgrade to version 3.0 free of charge and avoid the price increase.
If you follow us on you will have seen that we released a new version of the Analyse-it Standard Edition, v3.0, to testing this week.
During the initial test phases we release development versions of the application to a small group of customers to ensure it installs and runs as expected on a wide range of PCs and configurations. The official beta test phase stage then follows where more customers are invited to download and use the software, while we iron-out the final few bugs before the official release. The official release is planned for early 2012.
Screenshot: Analyse-it v3 scatter matrix with histograms and density ellipses
In coming months we’ll reveal more about the many new features, statistics & plots in Analyse-it Standard Edition 3 (see image above). Pricing and upgrade costs will also be announced, though many, including anyone buying a licence today, will receive the upgrade free.
To join, get involved, and start testing early releases of Analyse-it 3 e-mail . The only commitment is you need to use the application for a few hours over the coming weeks and will need either Excel 2007 or Excel 2010 (on any version of Windows). You can use the test version alongside your current Analyse-it so it will not disrupt your day-to-day work.
It's been a few months since we released Analyse-it 2.22, which added compatibility with, what was then, the Excel 2010 release candidate. Now it seems many are upgrading from Excel 2003 and are contacting us to ask whether Analyse-it is compatible. It is!
Interestingly, starting with Office 2010, Microsoft is providing 32- and 64-bit versions of Microsoft Excel. Until now Excel has been a 32-bit application only (going back to Excel 5 which was a 16-bit application). natively supports the 64-bit microprocessors becoming more common in desktop PCs and allows you to work with truly enormous quantities of data.
Unfortunately most Excel add-ins, such as Analyse-it, are 32-bit applications and are not compatible with the 64-bit versions. It will take sometime until add-ins like Analyse-it natively support 64-bit – it has taken Microsoft many years to take the first step. For this reason, Microsoft recommends you only choose to install Excel 64-bit if you really need it. And 99.99% of users don’t. Excel 32-bit supports over 1million rows by over 65,000 columns and up to 2GB workbooks – that’s a lot of information!
Yesterday Microsoft launched . It’s the next version of Windows, following on from Windows Vista, Windows XP, and Windows 2000.
As a software vendor we had early access to Windows 7 and have been using it daily for approximately 6-8 months. Our impressions are Windows 7 is very reliable, stable, much faster than Vista, and is an upgrade we wouldn’t hesitate to recommend.
Under the hood, dramatic improvements have been made. Start-up time is reduced so Windows 7 is ready to use from cold-start or hibernation far more quickly than Vista and even XP. In normal use it also feels much faster and more responsive.
Usability is improved significantly, with many slick user interface improvements especially for managing applications on the task bar. Many users will be relieved to know that the learning curve is shallow though – the user-interface isn’t such a departure from Windows Vista that you’ll be less productive for the days after upgrading.
Today we released the latest update to Analyse-it, version 2.20, which includes some minor fixes and major improvements. The major improvements affect users of Excel 2007, though we recommend all users download this update for the minor fixes included.
Late last week Microsoft started to release Microsoft Office 2007 Service Pack 2, and made it publicly available on Tuesday this week. Service Pack 2 includes many improvements, including some worthy performance improvements for users of Microsoft Outlook.
Unfortunately, the changes made to Excel 2007 caused Analyse-it 2.14 (and earlier) to sometimes crash when you clicked Refresh or Collate on the Report toolbar. Analyse-it 2.20 fixes the problem, working around the bug introduced in Excel 2007 Service Pack 2.
If you want to download Microsoft Office 2007 Service Pack 2, you can or install it automatically using .
We've also taken the opportunity in this release to add support for large Excel 2007 worksheets. Before Excel 2007 a worksheet was limited to 65,536 rows and 256 columns. Excel 2007 expands this so you can now have a worksheet as large as 1,048,576 rows and 16,384 columns.
The British government recently announced a 2.5% reduction in VAT (sales tax) on goods purchased from the United Kingdom (see ). UK VAT was previously 17.5%, but from the 1st December 2008 until the end of 2009 it has been reduced to 15%.
Like many businesses, last Monday, we implemented the change.
Customers in the United Kingdom who aren’t VAT exempt, and those of you in Europe who are not VAT registered and must pay VAT, will now pay only 15% VAT.
For customers outside Europe, including those of you in the USA, Canada, and Australia, UK VAT still does not apply and you can continue to purchase licences without paying UK sales tax.
In clearly titling this blog post, we’ve probably already revealed the answer, but... Can you spot the difference between the two rows of values in the Excel spreadsheet shown below?
Sorry, it’s a trick question, because (visually) there is no difference. The difference is how the values are stored by Microsoft Excel. The value 57 in the cell on second row is actually stored as a text string, not a number.
When you type a value into a cell, Excel looks at what you’ve typed and decides whether it’s a valid number. If it is, the value is stored as a number, and if not it’s stored as text (a string of characters).
Considering this, how is it possible for Excel to store a value that looks like a number, as text? There are a few ways. Most common is when you copy-paste data from another application, and the application providing the data fools Excel into believing the values should be stored as text. Similarly, if you import data from a database field that contained numbers stored as text, the numbers will be imported as text. Finally, you can force Excel to store a number as text by prefixing it with an apostrophe (‘).
Today we’re delighted to publish the second case study into the use of Analyse-it.
The case study features a national clinical laboratory in the USA that offers more than 2,000 tests and combinations to major commercial and government laboratories. They use Analyse-it to determine analytical performance of automated immunoassays for some of the industry’s leading in-vitro diagnostic device makers -- including Abbott Diagnostics, Bayer Diagnostics, Beckman Coulter and Roche Diagnostics.
Unfortunately we cannot name the end-user, or the organisation she works for, in the case study. Although she was delighted to feature in the case study, at final approval her organisation's committee preferred the names be withheld. Thankfully they have allowed us to use the case study, albeit anonymously.
You can online now or download the .
We would love to feature more customer stories in case studies. If you can get approval to participate – which we realise is very difficult in many industries – and have 20 minutes to spare for a telephone interview, please contact us at .
In a previous post, , we explained the tests provided in Analyse-it to determine if a sample has normal distribution. In that post, we mentioned that although hypothesis tests are useful you should not solely rely on them. You should always look at the histogram and, maybe more importantly, the normal plot.
The beauty of the normal plot is that it is designed specifically for judging normality. The plot is very easy to interpret and lets you see where the sample deviates from normality.
As an example, let’s look at the distribution of systolic blood pressure, for a random group of healthy patients. Analyse-it creates the histogram (left) and normal plot (right) below:
Looking at the histogram, you can see the sample is approximately normally distributed. The bar heights for 120-122 and 122-124 make the distribution look slightly skewed, so it’s not perfectly clear.
The normal plot is clearer. It shows the observations on the X axis plotted against the expected normal score (Z-score) on the Y axis. It’s not necessary to understand what an expected normal score is, nor how it’s calculated, to interpret the plot. All you need to do is check is that the points roughly follow the red-line. The red-line shows the ideal normal distribution with mean and standard-deviation of the sample. If the points roughly follow the line – as they do in this case – the sample has normal distribution.
A customer contacted us last week to ask how to refer to cells on an Analyse-it report worksheet, from a formula on another worksheet. The customer often used Analyse-it's refresh feature, to repeat the statistical analysis and update the statistics, and direct references to cells on the report were being lost on refresh.
As an example, suppose you have used Analyse-it linear regression to calculate the linear relationship between installation cost and the number of employees required, distance to the site, and the cost of machine being installed. Analyse-it would calculate the effect of each variable on the final cost, technically known as regression coefficients, which you can then use to predict installation costs for jobs in future.
You might setup a worksheet to predict and quote installation costs for future jobs. You could use an Excel formula to reference the coefficients directly from the Analyse-it report, for example:
= Employees * CostAnalysis!C17 + Distance * CostAnalysis!C18 + MachineCost * CostAnalysis!C19
Today we’re delighted to publish the first case study into the use of Analyse-it.
Marco Balerna Ph.D., a Clinical Chemist at the in Switzerland, used Analyse-it when replacing the clinical chemistry and immunological analysers in EOC’s laboratories.
Since the EOC provides clinical chemistry services to five large hospitals and three small clinics in the region, it was essential the transition to the new analysers went smoothly. Marco used Analyse-it to ensure the analyser’s performance met the manufacturer’s claims, to ensure the reporting of patient results was not affected, and to comply with the regulations of the EOC’s accreditation.
Overall the project involved comparing performance for 110-115 parameters, comprising over 25,600 measurements with control materials and patient samples.
Marco was so impressed with Analyse-it and the time he saved, that he was very enthusiastic when we asked if we could feature his story in a case study. We would like to publically thank Marco for his co-operation in the case study. Grazie Marco! Salute!
Last Friday we released the latest update to Analyse-it, version 2.12 -- a minor update, providing fixes to minor issues recently reported by customers. The update is available free.
If you're using , and not experiencing any of the issues fixed (see the ), then you can skip the the update if you wish. But if you're using an earlier version of Analyse-it, version 2.10 or earlier, we recommend you get the update.
If you’re unsure which version of Analyse-it you’re using, see our FAQ: .
Analyse-it automatically checks for updates every 15 days on start-up, and will tell you if an update is available to download. Firewalls can get in the way though, so if you haven’t got a notification yet, or want to download 2.12 right away, you can download at:
When you’ve downloaded the update simply install over your existing version of Analyse-it. There’s no need to uninstall the old version and you won’t need your product key to reactivate Analyse-it -- unless you’re upgrading from Analyse-it version 1.xx in which case you’ll need to to activate Analyse-it 2.12.
Although the charts in Analyse-it are large so they’re easy to read when printed, sometimes you need to print a chart to fill the full page. You can do so easily, without resizing the chart, in just a few steps:
Chart size is only limited by the page size your printer supports.
We’re pleased to announce that from today we can accept payment in EUROs. You can now see prices for Analyse-it in EUROs, as well as British Pounds sterling, and US dollars (for customers in the USA & Canada).
The EURO is now the primary currency in Europe and many of our customers have asked us to accept EUROs. We can accept payment in EUROs by VISA, MasterCard or AMEX credit card, or by cheque or wire-transfer.
When paying by credit card, your card will be charged the exact amount in EUROs. There are no currency conversions so no charges are incurred. The price you see, in EUROs, is the exact price we will charge to your credit card.
For established organisations we are happy to accept purchase orders in EUROs and invoice in EUROs. The invoice can be paid by cheque or wire-transfer, directly in EUROs, avoiding currency conversion costs.
Identifying what was analysed, when, and by who, is the first step in understanding any Analyse-it report. The top rows of each Analyse-it report provide you with this information. The statistical test used, dataset and variables analysed, user who analysed, and the date and time last analysed, are included (see below). When you print the report the header is repeated at the top of printed page.
The date the report was last updated is included so you can see when reports are out of sync with changes made to the dataset. It’s also useful if you archive analysis reports and need to know when the analysis was performed. For brevity Analyse-it shows only the date, but the cell also contains the time of the last update to the report. To see the time, click the cell containing the date to activate it, and then look at the Excel formula bar to see the time (see screenshot above).
To aid traceability, Analyse-it includes the name of the user who last updated the report. Analyse-it gets the name from the Microsoft Office user name. The user name shared among all Microsoft Office applications, including Word, Excel, and PowerPoint. Office applications use the name to identify changes in documents, and store it in the document properties to identify who created, last edited, or modified an Office document. Analyse-it includes the name in the report header so you can quickly see who last analysed the data, should you need to contact them.
In May this year, we surveyed users of the Analyse-it Method Evaluation edition to gain insight into how we can improve Analyse-it in future. Thank you to all those who responded.
In the responses, one issue became clear: the unfiled reports feature causes confusion.
When you run an analysis, Analyse-it creates a new worksheet containing the statistics and charts for that analysis (what we call a report). Analyse-it places the report in a temporary workbook called . From there you can then decide what you want to do with the analysis: keep it, print it, e-mail it, or discard it. If you want to keep it you click the (see below), and Analyse-it moves the report into the same workbook as your dataset.
You might wonder where the idea of unfiled reports originated. It was actually a carry-over from , the predecessor to Analyse-it. We implemented the same feature in Analyse-it without really questioning it. We thought the feature would be useful to help you manage reports, plus most Astute users upgrading would expect it.
The most used distribution in statistical analysis is the normal distribution. Sometimes called the Gaussian distribution, after , the normal distribution is the basis of much parametric statistical analysis.
Parametric statistical tests often assume the sample under test is from a population with normal distribution. By making this assumption about the data, parametric tests are more powerful than their equivalent non-parametric counterparts and can detect differences with smaller sample sizes, or detect smaller differences with the same sample size.
It’s vital you ensure the assumptions of a parametric test are met before use.
If you’re unsure of the underlying distribution of the sample, you should check it.
Only when you know the sample under test comes from a population with normal distribution – meaning the sample will also have normal distribution – should you consider skipping the normality check.
Many variables in nature naturally follow the normal distribution, for example, biological variables such as blood pressure, serum cholesterol, height and weight. You could choose to skip the normality check these in cases, though it’s always wise to check the sample distribution.
For new and occasional Analyse-it users, datasets can sometimes seem confusing. Today we’ll explain why we devised the 'dataset' concept, a concept now copied by some other Excel add-ins.
We introduced the dataset concept so Analyse-it could automatically pick-up the data and variables from your Excel worksheet. As we found with , the Analysis Toolpak, and other Excel add-ins, forcing you to select cells containing the data to be analysed can be problematic:
Statistics software is supposed to simplify what’s already a complex and error-prone subject area. Forcing you to select ranges of cells just seemed, to us, to introduce more potential for errors.
In Analyse-it we tried to solve all these problems. We wanted the software to do more of the work, eliminating the need to select or re-organise your data. You should be able to:
These requirements mean Analyse-it has to know exactly how your data is arranged on an Excel worksheet -- which cells contain data for analysis, and which cells contains the variable names.
A few readers have e-mailed to ask for more information about the book by David J. Sheskin we alluded to in the comment reply re: the , last week.
The book is the Handbook of Parametric & Non-parametric Statistical procedures, by David J. Sheskin, ISBN: 1584888148.
We have the third edition of the book which runs to over 1,200 pages -- a phenomenal piece of work for a single (obviously very dedicated) author. While it’s not a book you would sit down and read cover-to-cover, it is a very readable reference guide, covering all the parametric and non-parametric statistical procedures included in Analyse-it.
For beginners the book starts at the very beginning, introducing summary statistics such as the mean, median, then moving on to explain concepts such as measurement scales, central tendency, variability, normal distribution, hypothesis testing, parametric and non-parametric statistics. The text is concise, but is clear, easy to read, and easy to understand -- ideal for anyone needing a refresher course on statistics.
Many Analyse-it users readily admit their statistics knowledge is a little rusty, usually because it’s 10 years or more since their last statistics course. Should I use the t-test, Mann-Whitney, or Wilcoxon test? The names of the tests aren’t exactly helpful, nor do they give you any clue of the assumptions that must be met to use the test.
That’s why we devised the .
It’s a simple interactive wizard that asks what you want to do, what data you’ve observed, checks which pre-test assumptions can be met, then tells you the best statistical test to use. Using the advisor you can be confident you’re using the correct statistical test -- or even use it to check if your statistics knowledge really is as rusty as you think!
Try it for yourself now:
Bear in mind this is only a simple prototype at the moment. Eventually the advisor will integrate into Analyse-it, leading you to help and tutorials showing you how to use the recommended test and interpret the statistics. You’ll also notice the advisor recommends tests that will be new to Analyse-it 3.
Depending on where you’re located, the way we spell “Analyse-it” may intrigue you. We chose the name in 1997 as it sounded active, a direction to analyse it! – similar to many product names of the time.
The name has served us well and hints as to what our business and product offers.
At the time we didn’t think such a simple name would cause so many headaches. Before you wonder, Analyse-it doesn’t mean anything offensive in other languages, but it can be spelt different ways:
If you’re from the US, the obvious spelling is “Analyze-it”. If English isn’t your first language, “Analise-it” or “Analize-it” seems to be the natural spelling. Then there’s the hyphen – with or without? Combined there are 16 variations.
The variations have sometimes caused customers to send e-mail to the wrong company, or made finding our web-site difficult. We’ve also had the occasional e-mail suggesting we can’t spell! One of the not-so polite e-mails went like this:
Most of you know where to find the help and examples provided with Analyse-it, but if not, today we’d like to explain what’s available. If you're stuck we're always happy to help, and usually respond within a few hours, but it's always faster for you to check if the help answers your question first.
If you’re new to Analyse-it, or want a quick refresher, the best place to start is the Getting Started tutorial. It’s completely automated, no typing is required, so all you have to do is sit back and watch. In just 10 minutes it will demonstrate how to setup a dataset, how to filter the dataset, how to run a statistical test, and how to edit, refresh, and print the reports.
To watch the tutorial:
The application help provided with Analyse-it is a complete reference covering all aspects of Analyse-it: how to install Analyse-it, start it, layout datasets, manage reports, and how to use the statistical tests. You can either browse the contents to learn about Analyse-it, or use the index or search to quickly find the right topic. Index and search were only recently added to the help, in Analyse-it 2.10, so if you’re not using the latest version .
One of the primary reasons we launched the blog is to let you know what we’re currently working on, and give you to opportunity to feedback and influence the development.
At the moment we're spending most of our time developing version 3.0 of the Analyse-it Standard edition. The improvements are based on what you've asked us to include, and through insights gleaned from recent customer surveys. Improvements and new features will include:
Version 3.0 will be a major upgrade and will be released later this year. As per our upgrade policy, you'll get the upgrade free if you purchased your licence within 2-years of the release date – so if you're thinking of buying a licence now you will get the upgrade free. If you don't qualify for the free upgrade you'll be able to buy the upgrade for a small cost, necessary to cover development costs. We'll announce final pricing and upgrade pricing nearer release.
In the next few weeks we’ll post more about the new features, including screenshots, so you get a chance to comment on individual features. We'll then incorporate any requests and suggestions into the final version before release.
Last Friday we released the latest update to Analyse-it, version 2.11. It’s a minor update providing minor fixes to issues recently reported by customers.
The update is available free to everyone, including users still using Analyse-it version 1.
In fact if you’re still using Analyse-it version 1 you should get the update right away. You’ll be impressed with the improvements to both the application and help, and if you’re using Excel 2007 you’ll love the slick Analyse-it RibbonBar .
Analyse-it automatically checks for updates every 15 days on start-up, and will tell you if an update is available to download. Firewalls can get in the way though, so if you haven’t got a notification yet, or want to download 2.11 right away, you can download at:
When you’ve downloaded the update simply install over your existing version of Analyse-it. There’s no need to uninstall the old version and you won’t need your product key to reactivate Analyse-it -- unless you’re upgrading from Analyse-it version in which case you’ll need to to activate Analyse-it 2.11.
Thanks for stopping-by to read the inaugural post to the Analyse-it blog.
We’ve launched the to keep you up-to-date on what’s new at Analyse-it, what we’re working on, and to notify you when updates and new products are released. We’ll also blog about using Analyse-it, statistical analysis, and interpretation.
The best part about the blog is it’s a conversation -- a conversation between you, our visitors and customers, and us. After reading a post you can add your thoughts, suggestions, and comments. We’ll reply when necessary or other visitors can post their opinions to ignite the debate. Click the links at the end of each post to read comments or add your own.
To stay up-to-date we recommend you subscribe so you get the latest posts as they're published, without having to check the web-site. You can subscribe in two ways
That’s about it for the first blog post. Please remember to subscribe, and please post a comment now to introduce yourself or let us know how you want the blog and Analyse-it to develop in future.