top of page

MEASUREMENT SYSTEMS ANALYSIS FOR THE FUTURE

Updated: Jun 13, 2023




by John Knotts



How important is having valid data to make decisions when it comes to running your business or making improvements to your processes? I bet you are saying, “Pretty important!”


The question is, “Is your data valid enough to allow you to make good decisions regarding your business?” The real question is, “Is your measurement collection system good enough to provide valid data to make decisions with?”


The answer to this question today is the Measurement Systems Analysis (MSA), which is an approach that many use in process improvement. The MSA analyzes the validity of the tools used to collect measurements of the process in question. Plus, the MSA also validates the steps operators use to collect the measurement data.


If something is wrong with the tool you are using or the process being followed to collect the measurement data, you need to fix that first, before fixing the process.


The two typical approaches to validating your measurement systems are the Gage R&R Study and the Attribute Agreement Analysis. These both are what I refer to as, “Quantitative MSAs.” They use math-based logic to verify that the measures being collected repeatedly by individuals and machines are the same or match a known truth.


We use powerful statistical software to run these analyses, but in reality these approaches really only need a good eyeball check of the data. Is one person getting the same measurements every time they measure the same thing? Is everyone getting the same measurements as everyone else on the same thing? That is only what these two approaches are really testing.


However, the world of data collection is changing in business today. Data is being entered into management information systems (MIS), sometimes by human data entry and other times by systems and machines themselves.


The challenge with making decisions from data that is pulled from a MIS is that you cannot run a traditional MSA on the data pulled from the system. So, how do you validate that the data you are about to analyze and make decisions from is actually good data?


This is why we need a new MSA approach for the current business world. Enter the Qualitative Measurement Systems Analysis (MSA).



These data errors in large runs of data can go unnoticed for weeks, months, or even years. (...) This makes it important to periodically validate with a qualitative approach that your data is still good.

You will not find this approach in any Lean Six Sigma certification course or process improvement textbook. Whereas a quantitative approach works with data that can be counted and measured, a qualitative approach looks at the quality of the data.


This is particularly helpful in transactional processes, where you are downloading, reporting on, and analyzing large amounts of data pulled from systems. However, today many manufacturing processes are integrating data entry methods with things like tablets and computers right at the process. Thus, more-and-more data being used is coming from a MIS.


Conducting a Qualitative MSA is relatively simple. You can follow these five steps.


1. The first step is to pull the data that you plan to review and analyze into a

spreadsheet format, using something like Microsoft Excel, Google Sheets, or

Apple Numbers.

2. Examine the columns that you will use to analyze your data with. These fall

typically into numeric data columns but could be text-based data as well.

3. Determine the expected format and range of the data. Do you expect numbers

and no text? Do you expect text and no numbers? Do you expect a range of

numbers (e.g., between 50 and 200)? Do you expect a specific format for

the field (e.g., first name last name)? Do you expect any blank fields?

4. In each column, sort your data using the sort function. First sort the data in

ascending order. Then sort the data in descending order. If there are any

data that is not as expected, this will typically highlight these issues quickly.

5. If you have a text field that you expect in a certain format, you can do a quick

scan of the appropriate column to search for text in the wrong format. For

instance, if you expect the name field to be first and then last name without

any punctuation, you can search to see if there are any commas in the data –

when people enter the last name first, it is usually entered as last name, first

name.


When you have irregularities with your data – the data does not meet expectations – this tells you that your data entry process is flawed. Thus, if you used this data as is, you could make significant errors with decisions. You would want to figure out why this is happening and fix it first.


Here are some common data entry issues that can cause bad data:

• The data field does not require an entry to move forward

• The data field does not have any validation to prevent erroneous entries

• The default option is often selected because it is the easy choice

• Entering a simple number, like a “0” or a “1” allows an operator to quickly

pass by a data field

• Data is not updated between shift changeovers

• Users were not properly trained on data entry requirements before

performing the task


These data errors in large runs of data can go unnoticed for weeks, months, or even years. If a dashboard has been built using the data, then the errors might remain invisible forever. This makes it important to periodically validate with a qualitative approach that your data is still good.


Below are some of the common issues that a qualitative look will capture that might go missed otherwise.


1. Fat-fingering data entry with accidental amounts that are simply too large or

too small for the typical data range of the process. When data entry is all

done manually, it is very easy for someone to accidently miss a number or

enter numbers that have an extra digit.

2. More and more tablets and computers are used to record and collect machine

downtime and changeover time and this is a place where several data entry

errors can occur. Workers choose the quickest entry methods to speed

through the data. I have seen where the default reason code of “Other” was

used just because the workers did not want to search for an appropriate

reason code. I have witnessed where they enter a quick time entry amount,

like “1,” just to get past the screen. I have also seen where the workers start

the downtime or changeover clock, but forget to stop it or there is a shift

change and the new shift forgets to stop the clock.

3. Sometimes data format collected might change, especially if it is stored in a

cloud-based SAS system. If you are making decisions off data that has

changed, you might not be getting all the data. Also, if you are pulling data

off some text-based filter, if this filter changes, it can affect the data that you

are receiving.

4. The last popular aspect that I see a lot is data tampering. This is where people

are, unfortunately, purposefully fudging the numbers to meet specifications or

just make their operation look good. Typically, this will create a pattern of

behavior in a MIS that looks different then everything else. A quantitative

approach will not find this, where a qualitative approach can.


We spend a lot of time and effort teaching and using analysis tools like a Gage R&R and Attribute Agreement Analysis to ensure the validity of data collected. These quantitative approaches to data validation are still effective, but data these days are being reviewed from systems where these approaches do not work. Now is the time to start considering a qualitative approach to the information collected from our data and information systems.


 

John Knotts is a COO with over 30 years of experience in military, non-profit, and commercial leadership and operations. He has an extensive background in strategy, change, process, leadership, management, human capital, training and education, innovation, design, and communication. John is a 21-year Air Force Veteran, was a former consultant with Booz | Allen | Hamilton, and was a strategic business advisor with USAA. John owned a consulting business, Crosscutter Enterprises, since retiring from the Air Force in 2008. He has a Lean Six Sigma Master Black Belt from Smarter Solutions, a Master’s-level certification as a Change Management Advanced Practitioner from Georgetown University, and Change Management certification from Prosci.

23 views0 comments
bottom of page