Today's Clinical Lab - News, Editorial and Products for the Clinical Laboratory
Image of a clinical lab professional wearing white lab coat and globes looking and pointing at quality control data on a computer monitor

Quality Control: It’s All about the Data

Quality data is the backbone of today’s clinical laboratories—make sure you get it right

Matt Casper

Matt Casper is a biotech consultant for clients such as Labcorp, Codex DNA, Sequenom, and others.

ViewFull Profile
Learn about ourEditorial Policies.
Published:Apr 01, 2021
|4 min read
Register for free to listen to this article
Listen with Speechify
0:00
4:00

We’re all obsessed with quality. We demand it in our food, our clothing, automotive supplies, hotel service, even in our conversations. And there’s no organization out there that hasn’t, at one time or another, had quality baked into its brand values or mission statement: “Quality is job #1” was Ford Motors’s slogan for 17+ years.

But for clinical laboratory managers, quality isn’t simply job #1. Quality is often the whole job.

A decrease in the quality of a lab’s data, or in its processes, raw materials, or air cleanliness can lead to a patient receiving an inaccurate result, the delay of a product launch, and worse. Millions of dollars and dozens of years of work can disappear overnight.

So, where for many businesses “quality” is typically just a benefit or a buzzword, for today’s laboratory, it’s the backbone.

It’s also legally required. The Code of Federal Regulations (42 CFR 493) states laboratories “must establish and follow written policies and procedures for a comprehensive quality assurance program that is designed to monitor and evaluate the ongoing and overall quality of the total testing process.” More than a few product launches have been delayed or derailed by a failure to meet these rigorous standards.

Where to begin?

Determining where your lab may be in its quality assurance (QA) and control processes can be overwhelming.

But in almost every case, it boils down to the data. The data that comes from each facility, from every lab environment and piece of lab equipment, from every system used to track the progress of every product test or research step.

Here’s where looking outside the lab to processes and technologies employed by modern commercial enterprises can help. Because while a managed, optimized, and—ideally—automated QA methodology is new to some labs, most of today’s Fortune 100 companies have had quality control processes around their business data for decades. 

Their secret? Observation and automation—systems that are always “on,” capturing, analyzing, and distributing data to decision-makers, and centralizing both the storage and administration of that data. And while the processes and products within a lab are far different than those of a financial institution, data management practices (like acquisition and security) are similar across vertical markets.

Data logging vs data acquisition

While some lab managers may use the terms data logging and data acquisition interchangeably, they’re vastly different. Namely, data logging is a subset of data acquisition and must be treated as such.

Many labs have some data logging processes and tools in place, e.g., sensors on equipment that capture and store information. The information is typically extracted from these sensors, analyzed, reported, and then acted on. It can be a cumbersome process as data logging devices are not always connected to each other or to a larger system.

A data acquisition system, however, offers a far more proactive solution as it captures, codifies, and delivers data from data loggers, information systems, and more. In short, data acquisition takes what data loggers do and “de-silos” their work. Perhaps more importantly, data acquisition systems pull data from sources proactively, rather than relying on, for example, technicians manually collecting that information.

The end-result of acquisition vs logging is a 360-degree view of the data from your entire laboratory, rather than piecemeal data from each department or device. And as IT platforms, data logging devices and sensors, and other IoT (Internet of Things) hardware varies from room to room, from station to station, from one piece of equipment to another, a key driver for a successful data acquisition program is often a unified monitoring system. 

Loss prevention is a data issue too

When most labs think of loss prevention, they tend to focus on samples and raw materials. But data loss can be an even more trying issue as it can 1) lead to losses of samples and raw materials, and 2) leave you unprepared for audits by regulatory agencies.

The causes of data loss in the lab are endless: human error, system communication errors or failures, power outages, acts of nature. Though In most cases, it’s due to human error: a technician misses something, hits the wrong key, records the wrong temperature, etc.

Regardless of what causes data loss, in most cases it’s easily preventable through tried-and-true automation processes and platforms.

Quality data means quality controls

Fortunately, the same kinds of enterprise software and QA methodologies used by Fortune 100 companies to improve data quality and management may be emulated in the modern lab. An “always on” lab monitoring solution can be engineered to monitor dataflows as well, and to provide multiple layers of redundancy to ensure data is readily available.

When your processes, equipment, materials, and facilities—and all the sensors attached to them, collecting data in real-time—are housed in the same place, data integrity becomes easier to achieve and maintain.

The key consideration in lab monitoring for data quality control is that the system you’re using does more than “watch.” It must also be capable of acting, from simple steps, such as automatically generating and disseminating reports, to more complex functionalities, such as proactively alerting admins in the event of anomalies or breaches of pre-determined thresholds (e.g., air quality).

A second set of eyes

Quality control of data is everybody’s job, but in the end it’s a lab manager who is held accountable.

While you can offer training sessions on the importance of data quality for new hires, update workflows and security software on servers and systems, and talk until you’re blue in the face about best practices for avoiding errors and keeping data clean, there will almost always be blind spots.

With an automated laboratory monitoring service, you now have a second set of eyes that are open 24/7.

Written in consultation with XiltriX North America thought leaders.