How Middleware Is Helping The Medical Sector To Maximize Investment and Increase Quality

The application of middleware can sometimes seem elusive to grasp. Read about how it has become a necessity in the medical sector to craft connectivity between disparate systems within your IT framework, which will ultimately lead to increased ROI on investments and quality control improvements.

Over the past few years, middleware has really come into its own as a mainstream solution for helping organisations across a wide spectrum of business sectors succeed in streamlining work processes and improving data sharing. As medical organisations such as hospitals and laboratories are constantly being challenged by the advent of new technologies and regulations, administrators are constantly looking to deliver new ways of increasing productivity through applications such as middleware. As a result, laboratories are exploiting middleware in many other key areas of their business and realizing its true value. They are leveraging their investment with new and innovative applications of middleware.


Autoverification represents a solution to a wide range of issues: increased financial constraints, decreased reimbursements, increasing volumes of testing, increased regulatory constraints, demands for decreased response time, and higher expectations of quality. To add to the problem, there is a constantly decreasing pool of qualified personnel to do the work and growing fatigue among this group. With middleware, it is common for autoverification in general/ community/university hospital environments to reach levels as high as 90 percent. In reference laboratories it reaches 95 percent or higher. Some of the benefits are:

* Decreased turnaround times for routine and STATs, which is immediately followed by a significant decrease in the number of STATs;

* Higher consistency in the quality of results, which are being evaluated consistently across all shifts and all laboratories in a hospital organization–every day and in the same way;

* Reduced staff fatigue, which improves the work environment and allows medical technologists to focus on the true “problem” samples;

* The fact that special processing requests from physicians can be easily automated and accommodated; further differentiating laboratory services; and

* In growing number of sites, savings in FTEs that are allowing the laboratory to bring additional test methods in-house that were traditionally sent out or to expand into the emerging area of molecular testing.

Many labs’ first attempt to build autoverification systems was with their Laboratory Information System (LIS), usually starting in one work area such as Coagulation or Hematology. Often the first people to seek out middleware were typically trying to fill “gaps,” that is, the absence of functionality, in their LIS, instrumentation and/or automation systems or to improve systems that were not reaching desired autoverification levels. Later adopters came to realize that middleware could be configured more easily or function better than what they already had in place. They were not replacing the LIS, but supplementing the functionality pool to reach more work areas and achieve higher levels of autoverification.

The rules-based decision processing in middleware allows for an “intelligent” verification process and the ability to go beyond reference ranges. Now laboratorians write physician-specific, ward-specific, practice-specific rules that interact in real time with instrumentation and results production while incorporating quality control functions that far exceed the capabilities of most LIS systems.

Quality control and quality assurance

Quality is the cornerstone of any business or organisation. Quality control is one of the most important and probably the least appreciated activities in the laboratory. Few have quantified in time, effort, and expense what their existing quality systems cost the laboratory. Integrated with autoverification processes, middleware can improve quality and provide consistency across multiple work areas while reducing time and oversight of your quality processes.

Real-time, Interactive Quality Control: It is commonplace for laboratories to waste many hours each week transposing quality control information into third-party software only to discover days or weeks later that they have a quality issue. Then they spend countless hours trying to determine, “Is it me, or is everyone having this problem?” What these current processes lack is real-time monitoring and direct interaction with results production and expert analysis tools and information.

This is where middleware has changed the dynamics of current processes and is providing consistent compliance and improving quality without undue oversight or financial burden. Not only-does middleware act as a real-time conduit of information between your LIS and instruments, but it also provides information to third-party Quality Control (QC) software, such as Bio-Rad’s Unity Real Time.

Middleware eliminates the manual transposing of information or manual importing, allowing the laboratory to participate in web-based, inter-laboratory peer-grouping that provides the “sanity checking” and troubleshooting resources.

But reducing the data errors and providing QC in real time is only part of the benefit. The other part of the equation is interacting with your result production in real-time.

The middleware’s internal QC package or conduits to industry leading, third party packages allows QC and middleware to be integrated with your results production. When a QC violation occurs, rules-based decision logic in middleware starts immediately, flagging and holding results affected, and promptly alerts the appropriate laboratory staff by email, pagers and even light poles and network notifications.

Monitor Moving Averages in Real Time: Middleware has brought moving averages to the forefront as a key component for a comprehensive quality assurance system. The concept of average of normal (AON) or Moving Averages is not new. It was first proposed for the clinical laboratory by Hoffman and Wlad in 1965 (1) but its adoption has been hindered by the programs’ lack of capability in implementing Moving Averages, with the exception of Bull’s algorithm in most Hematology instruments.

Moving Averages is a tool to continuously monitor stability, by using patient samples to measure instrument and assay performance. It provides early detection and notification for subtle analytical shifts not always detected by event driven QC methods.

Many middleware systems now employ Moving Averages, allowing this functionality to be applied in a broad spectrum of disciplines, most notably automated chemistry, and doing so in real time. Users are able to define which assays to monitor and the number of results needed to generate a point and to choose multiple algorithms to use.

Levy Jennings’ chart allows laboratorians to monitor the performance of multiple instruments in the multiple work areas in multiple facilities. Users have the ability to define actions when user defined thresholds are crossed, and more sophisticated implementations allow laboratorians the ability to segment data, such as the ability to monitor renal dialysis patients differently than outreach patients.

Integration with Quality Assurance Systems: Middleware has changed the method and the frequency with which laboratorians are correlation testing, verifying reference ranges, and establishing normal values. Traditionally labs have performed their calculation checks every six months due to the tedious nature of gathering data. Many Quality Assurance systems, such as EPEvaluator, are able to acquire data directly from information-rich middleware, eliminating hours, if not days, of gathering and retyping data to comply with CAP requirements. Not only does this save effort and avoid mistyping of data, but it encourages the laboratory to shift from minimal frequency to a continuous and ongoing basis.

With many middleware systems being ODBC-compliant, laboratories using their own ‘homegrown’ spreadsheets can now acquire data directly from their middleware systems.

Redefining instrument interfacing

The demand for highly integrated work areas is growing. Interfacing instruments typically have meant connecting the instrument to the LIS and exchanging test instructions and test results. The demand for interfacing is being replaced by integration: the interoperability of two or more systems to automate diagnostic testing. The Hematology workcell is a prime example of this growing need for integration with the automated digital cell morphology. Most LIS systems are not able to reflex one instrument’s results, comments and/or flags to other instruments. But with the growing demand for automated digital cell morphology instruments, such as CellaVision DM96, information from automated cell counters combined with rules-based decision processing are needed to automate areas of Hematology that have been manually intensive processes. Middleware has not only facilitated the interfacing but is leading the integration and automation of this integration.

Hematology is not the only example. Middleware is also leading the way to integrate Molecular workcells by coordinating and managing the preparation, amplification and specimen handling that these emerging work areas require.

Sample Storage and Retrieval

Most labs incorporate a variety of manual methods. Some are store intensive, meaning they require a significant amount of time to place samples in a particular order to make retrieval faster. Some are the inverse, where storing is fast but retrieval is tedious, involving sorting through hundreds of samples. It is so painful that I have witnessed med techs playing ‘rock-paper-scissors’ to decide who gets to find samples.

Previous market research has shown the average time to retrieve a sample is 12 to 15 minutes and involves at least one FTE. The cost for this non-revenue generating task is enormous. It is one of the easiest problems to solve, but the most overlooked of them all. To help quantify the cost of this process, I will use an “average” laboratory with 1500 samples/slides/cups to store every day; its current method takes two seconds per specimen to store. It retrieves 2 percent of specimens per day for retesting at an average of 15 minutes per sample to retrieve. Using the calculations below, it is spending 8.33 hours per day storing and retrieving specimens.

That equates to a full FTE dedicated to this procedure. Using an average salary of $21.52/hour, that is $65,457 annualized. Using the findings of AACC middleware presentation in 2001 (2) and using our example laboratory, that same process would only cost $10,473 annually–a savings of $54,984.

Some middleware solutions that address sample storage and retrieval issues can be implemented in less than a day, require no LIS interfacing, and pay for themselves in weeks or months. Also, automated archiving systems can be integrated for high-volume areas and still provide full-sample tracking for all disciplines.

Beyond Traditional Boundaries

Middleware is not just limited to clinical pathology. Other diagnostic areas such as pulmonary diagnostics are leveraging middleware to improve diagnostic care and allowing the integration with multiple information systems, including EMR/HER systems, to automate the delivery of timely, quality, actionable healthcare information.

Even traditionally separate areas of the laboratory are being bridged, as middleware is now providing clinical pathology results to anatomical pathology systems, allowing the exchange of critical patient results.

The increased use of middleware has become an integral part in helping laboratories decrease costs and increase quality. With the day-to-day challenges that these organizations are being faced with, it is more important now than ever to evaluate what middleware can do for you.