INTRODUCTION
A significant number of errors occur during the total testing process in the preanalytical phase in clinical laboratories, accounting for 40%–70% of total errors [
1–
4]. Hemolysis is the most common type of preanalytical interference; it occurs more frequently than high bilirubin and lipemia, and leads to errors in result interpretation, with potential damaging consequences for the patient. Therefore, adequate hemolysis management has a great impact on patient health [
5–
7]. Hemolysis is defined as the rupture of red blood cells (RBC) and the release of their contents (as well as white blood cell and platelet contents). It can occur
in vivo, e.g., in hemolytic anemias of acquired and/or genetic origin, or
in vitro during sample collection, transport, and storage [
6,
8]. Cell rupture and content release into the plasma cause interferences in the determination of blood analytes via various mechanisms: an increase in the concentration of the released constituent, chemical interference, spectrophotometric interference, or a dilutional effect [
6].
Visual assessment of the degree of hemolysis after sample centrifugation has been historically used to evaluate interference; however, this method is no longer recommended. Currently, the use of automated analyzers to determine the hemolysis index (HI) is recommended, as it is a standardized and more accurate method than visual assessment [
5,
9]. According to the manufacturer, the HI may be presented as a scale without units or as concentration units. Currently, only two manufacturers provide biochemical tests (Roche Diagnostics, Basel, Switzerland and Abbott, Abbott Park, IL, USA), and one manufacturer provides a coagulation test (ACLTop-Werfen, Barcelona, Spain), in which the hemolysis level may be converted to a free Hb concentration (measured in g/L) [
10,
11]. However, there exist discrepancies in results across laboratories and a lack of unified criteria for hemolysis management. A questionnaire survey among 846 laboratory professionals in the United States revealed that only 40%–46% had standardized hemolysis reports for lactate dehydrogenase (LDH), potassium, and glucose [
8]. To determine the degree of hemolysis, an interference cut-off has been established as a maximum bias of ±10% from a non-hemolyzed baseline pool [
10]. In contrast, the CLSI-C56-A guidelines report acceptability criteria based on the biological variation (BV) and reference change value (RCV) for each analyte [
12]. To standardize this procedure, the Working Group for Preanalytical Phase (WG-PRE) of the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) has developed a protocol for the determination of cut-offs and their inclusion in laboratory report comments [
13].
We report the implementation and analyte-by-analyte evaluation of a protocol for the management of hemolyzed samples based on the protocol presented by the EFLM WG-PRE. Additionally, we describe our experience in assessing the impact on the management of interference after the protocol implementation.
MATERIALS AND METHODS
Samples
Residual serum samples from routine analyses were collected in tubes without anticoagulant and with separating gel (Vacuette; Greiner Bio-One, Madrid, Spain). The samples were stored at 4°C and analyzed within 24 hours. Forty-two biochemical analytes were measured: 31 in the Cobas c701 module and 11 in the Cobas e602 module (Roche Diagnostics). The study was conducted using equipment under stable conditions (over six months with stable internal quality control), and equipment performance was assessed via participation in an external quality assurance program of the Spanish Society of Laboratory Medicine (SEQCML). The internal control materials used were Liquid Assayed Multiqual Levels 1 and 2, and Liquicheck Immunoassay Plus Levels 1 and 2 (Bio-Rad Laboratories, Hercules, CA, USA). Control data were obtained from the Laboratory Information System (LIS), Cobas Infinity IT Solutions (Roche Diagnostics). The study protocol was approved by the ethical review board of Salamanca University Hospital, Salamanca, Spain and was in agreement with the World Medical Association Declaration of Helsinki (2017).
Preparation of hemolysates and serum pool
Hemolysates were prepared by RBC lysis through osmotic disruption within 12 hrs after sample collection, according to the Meites method [
14]. Five milliliters of total blood in Vacuette Lithium Heparin tubes was centrifuged at 1,500×
g for 5 minutes. The supernatant was discarded, and the RBC were washed three times with a 0.9% sodium chloride solution. Next, distilled water was added in a 1:1 ratio with RBC and the mixture was stored at –20°C overnight. The next day, the tubes were thawed and centrifuged at 1,500×
g for 5 minutes. The Hb concentration in the supernatant was measured using an hematology analyzer, SysmexXN-2000 (Roche Diagnostics). The hemolysates were stored at –20°C in 500 μL aliquots.
On the day of the interference study, a fresh pool was prepared by mixing six serum samples in which the levels of hemolysis, icterus, lipemia, and the analytes evaluated in this study were within the reference intervals established in the laboratory. Appropriate hemolysate volumes were added to obtain Hb concentrations of 0, 0.25, 0.5, 1, 2, 3, 5, and 10 g/L. Aliquots were prepared in duplicate.
The degree of hemolysis is indicated by the HI, which gives a semiquantitative estimation of the free Hb concentration in the sample through several dichromatic absorbance measurements at 600/570 nm or 415 nm. The deviation percentage for each analyte caused by hemolysis or bias was determined using the following formula: bias=(T1−To/To)×100, where To is the mean concentration of the analyte in samples without hemolysis and T1 is the mean concentration of the analyte measured in the pool with added hemolysate. The experiment was repeated three times in duplicate.
Interferograms
Interference was assessed according to the method reported by Glick,
et al. [
15], presenting the results as relative percentages of deviation of the concentration of each analyte from the initial concentration. The data were represented graphically with the HI on the X-axis and the deviation percentage of each analyte concentration on the Y-axis. We used the interferogram dates to calculate the straight-line trend value and R
2 regression coefficient. In all cases, the selected regression model showed R
2>0.98 [
16]. Interferograms for the five selected analytes were generated using the data described in
Fig. 1.
RCV determination
The RCV was calculated according to the following formula [
16]:
where CVI represents the within-subject BV, and CVA represents the analytical variation in a given laboratory for a given method. CVA was obtained from data collected from an internal quality control process of the laboratory over a period of six months prior to the beginning of the study, using the following formula:
Cut-off selection
Out of all analytes, five were selected: aspartate aminotransferase (AST), direct bilirubin (DB), potassium, LDH, and folic acid. The selection criterion was a percentage of hemolysis interference >50% for each analyte with a Hb concentration of 5 g/L (
Fig. 1). For the five analytes selected, two cut-offs were established based on analytical criteria and the variation in concentration between two consecutive points (RCV) [
13]. As an analytical acceptance limit, based on the BV, we established the desirable quality specification (DQS) for systematic error (desirable systematic error [SE]) as a cut-off. This value was obtained from the SEQC
ML BV database [
18]. For each analyte, the data were compared with those published in the EFLM BV database [
19]. The second cut-off was the RCV [
13]. By including analytical variation and BV, an interference level above this value may represent a high impact on the therapeutic choice made (clinical cut-off). To establish hemolysis values for both cut-offs, the model used regression lines, calculated based on the interferograms described above.
Expert rule development
Starting from the established cut-offs, expert rules were formulated to report the presence of hemolysis in samples. In accordance with the harmonization document of the EFLM WG-PRE [
13], ranges were established based on the cut-offs. These ranges were then used to determine whether the result should be reported with or without an informative comment, depending on the hemolysis degree detected. Four intervals were established for each analyte: (1) below the desirable SE: the result is reported without comments; (2) between the desirable SE and RCV: the result is reported, and a comment is added to mention the effect of the interference (underestimation or overestimation); (3) above the RCV: the result is not reported, and an informative comment is added to mention the reason for the rejection; (4) HI>10 g/L: the results of the entire request are canceled. Once these rules were established, they were implemented and automated in the LIS Cobas Infinity IT Solutions.
Pre-implementation retrospective study
To assess the impact of the implementation of a hemolysis interference information system based on the above cut-offs, we conducted a retrospective analysis of hemolysis samples collected between December 2017 and April 2018 (five months) in both the emergency laboratory and the routine biochemistry laboratory. Folic acid concentrations were analyzed only in routine samples. Frequency distributions (in percentage) of the results of the HI for AST, DB, potassium, LDH, and folic acid were determined according to the intervals described in the above section. Finally, these intervals for each analyte were implemented in the LIS.
Post-implementation prospective study
One year after implementing the system, between September 2018 and April 2019 (eight months), a study was conducted to establish the numbers of requests in the emergency and routine laboratories in each of the four analyte intervals. The percentages were compared with those obtained in the pre-implementation study for the same intervals. Pre- and post-implementation results were obtained from the Omnium database (Oracle; Roche Diagnostics).
Data analysis
Data analysis, interferogram generation, and calculation of straight-line trend value sand R2 regression coefficients were carried out in spreadsheets (Excel, Microsoft, Redmond, WA, USA).
RESULTS
According to the representation model proposed by Glick,
et al. [
15] (
Fig. 1), LDH showed the highest degree of positive hemolysis interference, which reached 50% at a Hb concentration of approximately 1 g/L. DB reached 50% of interference (negative) at a Hb concentration of 0.5–1 g/L. AST and folic acid reached 50% interference at 3 g/L, but above that concentration, folic acid caused a higher percentage of interference than AST. Potassium was the only analyte selected that did not reach the required interference level until a Hb concentration of 10 g/L. However, it was included in the study for comparison with the other analytes as it is a classical hemolysis marker [
20].
Cut-offs were established using the BV levels published in the SEQC
ML and EFLM BV databases (CV
I) and the laboratory CV
A level (
Table 1). For two analytes, DB and folic acid, CV
I levels are present only in the SEQC
ML database, whereas for the other three analytes, CV
I levels are present in both databases. The cut-offs for potassium were the most restrictive, whereas those for folic acid and DB were the most permissive.
For all analytes except DB, the regression line was linear, with R
2>0.98. For DB, the regression curve was exponential, with R
2>0.98 (
Fig. 2) [
16]. In all five interferograms, the regression line crossed the axes at coordinates (0,0).
Table 2 lists the concentrations of the intervals established for the management of hemolysis. LDH and DB presented low values in intervals 1 and 2. AST, potassium, and folic acid were associated with a higher Hb concentration at the same level of interference. Results of the various analytes above the concentrations in interval 2 were not reported to clinicians. This trend was observed for both emergency and routine samples.
In a pre-implementation study using emergency and routine samples (
Table 3), a low percentage of LDH levels were reported correctly in the routine laboratory (28.16% of total requests); 71.84% of the cases were, but should not have been reported given the interval defined in
Table 2.
As for routine LDH, the percentage of samples in intervals 2 (2.52%) and 3 (0.21%) decreased between the pre- and post-implementation stages, and that of samples in interval 1 increased (2.72%) (
Table 4). This trend was generally not observed in the emergency samples. The percentages for DB were obtained using a small number of hemolyzed samples because we only analyzed whether the total bilirubin concentration was above the reference interval; therefore, the results could not be assessed.
DISCUSSION
Currently, there is no consensus on the criteria to establish cut-offs for hemolysis interference. The CLSI-EP07-A2 guidelines [
21] suggest variation >10% as a significant cut-off for hemolysis interference, regardless of the analyte being studied. The Working Group on Laboratory Errors and Patient Safety of the International Federation of Clinical Chemistry has developed quality indicators for comparison between different clinical laboratories [
22]. Among preanalytical indicators, the indicator of hemolyzed samples was defined as the number of samples with a free Hb concentration >0.5 g/L divided by the total number of samples analyzed in the laboratory [
23]. Finally,
in vitro diagnostics (IVD) companies generally suggest a cut-off of 10% for hemolysis interference, regardless of the analyte being studied. We established cut-offs for five selected analytes and a working algorithm and acceptance and rejection criteria for the results based on the interference level. Following the recommendations of the EFLM WG-PRE [
13], four intervals were established based on BV criteria (DQS and RCV), and they were implemented in the LIS and standardized with comments on the report.
The intervals were calculated from BV data in the SEQC
ML database [
18] because EFLM BV data [
19] were not available at the time of the study (2017–2018). However, we observed no significant differences regarding the management of hemolyzed samples comparing the values obtained for both databases (SEQC
ML and EFLM BV) in the pre- and post-implementation studies (data not shown). The use of BV data from the EFLM BV database is recommended for subsequent studies because they are more robust.
Based on our data, the number of analytes with interference >10% at a free Hb concentration of 0.5 g/L was very low (data not shown). An increase in the cut-off up to an interference of 50% at a Hb concentration >5 g/L allowed us to select the analytes with the highest levels of interference to which the protocol for hemolysis management was applied, i.e., LDH, AST, DB, potassium, and folic acid. Given their interference levels, these analytes provided more information to assess the study objectives than others.
Analyte-by-analyte hemolysis management yields individual degrees of interference and provides more reliable information to practitioners, helping them with result interpretation. Therefore, based on the interferograms, we observed that for four analytes (all except DB), the analytical results were overestimated. The difference in their behavior is associated with their interference mechanisms: LDH, AST, potassium, and folic acid are released from RBCs, whereas decreased concentrations of DB are due to chemical interference [
6]. Therefore, it is essential that clinical laboratories establish their own cut-offs for adequate hemolysis management.
Gils,
et al. [
11] established an HI reference value of 0.16 g/L in a healthy population using the same analytical platform we used. In our study, in interval 1, potassium and LDH showed maximum concentrations of 0.09 and 0.11 g/L, respectively, which is below the limit of 0.16 g/L proposed by Gils,
et al. [
11]. Therefore, our protocol reduces the undervaluation of hemolysis compared with the use of a fixed reference limit for all analytes in a sample. Conversely, AST, DB, and folic acid had cut-offs >0.16 g/L in interval 1, indicating that a fixed limit for the entire sample would overestimate hemolysis. Therefore, this protocol allows a more individualized interference management and increases patient safety by reducing erroneous interpretation of analytical results. The use of cut-offs based on BV allows us to be more restrictive in those analytes with narrower CV
I levels, such as potassium, in which concentrations >0.09 g/L are overestimated based on interval 1 in our protocol versus 2 g/L with a variation of 10% (CLSI criterion) [
21]. Before this study was conducted, in our laboratory, we determined hemolysis based on the cut-off established by the IVD provider (0.15 g/L for any analyte) and generally discarded the results of analytes, such as potassium or LDH, without including informative comments.
To prove the efficacy of the algorithm that was implemented in the LIS, pre- and post-implementation studies were conducted to analyze its impact on the management of hemolyzed samples. The pre-implementation study used the upper limit of interval 2 as a cut-off for each analyte to observe how the samples would have been managed. For potassium, the percentage of samples that were adequately reported in the routine laboratory based on this upper limit would have been higher than that in the emergency laboratory (95.11% vs. 92.52%). A similar behavior was observed for incorrectly reported samples (4.89% vs. 7.48%), based on the new levels in our study. The percentage of correctly reported samples for LDH was significantly low in both routine and emergency laboratories (28.16% vs. 26.38%), resulting in a high percentage of incorrectly reported samples (71.84% vs. 73.62%). LDH is more sensitive to hemolysis than other analytes, whereas potassium is sensitive to inadequate sample transport. LDH is released from RBC only when the membrane breaks (hemolysis), whereas potassium is transported through the intact membrane, which may account for the differences between these two analytes [
24].
The comparison of the results at the pre- and post-implementation stages showed an improvement in the percentage of hemolyzed samples in the routine laboratory, with a decreased percentage of samples in interval 2 and an increased percentage in interval 1, mainly for LDH and potassium. The protocol allowed us to increase the percentage of reported samples without hemolysis interference (interval 1). However, the post-implementation stage was not studied in the emergency laboratory, and an opposite trend was observed regarding the percentages of samples in intervals 1 and 2. In both the emergency and the routine analytes, a slight decrease in hemolysis levels was found in intervals 3 and 4; however, the results are not significant considering the low number of samples in these two groups.
The implementation of a harmonized protocol for hemolysis management makes it possible to achieve homogeneous criteria for reporting by clinical laboratories. Moreover, informing the requesting practitioners about the degree and type of interference (under- or overestimation) with pre-established comments leads to a better clinical interpretation of the results. We have observed a progressive improvement of the quality of the samples received in the laboratory (lower degree of hemolysis), probably due to a better understanding of the clinical impact of interference by the requesting practitioners.
Individual laboratories should establish the degree of hemolysis interference for all analytes, which is affected by the analytical methodology and the equipment used. The establishment of BV-based individualized cut-offs for interference management makes it possible to implement a harmonized protocol for hemolysis management in the laboratory. Moreover, the implementation of such a protocol would make it possible to compare results across laboratories that assess hemolysis based on individual analytes. This protocol is endorsed by recommendations, such as those of the EFLM WG-PRE [
13]. In our laboratory, there was an improvement in terms of adequate detection of the interference level caused by hemolysis for each analyte and a decrease in the percentage of samples affected by interference after the protocol had been implemented and assessed. A periodical assessment of results would help in the establishment of an internal monitoring protocol focused on the preanalytical stage and serving as an internal control tool. More studies of this kind using other analytical platforms (with different method-instrument relationships) will be needed to assess their effectiveness. Finally, the implementation of this protocol has a high impact on the quality of the results, allowing better clinical decisions and increasing patient safety.