Bacterial contamination of blood and its cellular components remains the most common microbiological cause of transfusion associated morbidity and mortality, even in developed countries. This yet unresolved complication is seen more often in platelet transfusions, as platelet concentrates are stored at room temperature, in gas permeable containers with constant agitation, which support bacterial proliferation from relatively low undetectable levels, at the beginning of storage time, to relatively high virulent bacteria titers and endotoxin generation, at the end of shelf life. Accordingly, several combined strategies are introduced and implemented to at least reduce the potential risk of bacterial contaminated products for transfusion. These embody: improved donors arms cleaning; bacterial avoidance by diversion of the first portion of collection; reducing bacterial growth through development of newer storage media for longer platelet shelf life; bacterial load reduction by leucoreduction/viral inactivation, in some countries and eliminating the use potentially contaminated units through screening, through current available testing procedures, though none are not yet fully secure. We have not seen the same reduction in bacterial associated transfusion infections as we have observed for the sharp drop in transfusion associated transmission rates of HIV and hepatitis B and C. This great viral reduction is not only caused by the introduction of newer and more sensitive and specific detection methods for different viruses, but also the identification of donor risk groups through questionnaires and personal interviews. While search for more efficient methods for identifying potential blood donors with asymptomatic bacteremia, as well as a better way for detecting bacteria in stored blood components will be continuing, it is necessary to establish more standardized guidelines for the recognition the adverse reactions in recipients of potentially contaminated units. Efforts also should be also directed to identify blood donors with significant risk of bacteremia, at the time of donation in the first place as a high priority. The goal of this review is to highlights strategies for identifying both the sources of bacterial contamination of blood components in Norway and identifying donors with a higher risk of bacteremia at the time of donation. The key to achieving this goal is initiating continual revising and upgrading the Norwegian transfusion guidelines, based on the transfusion legislation and by introducing a relevant specialized donor bacterial questionnaire.
The aim of this study was to measure blood concentrations of environmental pollutants in Norwegian donors and evaluate the risk of pollutant exposure through blood transfusions.
Transfused blood may be a potential source of exposure to heavy metals and organic pollutants and presents a risk to vulnerable patient groups such as premature infants.
Donors were randomly recruited from three Norwegian blood banks: in Bergen, Tromsø and Kirkenes. Selected heavy metals were measured in whole blood using inductively coupled plasma mass spectrometry (ICP-MS), and perfluoroalkyl substances (PFAS) were measured in serum by ultrahigh-pressure liquid chromatography coupled with a triple-quadrupole mass spectrometer (UHPLC-MS/MS).
Almost 18% of blood donors had lead concentrations over the limit suggested for transfusions in premature infants (0.09?µmol/L). About 11% of all donors had mercury concentrations over the suggested limit of 23.7 nmol/L. Cadmium was higher than the limit, 16?nmol/L, in 4% of donors. Perfluorooctanoate (PFOA) and perfluorooctane sulfonate (PFOS) concentrations were over the suggested limit of 0.91?ng/mL in 68% and 100% of the donors, respectively. PFAS concentrations and heavy metal concentrations increased with donor's age.
A considerable percentage of donors had lead, PFOS and PFOA concentrations over the suggested limits. In addition, at each study site, there were donors with high mercury and cadmium concentrations. Selecting young donors for transfusions or measurements of pollutants in donor blood may be a feasible approach to avoid exposure through blood transfusions to vulnerable groups of patients such as premature infants.
Hereditary haemochromatosis may result in severe organ damage which can be prevented by therapy. We studied the possible advantages and disadvantages of erythrocytapheresis as compared with phlebotomy in patients with hereditary haemochromatosis.
In a prospective, randomised, open-label study, patients with hereditary haemochromatosis were randomised to bi-weekly apheresis or weekly whole blood phlebotomy. Primary end-points were decrease in ferritin levels and transferrin saturation. Secondary endpoints were decrease in haemoglobin levels, discomfort during the therapeutic procedure, costs and technicians' working time.
Sixty-two patients were included. Thirty patients were randomised to apheresis and 32 to whole blood phlebotomy. Initially, ferritin levels declined more rapidly in the apheresis group, and the difference became statistically highly significant at 11 weeks; however, time to normalisation of ferritin level was equal in the two groups. We observed no significant differences in decline of transferrin saturation, haemoglobin levels or discomfort. The mean cumulative technician time consumption until the ferritin level reached 50 µg/L was longer in the apheresis group, but the difference was not statistically significant. The cumulative costs for materials until achievement of the desired ferritin levels were three-fold higher in the apheresis group.
Treatment of hereditary haemochromatosis with erythrocytapheresis instead of whole blood phlebotomy results in a more rapid initial decline in ferritin levels and a reduced number of procedures per patient, but not in earlier achievement of target ferritin level. The frequency of discomfort was equally low with the two methods. The costs and, probably, technician time consumption were higher in the apheresis group.
The voluntary, non-remunerated blood donation organization is the important part of the International Red Cross movement. Historically the Red Cross Blood Program was established in Oslo with the main objectives to recruit new donors, support the blood banks with recruiting materials and to support in recruiting efforts made by local Red Cross branches. Currently the continual education of the recruited skilled personals, at all levels, is considered to be an essential part of such a program. The 2013 coincide with the 12th anniversary of Norwegian educational program in Quality in Transfusion Medicine. This report focuses on the historical background on the Red Cross Quality Course in Transfusion Medicine, as well as the progress made so far and looking into future perspectives.
A low concentration of serum folate is associated with an increased risk of cardiovascular disease. Extracellular cysteine is involved in aging, cancer and cardiovascular disease. The relationship between serum folate and plasma cysteine is poorly understood. Therefore, we investigated this relationship in industry workers, whose health has economic implications.
The concentration of serum folate was determined by the Access ImmunoAssay System Sanofi Pasteur. Plasma cysteine and homocysteine were measured by an ion-pair HPLC method. The concentrations of serum triglycerides were determined by an enzymatic colorimetric method.
We detected a positive correlation between the concentration of serum folate and plasma cysteine, whereas the concentration of serum folate was negatively correlated with plasma homocysteine and serum triglycerides. In a multiple regression analysis with adjustment for age, BMI and smoking, serum folate as the dependent variable exhibited a strong relationship with plasma cysteine, and a negative relationship with plasma homocysteine and serum triglycerides.
We observed significant correlations between serum folate, plasma cysteine and serum triglyceride concentrations in industry workers, implying that folate may modulate key aspects of the body's cysteine and lipid metabolism.
In 2007, previous syphilis infection was diagnosed in a blood donor who had given blood regularly for 15 years. This was discovered when the donor was tested for syphilis, as a new donor in another blood bank. The time of infection is unknown. An expert group, set up by The Norwegian Directorate of Health, was commissioned to evaluate the risk of syphilis transmission through blood products in Norway.
The expert group based its evaluation on the epidemiology of syphilis, risk of infection and properties of the syphilis bacterium, especially in relation to blood donation. Specific information about the actual incident, made available by the Norwegian Directorate of Health, was also evaluated.
Of 54 blood recipients 21 were alive and 18 (86 %) were tested for syphilis, all with a negative result. For 11 deceased the hospital records were studied without discovering signs of syphilis infection.
The risk of transfusion-transmitted syphilis is low for several reasons: The prevalence of syphilis in the population is low, a compulsory interview and completion of a questionnaire before donation in Norway excludes patients who are ill or at risk of being infected; the proportion of fresh blood donations is very low and syphilis bacteria die quickly during normal storage conditions for blood. An incidental infection is symptomatic and easily treated by antibiotics. The expert group recommends to not start syphilis testing of each blood donor but to continue the present routine testing of new donors.
Comment In: Tidsskr Nor Laegeforen. 2010 Nov 18;130(22):2219-2021109833
BACKGROUND: Clinical effect of platelet (PLT) transfusion is monitored by measures of PLT viability (PLT recovery and survival) and functionality. In this study we evaluate and compare transfusion effect measures in patients with chemotherapy-induced thrombocytopenia due to treatment of acute leukemia. STUDY DESIGN AND METHODS: Forty transfusions (28 conventional gamma-irradiated and 12 pathogen-inactivated photochemical-treated PLT concentrates [PCs]) were investigated. PC quality was analyzed immediately before transfusion. Samples were collected from thrombocytopenic patients at 1 and 24 hours for PLT increments and thromboelastography (TEG) with assessments of bleeding score and intertransfusion interval (ITI). Data were analyzed by Spearman's correlation. Patient and PC variables influencing the effect of transfusion were analyzed by use of a mixed-effects model. RESULTS: PLT dose, storage time, and pathogen inactivation correlated with PLT recovery but not with PLT survival (including ITI), TEG, or clinical bleeding. Fever was negatively correlated with PLT survival but did not affect PLT recovery. After 1 and 24 hours, strong correlations were observed within measures of PLT viability and between PLT increment and the TEG value maximal amplitude (MA). Negative correlation was observed between late MA increment and clinical bleeding status after transfusion (r = -0.494, p = 0.008). PLT count increments did not correlate to clinical bleeding status. CONCLUSIONS: PLT dose and quality of PCs are important for optimal immediate transfusion response, whereas duration of transfusion effect is influenced mainly by patient variables. The TEG value MA correlates with PLT count increments and bleeding, thus reflecting both PLT viability and functionality.
The Trauma Hemostasis and Oxygenation Research Network held its third annual Remote Damage Control Resuscitation Symposium in June 2013 in Bergen, Norway. The Trauma Hemostasis and Oxygenation Research Network is a multidisciplinary group of investigators with a common interest in improving outcomes and safety in patients with severe traumatic injury. The network's mission is to reduce the risk of morbidity and mortality from traumatic hemorrhagic shock, in the prehospital phase of resuscitation through research, education, and training. The concept of remote damage control resuscitation is in its infancy, and there is a significant amount of work that needs to be done to improve outcomes for patients with life-threatening bleeding secondary to injury. The prehospital phase of resuscitation is critical in these patients. If shock and coagulopathy can be rapidly identified and minimized before hospital admission, this will very likely reduce morbidity and mortality. This position statement begins to standardize the terms used, provides an acceptable range of therapeutic options, and identifies the major knowledge gaps in the field.
Thromboelastography (TEG) has been part of the assessment of patients receiving massive transfusion (MT) at Haukeland university hospital (HUH) since 2007. However, the test has been used inconsistently, and in general, the value of the test in evaluation of patients with critical bleeding is still debated, although it has been suggested that the TEG-guided treatment decreases blood usage. This single-centre retrospective study examines the use of TEG and discusses its place as part of assessing MT patients. The study focuses on the amount of blood product transfused in TEG-tested and non-TEG tested patients and whether TEG assisted coagulation therapy has affected mortality compared to conventional coagulation tests (CCTs). The study is based on the data from the massive transfusion study (MTS) 2002-15. 241?MT patients were identified, and they were grouped into patients assessed with TEG and patients who did not get this evaluation. In a sub-analysis, the patients with the initially (first TEG-test) 30 best and 30 worst TEG curves were defined based on normal ranges for the parameters R-time, a-angle, Maximal Amplitude (MA) and lysis after 30?min (LY). Survival rate and blood product usage were compared between these groups and between TEG and non-TEG patients. 111 patients were tested with TEG and 130 were not. The patients with highly pathological TEG curves (worst) have significantly higher mortality than the 30 normal-TEG patients (best) after 24?h (p?