To explore ambulance nurses' (ANs) experiences of non-conveying patients to alternate levels of care.
Increases in ambulance utilisation and in the number of patients seeking ambulance care who do not require medical supervision or treatment during transport have led to increased nonconveyance (NC) and referral to other levels of care.
A qualitative interview study was conducted using an inductive research approach.
The study was conducted in a region in the middle of Sweden during 2016-2017. Twenty nurses were recruited from the ambulance departments in the region. A conventional content analysis was used to analyse the interviews. The study followed the COREQ checklist.
The ANs experienced NC as a complex and difficult task that carried a large amount of responsibility. They wanted to be professional, spend time with the patient and find the best solution for him or her. These needs conflicted with the ANs' desire to be available for assignments with a higher priority. The ANs could feel frustrated when they perceived that ambulance resources were being misused and when it was difficult to follow the NC guidelines.
If ANs are expected to nonconvey patients seeking ambulance care, they need a formal mandate, knowledge and access to primary health care.
This study provides new knowledge regarding the work situation of ANs in relation to NC. These findings can guide future research and can be used by policymakers and ambulance organisations to highlight areas that need to evolve to improve patient care.
Improved recycling of end-of-life vehicles (ELVs) may serve as an important strategy to address resource security risks related to increased global demand for scarce metals. However, in-depth knowledge of the magnitude and fate of such metals entering ELV recycling is lacking. This paper quantifies input of 25 scarce metals to Swedish ELV recycling, and estimates the extent to which they are recycled to material streams where their metal properties are utilised, i.e. are functionally recycled. Methodologically, scarce metals are mapped to main types of applications within newly produced Swedish car models and subsequently, material flow analysis of ELV waste streams is used as basis for identifying pathways of these applications and assessing whether contained metals are functionally recycled. Results indicate that, of the scarce metals, only platinum may be functionally recycled in its main application. Cobalt, gold, manganese, molybdenum, palladium, rhodium and silver may be functionally recycled depending on application and pathways taken. For remaining 17 metals, functional recycling is absent. Consequently, despite high overall ELV recycling rates of materials in general, there is considerable risk of losing ELV scarce metals to carrier metals, construction materials, backfilling materials and landfills. Given differences in the application of metals and identified pathways, prospects for increasing functional recycling are discussed.
The genetic susceptibility to colorectal cancer (CRC) has been estimated to be around 35% and yet high-penetrance germline mutations found so far explain less than 5% of all cases. Much of the remaining variations could be due to the co-inheritance of multiple low penetrant variants. The identification of all the susceptibility alleles could have public health relevance in the near future. To test the hypothesis that what are considered polymorphisms in human CRC genes could constitute low-risk alleles, we selected eight common SNPs for a pilot association study in 1785 cases and 1722 controls. One SNP, rs3219489:G>C (MUTYH Q324H) seemed to confer an increased risk of rectal cancer in homozygous status (OR=1.52; CI=1.06-2.17). When the analysis was restricted to our 'super-controls', healthy individuals with no family history for cancer, also rs1799977:A>G (MLH1 I219V) was associated with an increased risk in both colon and rectum patients with an odds ratio of 1.28 (CI=1.02-1.60) and 1.34 (CI=1.05-1.72), respectively (under the dominant model); while 2 SNPs, rs1800932:A>G (MSH6 P92P) and rs459552:T>A (APC D1822V) seemed to confer a protective effect. The latter, in particular showed an odds ratio of 0.76 (CI=0.60-0.97) among colon patients and 0.73 (CI=0.56-0.95) among rectal patients. In conclusion, our study suggests that common variants in human CRC genes could constitute low-risk alleles.
Prehospital emergency medicine is a challenging discipline characterized by a high level of acuity, a lack of clinical information and a wide range of clinical conditions. These factors contribute to the fact that prehospital emergency medicine is a high-risk discipline in terms of medical errors. Prehospital use of Computerized Decision Support System (CDSS) may be a way to increase patient safety but very few studies evaluate the effect in prehospital care. The aim of the present study is to evaluate a CDSS.
In this non-blind block randomized, controlled trial, 60 ambulance nurses participated, randomized into 2 groups. To compensate for an expected learning effect the groups was further divided in two groups, one started with case A and the other group started with case B. The intervention group had access to and treated the two simulated patient cases with the aid of a CDSS. The control group treated the same cases with the aid of a regional guideline in paper format. The performance that was measured was compliance with regional prehospital guidelines and On Scene Time (OST).
There was no significant difference in the two group's characteristics. The intervention group had a higher compliance in the both cases, 80% vs. 60% (p
In patients who call for the emergency medical service (EMS), there is a knowledge gap with regard to the final assessment after arriving at hospital and its association with field assessment.
In a representative population of patients who call for the EMS, to describe a) the final assessment at hospital discharge and b) the association between the assessment in the field and the assessment at hospital discharge.
Thirty randomly selected patients reached by a dispatched ambulance each month between 1 Jan and 31 Dec 2016 in one urban, one rural and one mixed ambulance organisation in Sweden took part in the study. The exclusion criteria were age?
Artificial insemination is widely used in many cattle breeding programs. Semen samples of breeding bulls are collected and closely examined immediately after collection at artificial insemination centers. Only ejaculates without anomalous findings are retained for artificial insemination. Although morphological aberrations of the spermatozoa are a frequent reason for discarding ejaculates, the genetic determinants underlying poor semen quality are scarcely understood.
A tail stump sperm defect was observed in three bulls of the Swedish Red cattle breed. The spermatozoa of affected bulls were immotile because of severely disorganized tails indicating disturbed spermatogenesis. We genotyped three affected bulls and 18 unaffected male half-sibs at 46,035 SNPs and performed homozygosity mapping to map the fertility disorder to an 8.42 Mb interval on bovine chromosome 13. The analysis of whole-genome re-sequencing data of an affected bull and 300 unaffected animals from eleven cattle breeds other than Swedish Red revealed a 1 bp deletion (Chr13: 24,301,425 bp, ss1815612719) in the eleventh exon of the armadillo repeat containing 3-encoding gene (ARMC3) that was compatible with the supposed recessive mode of inheritance. The deletion is expected to alter the reading frame and to induce premature translation termination (p.A451fs26). The mutated protein is shortened by 401 amino acids (46 %) and lacks domains that are likely essential for normal protein function.
We report the phenotypic and genetic characterization of a sterilizing tail stump sperm defect in the Swedish Red cattle breed. Exploiting high-density genotypes and massive re-sequencing data enabled us to identify the most likely causal mutation for the fertility disorder in bovine ARMC3. Our results provide the basis for monitoring the mutated variant in the Swedish Red cattle population and for the early identification of infertile animals.
Respiratory agents may be detected in the oropharynx of healthy individuals. The extent of this condition and the reasons behind it are largely unknown. The objective of this study was to determine the factors associated with the presence of respiratory agents in the oropharynx of adolescents healthy enough to attend school activities.
On a single day in December, samples from the posterior wall of the oropharynx of adolescents aged 10-15 y were obtained using cotton-tipped swabs. The samples were analyzed by real-time polymerase chain reaction (PCR) for the presence of 13 respiratory viruses and 2 bacteria (Mycoplasma pneumoniae and Chlamydophila pneumoniae).
Out of the 232 adolescents sampled, 67 (29%) had any respiratory symptom. A positive PCR result was found in 50 individuals (22%). Human rhinovirus was the most commonly found agent. Respiratory agents were significantly more frequent in the younger age group (10-13 y) than in the older age group (14-15 y): 26% (38/148) vs 14% (12/84), respectively; p = 0.04. Cough was the only symptom that was more common among individuals with a positive PCR test than among those with a negative PCR test: 8/50 (16%) vs 11/182 (6%); p = 0.02. Family size and class size were not associated with the likelihood of a positive PCR test.
The presence of respiratory agents in the oropharynx is a frequent finding among adolescents healthy enough to attend school activities. The high prevalence was found to be associated with young age, but not with the size of the family or class.
The major factor affecting morbidity and mortality after lung transplantation (LTX) is bronchiolitis obliterans syndrome. Earlier studies have suggested a connection between the presence of viral agents and morbidity in this patient group, but data are somewhat conflicting. The objective of this study was to investigate the development of bronchiolitis obliterans syndrome and graft loss after LTX in relation to the presence of respiratory viruses during the first year after LTX.
The study is a retrospective cohort study of 39 LTX recipients 11Y13 years after surgery. Patients were operated between January 1, 1998 and December 31, 2000 at Sahlgrenska University Hospital. The presence of virus in bronchoalveolar lavage (BAL) fluids from patients during the first year after surgery was analyzed retrospectively using a multiplex polymerase chain reaction test capable of detecting 15 respiratory agents. The time to BOS or graft loss was analyzed in relation to the positive findings in BAL during the first year after LTX.
Patients with one or more viruses detected in BAL during the first year after transplantation demonstrated a significantly faster development of BOS (P=0.005) compared with patients with no virus detected. No significant difference in graft survival was found.
Our results suggest that the long-term prognosis after LTX may be negatively affected by viral respiratory tract infections during the first year after LTX.
The microbial etiology of community-acquired pneumonia (CAP) is often unclear in clinical practice, and previous studies have produced variable results. Population-based studies examining etiology and incidence are lacking. This study examined the incidence and etiology of CAP requiring hospitalization in a population-based cohort as well as risk factors and outcomes for specific etiologies.
Consecutive admissions due to CAP in Reykjavik, Iceland were studied. Etiologic testing was performed with cultures, urine-antigen detection, and polymerase chain reaction analysis of airway samples. Outcomes were length of stay, intensive care unit admission, assisted ventilation, and mortality.
The inclusion rate was 95%. The incidence of CAP requiring hospitalization was 20.6 cases per 10000 adults/year. A potential pathogen was detected in 52% (164 of 310) of admissions and in 74% (43 of 58) with complete sample sets. Streptococcuspneumoniae was the most common pathogen (61 of 310, 20%; incidence: 4.1/10000). Viruses were identified in 15% (47 of 310; incidence: 3.1/10000), Mycoplasmapneumoniae were identified in 12% (36 of 310; incidence: 2.4/10000), and multiple pathogens were identified in 10% (30 of 310; incidence: 2.0/10000). Recent antimicrobial therapy was associated with increased detection of M pneumoniae (P
Cites: Clin Infect Dis. 2008 May 15;46(10):1513-21 PMID 18419484
Cites: Annu Rev Pathol. 2008;3:499-522 PMID 18039138
The objective of this study was to compare the efficacy of ritonavir boosted atazanavir versus ritonavir boosted lopinavir or efavirenz, all in combination with 2 nucleoside analogue reverse transcriptase inhibitors (NRTIs), over 144 weeks in antiretroviral-naïve HIV-1-infected individuals.
A prospective open-label randomized controlled trial was conducted at 29 sites in Sweden and Norway between April 2004 and December 2009. Patients were randomized to receive either efavirenz 600 mg once daily (EFV), or atazanavir 300 mg and ritonavir 100 mg once daily (AZV/r), or lopinavir 400 mg and ritonavir 100 mg twice daily (LPV/r). The primary endpoints were the proportion of patients with HIV-1 RNA 100,000 copies/ml at baseline had similar response rates in all arms.
EFV was superior to LPV/r at week 48, but there were no significant differences between the 3 arms in the long-term (144 weeks) follow-up.