The Norwegian Action Plan against Campylobacter in broilers was implemented in May 2001 with the objective of reducing human exposure to Campylobacter through Norwegian broilers. From each flock, samples collected at the farm about one week prior to slaughter, and then again at the slaughter plant, are examined for the presence of Campylobacter. All farmers with positive flocks are followed up with bio-security advice. Sampling of broiler products at retail level is also included in the Action Plan. The aim of this study was to evaluate the existing sampling and culturing methods of the Norwegian Action Plan against Campylobacter in broilers. The material collected was pooled faecal samples, pooled cloacae samples and caecae samples from individuals. The highest number of positives, from culturing of the pooled faecal samples, the pooled cloacae swabs and the caecae swabs from individuals, were obtained at incubation temperature 41.5 degrees C. When comparing the results at incubation temperature 37 and 41.5 degrees C, the faecal samples from the farms demonstrated a high concordance, with a kappa value of 0.88. The results from culturing cloacae swabs and caecae samples from slaughter plant level at two temperatures did not agree very well with a kappa value of 0.21 and moderate value of 0.57, respectively, but were both disconcordant at a level of 0.05. Modelling farm level data indicated that if increasing the number of pooled samples per flock from two (in existing regime) to three, the flock sensitivity increases from 89% to 95%. Modelling of slaughter plant data indicated that three pooled cloacae swabs are needed to identify 90% of the positive flocks. The results from the modelling of caecae data indicated that samples from seven individuals are sufficient to identify 90% of the positive flocks and caecae samples could thus be an alternative to cloacae sampling at slaughter plant level.
The objective of this study was to evaluate (eO), a biological time temperature integrator (TTI) as a quality and safety indicator for ground beef packed under modified atmosphere and spiced cooked chicken slices packed under modified atmosphere. Storage trials and challenge tests were thus performed on several batches of the studied food to monitor and model the behavior of Listeria monocytogenes, Salmonella, Staphylococcus aureus and the indigenous food flora. Then, two different prototypes of the TTI (eO) were set and manufactured according to the studied products shelf lives. The TTI evolution with time at static and dynamic temperatures was monitored and modeled. Finally, exposure assessment models were set and used under several realistic storage profiles to assess the distributions of the concentration of the indigenous food flora and the distributions of the increase in the pathogens populations obtained at the end of the product shelf life or at the end point of the TTI, taking into account the TTIs batch variability. Results showed that in case of poor storage conditions, TTI can reduce the consumer exposure to altered or hazardous foods.
Campylobacter spp. (n = 405), isolated from the feces of apparently healthy grow-finish pigs in 20 herds, were tested for susceptibility to 10 antimicrobials representing seven classes. Twelve percent of the isolates were susceptible to all drugs, while 64% were resistant to two or more antimicrobial classes. Resistance was most common to clindamycin, azithromycin, and erythromycin (71% each), and 10% of the isolates were resistant to ciprofloxacin. An antimicrobial use risk-factor analysis and a variance analysis explored the connection between antimicrobial resistance and the herd. The antimicrobial exposure of each production phase of each herd, through feed and water, was evaluated as a potential risk factor for resistance to macrolides and quinolones. Every 100,000 pig days of macrolide exposure in nursery pigs increased the odds of resistance to macrolides by a factor of 1.3. In contrast, the odds of resistance to a quinolone were nine times higher in Campylobacter from herds without beta-lactam exposure in grow-finish pigs compared with those with exposure. The variance analysis identified remarkably high clustering between isolates within herds; the intraclass correlations for resistances ranged from 0.52 to 0.82. Such extreme clustering demonstrates the potential for herd-level interventions to influence antimicrobial resistance in Campylobacter. The three key findings of this study, i.e., the prevalent resistance to macrolides, the association between macrolide exposure and Campylobacter resistance to macrolides, and the high clustering of resistance within herds, illustrate the need for continued study of antimicrobial-resistant Campylobacter on pig farms and the importance of judicious antimicrobial use in pork production.
A total of 890 samples of fresh produce obtained from Norwegian markets were examined in order to assess the bacteriological quality of the products and their potential public health risk. The samples comprised lettuce, pre-cut salads, growing herbs, parsley and dill, mushrooms and strawberries. The samples were analysed for the presence of thermotolerant coliform bacteria (TCB), Escherichia coli O157, Salmonella spp., Listeria monocytogenes, Staphylococcus spp., and Yersinia enterocolitica. Neither Salmonella spp. nor E. coli O157 were isolated. For all product groups included, TCB were isolated from a small proportion of samples. Three samples harboured L. monocytogenes; one of the isolates belonging to serogroup 1 (champignons) and two of the isolates belonging to serogroup 4 (Chinese leaves and strawberries). Staphylococci were isolated from a relatively large proportion of the samples of strawberries and mushrooms. However, only four isolates were identified as S. aureus (non-toxinogenic). By the use of PCR, the presence of Y. enterocolitica was indicated in a few of the samples of lettuce, whilst no positive samples were found using a culturing method. The study shows that the occurrence of pathogenic bacteria and TCB in the products analysed was quite low. Nevertheless, the results indicate that the type of products analysed may contain pathogenic bacteria and thereby represent a risk to the consumers in regard to food-borne diseases.
Portuguese chouriço de vinho is made by drying coarsely minced meat and fat that has been previously marinated with wine (usually red), salt, and garlic for 1 to 2 days at a low temperature (4 to 8 °C). This procedure may improve the microbiological safety of the product. The aim of this study was to evaluate the behavior of three pathogens in this product, Salmonella spp., Listeria monocytogenes, and Staphylococcus aureus, to establish the minimum period of drying and maturation necessary to render safe products. The pathogens were inoculated in the chouriço de vinho batter. A factorial design was used to study the following variables in the fermentation process: (i) the presence or absence of an indigenous Lactobacillus sakei starter culture; (ii) the presence or absence of fermentable carbohydrates; and (iii) the salt level (1.5 or 3%). The samples were analyzed 24 h after the preparation of the batter (at stuffing); after 7, 15, and 30 days of drying; and after 30 days of storage at 4 °C under vacuum. Under all of the conditions studied, the levels of the three pathogens decreased during the drying period. In the early stages of drying, the addition of L. sakei starter culture and/or carbohydrates resulted in lower levels of gram-positive pathogens. After 15 days of drying, populations of all pathogens decreased by ca. 2 log in all samples. At that sampling time, L. monocytogenes was undetectable in the chouriço de vinho with L. sakei starter culture and carbohydrates. The mean count of S. aureus after 15 days of drying was below 1 log CFU/g. After 30 days of drying, no pathogens were detected. The drying period could be shortened to 15 days when considering only the gram-positive pathogens studied and the use of a starter culture and carbohydrates. Due to the low infective dose of Salmonella spp., the product should be considered safe after 30 days, when this pathogen became undetectable.
There is currently widespread exposure to the toxic metal cadmium through the diet as well as through smoking, and it has been suggested that cadmium exposure may increase the risk of cardiovascular disease. Here we examined whether cadmium exposure is associated with prevalence and growth of atherosclerotic plaques in the carotid arteries.
The analyses were performed in a screening-based cohort of 64-year-old Caucasian women with stratified, random selection to groups with normal glucose tolerance, impaired glucose tolerance and diabetes (n = 599). We measured cadmium concentrations in blood and urine at baseline. In addition, we performed ultrasound examination to determine the prevalence and area of atherosclerotic plaques in the carotid arteries and assessed smoking history and other cardiovascular risk factors at baseline and at a follow-up examination after a mean of 5.4 years.
At baseline, blood cadmium levels were associated with increased risk of plaque and a large plaque area after adjustment for confounders. In women who had never smoked, blood cadmium levels correlated positively with plaque area at baseline. The occurrence of large plaques and the change in plaque area at follow-up were associated with blood and creatinine-corrected urinary cadmium concentrations at baseline after adjustment for confounders. Blood and urine cadmium levels added information to established cardiovascular risk factors in predicting progress of atherosclerosis.
We have shown that cadmium levels in blood and urine are independent factors associated with the development of atherosclerotic plaques at baseline as well as prospectively. This novel observation emphasizes the need to consider cadmium as a pro-atherogenic pollutant.
Cd translocation through soil-food crop-diet is considered as one of most important pathway for human Cd exposure. Rice is considered as a particular crop with high Cd uptake and accumulation in grains among the main food crops. In this study, a pot experiment was conducted to elucidate mutual interaction of soil and cultivars on uptake and grain accumulation of Cd by hybrid rice with or without Cd spiking at 2.5 mg x kg(-1) under continuous submerging condition. Two hybrid rice cultivars (Shanyou 63, a common hybrid rice and II Youming 86, a super-rice) and two paddy soils (a Wushantu, Gleyic Stagnic Anthrosols and a Hongshanitian, Ultic Stagnic Anthrosols) were used. The results show significant differences in Cd uptake and grain partitioning between soils, cultivars and the soil-cultivar interactions. The cultivars effect on uptake of indigenous soil Cd seems stronger than the soil effect while soil effect turns significant over that of cultivars on spiked Cd. However, intense Cd accumulation in grains is found under the positive interaction of soil with high Cd availability and cultivar with high Cd affinity (super rice on acidic paddy soil). This study demonstrates a phenomenon of intense Cd uptake and grain accumulation by super rice and, thus, imposing a very high Cd exposure risk (as several times as the acceptable daily intake, ADI) to subsistence-diet farmers. The low Cd cultivar Shanyou 63 tends to hamper the up-taken Cd in root while the super rice II Youming 86 promotes higher partitioning to grain. Furthermore, the difference in total biomass between the two cultivars is small compared to that in total Cd uptake under Cd spiking. It is suggested that the Cd uptake behavior should be taken into account in super rice breeding and practical measures should be taken while spread of super rice cultivars in rice areas with acidic soils and under Cd pollution in order to control the human Cd exposure by diet.
This study was designed to review all grossly detectable abnormalities and conditions (GDACs) encountered in poultry in Canadian abattoirs to determine which have potential to cause adverse health effects for the consumer. Review of the literature and consultation with scientists in the field of microbiology, epidemiology, poultry pathology, chemistry, and meat inspection served to generate an inventory of GDACs, and a decision tree containing algorithms was developed to identify GDACs potentially representing a health hazard to consumers. Through the use of the decision tree, GDACs were classified into different categories with regard to the risk they represent to humans. A number of GDACs were identified as being of potential concern from a food safety perspective, namely Erysipelas, fowl cholera, Campylobacteriosis, clostridial diseases, hepatitis/enteritis associated with Helicobacter, Listeriosis, Salmonella infections (nontyphoid infections, Salmonella arizonae, pullorum disease, and fowl typhoid), Staphylococcosis, and Toxoplasmosis. Further characterization--i.e., hazard characterization, exposure assessment, and risk characterization--is required to quantify or better characterize the probability that products derived from affected carcasses may affect the consumer as well as the resulting consequences. Risk assessment is a dynamic process. Results presented in this paper are based on available information and expert opinion. As new information is obtained, the inventory of GDACs and their classification may be modified.