首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
The possible direct relationship between climate variations and abortion in Neospora caninum‐infected cows has not been studied. The objective of this study was to determine whether climate changes could be a risk factor for abortion in N. caninum‐infected cows, and was based on yearly serological screening for neosporosis and on the confirmation of N. caninum infection on aborted fetuses in two high‐producing dairy herds with a mean 27% seroprevalence of N. caninum antibodies. The final population study was comprised of 357 pregnancies in seropositive animals. Logistic regression analysis indicated no significant effects of herd, N. caninum antibody titre, climate variables during the first and third trimesters of gestation, mean and maximum temperature–humidity index values during the second trimester of gestation, and previous abortion on the abortion rate. Based on the odds ratio, a 1‐unit increase in lactation number yielded a 0.85‐fold decrease in the abortion rate. The likelihood of abortion was 1.9 times (1/0.54) lower for pregnant cows inseminated with beef bull semen compared with Holstein‐Friesian bull semen. The likelihood of abortion decreased significant and progressively by factors of 0.5, 0.41 and 0.3 for the respective classes 40–49, 30–39 and <30 rainfall mm during the second trimester of gestation (using the class ≥60 rainfall mm as reference). As a general conclusion, it seems that increased rainfall in a dry environment can compromise the success of gestation in N. caninum‐infected cows. Attempts should therefore be made to reduce environment effects during the second trimester of gestation, a period in which the immune response of cows is diminished.  相似文献   

2.
Reasons for performing study: The pattern of long‐term survival and specific factors associated with long‐term survival have not previously been evaluated in horses with a strangulating large colon volvulus (LCV). Objectives: To provide data on the long‐term survival of horses with LCV and to identify pre‐, intra‐ and post operative variables associated with survival. Methods: Clinical data and long‐term follow‐up information were obtained from 116 horses with a strangulating LCV (≥360°) undergoing general anaesthesia. Two multivariable Cox proportional hazards models for post operative survival time were developed: Model 1 included all horses and evaluated preoperative variables and Model 2 included horses that survived anaesthesia and evaluated pre‐, intra‐ and post operative variables. Results: The study population comprised 116 horses. Eighty‐nine (76.7%) survived general anaesthesia. Of these, the percentage that survived until discharge, to one year and to 2 years was 70.7%, 48.3% and 33.7%, respectively. Median survival time for horses that survived general anaesthesia was 365 days. In Model 1 increased preoperative packed cell volume (PCV) was significantly associated with reduced post operative survival (hazard ratio [HR] 1.08, 95% confidence interval [CI] 1.05–1.11). However, this effect changed over time. In Model 2 abnormal serosal colour intraoperatively (HR 3.61, 95% CI 1.55–8.44), increased heart rate at 48 h post surgery (HR 1.04, 95% CI 1.02–1.06), and colic during post operative hospitalisation (HR 2.63, 95% CI 1.00–6.95), were all significantly associated with reduced post operative survival. Conclusions: Survival time in horses with a LCV was associated with preoperative PCV, serosal colour, heart rate at 48 h post operatively and colic during post operative hospitalisation. Potential relevance: This study provides evidence‐based information on the long‐term survival of horses with LCV and identifies parameters that may assist decision‐making by clinicians and owners.  相似文献   

3.
A prospective cohort study of lameness in Michigan equids was conducted using the Michigan Equine Monitoring System (MEMS) Phase-II database. MEMS Phase II was an equine health-monitoring study of 138 randomly-selected Michigan equine operations. Management and health-related data were collected for operations in two 12-month periods. The median incidence density of lameness was 2.8 cases per 10000 horse-days at-risk (Minimum = 0; 25th Quartale (Q) = 0; 75th Q = 10.2; Maximum = 48.5). Equine operation-management and environmental risk factors associated with the incidence density of lameness were assessed using multivariable Poisson regression. Management risk factors associated with the incidence density of lameness included the total operation horse-days monitored (3rd Q: Relative Risk (RR) = 0.46; 95% Confidence Interval (CI): 0.29–0.71 and 4th Q: RR = 0.24; 95% CI: 0.16–0.37), the veterinary-related services score (3rd Q: RR = 0.61; 95% CI: 0.39–0.96 and 4th Q: RR = 1.45; 95% CI: 1.01–2.08), the farrier-related services score (4th Q: RR = 1.60; 95% CI: 1.07–2.42) and operations having equids participating in exercise-related activities (RR = 1.71; 95% CI: 1.16–2.50). Environmental risk factors associated with the incidence density of lameness included operations with stalls having medium flooring (RR = 0.48; 95% CI: 0.35–0.65), operations with stalls having loose flooring (RR = 2.78; 95% CI: 1.88–4.10) and operations using straw-like materials for stall bedding (RR = 2.02; 95% CI: 1.53–2.68).  相似文献   

4.
The purpose of this study was to investigate the relationship between conception rate (CR) and climate variables. Data consisted of 24,380 inseminations of Holstein dairy herd in Hidalgo, Mexico. Weather records, including daily temperature (T), relative humidity (RH), rainfall, wind speed, and solar radiation, were obtained from a nearby weather station. Means for each climatic variable from 2 days before artificial insemination (AI) to the AI day were calculated for each conception date represented in the study. A significant negative correlation was observed between the CR and mean and minimum T, mean and minimum RH, mean and minimum temperature–humidity index (THI), and rainfall. The overall mean CR was 34.3%. The CR in lactating dairy cows followed a seasonal pattern, lower CRs were observed in summer months than during winter (32.1% vs. 36.9%; P < 0.01). The variables that had the greatest influence on CR were minimum and maximum T, minimum RH, minimum THI, wind speed, and rainfall.  相似文献   

5.
A retrospective study of laminitis was carried out to identify risk factors associated with this disease on an East Anglian farm with approximately 1000 animals living in an area of 1000 acres. Medical records between January 1997 and May 2000 and between April 2005 and March 2008 were reviewed, and the age, sex, weight (kg), height (inches [in] and hands [H]) and weight-to-height ratio (kg/in) was recorded. The prevalence, incidence and seasonality of laminitis were determined and their relationship to the monthly temperature, rainfall and hours of sunshine was evaluated. Averaged over the six years, the highest prevalence (2.6 per cent) and incidence (16 cases/1000 animals) of laminitis occurred in May. The findings of a multivariate analysis revealed that females (P=0.007, odds ratio [OR] 1.46, 95 per cent confidence interval [CI] 1.1053 to 1.9646) and light animals (P ≤ 0.001, OR=0.995, 95 per cent CI =0.9932 to 0.9963) had the greatest risk of developing laminitis. A positive association was found between hours of sunshine and incidence (P=0.007, relative risk [RR] 1.009, 95 per cent CI 1.001 to 1.012) and prevalence (P=0.002, RR 1.008, 95 per cent CI 1.003 to 1.012) of laminitis. The data suggest that there is a relationship between season, sex of the animal and the development of laminitis.  相似文献   

6.
This review assesses the efficacy of whole cell Tritrichomonas foetus vaccine to prevent and treat trichomoniasis in beef cattle. Three databases were searched in June 2012. Eligible studies compared infection risk, open risk, and abortion risk in heifers or infection risk in bulls that received vaccine compared with no vaccine. Study results were extracted, summary effect measures were calculated, and the quality of the evidence was assessed. From 334 citations identified, 10 were relevant to the review . For heifers, there was limited evidence of moderate quality to assess the impact of vaccination on infection risk (RR, 0.89; P = .16; 95% CI, 0.76–1.05; 6 randomized and 4 nonrandomized studies; 251 animals) and open risk (RR, 0.80; P = .06; 95% CI, 0.63–1.01; 6 randomized and 5 nonrandomized studies; 570 animals). The quality of the body of work describing the impact of vaccination on abortion risk was low (summary RR, 0.57; P = .0003; 95% CI, 0.42–0.78; 3 randomized and 2 nonrandomized studies; 176 animals). The quality of evidence was very low for duration of infection (mean difference, ?23.42; P = .003; 95% CI, ?38.36 to ?7.85; 2 randomized and 3 nonrandomized studies; 163 animals). Although the summary effect measures suggest a benefit to vaccination, due to publication bias the effect reported here is likely an over estimate of efficacy. For bull‐associated outcomes, the evidence base was low or very low quality.  相似文献   

7.
The objective of this meta‐analysis was to summarize available information on the prevalence of thermotolerant Campylobacter (TC) in different food‐producing animals worldwide. Databases (i.e., PubMed, ScienceDirect, Scopus) were searched from 1980 to 2017 unrestricted by language. The inclusion criteria were as follows: prevalence or incidence studies, published in peer‐reviewed journals, and they must have reported the total number of animal samples studied and the number of samples that were positive for the presence of TC. When the identification of Campylobacter species was available, this information was included in the analysis. Multilevel random‐effect meta‐analysis models were fitted to estimate mean occurrence rate of TC and to compare them among different factors potentially associated with the outcome. The mean occurrence rate of TC in food‐producing animals was 0.424 (95% CI: 0.394–0.455), and the mean occurrence rate of Campylobacter jejuni and Campylobacter coli were 0.214 and 0.133, respectively. Pigs and poultry showed the highest prevalence of TC; however, there were differences in the prevalence of each Campylobacter species. Campylobacter jejuni was observed in broilers (0.322; 95% CI: 0.273–0.377) and hens (0.395; 95% CI: 0.265–0.542), while C. coli was restricted essentially in pigs (0.553; 95% CI: 0.541–0.650). The prevalence of C. jejuni in intensively bred cattle was higher (0.302; 95% CI: 0.227–0.389) than the prevalence in extensively bred cattle (0.172; 95% CI: 0.119–0.242) while the prevalence of C. coli was similar (0.051; 95% CI: 0.028–0.091 vs. 0.050; 95% CI: 0.027–0.091) in both production systems. Agar with or without blood used for the isolation of TC did not affect the prevalence observed. The method of species identification did not seem to generate differences in the prevalence of Campylobacter species. The prevalence of Campylobacter in primary food production has a strong impact on the entire agri‐food chain. National authorities must monitor the situation with the aim to establish the appropriate risk management measures.  相似文献   

8.
A study of mortality, morbidity and productivity of cattle on smallholder dairy farms was conducted in Chikwaka communal land, Zimbabwe. We estimated the frequency and determinants of mortality in DDP cattle and explored demographic trends. Using Cox proportional-hazards modelling (with the farm as a random effect), the animal-level variables associated with mortality were age, sex and breed. Calf mortality was 35% of calves within the first year of life. This was nearly five-times higher than adult mortality (relative risk (RR) 4.73, 95% CI 2.12, 10.6). Females had lower mortality than males (RR=0.25, 95% CI 0.11, 0.56). After adjusting for the confounding effects of age, Jersey breeding was associated with higher mortality (RR=2.89, 95% CI 1.16, 7.22) whereas Red Dane breeding was associated with lower mortality (RR=0.27, 95% CI 0.11, 0.69). Farms with a higher ratio of non-DDP:DDP cattle had higher mortality in their DDP cattle. Leslie-matrix models simulated population growth and showed that (at the current levels of mortality and fertility) the population would double in approximately 10 years.  相似文献   

9.
Objective: To evaluate the occurrence of, and variables associated with, incisional complications after right ventral paramedian celiotomy in horses. Study Design: Case series. Animals: Horses (n=159). Methods: Occurrence of incisional complications after right ventral paramedian celiotomy was determined in 159 horses (161 celiotomies) that survived at least 30 days after surgery at a private equine hospital (2003–2007). Follow‐up information for 121 horses was obtained ≥90 days after surgery. Univariate analysis and multivariate logistic regression was performed to evaluate variables associated with incisional complications after celiotomy. Results: Of 161 celiotomies, ≥1 incisional complications occurred in 27 (16.8%) during hospitalization and/or after discharge, including: drainage (15.5%), skin dehiscence (3.7%), noticeable cutaneous scarring (1.9%), and herniation (0.6%). Variables significantly associated with incisional complications after multivariate analysis included: Quarter horse‐type breed (odds ratio [OR]: 3.9, 95% confidence interval [95% CI]: 1.3–11.7); use of an abdominal bandage (OR: 9.5, 95% CI: 2.9–30.8); and >4 postoperative febrile (>38.3°C) days (OR: 12.9, 95% CI: 2.8–58.2). Conclusions: Overall occurrence of incisional complications after right paramedian ventral celiotomy compared favorably to those reported for ventral median celiotomies. Several variables were associated with, but not necessarily predictive for, the occurrence of incision complications.  相似文献   

10.
Objective To perform electroretinography on normal anesthetized western gray kangaroos (Macropus fuliginosus). Animals studied Six captive western gray kangaroos. Procedures The kangaroos were anesthetized using a combination of ketamine and medetomidine via a remote drug delivery system, then were maintained on isoflurane after endotracheal intubation and reversal of the medetomidine with atipamazole. After a minimum of 20 min of dark adaptation, electroretinograms were obtained using a handheld electroretinography (ERG) machine using a single flash protocol at three light intensities: 10 mcd.s/m2, 3000 mcd.s/m2, 10 000 mcd.s/m2. Results At 10 mcd.s/m2 the mean b‐wave amplitude and implicit time was 102.0 μV (SD ± 41.3 and 95% CI 68.9–135.1) and 78.4 ms (SD ± 8.3 and 95% CI 71.8–85.0). At 3000 mcd.s/m2 the mean a‐wave amplitude and implicit time was 69.9 μV (SD ± 20.5 and 95% CI 53.5–86.3) and 17.6 ms (SD ± 1.5 and 95% CI 16.4–18.8) and the mean b‐wave amplitude and implicit time was 175.4 μV (SD ± 35.9 and 95% CI 146.7–204.1) and 74.1 ms (SD ± 3.5 and 95% CI 71.2–76.9). At 10 000 mcd.s/m2 the mean a‐wave amplitude and implicit time was 89.1 μV (SD ± 27.1 and 95% CI 67.5–110.8) and 16.8 ms (SD ± 1.0 and 95% CI 16.0–17.0) and the mean b‐wave amplitude and implicit time was 203.7 μV (SD ± 41.4 and 95% CI 170.6–236.8) and 75.4 ms (SD ± 3.3 and 95% CI 72.8–78.1). Conclusion Electroretinography outside of the typical clinical setting is feasible using a portable ERG system and allows for quick analysis of retinal function in exotic species.  相似文献   

11.
Outdoor reared pigs were used as indicators for investigating the effect of weather conditions in the seroprevalence of Leptospira. Over the period February to March 2008, sera from 386 sows on 11 farms in southern Sweden were tested for antibodies to the following Leptospira serovars: L. interrogans serovar (sv) Bratislava, L. kirschneri sv Grippotyphosa, L. interrogans sv Icterohaemorrhagiae, L. interrogans sv Pomona, L. borgpetersenii sv Tarassovi and one domestic strain (mouse 2A) related to L. borgpetersenii sv Sejroe and L. borgpetersenii sv Istrica. The highest seroprevalence was to this strain (8.0%) followed by sv Bratislava (3.9%). Six of the 11 farms had sows which were seropositive to at least one of the Leptospira serovars. Data on rainfall and temperature were retrieved for the respective farms. For each millimetre of extra rainfall, there was an increase in the odds ratio (OR) for seropositivity to sv Bratislava of 4.3 (95% CI 1.9-10), and to strain mouse 2A of 2.5 (95% CI 1.0-6.4). There was no association between seropositivity and temperature. This study indicates that different climate conditions within the northern temperate climate zone may be of importance for the presence of Leptospira-seropositivity in mammals.  相似文献   

12.
Background: Scores allowing objective stratification of illness severity are available for dogs and horses, but not cats. Validated illness severity scores facilitate the risk‐adjusted analysis of results in clinical research, and also have applications in triage and therapeutic protocols. Objective: To develop and validate an accurate, user‐friendly score to stratify illness severity in hospitalized cats. Animals: Six hundred cats admitted consecutively to a teaching hospital intensive care unit. Methods: This observational cohort study enrolled all cats admitted over a 32‐month period. Data on interventional, physiological, and biochemical variables were collected over 24 hours after admission. Patient mortality outcome at hospital discharge was recorded. After random division, 450 cats were used for logistic regression model construction, and data from 150 cats for validation. Results: Patient mortality was 25.8%. Five‐ and 8‐variable scores were developed. The 8‐variable score contained mentation score, temperature, mean arterial pressure (MAP), lactate, PCV, urea, chloride, and body cavity fluid score. Area under the receiver operator characteristic curve (AUROC) on the construction cohort was 0.91 (95% CI, 0.87–0.94), and 0.88 (95% CI, 0.84–0.96) on the validation cohort. The 5‐variable score contained mentation score, temperature, MAP, lactate, and PCV. AUROC on the construction cohort was 0.83 (95% CI, 0.79–0.86), and 0.76 (95% CI, 0.72–0.84) on the validation cohort. Conclusions and Clinical Importance: Two scores are presented enabling allocation of an accurate and user‐friendly illness severity measure to hospitalized cats. Scores are calculated from data obtained over the 1st 24 hours after admission, and are diagnosis‐independent. The 8‐variable score predicts outcome significantly better than does the 5‐variable score.  相似文献   

13.
ObjectiveTo evaluate the evidence of analgesic efficacy of tramadol for the management of postoperative pain and the presence of associated adverse events in dogs.Databases usedA comprehensive search using PubMed/MEDLINE, LILACS, Google Scholar and CAB databases with no restrictions on language and following a prespecified protocol was performed from June 2019 to July 2020. Included were randomized controlled trials (RCTs) performed in dogs that had undergone general anesthesia for any type of surgery. Two authors independently classified the studies, extracted data and assessed their risk of bias using Cochrane’s tool. RevMan and GRADE methods were used to rate the certainty of evidence (CoE).ConclusionsOverall 26 RCTs involving 848 dogs were included. Tramadol administration probably results in a lower need for rescue analgesia versus no treatment or placebo [moderate CoE; relative risk (RR): 0.47; 95% confidence interval (CI): 0.26–0.85; I2 = 0%], and may result in a lower need for rescue analgesia versus buprenorphine (low CoE; RR: 0.50; 95% CI: 0.20–1.24), codeine (low CoE; RR: 0.75; 95% CI: 0.16–3.41) and nalbuphine (low CoE; RR: 0.05; 95% CI: 0.00–0.72). However, tramadol administration may result in an increased requirement for rescue analgesia versus methadone (low CoE; RR: 3.45; 95% CI: 0.66–18.08; I2 = 43%) and COX inhibitors (low CoE; RR: 2.27; 95% CI: 0.68–7.60; I2 = 45%). Compared with multimodal therapy, tramadol administration may make minimal to no difference in the requirement for rescue analgesia (low CoE; RR: 1.12; 95% CI: 0.48–2.60; I2 = 0%). Adverse events were inconsistently reported and the CoE was very low. The overall CoE of the analgesic efficacy of tramadol for postoperative pain management in dogs was low or very low, and the main reasons for downgrading the evidence were risk of bias and imprecision.  相似文献   

14.

Background

The downer cow syndrome (DCS) is a challenging health issue in the dairy industry. No cow‐side test is available to provide an accurate prognosis for DCS cases in farm settings.

Hypothesis/Objectives

Local or systemic hypoperfusion and myocardial lesions lead to an increase in blood concentration of biomarkers cardiac troponin I (cTnI) and L‐lactate. The objective was to determine the prognostic values of these biomarkers assessed cow‐sides in addition to clinical examinations in prognostication of a negative outcome (NO: death or euthanasia within 7 days).

Animals

218 client‐owned dairy cows affected by DCS.

Methods

In a prospective study, animals were monitored for 60 days after inclusion of each cow. Blood cTnI and L‐lactate concentrations were measured on the day of inclusion. The prognostic accuracy of both biomarkers and physical examination variables was estimated to predict NO. A mixed multivariable logistic regression model was used for data analysis.

Results

Prevalence of NO in this study was 63% on day 7. Troponin concentrations greater than 0.7 ng/mL had sensitivity and specificity of 54.1% (95% CI: 45.3–62.7%) and 78.4% (95% CI: 67.3–87.1%), respectively, for predicting NO. Blood L‐lactate was not associated with the outcome. The multivariable model revealed that heart rate >100 bpm (OR; 95% CI: 3.7; 1.3–10.2) and cTnI > 0.7 ng/mL (OR; 95% CI: 5.5; 2.1–14.6) were associated with the risk of NO.

Conclusions and Clinical Importance

Hypertroponinemia and tachycardia were associated with reduced survival in DCS cases. The use of cow‐side blood cTnI concentrations and heart rate could help to rapidly identify cows in farm setting that have poor chances of recovery and would benefit from a more aggressive treatment or euthanasia.  相似文献   

15.
Before weaning, dairy calves are susceptible to many pathogens which can affect their subsequent performance. The use of lactic acid bacteria (LAB) has been identified as a tool to maintain the intestinal microbial balance and to prevent the establishment of opportunistic pathogenic bacterial populations. However, a consensus has not been reached as to whether probiotics may be effective in reducing the prevalence of gastrointestinal diseases in young calves. The aim of this meta-analysis was to assess the effect of probiotics on diarrhea incidence and the intestinal microbial balance. LAB supplementation has been shown to exert a protective effect and to reduce the incidence of diarrhea (relative risk, RR=0.437, 95% confidence interval (CI) 0.251-0.761). In the subanalysis, this protective effect of the probiotics against diarrhea was observed only in trials that used whole milk (RR=0.154, 95% CI 0.079-0.301) and trials that used multistrain inocula (RR=0.415, 95% CI 0.227-0.759). Probiotics did not improve the fecal characteristics (standardized mean difference, SMD=-0.4904, 95% CI -1.011-0.035) and were unable to change the LAB:coliforms ratio (SMD=0.016, 95% CI -0.701-0.733). Probiotics showed a beneficial impact on the LAB:coliforms ratio in the subanalysis that included trials that used whole milk (SMD=0.780, 95% CI 0.141-1.418) and monostrain inocula (SMD=0.990, 95% CI 0.340-1.641). The probability of significant effects (probiotic positive effect) in a new study was >0.70 for diarrhea and fecal consistency. Whole milk feeding improved the action of the probiotic effect on the incidence of diarrhea and LAB:coliforms ratio. The probability to find significant effects in the diarrhea frequency and LAB:coliforms ratio was higher (P>0.85) if the new studies were conducted using whole milk to feed calves. This paper defines the guidelines to standardize the experimental designs of future trials. LAB can be used as growth promoters in calves instead of antibiotics to counteract the negative effects of their widespread use.  相似文献   

16.
Objective To investigate predictors of survival and athletic function in adult horses with infection of a synovial structure. Hypotheses Increasing duration from contamination to referral, bone or tendon involvement and positive microbial culture decreases short‐term survival. Synovitis and/or sepsis at 5 days post‐admission and involvement of Staphylococcus spp. decreases long‐term athletic function. Design Retrospective study. Methods Records over 4 years of adult horses with synovial sepsis were reviewed. A two‐tailed Fisher's exact test, Mann‐Whitney U test or t‐test was used to examine whether variables were predictive of short‐term survival and long‐term athletic function. Results During the study period 75 horses underwent treatment for infection of 93 synovial structures. Short‐term survival was 84% (63/75) and 54% (30/56) of horses returned to athletic function. Of the variables measured at admission, duration from contamination to referral did not affect survival, whereas evidence of bone or tendon involvement decreased survival and athletic function. Of the variables available during treatment, abnormal synovial fluid at 4–6 days post‐admission and positive microbial culture reduced athletic function. Staphylococcal infection was associated with persistent sepsis. Conclusions Of the variables available at presentation, only evidence of bone or tendon involvement negatively affected survival and athletic function. During treatment of synovial sepsis, analysis of synovial fluid at 4–6 days and bacterial culture results have prognostic value.  相似文献   

17.

Background

Colic has been associated with shedding of Salmonella. Horses with salmonellosis typically develop diarrhea, fever, and leukopenia. Overlooking additional predictors may result in failure to detect shedding horses and increase environmental contamination.

Objectives

Evaluate associations between signalment and clinicopathologic data during early hospitalization and Salmonella shedding in horses treated for acute colic.

Animals

Horses with acute colic admitted to a referral hospital. A total of 59 horses shedding Salmonella compared to 108 Salmonella‐negative horses.

Methods

Retrospective case‐control study evaluating patient and Salmonella culture data. Associations between variables and Salmonella shedding were identified using logistic regression. Two multivariable models were developed pertaining to (1) information available within 24 hours of admission and (2) clinical findings that developed later during hospitalization.

Results

Variables retained for multivariable model 1 indicated that Warmbloods and Arabians had increased odds for shedding Salmonella, as did horses requiring surgery (OR, 2.52; 95% CI, 1.10–5.75) or having more severe gastrointestinal disease (OR, 2.59; 95% CI, 1.08–6.20). Retained variables for model 2 demonstrated that horses that were treated surgically (OR, 1.60; 95% CI, 0.70–3.62), developed fever >103°F (OR, 2.70; 95% CI, 0.92–7.87), had abnormal leukocyte count (OR, 1.38; 95% CI, 0.61–3.09), or became inappetent and lethargic (OR, 16.69; 95% CI, 4.08–68.24) had increased odds for shedding Salmonella.

Conclusions and Clinical Importance

In horses with acute colic that present without signs of diarrhea, fever, or leukopenia, additional predictors associated with shedding Salmonella could be used to more promptly identify horses likely to shed organisms .  相似文献   

18.
An outbreak of chronic wasting disease (CWD) in farmed elk in Saskatchewan from 1996 to 2002 was reviewed to 1, determine the progression of CWD from infection to death in farmed elk; 2, assess animal risk factors for CWD infection in farmed elk; 3, assess farm management and exposure risk factors for within herd CWD transmission; and 4, assess the suitability of the Canadian Food Inspection Agency's (CFIA) current disease control policy for CWD in light of the findings. The results from animal movement tracing, animal testing, and a farm management questionnaire were used. The duration of CWD (time from exposure to death of a CWD test-positive animal) was between a mean minimum of 19 months and a mean maximum of 40 months. Age and sex were not associated with CWD infection, except that adult elk (> or = 2 y) were more likely to be infected than young elk (< 18 mo) (RR = 2.3, 95% CI 1.6-3.5). Elk calves born in the last 18 mo prior to the death or diagnosis of their dam were at higher risk if their dams died of CWD (RR = 4.1, 95% CI 1.5-11.4) or exhibited clinical signs of CWD (RR = 8.3, 95% CI 2.7-25.7). Significant risk factors for transmission of CWD on elk farms were the introduction from an infected farm of trace-in elk that died of CWD (RR = 13.5, 95% CI 2.0-91) or developed clinical signs of CWD (RR = 7.1, 95% CI 0.93-54) and the elapsed time in years since the incursion of CWD (OR = 5.6, 95% CI 1.8-17.4). The assumptions on which CFIA's disease control policies were based were validated, but based on this new information, quarantine in cases where exposure to preclinical elk has occurred could be considered as an alternative to whole herd eradication.  相似文献   

19.
This observational study aimed to determine MRSA prevalence using strain‐specific real‐time PCR at the pig level, stratified by age groupings, within a pig enterprise. A total of 658 samples were collected from individual pigs (n = 618) and the piggery environment (n = 40), distributed amongst five different pig age groups. Presumptive MRSA isolates were confirmed by the presence of mecA, and MALDI‐TOF was performed for species verification. All isolates were tested against 18 different antimicrobials. MRSA was isolated from 75.2% (95% CI 71.8–78.6) of samples collected from pigs, and 71% of the MRSA isolates from this source were identified as community‐associated (CA)‐MRSA ST93, while the remainder were livestock‐associated (LA)‐MRSA ST398. Amongst environmental isolates, 80% (CI 64.3–95.7) were ST93 and the remainder ST398. All MRSA isolates from pigs and the environment were susceptible to ciprofloxacin, gentamicin, linezolid, mupirocin, rifampicin, sulfamethoxazole–trimethoprim, teicoplanin and vancomycin. Phenotypic rates of resistance were penicillin (100%), clindamycin (97.6%), erythromycin (96.3%), ceftiofur (93.7%), chloramphenicol (81.2%), tetracycline (63.1%) and amoxicillin–clavulanate (63.9%). A low prevalence of resistance (9.2%) was observed against neomycin and quinupristin–dalfopristin. The probability of MRSA carriage in dry sows (42.2%) was found to be significantly lower (p < .001) when compared to other age groups: farrowing sows (76.8%, RR1.82), weaners (97.8%, RR 2.32), growers (94.2%, RR 2.23) and finishers (98.3%, RR 2.33). Amongst different production age groups, a significant difference was also found in antimicrobial resistance for amoxicillin–clavulanate, neomycin, chloramphenicol and tetracycline. Using the RT‐PCR assay adopted in this study, filtering of highly prevalent ST93 and non‐ST93 isolates was performed at high throughput and low cost. In conclusion, this study found that weaner pigs presented a higher risk for CA‐MRSA and antimicrobial resistance compared to other age groups. These findings have major implications for how investigations of MRSA outbreaks should be approached under the One‐Health context.  相似文献   

20.
The Philippines has a long history of rabies control efforts in their dog populations; however, long‐term success of such programmes and the goal of rabies elimination have not yet been realized. The Bohol Rabies Prevention and Elimination Program was developed as an innovative approach to canine rabies control in 2007. The objective of this study was to assess canine rabies vaccination coverage in the owned‐dog population in Bohol and to describe factors associated with rabies vaccination 2 years after implementation of the programme. We utilized a cross‐sectional cluster survey based on the World Health Organization’s Expanded Programme on Immunization coverage survey technique. We sampled 460 households and collected data on 539 dogs residing within these households. Seventy‐seven per cent of surveyed households reported owning at least one dog. The human‐to‐dog ratio was approximately 4 : 1, and the mean number of dogs owned per household was 1.6. Based on this ratio, we calculated an owned‐dog population of almost 300 000. Overall, 71% of dogs were reported as having been vaccinated for rabies at some time in their lives; however, only 64% of dogs were reported as having been recently vaccinated. Dogs in our study were young (median age = 24 months). The odds of vaccination increased with increasing age. Dogs aged 12–23 months had 4.6 times the odds of vaccination compared to dogs aged 3–11 months (95% CI 1.8–12.0; P = 0.002). Confinement of the dog both day and night was also associated with increased odds of vaccination (OR = 2.1; 95% CI 0.9–4.9; P = 0.07), and this result approached statistical significance. While the programme is on track to meet its goal of 80% vaccination coverage, educational efforts should focus on the need to confine dogs and vaccinate young dogs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号