首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.

Background

Previous studies have identified large breed, male, outdoor dogs of herding or working groups to be at increased risk for Leptospira infection. Exposure risk factors may change over time, altering the signalment of dogs most commonly diagnosed with leptospirosis.

Objectives

The objectives of this study were to evaluate possible signalment changes by decade in canine leptospirosis cases diagnosed at university veterinary hospitals in the United States and Canada using reports to the Veterinary Medical DataBase (VMDB) over a 40‐year period (1970–2009).

Animals

One thousand and ninety‐one dogs with leptospirosis diagnosed among 1,659,146 hospital visits.

Methods

Hospital prevalence of leptospirosis by decade was determined by age, sex, weight, and breed groups. Multivariable logistic regression models were created to evaluate the association between variables and the odds of disease for each decade.

Results

Veterinary Medical DataBase hospital prevalence of leptospirosis in dogs, after a marked decrease in the 1970s and low rates in the 1980s, began increasing in the 1990s. Hospital prevalence significantly increased in dogs between 2 and 9.9 years of age (P < .05) and in male dogs (P < .05) in each decade since the 1980s. Among weight groups in the most recent decade (2000–2009), dogs weighing <15 pounds had the greatest odds of being diagnosed with leptospirosis (P = .003).

Conclusions and Clinical Importance

Hospital prevalence rates by age, weight, sex, and breed groups differed by decade. These changes may reflect changes in exposure risk, Leptospira vaccination practices for dogs, or both.  相似文献   

2.
An important aspect of the bovine spongiform encephalopathy (BSE) epidemic has been an apparent age-dependent risk of infection, with younger cattle being more likely to become infected than older cattle. Our objective was to determine the age-dependent risk of infection of dairy cattle. We first reviewed unpublished data on the feeding patterns of proprietary concentrates for dairy-replacement cattle. These data showed that autumn- and spring-born cattle would receive different feeding patterns of proprietary concentrates, and so age-dependent risk of infection profiles were obtained separately for autumn- and spring-born cattle. We used back-calculation methods to analyse BSE-epidemic data collected in Great Britain between 1984 and 1996.

Dairy cattle were most at risk in the first 6 months of life; adult cattle were at relatively low risk of infection. Between 6 and 24 months of age, risk profiles reflected feeding patterns of proprietary concentrates in each of the autumn- and spring-born cohorts.  相似文献   


3.
Although substantial fecal shedding is expected to start years after initial infection with Mycobacterium avium subspecies paratuberculosis (MAP), the potential for shedding by calves and therefore calf-to-calf transmission is underestimated in current Johne’s disease (JD) control programs. Shedding patterns were determined in this study in experimentally infected calves. Fifty calves were challenged at 2 weeks or at 3, 6, 9 or 12 months of age (6 calves served as a control group). In each age group, 5 calves were inoculated with a low and 5 with a high dose of MAP. Fecal culture was performed monthly until necropsy at 17 months of age. Overall, 61% of inoculated calves, representing all age and dose groups, shed MAP in their feces at least once during the follow-up period. Although most calves shed sporadically, 4 calves in the 2-week and 3-month high dose groups shed at every sampling. In general, shedding peaked 2 months after inoculation. Calves inoculated at 2 weeks or 3 months with a high dose of MAP shed more frequently than those inoculated with a low dose. Calves shedding frequently had more culture-positive tissue locations and more severe gross and histological lesions at necropsy. In conclusion, calves inoculated up to 1 year of age shed MAP in their feces shortly after inoculation. Consequently, there is potential for MAP transfer between calves (especially if they are group housed) and therefore, JD control programs should consider young calves as a source of infection.  相似文献   

4.
This fecal prevalence study targeted cattle from 7 large (10 000 to > 40 000 head) commercial feedlots in Alberta as a means of establishing Campylobacter levels in cattle just prior to animals entering the food chain. Overall, 87% [95% confidence interval (CI) = 86–88] of 2776 fresh pen-floor fecal samples were culture positive for Campylobacter species, with prevalences ranging from 76% to 95% among the 7 feedlots. Campylobacter spp. prevalence was 88% (95% CI = 86–90) in the summer (n = 1376) and 86% (95% CI = 85–88) in the winter (n = 1400). In addition, 69% (95% CI = 66–71) of 1486 Campylobacter spp. positive samples were identified as Campylobacter jejuni using hippurate hydrolysis testing. Of those, 64% (95% CI = 58–70) of 277 and 70% (95% CI = 67–72) of 1209 Campylobacter isolates were identified as C. jejuni in winter and summer, respectively. After accounting for clustering within pen and feedlot, feedlot size and the number of days on feed were associated with Campylobacter spp. isolation rates. The high isolation rates of Campylobacter spp. and C. jejuni in feedlot cattle feces in this study suggest a potential role for feedlot cattle in the complex epidemiology of campylobacters in Alberta.  相似文献   

5.
The longstanding assumption that calves of more than 6 months of age are more resistant to Mycobacterium avium subspecies paratuberculosis (MAP) infection has recently been challenged. In order to elucidate this, a challenge experiment was performed to evaluate age- and dose-dependent susceptibility to MAP infection in dairy calves. Fifty-six calves from MAP-negative dams were randomly allocated to 10 MAP challenge groups (5 animals per group) and a negative control group (6 calves). Calves were inoculated orally on 2 consecutive days at 5 ages: 2 weeks and 3, 6, 9 or 12 months. Within each age group 5 calves received either a high – or low – dose of 5 × 109 CFU or 5 × 107 CFU, respectively. All calves were euthanized at 17 months of age. Macroscopic and histological lesions were assessed and bacterial culture was done on numerous tissue samples. Within all 5 age groups, calves were successfully infected with either dose of MAP. Calves inoculated at < 6 months usually had more culture-positive tissue locations and higher histological lesion scores. Furthermore, those infected with a high dose had more severe scores for histologic and macroscopic lesions as well as more culture-positive tissue locations compared to calves infected with a low dose. In conclusion, calves to 1 year of age were susceptible to MAP infection and a high infection dose produced more severe lesions than a low dose.  相似文献   

6.
Historical control data of tumor incidence were collected from the control groups (215 animals of each sex) in four recent carcinogenicity studies that were started between 2005 to 2009 (terminally sacrificed between 2007 and 2011) at BoZo Research Center Inc. (Gotemba, Shizuoka, Japan) using Fischer 344 rats (F344/DuCrlCrlj). These data were compared to the previous historical control data (from 1990 to 2004, previously reported) in the same facility. In the results, the incidence of C-cell adenoma in the thyroid tended to increase in both sexes in recent years (30.8% for males and 24.4% for females in 2005-2009) as compared with the previous data (17.4% and 20.1% for males and 11.5% and 11.8% for females in 1990–1999 and 2000–2004, respectively). In addition, the incidences of pancreatic islet cell adenoma in males and uterine adenocarcinoma tended to increase from around 2000 and remained high in recent years (incidences of islet cell adenoma in males of 10.5%, 17.1% and 20.5% in 1990–1999, 2000–2004 and 2005–2009; incidences of uterine adenocarcinoma of 3.3%, 12.0% and 13.5% in 1990–1999, 2000–2004 and 2005–2009, respectively). There was no apparent difference in the incidence of other tumors.  相似文献   

7.

Background

Serum and urine Blastomyces antigen concentrations can be used to diagnose blastomycosis in dogs.

Objectives

Blastomyces antigen concentrations correlate with clinical remission in dogs during antifungal treatment, and detect disease relapse after treatment discontinuation.

Animals

21 dogs with newly diagnosed blastomycosis monitored until clinical remission (Treatment Phase), and 27 dogs monitored over 1 year from the time of antifungal discontinuation or until clinical relapse (After Treatment Phase).

Methods

Prospective study. Dogs were monitored monthly during treatment and every 3 months after treatment discontinuation, with a complete history, physical exam, chest radiographs, and ocular exam. Urine and serum Blastomyces antigen concentrations were measured at each visit using a quantitative enzyme immunoassay.

Results

At enrollment in the Treatment Phase, Blastomyces antigen was positive in all 21 urine samples (100% sensitivity; 95% CI 85–100%), and in 18 of 20 serum samples (90% sensitivity; 95% CI 70–97%). At 2–4 months of treatment, urine antigen was more sensitive for clinically detectable disease (82%; CI 60–94%) than serum antigen (18%; CI 6–41%). The sensitivity of the urine test for clinical relapse was 71% (CI 36–92%), with close to 100% specificity (CI 84–100%) during after treatment surveillance in this population.

Conclusions

Urine Blastomyces antigen testing has high sensitivity for active disease at the time of diagnosis and during treatment, and moderate sensitivity but high specificity for clinical relapse. Urine testing should be useful at the time of diagnosis, when treatment discontinuation is being considered, and anytime there is poor clinical response or suspicion of relapse.  相似文献   

8.
A specific Polarographic method with a sensitivity of ≥ 2 jig/kg (ppb) has been used to determine the plasma and tissue concentrations of nitroxynil (NTX), which is used against Parafilaria bovicola in cattle. After treatment with the therapeutic dose on NTX (2 × 20 mg/kg b.w., s,c), there was an initial rapid decrease in the plasma concentration followed by a slower elimination phase. The plasma levels of NTX were 8 mg/kg (ppm) and 3 mg/kg after 6 weeks and 2 months, respectively. The muscle and other edible tissue from treated cattle contained 0.1–0.3 mg/kg NTX after 2 months, and jxg/kg amounts were still detectable 3 months after the injection. Based on available pharmacological and toxicological data, a 3-months withdrawal time for NTX in cattle is proposed.  相似文献   

9.
Infections with bovine viral diarrhea virus (BVDV) of the genus pestivirus, family Flaviviridae, are not limited to cattle but occur in various artiodactyls. Persistently infected (PI) cattle are the main source of BVDV. Persistent infections also occur in heterologous hosts such as sheep and deer. BVDV infections of goats commonly result in reproductive disease, but viable PI goats are rare. Using 2 BVDV isolates, previously demonstrated to cause PI cattle and white-tailed deer, this study evaluated the outcome of experimental infection of pregnant goats. Pregnant goats (5 goats/group) were intranasally inoculated with BVDV 1b AU526 (group 1) or BVDV 2 PA131 (group 2) at approximately 25–35 days of gestation. The outcome of infection varied considerably between groups. In group 1, only 3 does became viremic, and 1 doe gave birth to a stillborn fetus and a viable PI kid, which appeared healthy and shed BVDV continuously. In group 2, all does became viremic, 4/5 does aborted, and 1 doe gave birth to a non-viable PI kid. Immunohistochemistry demonstrated BVDV antigen in tissues of evaluated fetuses, with similar distribution but reduced intensity as compared to cattle. The genetic sequence of inoculated viruses was compared to those from PI kids and their dam. Most nucleotide changes in group 1 were present during the dam’s acute infection. In group 2, a similar number of mutations resulted from fetal infection as from maternal acute infection. Results demonstrated that BVDV may cause reproductive disease but may also be maintained in goats.  相似文献   

10.

Background

Brachycephalic dogs are at risk for arterial hypertension and obstructive sleep apnea, which are both associated with chronic magnesium (Mg) depletion.

Hypothesis/Objectives

To compare the period prevalence of hypomagnesemia between Boxers and Bulldogs presented to a referral teaching hospital. To screen a group of Bulldogs for evidence of hypomagnesemia, and to obtain pilot data regarding the utility of parenteral Mg tolerance testing (PMgTT) in the diagnosis of whole‐body Mg deficiency.

Animals

Chemistry laboratory submissions were retrospectively analyzed for serum total Mg (tMg) in Boxers and Bulldogs. Prospectively, 16 healthy client‐owned Bulldogs were enrolled.

Methods

Retrospective case study. tMg concentrations were compared between Boxers and Bulldogs. Dogs with low serum albumin or high serum creatinine concentrations were excluded. Prospectively, ionized Mg (iMg), tMg, and arterial blood pressure were measured and iMg‐to‐tMg ratio (iMg : tMg) was calculated. Parenteral Mg tolerance testing (PMgTT) was performed in 3/16 dogs.

Results

In the retrospective study, period prevalence of hypomagnesemia was 4.7% in Boxers and 15% in Bulldogs (P = .02). The risk ratio for hypomagnesemia in Bulldogs was 1.8 when compared to Boxers (CI: 1.3–2.7). In the prospective study, iMg was [median (interquartile)] 0.43 (0.42–0.46) mmol/L (reference range 0.4–0.52), tMg was 1.9 (1.8–1.9) mg/dL (reference range 1.9–2.5). iMg : tMg was [mean (±SD)] 0.59 ± 0.04. Percentage retention after PMgTT were 55%, 95%, and 67%, respectively.

Conclusions and Clinical Importance

Mg deficiency is common in Bulldogs and could contribute to comorbidities often observed in this breed. iMg : tMg and PMgTT might prove helpful in detecting chronic subclinical Mg deficiency.  相似文献   

11.
Understanding the utilization of feed energy is essential for precision feeding in beef cattle production. We aimed to assess whether predicting the metabolizable energy (ME) to digestible energy (DE) ratio (MDR), rather than a prediction of ME with DE, is feasible and to develop a model equation to predict MDR in beef cattle. We constructed a literature database based on published data. A meta-analysis was conducted with 306 means from 69 studies containing both dietary DE and ME concentrations measured by calorimetry to test whether exclusion of the y-intercept is adequate in the linear relationship between DE and ME. A random coefficient model with study as the random variable was used to develop equations to predict MDR in growing and finishing beef cattle. Routinely measured or calculated variables in the field (body weight, age, daily gain, intake, and dietary nutrient components) were chosen as explanatory variables. The developed equations were evaluated with other published equations. The no-intercept linear equation was found to represent the relationship between DE and ME more appropriately than the equation with a y-intercept. The y-intercept (−0.025 ± 0.0525) was not different from 0 (P = 0.638), and Akaike and Bayesian information criteria of the no-intercept model were smaller than those with the y-intercept. Within our growing and finishing cattle data, the animal’s physiological stage was not a significant variable affecting MDR after accounting for the study effect (P = 0.213). The mean (±SE) of MDR was 0.849 (±0.0063). The best equation for predicting MDR (n = 106 from 28 studies) was 0.9410 ( ± 0.02160) +0.0042 ( ± 0.00186) × DMI (kg) – 0.0017 ( ± 0.00024) × NDF(% DM) – 0.0022 ( ± 0.00084) × CP(% DM). We also presented a model with a positive coefficient for the ether extract (n = 80 from 22 studies). When using these equations, the observed ME was predicted with high precision (R2 = 0.92). The model accuracy was also high, as shown by the high concordance correlation coefficient (>0.95) and small root mean square error of prediction (RMSEP), <5% of the observed mean. Moreover, a significant portion of the RMSEP was due to random bias (> 93%), without mean or slope bias (P > 0.05). We concluded that dietary ME in beef cattle could be accurately estimated from dietary DE and its conversion factor, MDR, predicted by the dry matter intake and concentration of several dietary nutrients, using the 2 equations developed in this study.  相似文献   

12.

Background

Remission occurs in 10–50% of cats with diabetes mellitus (DM). It is assumed that intensive treatment improves β‐cell function and increases remission rates.

Hypothesis

Initial intravenous infusion of insulin that achieves tight glycemic control decreases subsequent insulin requirements and increases remission rate in diabetic cats.

Animals

Thirty cats with newly diagnosed DM.

Methods

Prospective study. Cats were randomly assigned to one of 2 groups. Cats in group 1 (n = 15) received intravenous infusion of insulin with the goal of maintaining blood glucose concentrations at 90–180 mg/dL, for 6 days. Cats in group 2 (n = 15) received subcutaneous injections of insulin glargine (cats ≤4 kg: 0.5–1.0 IU, q12h; >4 kg 1.5–2.0 IU, q12h), for 6 days. Thereafter, all cats were treated with subcutaneous injections of insulin glargine and followed up for 6 months. Cats were considered in remission when euglycemia occurred for ≥4 weeks without the administration of insulin. Nonparametric tests were used for statistical analysis.

Results

In groups 1 and 2, remission was achieved in 10/15 and in 7/14 cats (P = .46), and good metabolic control was achieved in 3/5 and in 1/7 cats (P = .22), respectively. Overall, good metabolic control or remission occurred in 13/15 cats of group 1 and in 8/14 cats of group 2. In group 1, the median insulin dosage given during the 6‐month follow‐up was significantly lower than in group 2 (group 1: 0.32 IU/kg/day, group 2: 0.51 IU/kg/day; P = .013).

Conclusions and Clinical Importance

Initial intravenous infusion of insulin for tight glycemic control in cats with DM decreases insulin requirements during the subsequent 6 months.  相似文献   

13.

Background

Neutrophil gelatinase–associated lipocalin (NGAL) is a protein that is used in human medicine as a real‐time indicator of acute kidney injury (AKI).

Hypothesis

Dogs with AKI have significantly higher plasma NGAL concentration and urine NGAL‐to‐creatinine ratio (UNCR) compared with healthy dogs and dogs with chronic kidney disease (CKD).

Animals

18 healthy control dogs, 17 dogs with CKD, and 48 dogs with AKI.

Methods

Over a period of 1 year, all dogs with renal azotemia were prospectively included. Urine and plasma samples were collected during the first 24 hours after presentation or after development of renal azotemia. Plasma and urine NGAL concentrations were measured with a commercially available canine NGAL Elisa Kit (Bioporto® Diagnostic) and UNCR was calculated. A single‐injection plasma inulin clearance was performed in the healthy dogs.

Results

Median (range) NGAL plasma concentration in healthy dogs, dogs with CKD, and AKI were 10.7 ng/mL (2.5–21.2), 22.0 ng/mL (7.7–62.3), and 48.3 ng/mL (5.7–469.0), respectively. UNCR was 2 × 10−8 (0–46), 1,424 × 10−8 (385–18,347), and 2,366 × 10−8 (36–994,669), respectively. Dogs with renal azotemia had significantly higher NGAL concentrations and UNCR than did healthy dogs (P < .0001 for both). Plasma NGAL concentration was significantly higher in dogs with AKI compared with dogs with CKD (P = .027).

Conclusions and Clinical Importance

Plasma NGAL could be helpful to differentiate AKI from CKD in dogs with renal azotemia.  相似文献   

14.

Background

Pioglitazone is a thiazolidinedione (TZD) insulin sensitizer approved for use in human type 2 diabetes mellitus. Therapeutic options for diabetes in cats are limited.

Objective

To evaluate the effects of pioglitazone in obese cats, which are predisposed to insulin resistance, to assess its potential for future use in feline diabetes mellitus.

Animals

A total of 12 obese purpose‐bred research cats (6 neutered males and 6 spayed females, 5–7 years of age, weighing 5.4–9.8 kg).

Methods

Randomized, placebo‐controlled 3‐way crossover study. Oral placebo or pioglitazone (Actos™; 1 or 3 mg/kg) was administered daily for 7‐week periods, with IV glucose tolerance testing before and after each period.

Results

Three mg/kg pioglitazone significantly improved insulin sensitivity (geometric mean [95% CI] 0.90 [0.64–1.28] to 2.03 [1.49–2.78] min −1pmol−1L; P = .0014 versus change with placebo), reduced insulin area under the curve during IVGTT (geometric mean [range] 27 [9–64] to 18 [6–54] min∙nmol/L; P = .0031 versus change with placebo), and lowered serum triglyceride (geometric mean [range] 71 [29–271] to 48 [27–75] mg/dL; P = .047 versus change with placebo) and cholesterol (geometric mean [range] 187 [133–294] to 162 [107–249] mg/dL; P = .0042 versus change with placebo) concentrations in the obese cats. No adverse effects attributable to pioglitazone were evident in the otherwise healthy obese cats at this dosage and duration.

Conclusions and Clinical Importance

Results of this study support a positive effect of pioglitazone on insulin sensitivity and lipid metabolism in obese cats, and suggest that further evaluation of the drug in cats with diabetes mellitus or other metabolic disorders might be warranted.  相似文献   

15.

Background

Bovine viral diarrhoea (BVD) is considered eradicated from Denmark. Currently, very few (if any) Danish cattle herds could be infected with BVD virus (BVDV). The Danish antibody blocking enzyme-linked immunosorbent assay (ELISA) has been successfully used during the Danish BVD eradication program, initiated in 1994. During the last decade, the cattle herd size has increased while the prevalence of BVDV has decreased. In this study, we investigated how these changes could affect the performance of the Danish blocking ELISA and of the SVANOVIR®BVDV-Ab indirect ELISA. The latter has successfully been used to eradicate BVD in Sweden.Data (2003–2010) on changes in median herd size and milk production levels, occurrence of viremic animals and bulk milk surveillance were analysed. Additionally, the Danish blocking ELISA and the SVANOVIR ELISA were compared analyzing milk and serum samples. The prevalence of antibody positive milking cows that could be detected by each test was estimated, by diluting positive individual milk samples and making artificial milk pools.

Results

During the study period, the median herd size increased from 74 (2003) to 127 cows (2010), while the prevalence of BVDV infected herds decreased from 0.51 to 0.02 %. The daily milk yield contribution of a single seropositive cow to the entire daily bulk milk was reduced from 1.61 % in 2003 to 0.95 % in 2010 due to the increased herd size. It was observed that antibody levels in bulk milk decreased at national level. Moreover, we found that when testing bulk milk, the SVANOVIR®BVDV-Ab can detect a lower prevalence of seropositive lactating cows, compared to the Danish blocking ELISA (0.78 % vs. 50 %). Values in the SVANOVIR®BVDV-Ab better relate to low concentrations of antibody positive milk (R2 = 94-98 %), than values in the blocking ELISA (R2 = 23–75 %). For sera, the two ELISAs performed equally well.

Conclusions

The SVANOVIR ELISA is recommended for analysis of bulk milk samples in the current Danish situation, since infected dairy herds e.g. due to import of infected cattle can be detected shortly after BVDV introduction, when only few lactating cows have seroconverted. In sera, the two ELISAs can be used interchangeably.  相似文献   

16.

Background

Population characteristics and outcome of cats with arterial thromboembolism (ATE) managed in general practice (GP) have been poorly described.

Hypothesis

Cats with ATE presenting to GP are usually euthanized at presentation, but survival times >1 year are possible.

Animals

Cats with ATE managed by 3 GP clinics in the United Kingdom.

Methods

Records of cases presenting to GP over a 98‐month period (2004–2012) were reviewed. Cats with an antemortem diagnosis of limb ATE were included. Outcome information was obtained.

Results

Over 98 months, 250 cats were identified with ATE. Prevalence was approximately 0.3%. At presentation, 153 cats (61.2%) were euthanized, with 68/97 (70.1%) of the remaining cats (27.2% of the total population) surviving >24 hours after presentation. Of these, 30/68 (44.1%) survived for at least 7 days. Hypothermia (HR, 1.44; 95% CI, 1.002–2.07; P = .049) and management by Clinic 2 (HR, 5.53; 95% CI, 1.23–24.8; P = .026) were independent predictors of 24‐hour euthanasia or death. For cats surviving >24 hours, hypothermia (HR, 2.25; 95% CI, 1.12–4.48; P = .021) and failure to receive aspirin, clopidogrel, or both (HR, 8.26; 95% CI, 1.39–50; P = .001) were independent predictors of euthanasia or death within 7 days. For cats that survived ≥7 days, median survival time was 94 (95% CI, 42–164) days, with 6 cats alive 1 year after presentation.

Conclusions

Although 153/250 cats were euthanized at presentation, 6 cats survived >12 months. No factors were identified that predicted euthanasia on presentation.  相似文献   

17.

Background

Measurement of salivary cortisol has been used extensively as a non-invasive alternative to blood sampling to assess adrenal activity in ruminants. However, there is evidence suggesting a considerable delay in the transfer of cortisol from plasma into saliva. Previous studies in cattle have used long sampling intervals making it difficult to characterise the relationship between plasma and salivary cortisol (PLCort and SACort, respectively) concentrations at different time points and determine whether or not such a time lag exist in large ruminants. Therefore, the objective of this study was to characterise the relationship between plasma and salivary cortisol and determine if there is a significant time lag between reaching peak cortisol concentrations in plasma and saliva across a 4.25 h time-period, using short sampling intervals of 10–15 min, following social separation in dairy cattle.Five cows were separated from their calves at 4 days after calving, and six calves were separated from a group of four peers at 8 weeks of age. Following separation, the animals were moved to an unfamiliar surrounding where they could not see their calves or pen mates. The animals were catheterised with indwelling jugular catheters 1 day before sampling. Blood and saliva samples were obtained simultaneously before and after separation.

Results

In response to the stressors, PLCort and SACort increased reaching peak concentrations 10 and 20 min after separation, respectively. This suggested a 10 min time lag between peak cortisol concentrations in plasma and saliva, which was further confirmed with a time-series analysis. Considering the 10 min time lag, SACort was strongly correlated with PLCort (P < 0.0001).

Conclusions

Salivary cortisol correlates well with plasma cortisol and is a good indicator of the time-dependent variations in cortisol concentrations in plasma following acute stress. However, there is a time lag to reach peak cortisol concentrations in saliva compared to those in plasma, which should be considered when saliva samples are used as the only measure of hypothalamic-pituitary-adrenal axis response to stress in cattle.  相似文献   

18.
Necrotic enteritis (NE) is an important enteric disease in poultry and has become a major concern in poultry production in the post-antibiotic era. The infection with NE can damage the intestinal mucosa of the birds leading to impaired health and, thus, productivity. To gain a better understanding of how NE impacts the gut function of infected broilers, global mRNA sequencing (RNA-seq) was performed in the jejunum tissue of NE challenged and non-challenged broilers to identify the pathways and genes affected by this disease. Briefly, to induce NE, birds in the challenge group were inoculated with 1 mL of Eimeria species on day 9 followed by 1 mL of approximately 108 CFU/mL of a NetB producing Clostridium perfringens on days 14 and 15. On day 16, 2 birds in each treatment were randomly selected and euthanized and the whole intestinal tract was evaluated for lesion scores. Duodenum tissue samples from one of the euthanized birds of each replicate (n = 4) was used for histology, and the jejunum tissue for RNA extraction. RNA-seq analysis was performed with an Illumina RNA HiSeq 2000 sequencer. The differentially expressed genes (DEG) were identified and functional analysis was performed in DAVID to find protein–protein interactions (PPI). At a false discovery rate threshold <0.05, a total of 377 DEG (207 upregulated and 170 downregulated) DEG were identified. Pathway enrichment analysis revealed that DEG were considerably enriched in peroxisome proliferator-activated receptors (PPAR) signaling (P < 0.01) and β-oxidation pathways (P < 0.05). The DEG were mostly related to fatty acid metabolism and degradation (cluster of differentiation 36 [CD36], acyl-CoA synthetase bubblegum family member-1 [ACSBG1], fatty acid-binding protein-1 and -2 [FABP1] and [FABP2]; and acyl-coenzyme A synthetase-1 [ACSL1]), bile acid production and transportation (acyl-CoA oxidase-2 [ACOX2], apical sodium–bile acid transporter [ASBT]) and essential genes in the immune system (interferon-, [IFN-γ], LCK proto-oncogene, Src family tyrosine kinase [LCK], zeta chain of T cell receptor associated protein kinase 70 kDa [ZAP70], and aconitate decarboxylase 1 [ACOD1]). Our data revealed that pathways related to fatty acid digestion were significantly compromised which thereby could have affected metabolic and immune responses in NE infected birds.  相似文献   

19.

Background

Despite the increasing popularity of Icelandic horses, published reference intervals (RIs) in this breed are rare. Due to their isolation and their small gene pool, alterations in some variables are likely and some possible breed-specific peculiarities have been described. The purpose of the present study was the establishment of comprehensive RIs in Icelandic horses according to recently published guidelines.In a prospective observational study, blood samples were collected from the jugular vein of 142 Icelandic horses into EDTA and serum tubes. Reference intervals were established for haematologic and biochemical analytes on the Advia 2120i™ and the Dimension ExL™ by established methods. RIs were defined as central 95 % intervals bounded by the 2.5th and 97.5th percentiles with their 90 % confidence intervals, calculated according to recently published ASVCP guidelines. An inhouse-developed quality control system using observed total allowable error was used for the surveillance of the internal quality control preceding the measurements.

Results

The RIs were as follows: haematocrit: 0.29–0.39, RBC: 5.79–8.63 T/l, haemoglobin: 102.0–142.3 g/l, MCV: 42–51 fl, platelets: 146–263 G/l, WBC: 4.13–8.57 G/l, segs: 1.98–4.73 G/l, lymphocytes: 1.25–3.49 G/l, monocytes: 0.06–0.31 G/l, eosinophils: 0.04–0.50 G/l, glucose: 4.0–5.7 mmol/l, urea: 3.2–6.4 mmol/l, creatinine: 79.6–141.4 μmol/l, total protein: 54.4–72.9 g/l, albumin: 27.7–36.8 g/l, total bilirubin: 8.1–21.1 μmol/l, triglycerides: 0.03–0.44 mmol/l, cholesterol: 1.75–2.90 mmol/l, ALP: 1.35–3.55 μkat/l, AST: 4.52–8.80 μkat/l, GLDH: 0.0–0.18 μkat/l, GGT: 0.11–0.39 μkat/l, CK: 2.53–6.52 μkat/l, LDH: 3.32–7.95 μkat/l, iron: 16.4–39.9 μmol/l, calcium: 2.69–3.19 mmol/l, phosphate: 0.5–1.3 mmol/l, magnesium: 0.6–0.9 mmol/l, sodium: 134–141 mmol/l, potassium: 3.6–4.7 mmol/l, chloride: 100–105 mmol/l.

Conclusions

Reference intervals of several haematologic and biochemical analytes differed from the transferred historical reference intervals applied to equine samples in the authors’ laboratory. These might be of clinical importance in some analytes such as creatine kinase.

Electronic supplementary material

The online version of this article (doi:10.1186/s13028-015-0120-4) contains supplementary material, which is available to authorized users.  相似文献   

20.
Pre-movement testing for bovine tuberculosis (BTB) was compulsory in Ireland until 1996. We determined the proportion of herd restrictions (losing BTB-free status) attributable to the recent introduction of an infected bovid; described events between restoration of BTB-free status (de-restriction) and the next herd-level test for BTB; estimated the proportion of undetected infected cattle present at de-restriction; identified high-risk movements between herds (movements most likely to involve infected cattle); and determined the potential yield of infected cattle discovered (or herds that would not lose their BTB-free status) by pre-movement testing, relative to the numbers of cattle and herds tested. We used national data for all 6252 herds with a new BTB restriction in the 12 months from 1 April 2003 and 3947 herds declared BTB-free in the 12 months from 1 October 2001. We identified higher-risk animals from our logistic generalized estimating-equation models. We attributed 6-7% of current herd restrictions to the recent introduction of an infected animal. There were considerable changes to herd structure between de-restriction and the next full-herd test, and infection was detected in 10% of herds at the first assessment (full-herd test or abattoir surveillance) following de-restriction. Following movement from a de-restricted herd, the odds of an animal being positive at the next test increased with increasing time in the source herd prior to movement, increasing time between de-restriction and the next full-herd test and increasing severity of the source herd restriction. The odds decreased with increasing size of the source herd. We estimated that 15.9 destination-herd restrictions per year could be prevented for every 10,000 cattle tested pre-movement and that 3.3 destination-herd restrictions per year could be prevented for every 100 source herds tested pre-movement. The yield per pre-movement test can be increased by focusing on high-risk movements; however, this would result in a substantial decrease in the total number of potential restrictions identified.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号