首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 328 毫秒
1.
OBJECTIVE: To characterize serum copper status of cows and heifers in beef cow-calf herds throughout the United States and to evaluate use of copper supplements in those herds. DESIGN: Cross-sectional survey. ANIMALS: 2,007 cows and heifers from 256 herds in 18 states. PROCEDURES: Producers participating in a health and management survey conducted as part of the National Animal Health Monitoring System voluntarily allowed serum samples to be obtained from cows and heifers for determination of copper concentration. Results were categorized as deficient, marginally deficient, or adequate. The proportion of cattle and herds (on the basis of mean value of the tested cattle) in each category was determined. Copper concentrations were compared between herds that reportedly used copper supplements and those that did not. RESULTS: Overall, 34 of 2,007 (1.7%) cows and heifers were deficient in copper, and 781 (38.9%) were marginally deficient. In each region, at least a third of the cattle were deficient or marginally deficient. For herds, 92 of 256 (35.9%) were marginally deficient, and 22 (0.8%) were deficient. Approximately half of the producers reported use of copper supplements, but a sizeable proportion of those producers' cattle and herds were classified as marginally deficient or deficient. CONCLUSIONS AND CLINICAL RELEVANCE: Copper deficiency is not restricted to a single geographic region of the United States. Copper deficiency can persist despite reported use of supplements by producers. Veterinarians dealing with beef cow-calf herds that have problems consistent with copper deficiency should not rule out copper deficiency solely on the basis of geographic region or reported use of copper supplements for the herd.  相似文献   

2.
To determine the prevalence of Giardia duodenalis in weaned beef calves on cow-calf operations in the United States, fecal specimens were collected from 819 calves (6-18 months of age) from 49 operations. After cleaning and concentration procedures to maximize recovery of cysts from feces, DNA was extracted from each of the 819 specimens. The presence of G. duodenalis was determined by nested PCR of a fragment of the SSU rRNA gene. All positive PCR products were subjected to sequence analysis. The overall sample level prevalence of Giardia was 33.5% with prevalence ranging from 0 to 100% among operations. The highest within herd prevalence of infected beef calves was found in one cow-calf operation from the South region (100%), followed by a cow-calf operation from the West region (90%), and three cow-calf operations from the Midwest region (87.5, 85, and 85%). Giardia was not detected in samples from 7 operations including 5 cow-calf operations from the South region, and 1 cow-calf operation each from the Midwest and West regions. Molecular analysis of the Giardia-positive samples identified assemblage E (or E-like) in 31.7% of all samples (260/819) and assemblage A in 1.2% (10/819). A mixed infection with assemblages A and E was observed in four calves from an operation in Midwest region. The potentially zoonotic assemblage A was detected in specimens from four operations in Midwest region. These findings indicate that most G. duodenalis found in weaned beef calves was assemblage E which represents no known zoonotic threat. However, the presence of assemblage A in a small number of animals poses a potential risk of infection to humans.  相似文献   

3.
Exposure of livestock grazing to forage productivity variation and to market fluctuations affects the risk of investment and returns from cow-calf operations, but little work has been done to empirically compare these returns to the returns that would be demanded by financial markets from assets with similar risk and return characteristics. This study uses historical forage production data from three rangeland locations in California, and cattle and hay prices, to simulate financial statements for three hypothetical cow-calf producers in the period 1988–2007. Return on investment from year to year incorporates the variability and risk associated with dependence on natural forage production. Performance is then compared to the actual performance of a diversified portfolio of assets using the Capital Asset Pricing Model, from which the theoretical cost of capital for these hypothetical grazing enterprises is derived. Much like other agricultural enterprises, cow-calf production in California has low market risk and a low theoretical cost of capital. This theoretical cost of capital is still greater than the historical return from livestock production (excluding land appreciation) in the western United States, adding further backing to the point often made in the literature that ranchers who engage in cow-calf production are receiving benefits beyond the commercial returns from livestock production alone.  相似文献   

4.
Recent studies have identified the novel, host adapted Cryptosporidium bovis and the deer-like genotype in dairy cattle from farms in the United States, China, India and Europe. This novel species and genotype appear to be more prevalent in older, post-weaned dairy cattle than previously thought. However, little information is available on their prevalence in beef cow-calf operations. In the present study, we determined the prevalence of Cryptosporidium species in 98 calves (6-8 months old) and 114 cows (>2 years old) in seven beef cow-calf herds in western North Dakota. DNA was extracted from fecal samples and Cryptosporidium spp. were identified by amplification of the 18S rRNA gene followed by sequencing or RFLP analysis. All seven herds tested positive for Cryptosporidium. Overall, 43/212 (20.3%) animals were positive. Only five of these positives were from cows. C. bovis, the deer-like genotype and C. andersoni were identified in 9.4, 6.6 and 1.4% of animals sampled, respectively. C. parvum was not identified in any of the positive samples. C. bovis, the deer-like genotype and C. andersoni were detected in 6/7, 5/7 and 2/7 herds, respectively. C. bovis and the deer-like genotype were primarily detected in calves, while C. andersoni was only detected in cows. Six isolates could not be typed. These results show a relatively high prevalence of C. bovis and the deer-like genotype in 6-8-month-old beef calves compared to cows older than 2 years in the seven herds studied.  相似文献   

5.
Although feed intake and efficiency differences in growing cattle of low and high residual feed intake (RFI) classification have been established, little is known about the difference in grazed forage intake between beef cows of known RFI classification. Two experiments were conducted using Hereford cows for which RFI had been determined as heifers using the GrowSafe 4000E feed intake system, after which heifers had been divided into thirds as low RFI, mid RFI, and high RFI. During Exp. 1, 2 replicates of low and high RFI cows (n = 7/replicate) in mid- to late-gestation were blocked to 1 of 4 non-endophyte-infected tall fescue paddocks (1.8 to 2.4 ha), which they grazed continuously for 84 d during summer. Using grazing exclosures, weekly rising plate meter readings, and forage harvests every 21 d, average forage DMI was calculated. Low and high RFI groups did not differ (P > 0.05) in BW change or BCS change over the trial (19.5 vs. 22.1 kg of BW gain and 0.11 vs. 0.10 BCS gain), but low RFI cows had a 21% numerically lower DMI than high RFI cows (12.4 vs. 15.6 kg/d; P = 0.23). The average area needed per paddock over the trial was similar for low and high RFI cows (1.71 vs. 1.82 ha; P = 0.35), and the average DM on offer over the trial was less for low RFI than for high RFI cows (4,215 vs. 4,376 kg; P = 0.06). During Exp. 2, 3 replicates of low and high RFI cows with their calves (n = 4 pair/replicate) strip-grazed stockpiled and early spring growth tall fescue paddocks (0.7 to 0.9 ha) for 60 d in late winter and early spring. Because of limiting forage availability and quality at trial initiation, cow-calf pairs were also fed 3.31 kg/pair of pelleted soyhulls daily. Pre- and post-grazed forage samples were harvested for 4 grazing periods, and forage growth was estimated using a growing degree days calculation and on-site weather station data. Performance did not differ (P > 0.05) between low and high RFI cows throughout the experiment (18.4 vs. 26.6 kg of BW gain and -0.04 vs. 0.15 BCS gain). Despite the utilization of forage offered being similar for low and high RFI cow-calf pairs (P > 0.05), low RFI cows and their calves had an 11% numerically lower DMI than high RFI pairs (12.5 vs. 14.1 kg/d; P = 0.12). We concluded that either no intake differences existed between low and high RFI cows or that current methodology and small animal numbers limited our ability to detect differences.  相似文献   

6.
OBJECTIVE: To evaluate biosecurity practices of cow-calf producers. DESIGN: Cross-sectional survey. SAMPLE POPULATION: 2,713 cow-calf operations were used in phase 1 of the study, and 1,190 cow-calf operations were used in phase 2. PROCEDURE: Producers were contacted for a personal interview between Dec 30, 1996 and Feb 3, 1997 regarding their management practices. Noninstitutional operations with 1 or more beef cows were eligible to participate in the study. Producers who participated in the first phase of the study and who had > or = 5 beef cows were requested to continue in the study and were contacted by a veterinarian or animal health technician who administered further questionnaires. All contacts for the second phase of the study were made between Mar 3, 1997 and Apr 30, 1997. Additional data on use of various vaccines, testing of imported cattle for brucellosis, Mycobacterium paratuberculosis, bovine viral diarrhea, and tuberculosis as well as potential for feed contamination were collected during the second phase of the study. RESULTS: Producers commonly engaged in management practices that increased risk of introducing disease to their cattle such as importing cattle, failing to quarantine imported cattle, and communal grazing. Producers inconsistently adjusted for the increased risk of their management practices by increasing the types of vaccines given, increasing the quarantine time or proportion of imported animals quarantined, or increasing testing for various diseases in imported animals. CONCLUSIONS AND CLINICAL RELEVANCE: Cow-calf herds are at risk for disease exposure from outside sources when cattle are introduced to the herd, and producers do not always adjust management practices such as vaccination schedules and quarantine procedures appropriately to minimize this risk. Veterinary involvement in education of producers regarding biosecurity risks and development of rational and economical biosecurity plans is needed.  相似文献   

7.
Twelve Angus crossbred cattle (eight heifers and four steers; average initial BW = 594 +/- 44.4 kg) fitted with ruminal and duodenal cannulas and fed restricted amounts of forage plus a ruminally undegradable protein (RUP) supplement were used in a triplicated 4 x 4 Latin square design experiment to determine intestinal supply of essential AA. Cattle were fed four different levels of chopped (2.54 cm) bromegrass hay (11.4% CP, 57% NDF; OM basis): 30, 55, 80, or 105% of the forage intake required for maintenance. Cattle fed below maintenance were given specified quantities of a RUP supplement (6.8% porcine blood meal, 24.5% hydrolyzed feather meal, and 68.7% menhaden fish meal; DM basis) designed to provide duodenal essential AA flow equal to that of cattle fed forage at 105% of maintenance. Experimental periods lasted 21 d (17 d of adaptation and 4 d of sampling). Total OM intake and duodenal OM flow increased linearly (P < 0.001) as cattle consumed more forage; however, OM truly digested in the rumen (% of intake) did not change (P = 0.43) as intake increased. True ruminal N degradation (% of intake) tended (P = 0.07) to increase linearly, and true ruminal N degradation (g/d) decreased quadratically (P = 0.02) as intake increased from 30 to 105%. Duodenal N flow was equal (P = 0.33) across intake levels, even though microbial N flow increased linearly (P < 0.001) as forage OM intake increased. Total and individual essential AA intake decreased (cubic; P < 0.001) as forage intake increased because the supply of nonammonia, nonmicrobial N flow from RUP was decreased (linear; P < 0.001) by design. Total duodenal flow of essential AA did not differ (P = 0.39) across these levels of forage intake. Although the profile of essential AA reaching the duodenum differed (P < or = 0.02) for all 10 essential AA, the range of each essential AA as a proportion of total essential AA was low (11.1 to 11.2% of total essential AA for phenylalanine to 12.3 to 14.3% of total essential AA for lysine). Duodenal essential AA flow did not differ (P = 0.10 to 0.65) with forage intake level for eight of the 10 essential AA. Duodenal flow of arginine decreased linearly (P = 0.01), whereas duodenal flow of tryptophan increased linearly (P = 0.002) as forage intake increased from 30 to 105% of maintenance. Balancing intestinal essential AA supply in beef cattle can be accomplished by varying intake of a RUP supplement.  相似文献   

8.
OBJECTIVE: To estimate potential revenue impacts of an outbreak of foot-and-mouth disease (FMD) in the United States similar to the outbreak in the United Kingdom during 2001. DESIGN: Economic analysis successively incorporating quarantine and slaughter of animals, an export ban, and consumer fears about the disease were used to determine the combined impact. SAMPLE POPULATION: Secondary data for cattle, swine, lambs, poultry, and products of these animals. PROCEDURE: Data for 1999 were used to calibrate a model for the US agricultural sector. Removal of animals, similar to that observed in the United Kingdom, was introduced, along with a ban on exportation of livestock, red meat, and dairy products and a reduction and shift in consumption of red meat in the United States. RESULTS: The largest impacts on farm income of an FMD outbreak were from the loss of export markets and reductions in domestic demand arising from consumer fears, not from removal of infected animals. These elements could cause an estimated decrease of $14 billion (9.5%) in US farm income. Losses in gross revenue for each sector were estimated to be the following: live swine, -34%; pork, -24%; live cattle -17%; beef, -20%; milk, -16%; live lambs and sheep, -14%; lamb and sheep meat, -10%; forage, -15%; and soybean meal, -7%. CONCLUSIONS AND CLINICAL RELEVANCE: Procedures to contain an outbreak of FMD to specific regions and allow maintenance of FMD-free exports and efforts to educate consumers about health risks are critical to mitigating adverse economic impacts of an FMD outbreak.  相似文献   

9.
Two experiments were conducted to evaluate the impacts of increasing levels of supplemental soybean meal (SBM) on intake, digestion, and performance of beef cattle consuming low-quality prairie forage. In Exp. 1, ruminally fistulated beef steers (n = 20; 369 kg) were assigned to one of five treatments: control (forage only) and .08, .16, .33, and .50% BW/d of supplemental SBM (DM basis). Prairie hay (5.3% CP; 49% DIP) was offered for ad libitum consumption. Forage OM intake (FOMI) and total OM intake (TOMI) were increased (cubic, P = .01) by level of supplemental SBM, but FOMI reached a plateau when the daily level of SBM supplementation reached .16% BW. The concomitant rises in TOMI and OM digestibility (quadratic, P = .02) resulted in an increase (cubic, P = .03) in total digestible OM intake (TDOMI). In Exp. 2, spring-calving Hereford x Angus cows (n = 120; BW = 518 kg; body condition [BC] = 5.3) grazing low-quality, tall-grass-prairie forage were assigned to one of three pastures and one of eight treatments. The supplemental SBM (DM basis) was fed at .08, .12, .16, .20, .24, .32, .40, and .48% BW/d from December 2, 1996, until February 10, 1997 (beginning of the calving season). Performance seemed to reach a plateau when cows received SBM at approximately .30% BW/d. Below this level, cows lost approximately .5 unit of BC for every .1% BW decrease in the amount of supplemental SBM fed. Providing supplemental SBM is an effective means of improving forage intake, digestion, and performance of beef cattle consuming low-quality forages.  相似文献   

10.
Grazing experiments may use steers or cow-calf pairs for measuring animal performance on pasture treatments, but the validity of extrapolation between these classes of cattle has not been verified. A grazing study was conducted in the spring and summer of both 1988 and 1989 to determine stocking equivalents and stocking rate-weight gain relationships for steers and cow-calf pairs grazing Coastal bermuda grass (Cynodon dactylon [L.] pers.) oversown with rye (Secale cereale L.) and ryegrass (Lolium multiflorum Lam.). Average daily gain and stocking rate (SR; 3.2, 4.2, 6.2, and 7.4 animals per hectare for steers and 1.7, 2.5, 3.7, and 4.9 pairs per hectare for cow-calf pairs) were both adjusted so that comparisons could be made on an equal BW basis. Disk meter height readings were used as measurements of forage accessibility. Disk meter height responses to SR did not differ (P greater than .10) between steer and cow-calf paddocks. There was a linear (P less than .001) decrease in ADG as SR increased, but this decline was steeper (P less than .001) for steers than for cows or suckling calves. Steers tended to be more productive than calves at low SR but less productive at high SR. Disk meter heights for the range of SR used in the study did not differ (P greater than .10) for steers and cow-calf pairs at equivalent BW per hectare. Our study suggests that live BW is a reasonable basis for determining forage requirements of steers and cow-calf pairs under grazing conditions, but extrapolation of production between classes of livestock will not be reliable.  相似文献   

11.
Adequate drinking water is essential to maintain acceptable production levels in beef cattle operations. In the context of global climate change, the water scarcity forecasted for the future is a growing concern and it would determine an increase in the use of poorer quality water by the agricultural sector in many parts of the world. However, consumption of high-salt water by cattle has consequences often overlooked. A meta-analysis was carried out to assess the impact of utilizing high-salt water on dry matter (DMI) and water intake (WI), and performance in beef cattle. The dataset was collected from 25 studies, which were conducted between 1960 and 2020. Within the dataset, the water quality was divided into three categories according to the ratio of sulfates (SO4) or sodium chloride (NaCl) to total dissolved solids (TDS): 1) TDS = all studies included (average SO4:TDS = 0.4); 2) NaCl = considered studies in which water salinity was dominated by NaCl (average SO4:TDS = 0.1); and 3) SO4 = considered studies in which water salinity was dominated by SO4 (average SO4:TDS = 0.8). Results showed that DMI and WI were negatively affected by high-salt water consumption, although the magnitude of the effect is dependent on the type of salt dissolved in the water. There was a quadratic effect (P < 0.01) for the WI vs. TDS, WI vs. NaCl, DMI vs. TDS, and DMI vs. NaCl, and a linear effect (P < 0.01) for WI vs. SO4 and WI vs. SO4. Average daily gain (ADG) and feed efficiency (FE) were quadratically (P < 0.01) affected by high-salt water, respectively. This study revealed significant negative effects of high-salt water drinking on beef cattle WI, DMI, and performance. However, the negative effects are exacerbated when cattle drink high-sulfate water when compared with high-chloride water. To the best of our knowledge, this is the first approach to evaluate animal response to high-salt water consumption and could be included in the development of future beef cattle models to account for the impact of water quality on intake and performance. In addition, this meta-analysis highlights the need for research on management strategies to mitigate the negative effects of high-salt water in cattle.  相似文献   

12.
OBJECTIVE: To describe the frequency and distribution of Escherichia coli O157:H7 in the feces and environment of cow-calf herds housed on pasture. SAMPLE POPULATION: Fecal and water samples for 10 cow-calf farms in Kansas. PROCEDURE: Fecal and water samples were obtained monthly throughout a 1-year period (3,152 fecal samples from 2,058 cattle; 199 water samples). Escherichia coli O157:H7 in fecal and water samples was determined, using microbial culture. RESULTS: Escherichia coli O157:H7 was detected in 40 of 3,152 (1.3%) fecal samples, and 40 of 2,058 (1.9%) cattle had > or = 1 sample with E coli. Fecal shedding by specific cattle was transient; none of the cattle had E coli in more than 1 sample. Significant differences were not detected in overall prevalence among farms. However, significant differences were detected in prevalence among sample collection dates. Escherichia coli O157:H7 was detected in 3 of 199 (1.5%) water samples. CONCLUSIONS AND CLINICAL RELEVANCE: Implementing control strategies for E coli O157:H7 at all levels of the cattle industry will decrease the risk of this organism entering the human food chain. Devising effective on-farm strategies to control E coli O157:H7 in cow-calf herds will require an understanding of the epidemiologic characteristics of this pathogen.  相似文献   

13.
OBJECTIVE: To estimate the prevalence of Mycobacterium avium subsp paratuberculosis infection among cows on beef operations in the United States. DESIGN: Cross-sectional seroprevalence study. Sample Population-A convenience sample of 380 herds in 21 states. PROCEDURES: Serum samples were obtained from 10,371 cows and tested for antibodies to M avium subsp paratuberculosis with a commercial ELISA. Producers were interviewed to collect data on herd management practices. RESULTS: 30 (7.9%) herds had 1 or more animals for which results of the ELISA were positive; 40 (0.4%) of the individual cow samples yielded positive results. None of the herd management practices studied were found to be associated with whether any animals in the herd would be positive for antibodies to M avium subsp paratuberculosis. CONCLUSIONS AND CLINICAL RELEVANCE: Results suggest that the prevalence of antibodies to M avium subsp paratuberculosis among beef cows in the United States is low. Herds with seropositive animals were widely distributed geographically.  相似文献   

14.
We examined the effect of endophyte infection level of tall fescue (Festuca arundinacea Schreb.) used for stockpiled forage on the performance of lactating, fallscalving beef cows and their calves. Treatments were endophyte infection levels of 20% (low; SEM = 3.5), 51%, (medium; SEM = 1.25), and 89% (high; SEM = 2.4; 4 replications/treatment). Five cow-calf pairs grazed in each replicate (n = 60 cow-calf pairs/yr) for 84 d (phase 1) starting on December 2, 2004 (yr 1), and December 1, 2005 (yr 2). After 84 d of grazing each treatment, the cattle were commingled and fed as a single group (phase 2) until weaning in April of each year. Phase 2 allowed measurement of residual effects from grazing stockpiled tall fescue with varying levels of endophyte infection. Pregrazing and postgrazing forage DM yield, forage nutritive value, and total ergot alkaloid concentrations of forage were collected every 21 d during phase 1. Animal performance data included cow BW, ADG, and BCS, as well as calf BW and ADG. Animal performance was monitored during both phases. Endophyte infection did not affect (P = 0.52) apparent intake (pregrazing minus postgrazing forage DM yield) of stockpiled tall fescue, because each cow-calf pair consumed 16 +/- 1.7 kg/d regardless of treatment. Cow ADG during phase 1 was -0.47 +/-0.43 kg for the low treatment, which was greater (P < 0.01) than either the medium (-0.64 +/-0.43 kg) or high (-0.74 +/- 0.43 kg) treatments. However, cows that had grazed the high or medium treatments in phase 1 lost -0.43 and -0.57 (+/-0.24) kg/d, respectively, which was less (P < 0.01) BW loss than the cows in the low (-0.78 +/- 0.24 kg/d) treatment during phase 2. By the end of phase 2, cow BW did not differ (528 +/-27 kg; P = 0.15). Body condition score for cows in the low treatment was greater (P = 0.02) than that of the medium and high treatments at the end of phase 1. Body condition scores did not change appreciably by the end of phase 2, and differences among treatments remained the same as at the end of phase 1 (P = 0.02). In contrast to cow performance, calf ADG was unaffected (P = 0.10) by endophyte level and averaged 0.73 +/- 0.07 kg during phase 1 and 0.44 +/- 0.04 kg during phase 2. Our data suggest that fall-calving herds can utilize highly-infected tall fescue when stockpiled for winter grazing, with little impact on cow performance and no impact on calf gain.  相似文献   

15.
The stocker industry is one of many diverse production and marketing activities that make up the United States beef industry. The stocker industry is probably the least understood industry sector and yet it plays a vital role in helping the industry exploit its competitive advantage of using forage resources and providing an economical means of adjusting the timing and volume of cattle and meat in a complex market environment.  相似文献   

16.
Cryptosporidium spp. and Giardia duodenalis are common protozoal parasites in livestock including beef cattle on rangeland and irrigated pasture. A statewide cross-sectional study was conducted to determine the prevalence, species or genotype, and risk factors for fecal shedding of Cryptosporidium and Giardia by cattle from California cow-calf operations. Species and genotypes of Cryptosporidium and Giardia were determined by molecular fingerprinting. Prevalence of Cryptosporidium (19.8%) and Giardia (41.7%) in fecal samples from calves were approximately twice as high as fecal samples from cows (9.2% and 23.1%, respectively). In addition to age, multivariable logistic regression showed that higher stocking density and a higher number of replacement heifers were positively associated with fecal shedding of Cryptosporidium while longer calving interval, a winter/spring calving season, and higher numbers of replacement heifers were positively associated with shedding of Giardia. The dominant species and genotypes of Cryptosporidium and Giardia in feces from these cow-calf herds were Cryptosporidium ryanae (75%) and assemblage E for Giardia duodenalis (90%), which have low impact on public health compared with other zoonotic species/genotypes of these two parasites. We identified host and potential management practices that can be used to protect cattle health and reduce the risk of surface water contamination with protozoal parasites from cow-calf operations. In addition, this work updated the scientific data regarding the predominance of low zoonotic genotypes of Cryptosporidium and Giardia shed in the feces of commercial cow-calf herds on California rangeland and irrigated pasture.  相似文献   

17.
Feedlot and carcass data were collected on an average of 20 steer offspring from 10 cooperating, southwestern Colorado cow-calf ranches that had previously been analyzed using the Integrated Resource Management-Standardized Performance Analysis (IRM-SPA) program. Averages and variation within and across ranches are presented. In general, variation observed was greater than desired by the beef cattle industry’s Long Range Plan carcass specification targets. Over all ranches, 59% of the cattle were outside of the desired specification ranges for one or more traits, although none of the ranches had insurmountable carcass challenges. The factors that affect low-cost production of a consistent beef product that meets the necessary standards for quality were also examined. Feed and labor costs were under control in most low-cost operations and managers had communicated goals to other people involved in the operation. There was no difference in feedlot performance between high- and low-cost cow-calf enterprises. High- and low-costs operators were equal in their ability to hit carcass targets.  相似文献   

18.
Six year-round, all-forage, three-paddock systems for beef cow-calf production were used to produce five calf crops during a 6-yr period. Forages grazed by cows during spring, summer, and early fall consisted of one paddock of 1) tall fescue (Festuca arundinacea Schreb.)-ladino clover (Trifolium repens L.) or 2) Kentucky blue-grass (Poa pratensis L.)-white clover (Trifolium repens L.). Each of these forage mixtures was combined in a factorial arrangement with two paddocks of either 1) fescue-red clover (Trifolium pratense L.), 2) orchardgrass (Dactylis glomerata L.)-red clover, or 3) orchardgrass-alfalfa (Medicago sativa L.), which were used for hay, creep grazing by calves, and stockpiling for grazing by cows in late fall and winter. Each of the six systems included two replications; each replicate contained 5.8 ha and was grazed by eight Angus cow-calf pairs for a total of 480 cow-calf pairs. Fescue was less than 5% infected with Acremonium coenophialum. Pregnancy rate was 94%. Cows grazing fescue-ladino clover maintained greater (P less than .05) BW than those grazing bluegrass-white clover, and their calves tended (P less than .09) to have slightly greater weaning weights (250 vs 243 kg, respectively). Stockpiled fescue-red clover provided more (P less than .05) grazing days and required less (P less than .05) hay fed to cows than stockpiled orchardgrass plus either red clover or alfalfa. Digestibilities of DM, CP, and ADF, determined with steers, were greater (P less than .05) for the orchardgrass-legume hays than for the fescue-red clover hay. All systems produced satisfactory cattle performance, but fescue-ladino clover combined with fescue-red clover required minimum inputs of harvested feed and maintained excellent stands during 6 yr.  相似文献   

19.
OBJECTIVE: To evaluate the association of herd demographics, parturition variables, stocking rate, and rotational grazing practices with the probability of fecal shedding of Cryptosporidium parvum from beef cow-calf herds in California. DESIGN: Cross-sectional study. SAMPLE POPULATION: 38 beef cow-calf operations. PROCEDURE: Fecal specimens were collected and examined for C parvum oocysts, using immunofluorescent microscopy. Association between various demographic and management factors and the probability of shedding C parvum were statistically evaluated. RESULTS: Adjusted for age and month of collection of a fecal sample, cattle from herds with a high number of young calves (< or = 2 months old) on the day of sample collection, a high stocking rate (No. of cattle/acre/mo), or a longer calving season were more likely to shed C parvum oocysts, compared with cattle from herds with fewer young calves, a lower stocking rate, or a shorter calving season. Cattle from herds with a higher number of older calves (> 2 months old) on the day of sample collection were less likely to shed C parvum oocysts, compared with cattle from herds with fewer older calves. Using our multivariate model, rotational grazing systems or season of onset of calving were not associated with shedding status for C parvum oocysts. CONCLUSIONS AND CLINICAL RELEVANCE: Reproductive management that would result in a shorter calving season and use of a lower stocking rate for cattle may be associated with reduced risk of C parvum shedding. Intensive rotational grazing systems and time of year for onset of calving season apparently have little effect on reducing prevalence of oocyst shedding.  相似文献   

20.
Cow-calf production occurs in all 50 states over varied resource bases and under vastly different environmental conditions. Multiple breeds exist and management styles and objectives are as numerous as the number of cow-calf producers. There is not one area of the country, one breed of cattle, or one management style that is most profitable for producing cows and calves. There are, however, some common strategies that can be employed by cow-calf producers to enhance profitability. Costs need to be controlled without jeopardizing cow herd productivity or net returns. It appears that the cost associated with purchased and harvested feeds varies considerably across operations. Understanding cyclic and seasonal price patterns, weight-price slides, cattle shrink, and other marketing costs can help producers enhance their profit by marketing (and not by just selling) their cattle. Producers with superior cattle genetics can become part of a specific alliance or, at a minimum, document the performance of their cattle so that they can get paid for the superior genetics. The beef industry is changing and will likely continue to change. Cow-calf producers will need to examine their own management practices to determine whether they are optimal for the current industry. Those producers who are most adept at matching their management abilities to their cattle type, their resource base, and the appropriate market outlet will be the most successful in the future.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号