首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Wheat was assessed at four crop growth stages for eyespot (anamorph Pseudocercosporella herpotrichoides, teleomorph Tapesia yallundae) in a series of field trials that studied the effects on disease frequency of five wheat management techniques (sowing date and density, nitrogen fertiliser dose and form, removal/burial of cereal straw). An equation expressing disease level as a function of degree-days was fitted to the observed disease levels. This equation was based on eyespot epidemiology and depended on two parameters illustrating the importance of the primary and the secondary infection cycles respectively. Cultural practices were classified according to the importance of their effects on disease, and these effects could be related to infection cycles and host plant architecture. Sowing date had the earliest and strongest effect; early sowing always increased disease frequency through the primary infection cycle, and its influence on the secondary cycle was variable. Disease frequency was increased by high plant density and/or a low shoot number per plant through primary infection; the secondary cycle was, however, decreased by a low shoot number per plant, which reduced late disease development at high plant density. High nitrogen doses increased disease levels and the severity of both infection cycles, but this effect was partly hidden by a simultaneous stimulation of tillering and thus an indirect decrease of disease incidence. When significant, ammonium (vs ammonium nitrate) fertiliser decreased eyespot levels and infection cycles whereas straw treatment (burial vs removal of straw from the previous cereal crop) had no effect.  相似文献   

2.
Schoeny A  Lucas P 《Phytopathology》1999,89(10):954-961
ABSTRACT Take-all, caused by Gaeumannomyces graminis var. tritici, is a damaging disease of wheat that remains difficult to control. The efficacy of an experimental fungicide, applied as a seed treatment, was evaluated in five naturally infested field experiments conducted during three cropping seasons. Plants were sampled and assessed for take-all incidence and severity at different growth stages. Nonlinear models expressing disease variables as a function of degree-days were fitted to the observed data. The incidence equation involved two parameters reflecting the importance of primary and secondary infection cycles. The earliness of infection was identified as an important variable to interpret the effects of the fungicide. In an early epidemic, the fungicide significantly reduced take-all incidence during all or most of the cropping season, whereas in late epidemics, it provided only moderate reductions of incidence. The seed treatment reduced incidence by delaying the primary infection cycle. The fungicide significantly reduced severity during the whole epidemic. It appeared more efficient in limiting root-to-root spread than in slowing down the extension of necrosis on diseased roots.  相似文献   

3.
Two deep-working soil tillage tools, one which inverts soil (plough) and one which does not (chisel), were used before sowing wheat after various crop successions combining eyespot host and non-host crops. Soil structure was nearly the same and crop residues were located in the different soil layers. Eyespot sporulation was estimated by visually assessing pot plants which had been on the trial plots for a fixed length of time. Field plants were also assessed for disease at several wheat growth stages. A kinetic equation expressing disease level as a function of degree-days was fitted to the disease levels observed on the field plants. This equation is based on eyespot epidemiology and depends on two parameters reflecting the importance of the primary and the secondary infection cycles respectively. Pot plant and early field plant disease levels and primary infection were closely correlated to the presence of crop residues in the top layer. The amount of residues depended on both crop succession and soil tillage. Where the previous crop was a host crop preceded by a non-host crop, soil inversion buried host residues, thus decreasing the primary infection risk. Where however the previous crop was a non-host crop preceded by a host crop, soil inversion carried the host residues back to soil surface, thus increasing the primary infection risk. Secondary infection was not correlated to either crop succession or soil tillage.  相似文献   

4.
ABSTRACT Epidemiological modeling combined with parameter estimation of experimental data was used to examine differences in the contribution of disease-induced root production to the spread of take-all on plants of two representative yet contrasting cultivars of winter wheat, Ghengis and Savannah. A mechanistic model, including terms for primary infection, secondary infection, inoculum decay, and intrinsic and disease-induced root growth, was fitted to data describing changes in the numbers of infected and susceptible roots over time at a low or high density of inoculum. Disease progress curves were characterized by consecutive phases of primary and secondary infection. No differences in root growth were detected between cultivars in the absence of disease and root production continued for the duration of the experiment. However, significant differences in disease-induced root production were detected between Savannah and Genghis. In the presence of disease, root production for both cultivars was characterized by stimulation when few roots were infected and inhibition when many roots were infected. At low inoculum density, the transition from stimulation to inhibition occurred when an average of 5.0 and 9.0 roots were infected for Genghis and Savannah, respectively. At high inoculum density, the transition from stimulation to inhibition occurred when an average of 4.5 and 6.7 roots were infected for Genghis and Savannah, respectively. Differences in the rates of primary and secondary infection between Savannah and Genghis also were detected. At a low inoculum density, Genghis was marginally more resistant to secondary infection whereas, at a high density of inoculum, Savannah was marginally more resistant to primary infection. The combined effects of differences in disease-induced root growth and differences in the rates of primary and secondary infection meant that the period of stimulated root production was extended by 7 and 15 days for Savannah at a low and high inoculum density, respectively. The contribution of this form of epidemiological modeling to the better management of take-all is discussed.  相似文献   

5.
ABSTRACT Using a combination of experimentation and mathematical modeling, the effects of initial (particulate) inoculum density on the dynamics of disease resulting from primary and secondary infection of wheat by the take-all fungus, Gaeumannomyces graminis var. tritici, were tested. A relatively high inoculum density generated a disease progress curve that rose monotonically toward an asymptote. Reducing the initial inoculum density resulted in a curve that initially was monotonic, rising to a plateau, but which increased sigmoidally to an asymptotic level of disease thereafter. Changes in the infectivity of particulate inoculum over time were examined in a separate experiment. Using a model that incorporated terms for primary and secondary infection, inoculum decay, and host growth, we showed that both disease progress curves were consistent with consecutive phases dominated, respectively, by primary and secondary infection. We examined the spread of disease from a low particulate inoculum density on seminal and adventitious root systems separately. Although seminal roots were affected by consecutive phases of primary and secondary infection, adventitious roots were affected only by secondary infection. We showed that the characteristic features of disease progress in controlled experiments were consistent with field data from crops of winter wheat. We concluded that there is an initial phase of primary infection by G. graminis var. tritici on winter wheat as seminal roots grow through the soil and encounter inoculum, but the rate of primary infection slows progressively as inoculum decays. After the initial phase, there is an acceleration in the rate of secondary infection on both seminal and adventitious roots that is stimulated by the increase in the availability of infected tissue as a source of inoculum and the availability of susceptible tissue for infection.  相似文献   

6.
ABSTRACT Epidemiological modeling is used to examine the effect of silthiofam seed treatment on field epidemics of take-all in winter wheat. A simple compartmental model, including terms for primary infection, secondary infection, root production, and decay of inoculum, was fitted to data describing change in the number of diseased and susceptible roots per plant over thermal time obtained from replicated field trials. This produced a composite curve describing change in the proportion of diseased roots over time that increased monotonically to an initial plateau and then increased exponentially thereafter. The shape of this curve was consistent with consecutive phases of primary and secondary infection. The seed treatment reduced the proportion of diseased roots throughout both phases of the epidemic. However, analysis with the model detected a significant reduction in the rate of primary, but not secondary, infection. The potential for silthiofam to affect secondary infection from diseased seminal or adventitious roots was examined in further detail by extending the compartmental model and fitting to change in the number of diseased and susceptible seminal or adventitious roots. Rates of secondary infection from either source of infected roots were not affected. Seed treatment controlled primary infection of seminal roots from particulate inoculum but not secondary infection from either seminal or adventitious roots. The reduction in disease for silthiofam-treated plants observed following the secondary infection phase of the epidemic was not due to long-term activity of the chemical but to the manifestation of disease control early in the epidemic.  相似文献   

7.
The dynamics of wheat spindle streak mosaic bymovirus in winter wheat were studied during two crop cycles in a field site with a history of high virus incidence. Individual plants of two susceptible cultivars were sampled from autumn to spring and the presence of virus antigen in roots and leaves was determined by ELISA. Virus incidence was higher in cv. Frankenmuth than in cv. Augusta. During year one, incidence of viral antigen in roots remained very low for four months after sowing, and did not reach maximum levels until the following spring. During year two, incidence of viral antigen in roots rose to maximum levels in autumn, only three months after sowing. These results strongly suggested that root infection occurred in spring as well as in autumn. In both cultivars and in both years, we detected the virus in roots one month prior to its detection in leaves, suggesting that virus moves slowly from roots into leaves. Maximum incidence of virus in leaves occurred in spring of both years, coinciding with the period of symptom development. Typical symptoms (yellow streaks, spindles, and mosaic) were observed in year two, whereas only mild mosaic was observed in year one. Virus antigen was detected in nonsymptomatic leaves from two months after sowing through crop senescence. Because antigen could be detected in roots throughout the crop cycle, and zoosporangia and cystosori of the fungal vector could be detected one and two months, respectively, after sowing, it is possible that wheat spindle streak mosaic bymovirus is acquired and/or spread by the vector during the majority of the crop cycle.  相似文献   

8.
Two field experiments, in 1999 and 2000, were used to test whether reductions in root growth and function explained the effects of take-all on crop water and nitrogen uptake. The fungicide silthiofam was used to manipulate take-all independently of other factors. Soil water content was manipulated from heading to determine effects on disease progress and resource capture. Epidemic progress was significantly delayed in the presence of silthiofam, leading to reductions in disease in both experiments. Effects of silthiofam were reduced by increasing soil water late in the season, although only in 2000 did increased soil water content have a direct effect, leading to a higher rate of disease increase. Higher levels of disease in the absence of silthiofam did not affect root growth as measured by total root length density (TRLD), but did lead to significantly reduced healthy root length density (HRLD, a measure of functional roots) in both experiments. Only in 2000 were there any significant effects of increased take-all on water and nitrogen uptake. This was attributed to the higher TRLD in the (1999) crop, which allowed HRLDs to be maintained above a critical threshold (where water and nitrogen uptake start to be severely affected) despite loss of functional root to disease. The effects of take-all on nitrogen uptake were more likely to affect crop canopy size and duration than the relatively small effects observed on water uptake. Increasing soil water content allowed the crop to take up more water in absolute terms despite, in 2000, increasing levels of disease and reducing HRLD.  相似文献   

9.
Point pattern analysis (fitting of the beta-binomial distribution and binary form of power law) was used to describe the spatial pattern of natural take-all epidemics (caused by Gaeumannomyces graminis var. tritici ) on a second consecutive crop of winter wheat in plots under different cropping practices that could have an impact on the quantity and spatial distribution of primary inoculum, and on the spread of the disease. The spatial pattern of take-all was aggregated in 48% of the datasets when disease incidence was assessed at the plant level and in 83% when it was assessed at the root level. Clusters of diseased roots were in general less than 1 m in diameter for crown roots and 1–1·5 m for seminal roots; when present, clusters of diseased plants were 2–2·5 m in diameter. Anisotropy of the spatial pattern was detected and could be linked to soil cultivation. Clusters did not increase in size over the cropping season, but increased spatial heterogeneity of the disease level was observed, corresponding to local disease amplification within clusters. The relative influences of autonomous spread and inoculum dispersal on the size and shape of clusters are discussed.  相似文献   

10.
Various grass species susceptible to infection by Gaeumannomyces graminis var. tritici were mixed-sown into a legume crop in order to assess their influence on density of inoculum and take-all disease in a subsequent crop of wheat.
In a pot experiment take-all inoculum increased ( P < 0.001) in all treatments containing a proportion (from 20 to 100%, in increments of 20%) of grass in subterranean clover. In a plot trial, most severe take-all occurred in the 20% legume/80% grass stands and least in the 100% legume and 80% legume/20% grass stands. Total grain weight was highest ( P <0.1) after the 100% legume stands. There was no difference in severity of take-all after pure stands of medic, subterranean clover and lupin, but there was more severe take-all after the grass-infested medic stands than after those of subterranean clover ( P < 0.1) or lupin ( P < 0.05). No significant differences ( P > 0.1) in yield occurred in wheat following any of the legumes or mixed stands.  相似文献   

11.
The importance of the spatial aspect of epidemics has been recognized from the outset of plant disease epidemiology. The objective of this study was to determine if the host spatial structure influenced the spatio-temporal development of take-all disease of wheat, depending on the inoculum spatial structure. Three sowing patterns of wheat (broadcast sowing, line sowing and sowing in hills) and three patterns of inoculum (uniform, aggregated and natural infestation) were tested in a field experiment, repeated over 2 years. Disease (severity, root disease incidence, plant disease incidence and, when applicable, line and hill incidences) was assessed seven times during the course of each season and the spatial pattern was characterized with incidence-incidence relationships. In the naturally infested plots, disease levels at all measurement scales were significantly higher in plots sown in hills, compared to plots sown in line, which were in turn significantly more diseased than plots with broadcast sowing. Disease aggregation within roots and plants was stronger in line and hill sowing than in broadcast sowing. Analysis of the disease gradient in the artificially infested plots showed that the disease intensified (local increase of disease level) more than it extensified (spatial spread of the disease), the effect of the introduced inoculum was reduced by 95% at a distance of 15 cm away from the point of infestation. Yield was not significantly affected by sowing pattern or artificial infestation.  相似文献   

12.
Relationships between take-all intensity and grain yield and quality were determined in field experiments on cereal crops using regression analyses, usually based on single-point disease assessments made during anthesis or grain-filling. Different amounts of take-all were achieved by different methods of applying inoculum artificially (to wheat only) or by using different cropping sequences (in wheat, triticale or barley) or sowing dates (wheat only) in crops with natural inoculum. Regressions of yield or thousand-grain weight on take-all intensity during grain filling were similar to those on accumulated disease (area under the disease progress curve) when these were compared in one of the wheat experiments. Regressions of yield on take-all intensity were more often significant in wheat than in the less susceptible crops, triticale and barley, even when a wide range of disease intensities was present in the latter crops. The regressions usually had most significance when there were plots in the severe disease category. Thousand-grain weight and hectolitre weight usually responded similarly to total grain yield. Decreased yield was often accompanied by a significant increase in the percentage of small grains. When severe take-all was present in wheat, regressions showed that nitrogen uptake was usually impaired. This was sometimes accompanied, however, by increased percentage nitrogen in the grain as a consequence of smaller grain size with decreased endosperm. Significant effects of take-all, both positive and negative, on Hagberg falling number in wheat sometimes occurred. Significant regressions of yield on take-all assessed earlier than usual, ie during booting rather than grain-filling in wheat and triticale and during anthesis/grain-filling rather than ripening in barley, had steeper slopes. This is consistent with observations that severe disease that develops early can be particularly damaging, whilst the crops, especially barley, can later express tolerance by producing additional, healthy roots. The regression parameters, including maximum potential yield (y-axis intercept) and the extrapolated maximum yield loss, also varied according to the different growing conditions, including experimental treatments and other husbandry operations. These differences must be considered when assessing the economic potential of a control measure such as fungicidal seed treatment.  相似文献   

13.
ABSTRACT Epidemiological modeling, together with parameter estimation to experimental data, was used to examine the contribution of disease-induced root growth to the spread of take-all in wheat. Production of roots from plants grown in the absence of disease was compared with production of those grown in the presence of disease and the precise form of diseaseinduced growth was examined by fitting a mechanistic model to data describing change in the number of infected and susceptible roots over time from a low and a high density of inoculum. During the early phase of the epidemic, diseased plants produced more roots than their noninfected counterparts. However, as the epidemic progressed, the rate of root production for infected plants slowed so that by the end of the epidemic, and depending on inoculum density, infected plants had fewer roots than uninfected plants. The dynamical change in the numbers of infected and susceptible roots over time could only be explained by the mechanistic model when allowance was made for disease-induced root growth. Analysis of the effect of disease-induced root production on the spread of disease using the model suggests that additional roots produced early in the epidemic serve only to reduce the proportion of diseased roots. However, as the epidemic switches from primary to secondary infection, these roots perform an active role in the transmission of disease. Some consequence of disease-induced root growth for field epidemics is discussed.  相似文献   

14.
麦类作物抗全蚀病室内苗期接种方法与评价指标   总被引:8,自引:0,他引:8  
利用生物统计学方法,研究了室内苗期评价麦类作物抗全蚀病鉴定的方法和指标,证明菌饼和玉米砂接种法均可用于麦类作物的抗病性评估,但以菌饼接种法接种较好。该法简便易行,接菌量均匀,有利于品种间的相互比较。对菌饼和玉米砂接种法接种后30d的发病指标进行了通径相关分析和主成分分析。在12项发病指标中,只有皮层褐化长一个指标入选主成分,表明皮层褐化长度可反映品种的抗病性差异,该指标可用于室内苗期抗全蚀病资源的大量筛选。  相似文献   

15.
The impact of cultivar resistance and inoculum density on the incidence of primary infection of canola root hairs by Plasmodiophora brassicae, the causal agent of clubroot, was assessed by microscopy. The incidence of root hair infection in both a resistant and a susceptible cultivar increased with increasing inoculum density, but was two‐ to threefold higher in the susceptible cultivar; the relationship between root hair infection and inoculum density was also substantially stronger and more consistent in the susceptible cultivar. In the susceptible cultivar, the root hair infection rate peaked between 6 and 8 days after sowing and then declined. In the resistant cultivar, it increased over the 14‐day duration of each study. It appears that examination of root hair infection by microscopy in a bait crop of susceptible canola could serve as a useful tool for estimating P. brassicae inoculum levels in soil. In a separate trial, the relationship between inoculum density and clubroot severity, plant growth parameters, and seed yield was assessed under greenhouse conditions. Inoculum density in the susceptible genotype was strongly and positively correlated with clubroot severity and negatively correlated with plant height and seed yield. In addition, a single cropping cycle of the susceptible cultivar contributed significantly higher levels of resting spores to the soil in a greenhouse test than did a cycle of the resistant cultivar, as assessed by quantitative PCR and microscope analysis.  相似文献   

16.
ABSTRACT The effects of take-all epidemics on winter wheat yield formation were determined, and disease-yield relationships were established to assess the agronomic efficacy and economic benefits of control methods. Epidemics were generated in naturally infested fields by varying cropping season, crop order in the rotation, and experimental fungicide seed treatment. Disease incidence and severity were assessed from tillering to flowering. Yield components were measured at harvest. Models simulating the formation of the yield components in the absence of limiting factors were used to estimate the losses caused by take-all. Losses were predicted by the disease level at a specific time or the area under the disease progress curve, reflecting accumulation during a specific period. Losses of grain number per square meter and 1,000-grain weight were linked to cumulative disease incidence between the beginning of stem elongation and flowering, and disease incidence at midstem elongation, respectively. Yield losses were accounted for by both cumulative disease incidence between sowing and flowering, and disease incidence at midstem elongation. Results confirm the importance of nitrogen fertilization in reducing the impact of take-all on wheat.  相似文献   

17.
This study reviews 52 field experiments, mostly from the UK, studying the effects of cultivation techniques, sowing date, crop density and cultivar choice on Alopecurus myosuroides infestations in cereal crops. Where possible, a statistical meta‐analysis has been used to calculate average responses to the various cultural practices and to estimate their variability. In 25 experiments, mouldboard ploughing prior to sowing winter cereals reduced A. myosuroides populations by an average of 69%, compared with non‐inversion tillage. Delaying drilling from September to the end of October decreased weed plant densities by approximately 50%. Sowing wheat in spring achieved an 88% reduction in A. myosuroides plant densities compared with autumn sowing. Increasing winter wheat crop density above 100 plants m?2 had no effect on weed plant numbers, but reduced the number of heads m?2 by 15% for every additional increase in 100 crop plants, up to the highest density tested (350 wheat plants m?2). Choosing more competitive cultivars could decrease A. myosuroides heads m?2 by 22%. With all cultural practices, outcomes were highly variable and effects inconsistent. Farmers are more likely to adopt cultural measures and so reduce their reliance on herbicides, if there were better predictions of likely outcomes at the individual field level.  相似文献   

18.
Effects of crop management patterns on coffee rust epidemics   总被引:1,自引:0,他引:1  
The effects of crop management patterns on coffee rust epidemics, caused by Hemileia vastatrix , are not well documented despite large amounts of data acquired in the field on epidemics, and much modelling work done on this disease. One main reason for this gap between epidemiological knowledge and understanding for management resides in the lack of links between many studies and actual production situations in the field. Coffee rust epidemics are based on a seemingly simple infection cycle, but develop polycyclic epidemics in a season and polyetic epidemics over successive seasons. These higher-level processes involve a very large number of environmental variables and, as in any system involving a perennial crop, the physiology of the coffee crop and how it affects crop yield. Crop management is therefore expected to have large effects on coffee rust epidemics because of its immediate effect on the infection cycle, but also because of its cumulative effect on ongoing and successive epidemics. Quantitative examples taken from a survey conducted in Honduras illustrate how crop management, different combinations of shade, coffee tree density, fertilization and pruning may strongly influence coffee rust epidemics through effects on microclimate and plant physiology which, in turn, influence the life cycle of the fungus. We suggest there is a need for novel coffee rust management systems which fully integrate crop management patterns in order to manage the disease in a sustainable way.  相似文献   

19.
Effects of ploughing or direct drilling with three methods of straw disposal on amounts of inoculum of Pyrenophora teres , and on frequency of infection and severity of net blotch in the autumn, were studied in winter barley. Prior to ploughing, many conidia of P. teres were caught above areas where infected straw from a previous crop of winter barley had been bated and removed leaving culm bases, or where barley straw had been chopped and left in situ , but relatively few were caught above areas where straw had been burnt. Thereafter, where ploughing had buried surface residues, irrespective of the method of straw disposal, conidia were not caught for at least 3 weeks, and subsequently were substantially fewer than in direct-drilled areas where many spores were caught. Production of conidia (measured as numbers per unit length of straw) was greatest on chopped straw, less on culm bases and least on burnt straw residues. Sporulation on volunteer barley plants was much reduced by application of paraquat + diquat, but some still occurred on visually'dead'volunteer barley.
All direct-drilled barley plants were diseased within 27 days of sowing, whereas 42 days elapsed before all plants sown in ploughed areas were diseased. Disease on individual plants was also more severe in direct-drilled areas: 20% of the area of the first leaf to emerge was diseased 19 days after crop emergence in direct-drilled plots, whereas less than 9% was diseased in ploughed areas 50 days after emergence.
There was an additive effect of straw disposal methods and direct-drilling on disease, which in turn affected plant vigour. The adverse effect of direct-drilling on the incidence and severity of net blotch appeared to be far greater than that of the straw disposal methods.  相似文献   

20.
A 2-year survey of soils from a total of 208 fields, using sugar-beet seedlings as bait plants, showed Polymyxa betae to be ubiquitous in the sugar-beet-growing areas of Britain. It was detected in some soils which had not grown a host crop for up to 17 years. However, infection of roots was relatively infrequent in plant samples taken from 134 survey fields in early summer and the density of colonization always low. Three other non-mycelial fungi, Olpidium brassicae, Lagena radicicola and Rhizophydium graminis were also common parasites of sugar-beet roots detected in soil bioassays. Infection of plant samples by O. brassicae was particularly severe.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号