首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
The Sorraia Watershed has a long history of continuous irrigated maize. Imprecise water and fertiliser management has contributed to increase nitrate in the groundwater. Solving this problem requires the identification of problem sources and the definition of alternate management practices. This can be performed by an interactive use of selective experimentation and modelling. This paper presents the experimentation phase, where the field experiments were conducted under the irrigation and fertilisation management commonly found in the watershed. Two different soil representatives of the watershed were selected, presenting different water and solute transport properties. One is a silty loam alluvial soil, with a shallow water table, and the other is a sandy soil with a very low water retention capacity. The various terms of the water (consumption, drainage, soil storage) and nitrogen balance (plant uptake, mineralisation and leaching) were obtained from intensive monitoring in the soil profile up to 80 cm, corresponding to the crop root zone. The results showed that in the alluvial soil, up to 70 kg N ha−1 was produced by mineralisation. Current fertiliser management fail in that it does not consider the soil capability to supply mineral nitrogen from the organic nitrogen stored in the profile at planting. This leads to a considerable amount of NO3-N stored in the soil at harvesting, which is leached during the winter rainy season. In the sandy soil, the poor irrigation management (45% losses by deep percolation), leads to NO3-N leaching during the crop season and to inefficient nitrogen use by the crop.  相似文献   

2.
The irrigated dairy industry in Australia depends on pasture as a low-cost source of fodder for milk production. The industry is under increasing pressure to use limited water resources more efficiently. Pasture is commonly irrigated using border-check but there is growing interest amongst dairy irrigators to explore the potential for overhead sprinklers to save water and/or increase productivity. This paper reports on a detailed water balance study that evaluated the effectiveness of centre pivot irrigation for pasture production. The study was conducted between 2004/2005 and 2005/2006 on a commercial dairy farm in the Shepparton Irrigation Region in northern Victoria. More than 90% of supplied water (irrigation plus rainfall) was utilized for pasture growth. Deep drainage of respectively 90 and 93 mm was recorded for the two observation seasons. During the 2004/2005 season, deep drainage resulted from large unseasonal summer rainfall events. Over the 2005/2006 season, deep drainage resulted from excess irrigation. The cumulative pasture dry matter (DM) production was 15.5 and 11.3 tonnes DM ha−1 for the two irrigation seasons, with an agronomic water use efficiency (WUE) of 16 and 12 kg DM ha−1 mm−1 respectively. The farmer's intuitive irrigation scheduling was found to be very effective; the pattern of irrigation application closely matched measured pasture water use, prevented water stress and resulted in high irrigation efficiency.  相似文献   

3.
Groundwater is being mined in much of the irrigated area of the central and southern High Plains of the USA. Profits and risks inherent in irrigation management depend on the association between crop yield and level of water application. Research was conducted over a 14 year period (1974–1987) to establish the yield vs. water application relationships of corn, grain sorghum, and sunflower. The research was located near Tribune, Kansas, USA on a Ulysses silt loam soil. Plots were level-basins to which water was added individually through gated pipe. Irrigation studies of the three crops were located adjacent to each other. Irrigation treatments were arranged in completely randomized blocks with three replications. As total irrigation amount increased from 100 to 200, 200 to 300, and 300 to 400 mm, sunflower yield increased by 0.53 Mg ha−1, 0.43 Mg ha−1, and 0.37 Mg ha−1, respectively. Corn outyielded grain sorghum at total irrigation amounts of 345 mm and above. Yield increase over continuous dryland was greater in corn than in grain sorghum at total irrigation amounts above 206 mm. Therefore, if grain mass is the consideration, grain sorghum is a better choice than corn at less than 206 mm of irrigation, whereas corn is a better choice than grain sorghum at more than 206 mm of irrigation.  相似文献   

4.
Deficit irrigation occurrence while maintaining acceptable yield represents a useful trait for sunflower production wherever irrigation water is limited. A 2-year experiment (2003–2004) was conducted at Tal Amara Research Station in the Bekaa Valley of Lebanon to investigate sunflower response to deficit irrigation. In the plots, irrigation was held at early flowering (stage F1), at mid flowering (stage F3.2) and at early seed formation (stage M0) until physiological maturity. Deficit-irrigated treatments were referred to as WS1, WS2 and WS3, respectively, and were compared to a well-irrigated control (C). Reference evapotranspiration (ETrye-grass) and crop evapotranspiration (ETcrop) were measured each in a set of two drainage lysimeters of 2 m × 2 m × 1 m size cultivated with rye grass (Lolium perenne) and sunflower (Helianthus annuus L., cv. Arena). Crop coefficients (Kc) in the different crop growth stages were derived as the ratio (ETcrop/ETrye-grass).

Lysimeter measured crop evapotranspiration (ETcrop) totaled 765 mm in 2003 and 882 mm in 2004 for total irrigation periods of 139 and 131 days, respectively. Daily ETcrop achieved a peak value of 13.0 mm day−1 at flowering time (stage F3.2; 80–90 days after sowing) when LAI was >6.0 m2 m−2. Then ETcrop declined to 6.0 mm day−1 during seed maturity phase. Average Kc values varied from 0.3 at crop establishment (sowing to four-leaf stage), to 0.9 at late crop development (four-leaf stage to terminal bud), to >1.0 at flowering stage (terminal bud to inflorescence visible), then to values <1.0 at seed maturity phase (head pale to physiological maturity). Measured Kc values were close to those reported by the FAO.

Average across years, seed yield at dry basis on the well-irrigated treatment was 5.36 t ha−1. Deficit irrigation at early (WS1) and mid (WS2) flowering stages reduced seed yield by 25% and 14% (P < 0.05), respectively, in comparison with the control. However, deficit irrigation at early seed formation was found to increase slightly seed yield in WS3 treatment (5.50 t ha−1). We concluded that deficit irrigation at early seed formation (stage M0) increased the fraction of assimilate allocation to the head, compensating thus the lower number of seeds per m2 through increased seed weight. In this experiment, while deficit irrigation did not result in any remarkable increase in harvest index (HI), water use efficiency (WUE) was found to vary significantly (P < 0.05) among treatments, where the highest (0.83 kg m−3) and the lowest (0.71 kg m−3) values were obtained from WS3 and WS1 treatments, respectively. Finally, results indicate that irrigation limitation at early flowering (stage F1) and mid flowering (stage F3.2) should be avoided while it can be acceptable at seed formation (stage M0).  相似文献   


5.
A variety of technologies for reducing residential irrigation water use are available to homeowners. These “Smart Irrigation” technologies include evapotranspiration (ET)-based controllers and soil moisture sensor (SMS) controllers. The purpose of this research was to evaluate the effectiveness of these technologies, along with rain sensors, based on irrigation applied and turfgrass quality measurements on St. Augustinegrass (Stenotaphrum secundatum (Walter) Kuntze). Testing was performed on two types of SMS controllers (LawnLogic LL1004 and Acclima Digital TDT RS500) at three soil moisture threshold settings. Mini-Clik rain sensors (RS) comprised six treatments at two rainfall thresholds (3 mm and 6 mm) and three different irrigation frequencies (1, 2, and 7 d/wk). Two ET controllers were also tested, the Toro Intelli-Sense controller and the Rain Bird ET Manager. A time-based treatment with 2 days of irrigation per week without any type of sensor (WOS) to bypass irrigation was established as a comparison. All irrigation controller programming represented settings that might be used in residential/commercial landscapes. Even though three of the four testing periods were relatively dry, all of the technologies tested managed to reduce water application compared to the WOS treatment, with most treatments also producing acceptable turf quality. Reductions in irrigation applied were as follows: 7–30% for RS-based treatments, 0–74% for SMS-based treatments, and 25–62% for ET-based treatments. The SMS treatments at low threshold settings resulted in high water savings, but reduced turf quality to unacceptable levels. The medium threshold setting (approximately field capacity) SMS-based treatment produced good turfgrass quality while reducing irrigation water use compared to WOS by 11–53%. ET controllers with comparable settings and good turf quality had −20% to 59% savings. Reducing the irrigation schedule (treatment DWRS) by 40% and using a rain sensor produced water savings between 36% and 53% similar to smart controllers. Proper installation and programming of each of the technologies was essential element to balancing water conservation and acceptable turf quality. Water savings with the SMS controllers could have been increased with a reduced time-based irrigation schedule. Efficiency settings of 100% (DWRS) and 95% (TORO) did not reduce turf quality below acceptable limits and resulted in substantial irrigation savings, indicating that efficiency values need not be low in well designed and maintained irrigation systems. For most conditions in Florida, the DWRS schedule (60% of schedule used for SMS treatments) can be used with either rain sensors or soil moisture sensors in bypass control mode as long as the irrigation system has good coverage and is in good repair.  相似文献   

6.
Field experiments were carried out over a 2-year period on a loamy soil plot under corn in Montpellier (south-east France). The effectiveness of improved irrigation practices in reducing the adverse impact of irrigation on the environment was assessed. Different irrigation and fertiliser treatments were applied to identify the best irrigation and fertilisation strategy for each technique (furrow and sprinkler) to ensure both good yields and lower NO3- leaching. No significant differences in corn yield and NO3- leaching were found for the climatic scenario of 1999 between sprinkler and furrow irrigation during the irrigation season. Following the rainy events occurring after plant maturity (and the irrigation season), differences in N leaching were observed between the treatments. The study shows that both the fertiliser method, consisting of applying a fertiliser just before ridging the furrows, and the two-dimensional (2D) infiltration process, greatly influence the N distribution in the soil. N distribution seems to have a beneficial impact on both yield and N leaching under heavy irrigation rates during the cropping season. But, under rainy events (particularly those occurring after harvesting), the N, stored in the upper part of the ridge and not previously taken up by plants, can be released into the deeper soil layers in a furrow-irrigated plot. In contrast, the 1D infiltration process occurring during sprinkler irrigation events affects the entire soil surface in the same way. As a result the same irrigation rate would probably increase N leaching under sprinkler irrigation to a greater extent than under furrow-irrigation during an irrigation period. In order to assess the robustness of these interpretations derived from soil N-profile analysis, a modelling approach was used to test the irrigation and fertilisation strategies under heavy irrigation rates such as those occurring at the downstream part of closed-end furrows. The RAIEOPT and STICS models were used to simulate water application depths, crop yield and NO3- leaching on three measurement sites located along the central furrow of each treatment. The use of a 2D water- and solute-transport model such as HYDRUS-2D enabled us to strengthen the conclusions derived from the observations made on the N distribution under a cross-section of furrow. This model helped to illustrate the risk of over-estimation of N leaching when using a simplified 1D solute-transport model such as STICS.  相似文献   

7.
《Agricultural Systems》2003,76(1):159-180
The long-term effects of nitrogen (N) fertiliser and slurry management practices in agricultural systems has been simulated using event driven physically based models. The Swedish soil water model SOIL and its associated nitrogen cycle model SOILN has been used to simulate the long-term impacts (over 12 years) of 360 management scenarios; three slurry applications with 10 spreading dates (involving single and split applications) for surface spreading and injection of slurry, and three fertiliser applications with two spreading dates. The effects of the N management scenarios on NO3–N drainage flows, total gaseous N losses and crop yields for grass, winter and spring cereals is investigated. Furthermore, seven soils with varying degrees of drainage efficiency and three climatic conditions (East and West coast Scotland and Southern Ireland) are studied.The aim of this work is to produce N-budget tables for an expert agricultural decision system (ADS) which deals specifically with N best management practises for fertiliser and slurry applications. Simulations conducted in this study were based on input parameters calibrated for specific sites in previous studies on hydrology and NO3–N transport to subsurface drains with associated crop growth.The results of this study show that increasing rates of N applications (in the form of slurry and fertiliser) resulted in a non-linear increase in both the N leached through subsurface drains and the N harvest yield. Surface spreading and injection of slurry gave similar trends. The most important decision about slurry spreading concerns the selection of spreading date and the selection of fields which are likely to produce only moderate leaching effects. Application of slurry in autumn (as a single or split loading), invariably leads to large losses through N leaching, with a single application always resulting in the highest loss. Significant differences are evident for N leaching from the seven soil types. Climatic variation as exemplified in the three meteorological data sets, produces noticeable and significant differences in both N leached and harvest crop totals. This study also aims to identify that a field environmental risk assessment (ERA) using a physically based model such as SOILN can be determined such that strategic agronomic decisions involving N management can be made. In practice this is so provided that a farm manager can recognise and match the actual soil type and drainage condition of the fields on which spreading is to occur with the simulated field types within a similar climate region.  相似文献   

8.
Agricultural systems with grazing animals are increasingly under scrutiny for their contribution to quality degradation of waterways and water bodies. Soil type, climate, animal type and nitrogen (N) fertilisation are contributors to the variation in N that is leached through the soil profile into ground and surface water. It is difficult to explore the effect of these factors using experimentation only and modelling is proposed as an alternative. An agro-ecosystem model, EcoMod, was used to quantify the pastoral ecosystem responses to situational variability in climate and soil, choice of animal type and N fertilisation level within the Lake Taupo region of New Zealand. Factorial combinations of soil type (Oruanui and Waipahihi), climate (low, moderate and high rainfall), animal type (sheep, beef and dairy) and N fertilisation level (0 or 60 kg N/ha/yr) were simulated. High rainfall climates also had colder temperatures, grew less pasture and carried fewer animals overall which lead to less dung and urinary N returned. Therefore, even though a higher proportion of N returned ultimately leached at the higher rainfall sites, the total N leached did not differ greatly between sites. Weather variation between years had a marked influence on N leaching within a site, due to the timing and magnitude of rainfall events. In this region, for these two highly permeable soil types, N applied as fertiliser had a high propensity to leach, either after being taken up by plants, grazed and returned to the soil as dung and urine, or due to direct flow through the soil profile. Soil type had a considerable effect on N leaching risk, the timing of N leaching and mean pasture production. Nitrogen leaching was greatest from beef cattle, followed by dairy and sheep with the level of leaching related to urine deposition patterns for each animal type and due to the amount of N returned to the soil as excreta. Simulation results indicate that sheep farming systems with limited fertiliser N inputs will reduce N leaching from farms in the Lake Taupo catchment.  相似文献   

9.
Irrigation scheduling performance by evapotranspiration-based controllers   总被引:2,自引:0,他引:2  
Evapotranspiration-based irrigation controllers, also known as ET controllers, use ET information or estimation to schedule irrigation. Previous research has shown that ET controllers could reduce irrigation as much as 42% when compared to a time-based irrigation schedule. The objective of this study was to determine the capability of three brands of ET-based irrigation controllers to schedule irrigation compared to a theoretically derived soil water balance model based on the Irrigation Association Smart Water Application Technologies (SWAT) protocol to determine the effectiveness of irrigation scheduling. Five treatments were established, T1-T5, replicated four times for a total of twenty field plots in a completely randomized block design. The irrigation treatments were as follows: T1, Weathermatic SL1600 with SLW15 weather monitor; T2, Toro Intelli-sense; T3, ETwater Smart Controller 100; T4, a time-based treatment determined by local recommendations; and T5, a reduced time-based treatment 60% of T4. All treatments utilized rain sensors set at a 6 mm threshold. A daily soil water balance model was used to calculate the theoretical irrigation requirements for comparison with actual irrigation water applied. Calculated in 30-day running totals, irrigation adequacy and scheduling efficiency were used to quantify under- and over-irrigation, respectively. The study period, 25 May 2006 through 27 November 2007, was drier than the historical average with a total of 1326 mm of rainfall compared to 1979 mm for the same historical period. It was found that all treatments applied less irrigation than required for all seasons. Additionally, the ET controllers applied only half of the irrigation calculated for the theoretical requirement for each irrigation event, on average. Irrigation adequacy decreased when the ET controllers were allowed to irrigate any day of the week. All treatments had decreased scheduling efficiency averages in the rainy season with the largest decrease of 29 percentile points with a timer and rain sensor (T4) and an average decrease of 20 percentile points for the ET controllers, indicating that site specific rainfall has a significant effect on scheduling efficiency results. Rainfall did not drastically impact the average irrigation adequacy results. For this study, there were two controller program settings that impacted the results. The first setting was the crop coefficients where specific values were chosen for the location of the study when calculating the theoretical requirement whereas the controllers used default values. The second setting was the soil type that defines the soil water holding capacity of the soil. The ET controllers were able to regularly adjust to real-time weather, unlike the conventional irrigation timers. However, the incorporation of site specific rainfall measurements is extremely important to their success at managing landscape water needs and at a minimum a rain sensor should be used.  相似文献   

10.
Considerable NO3 contamination of underlying aquifers is associated with greenhouse-based vegetable production in south-eastern Spain, where 80% of cropping occurs in soil. To identify management factors likely to contribute to NO3 leaching from soil-based cropping, a survey of irrigation and N management practices was conducted in 53 commercial greenhouses. For each greenhouse: (i) a questionnaire of general irrigation and N management practices was completed, (ii) amounts of N applied in manure were estimated; and for one crop in each greenhouse: (a) irrigation volume was compared with ETc calculated using a mathematical model and (b) total amount of applied fertiliser N was compared with crop N uptake. Total irrigation during the first 6 weeks after transplanting/sowing was generally excessive, being >150 and >200% of modelled ETc in, respectively, 68 and 60% of greenhouses. During the subsequent period, applied irrigation was generally similar to modelled ETc, with only 12% of greenhouses applying >150% of modelled ETc. Large irrigations prior to transplanting/sowing were applied in 92% of greenhouses to leach salts and moisten soil. Volumes applied were >20 and >40 mm in, respectively, 69 and 42% of greenhouses. Chemical soil disinfectants had been recently applied in 43% of greenhouses; associated irrigation volumes were >20 and >40 mm in, respectively, 78 and 48% of greenhouses conducting disinfection. Nitrogen and irrigation management were generally based on experience, with very little use of soil or plant analysis. Large manure applications were made at greenhouse construction in 98% of greenhouse, average manure and N application rates were, respectively, 432 m3 ha−1 and 3046 kg N ha−1. Periodic manure applications were made in 68% of greenhouses, average application rates for farmyard and pelleted manures were, respectively, 157 and 13 m3 ha−1 (in 55 and 13% of greenhouses); the average N rate was 947 kg N ha−1. Manure N was not considered in N fertiliser programs in 74% of greenhouses. On average, 75% of fertiliser N was applied as NO3. Applied fertiliser N was >1.5 and >2 times crop N uptake in, respectively, 42 and 21% of crops surveyed. The survey identified various management practices likely to contribute to NO3 leaching loss. Large manure applications and experiential mineral N management practices, based on NO3 application, are likely to cause accumulation of soil NO3. Drainage associated with: (i) the combined effect of large irrigations immediately prior to and excessive irrigations for several weeks following transplanting/sowing and (ii) large irrigations for salt leaching and soil disinfection, is likely to leach accumulated NO3 from the root zone. This study demonstrated that surveys can be very useful diagnostic tools for identifying crop management practices, on commercial farms, that are likely to contribute to appreciable NO3 leaching.  相似文献   

11.
To increase crop yield per unit of scarce water requires both better cultivars and better agronomy. The challenge is to manage the crop or improve its genetic makeup to: capture more of the water supply for use in transpiration; exchange transpired water for CO2 more effectively in producing biomass; and convert more of the biomass into grain or other harvestable product. In the field, the upper limit of water productivity of well-managed disease-free water-limited cereal crops is typically 20 kg ha−1 mm−1 (grain yield per water used). If the productivity is markedly less than this, it is likely that major stresses other than water are at work, such as weeds, diseases, poor nutrition, or inhospitable soil. If so, the greatest advances will come from dealing with these first. When water is the predominant limitation, there is scope for improving overall water productivity by better matching the development of the crop to the pattern of water supply, thereby reducing evaporative and other losses and fostering a good balance of water-use before and after flowering, which is needed to give a large harvest index. There is also scope for developing genotypes that are able to maintain adequate floret fertility despite any transient severe water deficits during floral development. Marker-assisted selection has helped in controlling some root diseases that limit water uptake, and in maintaining fertility in water-stressed maize. Apart from herbicide-resistance in crops, which helps reduce competition for water by weeds, there are no genetic transformations in the immediate offing that are likely to improve water productivity greatly.  相似文献   

12.
Automated residential irrigation systems tend to result in higher water use than non-automated systems. Increasing the scheduling efficiency of an automated irrigation system provides the opportunity to conserve water resources while maintaining good landscape quality. Control technologies available for reducing over-irrigation include evapotranspiration (ET) based controllers, soil moisture sensor (SMS) controllers, and rain sensors (RS). The purpose of this research was to evaluate the capability of these control technologies to schedule irrigation compared to a soil water balance model based on the Irrigation Association (IA) Smart Water Application Technologies (SWAT) testing protocol. Irrigation adequacy and scheduling efficiency were calculated in 30-day running totals to determine the amount of over- or under-irrigation for each control technology based on the IA SWAT testing protocol. A time-based treatment with irrigation 2 days/week and no rain sensor (NRS) was established as a comparison. In general, the irrigation adequacy ratings (measure of under-irrigation) for the treatments were higher during the fall months of testing than the spring months due to lower ET resulting in lower irrigation demand. Scheduling efficiency values (measure of over-irrigation) decreased for all treatments when rainfall increased. During the rainy period of this testing, total rainfall was almost double reference evapotranspiration (ETo) while in the remaining three testing periods the opposite was true. The 30-day irrigation adequacy values, considering all treatments, varied during the testing periods by 0-68 percentile points. Looking at only one 30-day testing period, as is done in the IA SWAT testing protocol, will not fully capture the performance of an irrigation controller. Scheduling efficiency alone was not a good indicator of controller performance. The amount of water applied and the timing of application were both important to maintaining acceptable turfgrass quality and receiving good irrigation adequacy and scheduling efficiency scores.  相似文献   

13.
Adequate knowledge on the movement of nutrients under various agricultural practices is essential for developing remedial measures to reduce nonpoint source pollution. Mathematical models, after extensive calibration and validation, are useful to derive such knowledge and to identify site-specific alternative agricultural management practices. A spatial-process model that uses GIS and ADAPT, a field scale daily time-step continuous water table management model, was calibrated and validated for flow and nitrate-N discharges from a 365 ha agricultural watershed in central Iowa, in the Midwestern United States. This watershed was monitored for nitrate-N losses from 1991 to 1997. Spatial patterns in crops, topography, fertilizer applications and climate were used as input to drive the model. The first half of the monitored data was used for calibration and the other half was used in validation of the model. For the calibration period, the observed and predicted flow and nitrate-N discharges were in excellent agreement with r2 values of 0.88 and 0.74, respectively. During the validation period, the observed and predicted flow and nitrate-N discharges were in good agreement with r2 values of 0.71 and 0.50, respectively. For all 6 years of data, the observed annual nitrate-N losses of 26 kg ha−1 for the entire simulation were in excellent agreement with predicted nitrate-N losses of 24.2 kg ha−1. The calibrated model was used to investigate the long-term impacts of nitrate-N losses to changes in the rate and timing of fertilizer application. Results indicate that nitrate-N losses were sensitive to rate and timing of fertilizer application. Modeled annual nitrate-N losses showed a 17% reduction in nitrate-N losses by reducing the fertilizer application rate by 20% and switching the application timing from fall to spring. Further reductions in nitrate-N losses require conversion of row cropland to pasture and/or replacement of continuous corn or corn–soybean rotation systems with alternative crops.  相似文献   

14.
The amount of water used by any crop largely depends on the extent to which the soil water depletion from the root zone is being recharged by appropriate depth of irrigation. To test this hypothesis a field study was carried out in November–March of 2002–2003 and 2003–2004 on a sandy loam (Aeric haplaquept) to quantify the effect of depth of irrigation applied through micro-sprinklers on onion (Allium cepa L.) bulb yield (BY) and water use patterns. Seven irrigation treatments consisted of six amounts of sprinkler applied water relative to compensate crop (Kc) and pan (Kp) coefficient-based predicted evapotranspiration loss from crop field (ETp) (i) 160% of ETp (1.6ETp); (ii) 1.4ETp; (iii) 1.2ETp; (iv) 1.0ETp; (v) 0.8ETp; (vi) 0.6ETp; (vii) 40 mm of surface applied water whenever cumulative pan evaporation equals to 33 mm. Water use efficiency (WUE), net evapotranspiration efficiency (WUEET) and irrigation water use efficiency (WUEI) were computed. Marginal water use efficiency (MWUE) and elasticity of water productivity (EWP) of onion were calculated using the relationship between BY and measured actual evapotranspiration (ETc). Yield increased with increasing sprinkler-applied water from 0.6 to 1.4ETp. Relative to the yield obtained at 0.6ETp, yield at 1.0ETp increased by 23–25% while at 1.4ETp it was only 3–9% greater than that at 1.0ETp. In contrast, yield at 1.6ETp was 9–12% less than that at 1.4ETp. Maximum WUE (7.21 kg m−3) and WUEET (13.87 kg m−3) were obtained under 1.0ETp. However, the highest WUEI (3.83 kg m−3) was obtained with 1.2ETp. The ETc associated with the highest WUE was 20% less than that required to obtain the highest yields. This study confirmed that critical levels of ETc needed to obtain maximum BYs, or WUE, could be obtained more precisely from the knowledge of MWUE and EWP.  相似文献   

15.
The aim of this work was to evaluate long-term effects of different irrigation regimes on mature olive trees growing under field conditions. A 9-year experiment was carried out. Three irrigation treatments were applied: no irrigation, water application considering soil water content (short irrigation), or irrigation without considering soil water reserves and applying a 20% of extra water as a leaching fraction (long irrigation). Leaf water content, leaf area, vegetative growth, yield and fruit characteristics (fruit size, pulp:stone ratio and oil content) were determined yearly. Results showed that growth parameters did not show significant differences as a consequence of applied water. Yield was increased in irrigated trees compared to non-irrigated ones, but little differences between short and long irrigation were observed, only when accumulated yield from 1998 to 2006 was considered. Irrigation did not cause significant differences in fruit size or pulp:stone ratio either. Irrigation regimes similar to those applied in this experiment, under environmental conditions with relatively high mean annual precipitation, does not increase growth, yield or fruit characteristics when compared to rain-fed treatment, and consequently, the installation of a irrigation system could be not financially profitable.  相似文献   

16.
Heavy rainfall and irrigations during the summer months in the North China Plain may cause losses of nitrogen because of nitrate leaching. The objectives of this study were to characterize the leaching of accumulated N in soil profiles, and to determine the usefulness of Br as a tracer of surface-applied N fertilizer under heavy rainfall and high irrigation rates. A field experiment with bare plots was conducted near Beijing from 5 July to 6 September 2006. The experiment included three treatments: no irrigation (rainfall only, I0), farmers’ practice irrigation (rainfall plus 100 mm irrigation, I100) and high-intensity irrigation (rainfall plus 500 mm irrigation, I500), with three replicates. Transport of surface-applied Br and NO3 (assuming no initial NO3 in the soil profile) and accumulated NO3 in soil profiles were all simulated with the HYDRUS-1D model. The model simulation results showed that Br leached through the soil profile faster than NO3. When Br was used as a tracer for surface-applied N fertilizer to estimate nitrate leaching losses, the amount of N leaching may be overestimated by about 10%. Water drainage and nitrate leaching were dramatically increased as the irrigation rate was increased. The amounts of N leaching out of the 2.1-m soil profile under I0, I100 and I500 treatments were 195 ± 84, 392 ± 136 and 612 ± 211 kg N ha−1, equivalent to about 20 ± 5%, 40 ± 6% and 62 ± 7% of the accumulative N in the soil profile, respectively. N was leached more deeply as the irrigation rate increased. The larger amount of initial accumulated N was in soil profile, the higher percentage of N leaching was. N leaching was also simulated in summer under different weather conditions from 1986 to 2006. The results indicated that nitrate leaching in rainy years were significantly higher than those in dry and normal years. Increasing the irrigation times and decreasing the single irrigation rate after fertilizer application should be recommended.  相似文献   

17.
Tomato production systems in Florida are typically intensively managed with high inputs of fertilizer and irrigation and on sandy soils with low inherent water and nutrient retention capacities; potential nutrient leaching losses undermine the sustainability of such systems. The objectives of this 3-year field study were to evaluate the interaction between N-fertilizer rates and irrigation scheduling on crop N and P accumulation, N-fertilizer use efficiency (NUE) and NO3-N leaching of tomato cultivated in a plastic mulched/drip irrigated production system in sandy soils. Experimental treatments were a factorial combination of three irrigation scheduling regimes and three N-rates (176, 220, and 330 kg ha−1). Irrigation treatments included were: (1) surface drip irrigation (SUR) both the irrigation and fertigation line placed underneath the plastic mulch; (2) subsurface drip irrigation (SDI) where the irrigation drip was placed 0.15 m below the fertigation line which was located on top of the bed; and (3) TIME (conventional control) with the irrigation and fertigation lines placed as in SUR and irrigation applied once a day. Except for the TIME treatment all irrigation treatments were soil moisture sensor (SMS)-based with irrigation occurring at 10% volumetric water content. Five irrigation windows were scheduled daily and events were bypassed if the soil water content exceeded the established threshold. The use of SMS-based irrigation systems significantly reduced irrigation water use, volume percolated, and nitrate leaching. Based on soil electrical conductivity (EC) readings, there was no interaction between irrigation and N-rate treatments on the movement of fertilizer solutes. Total plant N accumulation for SUR and SDI was 12-37% higher than TIME. Plant P accumulation was not affected by either irrigation or N-rate treatments. The nitrogen use efficiency for SUR and SDI was on the order of 37-45%, 56-61%, and 61-68% for 2005, 2006 and 2007, respectively and significantly higher than for the conventional control system (TIME). Moreover, at the intermediate N-rate SUR and SDI systems reduced NO3-N leaching to 5 and 35 kg ha−1, while at the highest N-rate corresponding values were 7 and 56 kg N ha−1. Use of N application rates above 220 kg ha−1 did not result in fruit and/or shoot biomass nor N accumulation benefits, but substantially increased NO3-N leaching for the control treatment, as detected by EC monitoring and by the lysimeters. It is concluded that appropriate use of SDI and/or sensor-based irrigation systems can sustain high yields while reducing irrigation application as well as reducing NO3-N leaching in low water holding capacity soils.  相似文献   

18.
A generic approach is proposed for the development and testing of crop management systems in contrasting situations of water availability. Ecophysiological knowledge, expertise, regional references and simulation models are combined to devise management strategies adapted to production targets and constraints. The next stage consists of converting these crop management strategies into logical and consistent sets of decision rules. Each rule describes the reasoning which is used to apply a technical decision by taking account of observed or simulated environmental conditions or predicted agronomic risks.

This approach was applied to design crop management systems for grain sorghum (Sorghum bicolor L. Moench.) in south-western France. For spring-sown crops, management (sowing date, plant density, varietal choice, N fertilizer rate and timing) was based on water availability, both for economic and environmental reasons. Specific sets of decision rules were written for irrigated and rainfed conditions. The establishment of rules was based on agronomic principles (e.g. for plant density) or on the application of a simulation model (e.g. for sowing date, variety). N fertilization and irrigation were applied using combined N and water dynamic models.

A novel methodology combining crop diagnosis, analytical trials and crop simulation was developed to evaluate the management systems. An irrigated and a rainfed rule-based management system were compared near Toulouse (S.W. France) from 1995 to 2002. The profitability of rainfed low-input management was confirmed for sorghum in spite of high yields under irrigation (up to 10 t ha−1). The adaptation of sorghum management in rainfed conditions was mainly achieved through early maturing cultivars and by reducing N applications by 65%.  相似文献   


19.
Science-based, holistic, site-specific water conservation practices can reduce water use on turfgrass sites without adversely affecting turfgrass performance. However, when water use is decreased below a certain threshold, performance declines. Water conservation measures that reduce turfgrass performance essentially decrease its economic, environmental, recreational, and aesthetic values, which can in turn adversely impact many ‘stakeholders’, including the local economy and those affected by increased wind erosion, water erosion, or fire hazard. On larger turfgrass sites, considerable costs are associated with some water conservation strategies, especially when the quality of an alternative irrigation water source is poor or redesign of the landscape and/or irrigation system is involved.  相似文献   

20.
Irrigation return flows may induce salt and nitrate pollution of receiving water bodies. The objectives of this study were to perform a salt and nitrogen mass balance at the hydrological basin level and to quantify the salt and nitrate loads exported in the drainage waters of three basins located in a 15,500 ha irrigation district of the Ebro River Basin (Spain). The main salt and nitrogen inputs and outputs were measured or estimated in these basins along the 2001 hydrological year. Groundwater inflows in the three basins and groundwater outflow in one basin were significant components of the measured mass balances. Thus, the off-site impact ascribed solely to irrigation in these basins was estimated in the soil drainage water. Salt concentrations in soil drainage were low (TDS of around 400–700 mg/l, depending on basins) due to the low TDS of irrigation water and the low presence of salts in the geologic materials, and were inversely related to the drainage fractions (DF = 37–57%). However, due to these high DF, salt loads in soil drainage were relatively high (between 3.4 and 4.7 Mg/ha), although moderate compared to other areas with more saline geological materials. Nitrate concentrations and nitrogen loads in soil drainage were highest (77 mg NO3/l and 195 kg N/ha) in basin III, heavily fertilized (357 kg N/ha), with the highest percentage of corn and with shallow, low water retention flood-irrigated soils. In contrast, the lowest nitrate concentrations and nitrogen loads (21 mg NO3/l and 23 kg N/ha) were found in basin II, fertilized with 203 kg N/ha and preponderant in deep, alluvial valley soils, crops with low N requirements (alfalfa and pasture), the highest non-cropped area (26% of total) and with fertigation practices in the sprinkler-irrigated fields (36% of the irrigated area). Thus, 56% of the N applied by fertilization was lost in soil drainage in basin III, as compared to only 16% in basin II. In summary, a low irrigation efficiency coupled to an inadequate management of nitrogen fertilization are responsible for the low-salt, high-nitrate concentrations in soil and surface drainage outflows from the studied basins. In consequence, higher irrigation efficiencies, optimized nitrogen fertilization and the reuse for irrigation of the low-salt, high-nitrate drainage waters are key management strategies for a better control of the off-site pollution from the studied irrigation district.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号