Iranian Journal of Remote Sensing and GIS (20085966)(مقالات آماده انتشار)
Background and Objectives: Air pollution is one of the major environmental and health challenges that has been exacerbated by industrial growth and increased human activity, particularly in large and industrial cities. Methane gas, as one of the most potent greenhouse gases, plays a significant role in global warming, climate change, and the deterioration of air quality. The sources of methane emissions include wetlands, livestock farming, agriculture, and landfill sites, with human activities contributing significantly to its increase. Measuring and monitoring air pollution often faces spatial and temporal limitations due to ground-based monitoring stations. In this context, satellite data, due to its wide coverage, cost-effectiveness, and ability to provide high spatial and temporal resolution data, is used as one of the most important sources of information for studying air pollution. This research utilizes data from the TROPOMI sensor on the Sentinel-5P satellite, which enables the measurement of methane concentrations in the atmosphere, as the primary data source. These data serve as the basis for spatial and temporal analysis of methane distribution in the Isfahan metropolitan area, providing an opportunity to closely examine the spatial and temporal patterns of this pollutant on a large scale. Despite the high importance of methane pollution, no comprehensive study has been conducted regarding the spatial and temporal distribution of this pollutant in Isfahan. The aim of this research is to conduct a comprehensive and systematic analysis of methane distribution in the city of Isfahan using satellite data and identify the relationship between atmospheric changes and methane variations to offer effective solutions for air pollution management and environmental quality improvement.
Materials and Methods: This study aims to analyze the spatial and temporal distribution of methane concentration in the Isfahan metropolitan area using TROPOMI sensor data from the Sentinel-5P satellite over the period from 2019 to 2023. Satellite data were retrieved, processed, and analyzed using the Google Earth Engine platform. To examine the spatial distribution pattern of methane concentration, the Global Moran’s I index and G-statistic were applied to analyze clusters and determine the data dispersion. Additionally, the Gi-statistic was used to identify areas with the highest (hot spots) and lowest (cold spots) methane concentrations. Furthermore, the relationship between methane concentration and climatic factors such as temperature, air pressure, precipitation, and wind speed was evaluated through the calculation of Pearson’s correlation coefficient. Finally, the temporal trends of methane concentration were analyzed on a monthly, seasonal, and annual scale.
Results and Discussion: The results from the analyses indicate an increasing trend in methane concentration in the Isfahan metropolitan area during the study period. This gas experienced the highest concentrations in the colder seasons, especially in industrial and agricultural areas. Spatial analyses revealed significant clusters of high concentrations in the northern regions, particularly in areas 4 and 7, as well as in the eastern areas, particularly in areas 12 and 15. These high methane concentrations were linked to activities such as livestock farming, agriculture, and landfill operations. In contrast, the southern regions, particularly areas 2 and 6, as well as some parts of the western areas, were identified as cold spots with lower concentrations. Furthermore, the assessment of the relationship between climatic parameters showed an inverse correlation between temperature and wind speed with methane concentration changes, while air pressure exhibited a positive and significant relationship with the gas concentration changes.
Conclusion: The results of this study, based on high-precision satellite data analysis and advanced spatial measurement techniques, provide valuable information for air pollution management and urban planning. Accordingly, it is recommended that methane emission monitoring and control be prioritized during the colder seasons, with a focus on the identified hot spots. In this regard, optimizing industrial processes, efficiently managing waste in the eastern parts of Isfahan, and controlling methane emissions from northern livestock farms using modern technologies, including bioremediation methods, can play an effective role in reducing this pollutant. Additionally, the use of remote sensing data and advanced predictive models for continuous methane concentration monitoring and targeted pollution control strategies is recommended as an effective approach.
Lee s., S., Mccarty g.w., G.W., Moglen g.e., G.E., Lang, M.W., Nathan jones c., , Palmer, M., Yeo i.-y., I., Anderson, M., Sadeghi, A., Rabenhorst, M.C.
Lee s., S., Sadeghi, A., Yeo i.-y., I., Mccarty g.w., G.W., Hively w.d., W.D.
Transactions of the ASABE (21510032)60(6)pp. 1939-1955
Winter cover crops (WCCs) have been widely implemented in the Coastal Plain of the Chesapeake Bay Watershed (CBW) due to their high effectiveness in reducing nitrate loads. However, future climate conditions (FCCs) are expected to exacerbate water quality degradation in the CBW by increasing nitrate loads from agriculture. Accordingly, the question remains whether WCCs are sufficient to mitigate increased nutrient loads caused by FCCs. In this study, we assessed the impacts of FCCs on WCC nitrate reduction efficiency in the Coastal Plain of the CBW using the Soil and Water Assessment Tool (SWAT). Three FCC scenarios (2085-2098) were prepared using general circulation models (GCMs), considering three Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) greenhouse gas emission scenarios. We also developed six representative WCC implementation scenarios based on the most commonly used planting dates and species of WCCs in this region. Simulation results showed that WCC biomass increased by ~58% under FCC scenarios due to climate conditions conducive to WCC growth. Prior to implementing WCCs, annual nitrate loads increased by ~43% under FCC scenarios compared to the baseline scenario (2001-2014). When WCCs were planted, annual nitrate loads were substantially reduced by ~48%, and WCC nitrate reduction efficiency was ~5% higher under FCC scenarios relative to the baseline scenario. The increase in WCC nitrate reduction efficiency varied with FCC scenario and WCC planting method. As CO2 concentrations were higher and winters were warmer under FCC scenarios, WCCs had greater biomass and thus demonstrated higher nitrate reduction efficiency. In response to FCC scenarios, the performance of less effective WCC practices (i.e., barley, wheat, and late planting) under the baseline scenario indicated a ~14% higher increase in nitrate reduction efficiency compared to WCC practices with greater effectiveness under the baseline scenario (i.e., rye and early planting) due to warmer temperatures. The SWAT simulation results indicated that WCCs were effective in mitigating nitrate loads accelerated by FCCs, suggesting the role of WCCs in mitigating nitrate loads will likely be even more important under FCCs.
Renkenberger j., J., Montas h., H., Leisnham p.t., P., Chanse v., V., Shirmohammadi a., A., Sadeghi, A., Brubaker k., K., Rockier, A., Hutson t., T., Lansing d., D.
Transactions of the ASABE (21510040)60(3)pp. 769-782
Renkenberger j., J., Montas h., H., Leisnham p.t., P., Chanse v., V., Shirmohammadi a., A., Sadeghi, A., Brubaker k., K., Rockier, A., Hutson t., T., Lansing d., D.
Transactions of the ASABE (21510040)59(6)pp. 1803-1819
Sexton a.m., A.M., Shirmohammadi a., A., Sadeghi, A., Montas h., H.
pp. 279-287
The U.S. EPA's Total Maximum Daily Load (TMDL) program has encountered hindrances in its implementation partly because of its strong dependence on mathematical models to set limitations on the release of impairing substances. The uncertainty associated with predictions of such models is often not scientifically quantified and typically assigned as an arbitrary margin of safety (MOS) in the TMDL allocation. The Soil Water Assessment Tool (SWAT) model was evaluated to determine its applicability to identify the impairment status and tabulate a nutrient TMDL for a waterbody located in the Piedmont physiographic region of Maryland. The methodology for tabulating the nutrient TMDL is an enhancement over current methods used in Maryland. The mean-value first-order reliability method (MFORM) was used to calculate variance in output variables with respect to input parameter variance and the MOS value was derived based on the level confidence in meeting the water quality standard. The largest amount of variance in output variables occurred during wet periods. Predicted sediment output had the largest amount of variability around its mean, followed by nitrate, phosphate, and streamflow as indicated by average annual coefficients of variation of 28%, 19%, 17%, and 15%, respectively. The methodology used in this study to quantify a nitrate TMDL and the MOS associated with it, was a useful tool and an improvement over current methods of nutrient TMDL analysis.
Sexton a.m., A.M., Sadeghi, A., Shirmohammadi a., A., Mccarty g.w., G.W., Hively w.d., W.D.
pp. 396-402
Cover cropping has become a widely used conservation practice on Maryland's Eastern shore. It is one of the main practices funded by the Maryland Department of Agriculture's (MDA) Maryland Agricultural Water Quality Cost Share (MACS) program. The major benefits of this practice include reduction of nutrient runoff and leaching to surface and ground waters, and control of soil erosion. Although cover crops are increasingly being implemented, the long term effectiveness of this practice is not well known, especially on a watershed-scale basis. Since many watershed/water quality models are designed to measure long-term, large-scale effects of management practices, the Soil Water Assessment Tool (SWAT) model was employed to evaluate the environmental impact of cover crop implementations. This project is part of the U.S. Department of Agriculture's (USDA) Conservation Effects Assessment Project (CEAP) program which was established to specifically quantify the environmental benefits from conservation practices. The study is being carried out on the Choptank, one of the nine major Maryland river basins within the Chesapeake Bay watershed. Several tributaries in the Choptank river basin have been identified as "impaired waters" under Section 303(d) of the Federal Clean Water Act due to high levels of nutrients and sediments. SWAT was first utilized to build a model for the German Branch (GB) subbasin (∼50 km2), a non-tidal tributary basin of the larger Choptank River watershed. The study period was 18 years (1990-2007). The streamflow component of SWAT was calibrated on a daily basis using years 2005 and 2006 with one year of spin-up (2004). Validation was conducted using the 1/1/07-4/15/07 time period. Cover cropping was first implemented in the GB watershed in 2002. Changes in nitrate loading were examined to measure improvements in the reduction of nitrate loads by increasing cover crop implementation. Model simulations were run to estimate nitrate loads for two scenarios: (1) no cover crop implementation during the entire study period, and (2) increasing cover crop implementation starting from 2002 through 2007.
Science of the Total Environment (00489697)408(9)pp. 2096-2108
Restoration of the Chesapeake Bay, the largest estuary in the United States, is a national priority. Documentation of progress of this restoration effort is needed. A study was conducted to examine water quality in the Choptank River estuary, a tributary of the Chesapeake Bay that since 1998 has been classified as impaired waters under the Federal Clean Water Act. Multiple water quality parameters (salinity, temperature, dissolved oxygen, chlorophyll a) and analyte concentrations (nutrients, herbicide and herbicide degradation products, arsenic, and copper) were measured at seven sampling stations in the Choptank River estuary. Samples were collected under base flow conditions in the basin on thirteen dates between March 2005 and April 2008. As commonly observed, results indicate that agriculture is a primary source of nitrate in the estuary and that both agriculture and wastewater treatment plants are important sources of phosphorus. Concentrations of copper in the lower estuary consistently exceeded both chronic and acute water quality criteria, possibly due to use of copper in antifouling boat paint. Concentrations of copper in the upstream watersheds were low, indicating that agriculture is not a significant source of copper loading to the estuary. Concentrations of herbicides (atrazine, simazine, and metolachlor) peaked during early-summer, indicating a rapid surface-transport delivery pathway from agricultural areas, while their degradation products (CIAT, CEAT, MESA, and MOA) appeared to be delivered via groundwater transport. Some in-river processing of CEAT occurred, whereas MESA was conservative. Observed concentrations of herbicide residues did not approach established levels of concern for aquatic organisms. Results of this study highlight the importance of continued implementation of best management practices to improve water quality in the estuary. This work provides a baseline against which to compare future changes in water quality and may be used to design future monitoring programs needed to assess restoration strategy efficacy.
Stevens, M.D., Black, B.L., Lea-cox, J.D., Sadeghi, A., Harman-fetcho, J., Pfeil, E., Downey, P., Rowland, R., Hapeman, C.J.
HortScience (00185345)44(2)pp. 298-305
The environmental effects of the three strawberry (Fragaria ×ananassa) coldclimate production systems were compared: the traditional method of conventional matted row (CMR) and the two more recently developed practices of advanced matted row (AMR) and cold-climate plasticulture (CCP). Side-by-side field plots were instrumented with automated flow meters and samplers to measure and collect runoff, which was filtered and analyzed to determine soil, pesticide, and nitrogen losses. Although annual mean runoff volumes were similar for all three production systems, the soil losses fromCMRplots were two to three times greater than the CCP plots throughout the study and two to three times greater than the AMR plots only in the first year of the 3-year study. In general, decreases in erosion and runoff volumes were observed in plots that were disturbed less by machine operations and had less foot traffic as a result of decreased need for hand weeding and in the plots that used straw mulch in the furrows between the beds. Timing and intensity of precipitation events also influenced the amount of soil erosion. Pesticide residues and nitrogen losses were also greatest in the runoff from the CMR plots. The two systems that used drip fertigation, AMR and CCP, also had higher nitrogen uptake efficiencies. Overall, the CCP and AMR systems performed similarly for most criteria; however, considering the nonrenewable nature of the plastic mulch and the need to dispose of the plastic mulch in a landfill, theAMRsystem was more environmentally sustainable than the CCP system.
Journal of Soil and Water Conservation (00224561)64(5)pp. 303-313
Winter cover crops are recognized as an important agricultural conservation practice for reducing nitrogen (N) losses to groundwater following the summer growing season. Accordingly, cost-share programs have been established to promote winter cover crops for water quality on farms throughout the Chesapeake Bay watershed. However, current estimates of cover crop nutrient uptake are largely calculated from plot-scale studies extrapolated to watershed-scale based solely on enrollment acreage. Remote sensing provides a tool for rapid estimation of cover crop biomass production on working farms throughout the landscape. This project combined cost-share program enrollment data with satellite imagery and on-farm sampling to evaluate cover crop N uptake on 136 fields within the Choptank River watershed, on Maryland's eastern shore. The Normalized Difference Vegetation Index was a successful predictor of aboveground biomass for fields with >210 kg ha-1 (>187 lb ac-1) of vegetation (corresponding to 4.2 kg ha-1 [3.7 lb ac-1] of plant N ), below which the background reflectance of soils and crop residues obstructed the cover crop signal. Cover crops planted in the two weeks prior to the regional average first frost date (October 15) exhibited average fall aboveground N uptake rates of 18, 13, and 5 kg ha-1 (16, 12, 4 lb ac-1) for rye, barley, and wheat, respectively, corresponding to 1,260,725, and 311 kg ha-1 (1,124,647,277 lb ac-1) of aboveground biomass, with associated cost-share implementation costs of $5.49, $7.60, and $19.77 kg-1 N ($2.50, $3.46, and $8.99 lb-1 N). Cover crops planted after October 15 exhibited significandy reduced biomass and nutrient uptake, with associated program costs of $15.44 to $20.59 kg-1 N ($7.02 to $9.36 lb-1 N). Agronomic factors influencing cover crop performance included species, planting date, planting method, and previous crop. Field sampling locations with > 1,000 kg ha-1 (>890 lb ac-1) of springtime cover crop biomass exhibited gready reduced soil nitrate (<3 mg kg-1 [<3 ppm]) in comparison to fields with low cover crop biomass (up to 14 mg kg-1 soil nitrate), indicating a target biomass threshold for maximum water quality impact. Additional sampling years will be necessary to account for cover crop response to climate variability. Combining remote sensing with farm program data can provide important information to scientists and regulators working to improve conservation programs. Results can be used to more effectively utilize scarce conservation resources and increase water quality protection.
The U.S. Environmental Protection Agency (EPA) and European Union (EU) are engaged in an extensive effort to assess and improve surface water quality, including decreasing risks to public health from water-borne pathogens. In the absence of data for specific pathogens, indicators of fecal contamination such as Escherichia coli are utilized to assess water quality. However, the relationship(s) between indicators and pathogens, and their population dynamics in watersheds are poorly understood. We undertook this monitoring study in a small rural watershed with inputs from wildlife and grazing cattle to (i) evaluate fluctuations in E. coli populations and (ii) assess the use of virulence factors typically associated with pathogenic E. coli as indicators of water quality. Generic E. coli concentrations were substantially higher in agricultural than in forested sites indicative of the much higher fecal inputs from grazing cattle vs. wildlife. However, high E. coli concentrations found in stream sediments suggest that these may be relatively stable habitats for E. coli growth and survival and be responsible for some portion of the downstream contamination. A general decrease was observed in E. coli concentrations from summer through fall and winter. This decrease was partially due to decreased wildlife activity and cattle densities. However, an additional factor was likely "flushing" of sediment-borne E. coli caused by high discharge levels (due to high rainfall) beginning in late fall. Virulence factors associated with pathogenic E. coli (O157 serogroup, eae gene, and stx1/2 genes) were prevalent throughout the watershed; population dynamics were similar to generic E. coli. However, no definitive conclusions could be drawn regarding the presence or absence of specific pathogenic E. coli strains. Also, no correlation was observed between concentrations of generic E. coli and the eae gene at agricultural sites, suggesting that generic E. coli data cannot be used to predict the risk of pathogen exposure. Although our results are consistent with the well established principle that fecal runoff and deposition are the predominant source of water-borne E. coli contamination, they also illustrate the difficulty associated with the interpretation of water-borne E. coli data. Watershed water quality models should account for E. coli growth and survival in indigenous habitats and "flushing" of sediment-borne E. coli from watersheds, as well as for fecal runoff.
Faucette l.b., L.B., Sefton, K.A., Sadeghi, A., Rowland, R.
Journal of Soil and Water Conservation (00224561)63(4)pp. 257-264
In 2005, the US Environmental Protection Agency National Menu of Stormwater Best Management Practices, National Pollutant Discharge Elimination System Phase II for Construction Sites, listed compost filter socks as an approved best management practice for controlling storm runoff and sediment on construction sites. Like most new technologies used to control sediment on construction sites, little has been done to evaluate their performance relative to conventional sediment control barriers, such as silt fences. The objectives of this study were (1) to determine and compare the sediment removal efficiency of silt fence and compost filter socks, (2) to determine if the addition of polymers to compost filter socks could reduce sediment and phosphorus loads, (3) to determine relationships between compost filter media particle size distribution and pollutant removal efficiency and hydraulic flow-rate. Simulated rainfall was applied to soil chambers packed with Hatboro silt loam on a 10% slope. All runoff was collected and analyzed for hydraulic flow rate, volume, total suspended solids (TSS) concentration and load, turbidity, and total and soluble P concentration and load. Based on 7.45 cm h-1 (2.9 in hr-1) of simulated rainfall-runoff for 30 minutes duration, bare soil (control) runoff TSS concentrations were between 48,820 and 70,400 mg L -1 (6.5 oz gal-1 and 9.4 oz gal-1), and turbidity was between 19343 and 36688 Nephelometric Turbidity Units. Compost filter sock and silt fence removal efficiencies for TSS concentration (62% to 87% and 71% to 87%), TSS load (68% to 90% and 72% to 89%), and turbidity (53% to 78% and 54% to 76%) were nearly identical; however with the addition of polymers to the compost filter socks sediment removal efficiencies ranged from 91% to 99%. Single event support practice factors (P factor) for silt fence were between 0.11 and 0.29, for compost filter socks between 0.10 and 0.32, and for compost filter socks + polymer between 0.02 and 0.06. Total and soluble P concentration and load removal efficiencies were similar for compost filter socks (59% to 65% and 14% to 27%) and silt fence (63% and 23%). Although when polymers were added to the filter socks and installed on phosphorus fertilized soils, removal efficiencies increased to 92% to 99%. Compost filter socks restricted hydraulic flow rate between 2% and 22%, while the silt fence restricted between 5% and 29%. Significant correlations (p < 0.05) were found between middle range particle sizes of compost filter media used in the filter socks and reduction of turbidity in runoff; however, hydraulic flow rate was a better indicator (stronger correlation) of total pollutant removal efficiency performance for compost filter socks and should be considered as a new parameter for federal and state standard specifications for this pollution prevention technology.
Shirmohammadi a., A., Sexton a.m., A.M., Montas h., H., Sadeghi, A.
pp. 263-278
Watershed scale hydrologic and water quality models have been used with increasing frequency to devise alternative pollution control strategies. With recent reenactment of the 1972 Clean Water Act's TMDL (total maximum daily load) componenet, some of the watershed scale models are being recommended for TMDL assessments on watershed scale. However, it has been recognized that such models may have a large degree of uncertainty associated with their simulations, and that this uncertainty can significantly limit the utility of their output. This study uses two uncertainty methods in assessing the uncertainty in SWAT model's output due to variability in input parameter values in a small watershed (Warner Creek Watershed) located in northern Maryland. Both Latin Hypercube Sampling (LHS) with constrained Monte Carlo Simulation (MCS) technique and Mean Value First Order Reliability Method (MFORM) were utilized. Additionally, results obtained with MFORM were used to evaluate the margin of safety (MOS) in the TMDL assessment for the selected watershed. Results showed that using average parameter values for the watershed without considering their variability, may result in significant uncertainty in SWAT's simulated streamflow, sediment, and nitrate-nitrogen. Results also indicated the capability of MFORM in capturing the uncertainty in SWAT's simulations and identifying the most sensitive parameters. In addition, results of MFORM were successfully used to identify nutrient reduction rates that are necessary to meet watershed TMDL criteria with acceptable level of confidence. This study concluded that using a best possible distribution for the input parameters is much preferred over using an average value.
Journal of Soil and Water Conservation (00224561)63(6)pp. 461-474
The Choptank River is an estuary, tributary of the Chesapeake Bay, and an ecosystem in decline due partly to excessive nutrient and sediment loads from agriculture. The Conservation Effects Assessment Project for the Choptank River watershed was established to evaluate the effectiveness of conservation practices on water quality within this watershed. Several measurement frameworks are being used to assess conservation practices. Nutrients (nitrogen and phosphorus) and herbicides (atrazine and metolachlor) are monitored within 15 small, agricultural subwatersheds and periodically in the lower portions of the river estuary. Initial results indicate that land use within these subwatersheds is a major determinant of nutrient concentration in streams. In addition, the 18O isotope signature of nitrate was used to provide a landscape assessment of denitrification processes in the presence of the variable land use. Herbicide concentrations were not correlated to land use, suggesting that herbicide delivery to the streams is influenced by other factors and/or processes. Remote sensing technologies have been used to scale point measurements of best management practice effectiveness from field to subwatershed and watershed scales. Optical satellite (SPOT-5) data and ground-level measurements have been shown to be effective for monitoring nutrient uptake by winter cover crops in fields with a wide range of management practices. Synthetic Aperture Radar (RADARSAT-1) data have been shown to detect and to characterize accurately the hydrology (hydroperiod) of forested wetlands at landscape and watershed scales. These multiple approaches are providing actual data for assessment of conservation practices and to help producers, natural resource managers, and policy makers maintain agricultural production while protecting this unique estuary.
Sadeghi, A., Mccarty g.w., G.W., Hively d., D., Moriasi d.n., D., Shirmohammadi a., A.
pp. 424-434
Open ditch drainage water management (also referred to as controlled drainage) is an old management strategy in agriculture, but recently has gained widespread use because of its potential impacts on nitrate reduction through enhanced denitrification. This is particularly a useful management strategy for the Chesapeake Bay region in Maryland, where nitrogen loads from agriculture has been cited as major components of overall nitrate loads into the Bay. Excess nutrients (especially N & P) entering surface water have shown to increase algal production, causing eutrophication of coastal water ecosystems. Controlled drainage restricts outflow during periods of the year when equipment operations are not required in the field (i.e. winter and midsummer) and to allow natural drainage to occur during the rest of the year, maintaining the water table below the crop root zone. This practice not only restricts the water flows into the Bay, but also allows more denitrification to occur, reducing the level of nitrogen in the ultimate flowing waters into the Bay. A study is undertaken on the Choptank watershed in the Eastern Shore region of Maryland to assess the quantitative role of these control structures in reducing nitrogen loads into surface waters and their overall impact on watershed water quality.
Sadeghi, A., Yoon, K., Graff c., C., Mccarty g.w., G.W., Mcconnell, L.L., Shirmohammadi a., A., Hively d., D., Sefton, K.A.
3
This study was conducted under the USDA-Conservation Effects Assessment Project (CEAP) on the Choptank River watershed which is located within the Chesapeake Bay watershed on the Delmarva Peninsula in Maryland, U.S.A. The watershed is nearly 1036 km2 and is dominated primarily by corn and soybean production with extensive poultry production operations. Portions of the watershed have been identified as "impaired waters" under Section 303(d) of the Federal Clean Water Act due to high levels of nutrients and sediment. In recent years, a significant number of state and federal incentive programs have been implemented for water quality improvement in this watershed, but environmental benefits from these programs have never been quantified. Two of the most widely used USDA watershed-scale models, Soil and Water Assessment Tool (SWAT) and Annualized Agricultural Non-Point Source (AnnAGNPS) were applied to quantify the environmental benefits widely used practices like winter cover crops. Five years (1991-1995) of detailed observed flow and water quality data were used to provide baseline calibration and validation for the two models. Simulation results showed significant differences in base-flow estimations for the two models. This difference may be considered to be a significant factor in model selection to estimate nutrients and sediment loads in regions of fairly flat landscapes such as the Coastal Plain physiographic region. This study concluded that both SWAT and AnnAGNPS models performed well for simulating hydrologie conditions, however, for nitrate loads, AnnAGNPS NC coefficients were relatively lower, slightly above an acceptable 0.5 value.
Vegetable production practices combining copper-based pesticides with polyethylene mulch create conditions for highly toxic runoff emissions to surface waters. Copper hydroxide is a widely used fungicide-bactericide approved for both organic and conventional agricultural production of vegetable crops for control of diseases. Copper-based pesticides are often viewed as more "natural" than synthetic organic pesticides, but aquatic biota, such as the saltwater bivalve Mercenaria mercenaria, are extremely sensitive to low concentrations of copper. The use of polyethylene mulch in organic and traditional vegetable production is gaining popularity because it decreases pesticide use and warms the soil allowing for earlier crop planting, but its use also increases runoff volume and soil erosion. Two field studies were conducted to evaluate the effectiveness of management practices to reduce loads of copper in runoff from tomato production. Seasonal runoff losses of 20 to 36% of applied copper hydroxide were observed in tomato plots using plastic mulch with bare soil furrows. The addition of vegetative furrows between the raised, polyethylene-covered beds or the replacement of polyethylene mulch with vegetative residue mulch reduced copper loads in runoff by an average of 72 and 88%, respectively, while maintaining harvest yields. Use of these alternative management practices could reduce surface water concentrations in nearby streams from the observed 22 μg/L to approximately 6 and 3 μg/L, respectively, which would be below the median lethal concentration for larval clams (M. mercenaria 96-h LC 50 = 21 μg/L) and close to or below the EPA guidelines to protect aquatic life (24-h average = 5.4 μg /L for fresh water and 4.0 μg /L for salt water).
Kouznetsov m.y., , Roodsari r., , Pachepsky, Y.A., Shelton d.r., D.R., Sadeghi, A., Shirmohammadi a., A., Starr, J.L.
Journal of Environmental Management (03014797)84(3)pp. 336-346
Hillslope vegetated buffers are recommended to prevent water pollution from agricultural runoff. However, models to predict the efficacy of different grass buffer designs are lacking. The objective of this work was to develop and test a mechanistic model of coupled surface and subsurface flow and transport of bacteria and a conservative tracer on hillslopes. The testing should indicate what level of complexity and observation density might be needed to capture essential processes in the model. We combined the three-dimensional FEMWATER model of saturated-unsaturated subsurface flow with the Saint-Venant model for runoff. The model was tested with data on rainfall-induced fecal coliforms (FC) and bromide (Br) transport from manure applied at vegetated and bare 6-m long plots. The calibration of water retention parameters was unnecessary, and the same manure release parameters could be used both for simulations of Br and FC. Surface straining rates were similar for Br and bacteria. Simulations of Br and FC concentrations were least successful for the funnels closest to the source. This could be related to the finger-like flow of the manure from the strip along the bare slopes, to the transport of Br and FC with manure colloids that became strained at the grass slope, and to the presence of micro-ponds at the grassed slope. The two-dimensional model abstraction of the actual 3D transport worked well for flux-averaged concentrations. The model developed in this work is suitable to simulate surface and subsurface transport of agricultural contaminants on hillslopes and to evaluate efficiency of grass strip buffers, especially when lateral subsurface flow is important.
Transactions of the ASABE (21510040)49(4)pp. 987-1002
Fecal contamination of surface waters is a critical water-quality issue, leading to human illnesses and deaths. Total Maximum Daily Loads (TMDLs), which set pollutant limits, are being developed to address fecal bacteria impairments. Watershed models are widely used to support TMDLs, although their use for simulating in-stream fecal bacteria concentrations is somewhat rudimentary. This article provides an overview of fecal microorganism fate and transport within watersheds, describes current watershed models used to simulate microbial transport, and presents case studies demonstrating model use. Bacterial modeling capabilities and limitations for setting TMDL limits are described for two widely used watershed models (HSPF and SWAT) and for the load-duration method. Both HSPF and SWAT permit the user to discretize a watershed spatially and bacteria loads temporally. However, the options and flexibilities are limited. The models are also limited in their ability to describe bacterial life cycles and in their ability to adequately simulate bacteria concentrations during extreme climatic conditions. The load-duration method for developing TMDLs provides a good representation of overall water quality and needed water quality improvement, but intra-watershed contributions must be determined through supplemental sampling or through subsequent modeling that relates land use and hydrologic response to bacterial concentrations. Identified research needs include improved bacteria source characterization procedures, data to support such procedures, and modeling advances including better representation of bacteria life cycles, inclusion of more appropriate fate and transport processes, improved simulation of catastrophic conditions, and creation of a decision support tool to aid users in selecting an appropriate model or method for TMDL development.
Proceedings of the National Academy of Sciences of the United States of America (10916490)102(45)pp. 16152-16157
Perchlorate is a goitrogenic anion that competitively inhibits the sodium iodide transporter and has been detected in forages and in commercial milk throughout the U.S. The fate of perchlorate and its effect on animal health were studied in lactating cows, ruminally infused with perchlorate for 5 weeks. Milk perchlorate levels were highly correlated with perchlorate intake, but milk iodine was unaffected, and there were no demonstrable health effects. We provide evidence that up to 80% of dietary perchlorate was metabolized, most likely in the rumen, which would provide cattle with a degree of refractoriness to perchlorate. Data presented are important for assessing the environmental impact on perchlorate concentrations in milk and potential for relevance to human health.
Roodsari r., , Shelton d.r., D.R., Shirmohammadi a., A., Pachepsky, Y.A., Sadeghi, A., Starr, J.L.
Transactions of the American Society of Agricultural Engineers (00012351)48(3)pp. 1055-1061
Land application of manure is recommended to recycle organic matter and nutrients, thus enhancing the soil quality and crop productivity. However, pathogens in manure may pose a human health risk if they reach potable or recreational water resources. The objective of this study was to observe and quantify the effects of vegetated filter strips (VFS) on surface and vertical transport of fecal coliform (FC) bacteria, surrogates for bacterial pathogens, released from surface-applied bovine manure. A two-sided lysimeter with 20% slope on both sides was constructed with a sandy loam soil on one side and a clay loam soil on the other. Each side of the lysimeter was divided into two subplots (6.0 x 6.4 m), one with grass and the other with bare soil. Plots were instrumented to collect runoff samples along a 6.0 m slope at three equidistant transects. Samples of runoff were also collected in a gutter at the edge of each plot. All plots were equipped with multi-sensor capacitance moisture probes to monitor water content through the soil profile. Bovine manure was applied at the top of each plot in a 30 cm strip. Rainfall was simulated at a 61 mm h -1 intensity using a portable rainfall simulator. Surface runoff rate was measured and water quality sampled periodically throughout the simulation. Soil samples were taken at incremental depths (0-60 cm) after each simulation. Runoff (as % of total rainfall) decreased from 93% to 12% in the bare vs. vegetated clay loam plots and from 61% to 2% in the bare vs. vegetated sandy loam plots. The reduced runoff from vegetated plots decreased the surface transport of FC while increasing its vertical transport. The amount of FC in runoff (as % of applied) decreased from 68% to 1% in the bare vs. vegetated clay loam plots and from 23% to non-detectable levels in the bare vs. vegetated sandy loam plots. These data indicate that VFS can reduce surface transport of FC, even for slopes as high as 20%, especially in soils with high infiltration (e.g., sandy loam).
Graff c., C., Sadeghi, A., Lowrance r.r., , Williams r.g.,
Transactions of the American Society of Agricultural Engineers (00012351)48(4)pp. 1377-1387
Conservation practices, such as buffers, are often installed to mitigate the effects of nutrients and sediment runoff from agricultural practices. The Riparian Ecosystem Management Model (REMM) was developed as process-based model to evaluate the fate of nutrients and sediment through a riparian buffer up to the edge of a stream. A one-at-a-time sensitivity analysis was performed on REMM to evaluate the effects that changing herbaceous buffer scenarios have on N, P, and sediment in surface and ground water. Vegetation characteristics such as rooting depth, LAI, and plant height, along with some physical buffer characteristics were varied within their "typical" range and compared to a "base case" scenario. Model outputs were not sensitive to plant height or LAI, but moderately sensitive to changes in SLA. Model outputs were only sensitive to rooting depth as roots became shallower in the soil profile. Sediment yield and dissolved nitrate in surface water were the most sensitive to changes in Manning's n, while other soil physical characteristics such as surface roughness, surface condition, and % bare soil had little to no effect on model outcomes. Dissolved surface nitrate, organic P, and dissolved subsurface nitrate were all moderately sensitive to changes in saturated hydraulic conductivity and the slope of the buffer. Results indicate that within the model, many vegetation characteristics do not directly play a role in the physical transport of nutrients and sediment in surface and subsurface water; therefore, utilizing REMM to evaluate effects of specific herbaceous plant types may have limited value unless specific leaf area or rooting depth are considered. It would be possible to model and perhaps achieve specific load reductions by modifying slope and other physical characteristics or by considering forest versus grass buffers.
Neurath, S.K., Sadeghi, A., Shirmohammadi a., A., Isensee, A.R., Torrents, A.
Chemosphere (00456535)54(4)pp. 489-496
Atrazine transport through packed 10 cm soil columns representative of the 0-10 cm soil horizon was observed by measuring the atrazine recovery in the total leachate volume, and upper and lower soil layers following infiltration of 7.5 cm water using a mechanical vacuum extractor (MVE). Measured recoveries were analyzed to understand the influence of infiltration rate and delay time on atrazine transport and distribution in the column. Four time periods (0.28, 0.8, 1.8, and 5.5 h) representing very high to moderate infiltration rates (26.8, 9.4, 4.2, and 1.4 cm/h) were used. Replicate soil columns were tested immediately and following a 2-d delay after atrazine application. Results indicate atrazine recovery in leachate was independent of infiltration rate, but significantly lower for infiltration following a 2-d delay. Atrazine distribution in the 0-1 and 9-10 cm soil layers was affected by both infiltration rate and delay. These results are in contrast with previous field and laboratory studies that suggest that atrazine recovery in the leachate increases with increasing infiltration rate. It appears that the difference in atrazine recovery measured using the MVE and other leaching experiments using intact soil cores from this field site and the rain simulation equipment probably illustrates the effect of infiltrating water interacting with the atrazine present on the soil surface. This work suggests that atrazine mobilization from the soil surface is also dependent on interactions of the infiltrating water with the soil surface, in addition to the rate of infiltration through the surface soil.
Chu t.w., , Shirmohammadi a., A., Montas h., H., Sadeghi, A.
Transactions of the American Society of Agricultural Engineers (00012351)47(5)pp. 1523-1538
Mathematical watershed-scale models are among the best tools available for analyzing water resources (quantity and quality) issues in spatially diverse watersheds since continuous water quality monitoring is expensive and spatially impractical in mixed land use watersheds. However, models without appropriate validation may lead to misconceptions and erroneous predictions. This study used six years of hydrologic and water quality data to calibrate and validate the capability of SWAT (Soil and Water Assessment Tool) model in assessing nonpoint source pollution for a 346 ha watershed in the Piedmont physiographic region. The evaluation of the hydrology component of SWAT completed in a previous study pointed out that SWAT has no mechanism to account for subsurface flow contributions from outside the watershed. For this evaluation, all nutrient loadings leaving the watershed were adjusted to subtract the chemical transport via subsurface flow contributions from outside the watershed. Evaluation results indicated a strong agreement between yearly measured and simulated data for sediment, nitrate, and soluble phosphorus loadings. However, simulations of monthly sediment and nutrient loadings were poor. Overall, it was concluded that SWAT is a reasonable watershed-scale model for long-term simulation of different management scenarios. However, its use on storm-by-storm or even on monthly basis may not be appropriate for watersheds with similar physiography and size. Additionally, ignoring the subsurface contribution of water and chemicals from outside the watershed into the watershed aquifer could cause significant errors in model prediction.
Environmental Toxicology and Chemistry (07307268)23(3)pp. 719-725
The transport of runoff with high copper concentrations and sediment loads into adjacent surface waters can have adverse effects on nontarget organisms as a result of increased turbidity and degraded water quality. Runoff from vegetable production utilizing polyethylene mulch can contain up to 35% of applied copper, a widely used fungicide/bactericide that has adverse effects on aquatic organisms. Copper is primarily transported in runoff with suspended particulates; therefore, implementation of management practices that minimize soilerosion will reduce copper loads. Replacing bare-soil furrows with furrows planted in rye (Secale cereale) significantly improved the sustainability of vegetable production with polyethylene mulch and reduced the potential environmental impact of this management practice. Vegetative furrows decreased runoff volume by >40% and soil erosion by >80%. Copper loads with runoff were reduced by 72% in 2001, primarily as a result of reduced soil erosion since more than 88% of the total copper loads were transported in runoff with suspended soil particulates. Tomato yields in both years were similar between the polyethylene mulch plots containing either bare-soil or vegetative furrows. Replacing bare-soil furrows with vegetative furrows greatly reduces the effects of sediments and agrochemicals on sensitive ecosystems while maintaining crop yields.
Communications in Soil Science and Plant Analysis (00103624)34(3-4)pp. 457-480
A field study was conducted from 1997 to 2000 to determine the biomass production of 'Pete' eastern gamagrass [Tripsacum dactyloides (L.) L.] grown on a restrictive (acid compact) soil on six unlimed sites located on the North Farm at the Beltsville Agricultural Research Center in Beltsville, MD. Total biomass from two cuttings in 1997, 1998, and 2000 averaged 4261, 4995, and 6611 kg ha-1, respectively, despite deficits in moisture during those years. A single cutting in July 1999 averaged 2288 kg ha-1. Overall biomass varied more than two-fold for the six sites. Significant differences in biomass were found among sites, years, and harvests. In general, biomass varied with position on the slope, bulk density, and depth of the topsoil, but not with pH. The average biomass was generally lowest at the top of the slope where the thickness of the Ap horizon was relatively thin and the soil was extremely acidic (pH 4.3-4.4) (1:1 soil-water). High silt content and low bulk density of the soil were associated with highest biomass; rainfall distribution also appeared to be important. Eastern gamagrass at Sites 4 to 6 at the top of the slope generally had a low pH (4.3-4.4), high penetrometer resistance, and high bulk density. Penetrometer readings were lower between rows than within rows at the 5 cm depth. There was no significant relationship between bulk density and penetrometer readings, but biomass appeared to be related to bulk density. Dry weight of roots was reduced by increasing bulk density (r2 = 0.57) and also reduced at depths below 15 cm. Despite adverse stress imposed by shallow top soil, low pH, high bulk density, and moisture deficits, eastern gamagrass produce relatively high biomass. These results demonstrate the resilience of eastern gamagrass to an acid compact soil and indicate that this species is suited for reclamation of acid compact soils and for producing high quality forage on marginal lands, when supplied with adequate NPK.
Pest Management Science (1526498X)59(6-7)pp. 681-690
Environmentally and economically viable agriculture requires a variety of cultivation practices and pest management options as no one system will be appropriate for every situation. Agrochemicals are some of the many pest control tools used in an integrated approach to pest management. They are applied with the intent of maximizing efficacy while minimizing off-site movement; however, their judicious use demands a practical knowledge of their fate and effects in agricultural and natural ecosystems. Agrochemical distribution into environmental compartments is influenced by the physical and chemical properties of the agrochemical and environmental conditions, ie soil type and structure, and meteorological conditions. Agricultural Research Service (ARS) researchers working in the area of agrochemical fate have focused on accurately describing those processes that govern the transport, degradation and bioavailability of these chemicals under conditions reflecting actual agronomic practices. Results from ARS research concerning the environmental fate and effects of agrochemicals have led to the development of science-based management practices that will protect vulnerable areas of the ecosystem. The new challenge is to identify these vulnerable areas and the temporal and spatial variations prior to use of the chemical by predicting how it will behave in environmental matrices, and using that information, predict its transport and transformation within an air- or watershed. With the development of better predictive tools and GIS (Geographic Information System)-based modeling, the risks of agricultural management systems can be assessed at the watershed and basin levels, and management strategies can be identified that minimize negative environmental impacts.
Teasdale, J.R., Shelton d.r., D.R., Sadeghi, A., Isensee, A.R.
Weed Science (00431745)51(4)pp. 628-634
High levels of cover-crop residue can suppress weed emergence and also can intercept preemergence herbicides and potentially reduce their effectiveness. This research was conducted in continuous no-tillage corn to compare the effect of residue from a hairy vetch cover crop with that of background crop residue on the soil solution concentration of atrazine and metolachlor and on the emergence of weeds with and without herbicide treatment. In a 3-yr field experiment, 5-cm-deep soil samples were taken and the weed density measured in paired microplots with and without herbicide at approximately weekly intervals after application of atrazine and metolachlor. High levels of residue were present in both treatments; the percentage of soil covered by residue ranged from 91 to 99 in the no-cover-crop treatment and from 99 to 100 in the hairy vetch treatment. Initial metolachlor concentration was lower and degradation rate higher in two of the 3 yr with a hairy vetch cover crop than without a cover crop. Cover-crop treatment had little effect on atrazine concentration or degradation. Annual grass weeds (predominantly fall panicum) were the major species in this field. Hairy vetch alone reduced grass emergence by 50 to 90%, and preemergence herbicides alone reduced emergence by 72 to 93% compared with the treatment without cover crop and herbicide. The combination of preemergence herbicides with hairy vetch provided only 24 to 61% control of grass weeds compared with control by hairy vetch alone and 23 to 52% compared with control by herbicide alone, suggesting an antagonism probably resulting from reduced metolachlor concentration by hairy vetch residue. Metolachlor with hairy vetch delayed emergence of weeds and reduced the concentration of metolachlor required to prevent emergence initiation compared with metolachlor without a cover crop.
Manures are sources of several human pathogens that can potentially contribute to surface and groundwater contamination. Microorganisms must first be released from the manure matrix before they can infiltrate into and leach through the vadose zone. The objective of this study was to estimate rates of rainfall-induced release of fecal coliform (FC) from surface-applied bovine manure. Simulated rainfall of 7.1 cm h-1 was applied to the surface of 90-cm-long lysimeters filled with the undisturbed stony soil. When the steady state was reached, clumps of manure were placed on the surface. Rainfall was continued for about 5 h after application of manure, and 10-min leachate portions were analyzed for turbidity and FC. The convective-dispersive equation with linear adsorption-exclusion and the first-order removal-regrowth terms was used as a model of the coliform transport in soil. Asymptotic properties of the solution of this equation with the exponentially decreasing boundary concentration were used to infer the release rate constant from the FC breakthrough curves. A value of 0.0054 ± 0.0015 min-1 was found for the FC release rate constant. The regression line of reduced coliform concentrations on reduced turbidity values was not significantly different from the one-to-one line; R2 was 0.807. Assuming that turbidity can be used as a measure of concentration of manure particulates in leachates, we found that average values for the release rate constants were not significantly different for FC and manure particulates. The average velocity of bacteria and manure particulates transport was about seven times larger than the average pore velocity. The proposed technique of estimating FC and manure release rates shows promise for use in further studies needed to elucidate and assess factors affecting release rate.
Rice, P.J., Mcconnell, L.L., Heighton, L.P., Sadeghi, A., Isensee, A.R., Teasdale, J.R., Abdul-baki, A., Harman-fetcho, J., Hapeman, C.J.
Environmental Toxicology and Chemistry (07307268)21(1)pp. 24-30
Runoff from tomato (Lycopersicon esculentum Mill.) production with polyethylene mulch has been implicated in the failure of commercial shellfish farms in the Mid-Atlantic Region of the United States. Copper, applied in the form of copper hydroxide, is the most widely used fungicide-bactericide for control of tomato diseases and recently has been detected in the Chesapeake Bay (USA) watershed. Elevated levels of copper have been shown to have adverse effects on shellfish, finfish, and other aquatic organisms. This research evaluates the off-site movement of copper with the dissolved phase and the particulate phase of runoff from controlled field plots containing tomato plants grown in either polyethylene mulch or a vegetative mulch, hairy vetch (Vicia villosa Roth.). Overall, runoff collected from polyethylene mulch plots contained significantly (p ≤ 0.05) greater loads of dissolved- and particulate-phase copper than runoff from hairy vetch mulch plots. However, the loss of copper associated with the particulate phase was significantly greater (p ≤ 0.05) than that associated with the dissolved phase of runoff from both mulch treatments, with the particulate phase accounting for more than 80% of the copper loads. The reported toxicity of copper to aquatic organisms and the greater runoff volume, soil loss, and off-site loading of copper measured in runoff from the polyethylene mulch suggests that this management practice is less sustainable and may have a more harmful impact on aquatic ecosystems.
Journal of Environmental Quality (00472425)30(5)pp. 1808-1821
Current vegetable production systems use polyethylene (plastic) mulch and require multiple applications of agrochemicals. During rain events, runoff from vegetable production is enhanced because 50 to 75% of the field is covered with an impervious surface. This study was conducted to quantify off-site movement of soil and pesticides with runoff from tomato (Lycopersicon esculentum Mill.) plots containing polyethylene mulch and a vegetative mulch, hairy vetch (Vicia villosa Roth). Side-by-side field plots were instrumented with automated flow meters and samplers to measure and collect runoff, which was filtered, extracted, and analyzed to determine soil and pesticide loss. Seasonal losses of two to four times more water and at least three times as much sediment were observed from plots with polyethylene mulch (55.4 to 146 L m-2 and 247 to 535 g m-2, respectively) versus plots with hairy vetch residue (13.7 to 75.7 L m-2 and 32.8 to 118 g m-2, respectively). Geometric means (±standard deviation) of total pesticide loads for chlorothalonil (tetrachloroisophthalonitrile) and α- and β-endosulfan (6,7,8,9,10,10-hexachloro-1,5,5a,6,9,9a-hexahydro-6,9-methano-2,4,3- benzodioxathiepin 3-oxide) for a runoff event were 19, 6, and 9 times greater from polyethylene (800 ± 4.6, 17.6 ± 3.9, and 39.1 ± 4.9 μg m-2, respectively) than from hairy vetch mulch plots (42 ± 6.0, 2.8 ± 5.0, and 4.3 ± 4.6 μg m-2, respectively) due to greater concentrations and larger runoff volumes. The increased runoff volume, soil loss, and off-site loading of pesticides measured in runoff from the polyethylene mulch suggests that this management practice is less sustainable and may have a harmful effect on the environment.
Shelton d.r., D.R., Sadeghi, A., Mccarty g.w., G.W.
Soil Science (0038075X)165(4)pp. 365-371
Experiments were conducted to assess the effects of soil water content on denitrification during hairy vetch (Vicia villosa) decomposition. Hairy vetch plants were grown from seed to maturity in soil cores. Before and after kill, simulated rainfall was applied to cores weekly and leachate was analyzed for NO3/- and NH4/+. Denitrification incubations (ca. 48 h duration) were conducted 3, 17, 31, and 45 days after kill using the acetylene block method. Soil water content was varied systematically to give a range of percent water-filled pore space (WFPS) values from field capacity (60%) to saturation (100%). Little denitrification occurred on Day 3 (<2 nag N2O-N). Substantial denitrification occurred on Days 17, 31, and 45, with maxima of 44, 27, and 30 mg N2O-N produced in saturated cores, respectively, accounting for approximately 60 to 75% of the total inorganic N (NO3/- + NH4/+ + N2O) present in cores. There was an apparent linear relationship between denitrification and soil water content (WFPS), with a threshold for denitrification at ca. 60% WFPS. Cumulative N lost from cores during four denitrification incubations ranged from 1 to 48 Kg N ha-1, depending on percent WFPS. Rates of N mineralization were relatively linear after denitrification incubations (55 days). Cumulative N mineralized from unsaturated cores was ca. 190 Kg N ha-1 through 120 days after kill. These data indicate that substantial quantities of vetch-N may be lost during decomposition.
Two long, term no-till corn production studies, representing different soil texture, consistently showed higher leaching of atrazine [2-chloro-4- (ethylamino)-6-(isopropylamino)-s-triazine] to groundwater in a silt loam soil than in a sandy loam soil. A laboratory leaching study was initiated using intact soil cores from the two sites to determine whether the soil texture could account for the observed differences. Six intact soil cores (16 cm dia by 20 cm high) were collected from a four-year old no-till corn plots at each of the two locations (ca. 25 km apart). All cores were mounted in funnels and the saturated hydraulic conductivity (K(sat)) was measured. Three cores (from each soil texture) with the lowest K(sat) were mixed and repacked. All cores were surface treated with 1.7 kgaiha-1 [ring-14C] atrazine, subjected to simulated rainfall at a constant 12 mmh-1 intensity until nearly 3 pore volume of leachate was collected and analyzed for a total of 14C. On an average, nearly 40% more of atrazine was leached through the intact silt loam than the sandy loam soil cores. For both the intact and repacked cores, the initial atrazine leaching rates were higher in the silt loam than the sandy loam soils, indicating that macropore flow was a more prominent mechanism for atrazine leaching in the silt loam soil. A predominance of macropore flow in the silt loam soil, possibly due to greater aggregate stability, may account for the observed leaching patterns for both field and laboratory studies. (C) 2000 Elsevier Science Ltd.
Journal of Plant Nutrition (01904167)22(10)pp. 1551-1566
Shallow rooting and susceptibility to drought are believed to be caused, at least in part, by strongly acidic (pH <5.5, 1:1 soil-water), aluminum (A1)-toxic subsoils. However, this hypothesis has not been clearly confirmed under field conditions. The A1 toxicity hypothesis was tested on a map unit of Matawan-Hammonton loam (0-2% slope) on unlimed and limed field plots (pH range 5.1 to 5.8) at Beltsville, MD, during 1994 to 1998. Aluminum-tolerant and sensitive pairs of barley (Hordeum vulgare L.), wheat [Triticum aestivum (L.)], snap bean (Phaseolus vulgaris L.), and soybean [Glycine max (L.) Merr.] cultivars were used as indicator plants. Eastern gamagrass [Tripsacum dactyloides (L.) L.], cultivar 'Pete', reported to tolerate both chemical and physical stress factors in soils, was grown for comparison. Shoots of A1-sensitive 'Romano' snap beans showed a significant response to liming of the 0-15 cm surface layer, but those of A1-tolerant 'Dade' did not, indicating that A1 toxicity was a growth limiting factor in this acid soil at pH 5.1. Lime response of the A1-tolerant and sensitive cultivars of barley, wheat, and soybean were in the same direction but not significant at the 5% level. Aluminum-tolerant and sensitive cultivars did not differ in abilities to root in the 15-30 cm soil depth. Only 9 to 25% of total roots were in this layer, and 75 to 91% were in the 0-15 cm zone. No roots were found in the 30-45 cm zone which had a pH of 4.9. Soil bulk density values of 1.44 and 1.50 g cm-3 in the 15-30 and 30-45 cm zones, respectively, indicated that mechanical impedance was a primary root barrier. Results indicated that restricted shoot growth and shallow rooting of the A1-indicator plants studied in this acid soil were due to a combination of Al toxicity and high soil bulk density. Confounding of the two factors may have masked the expected response of indicator plants to A1. These two growth restricting factors likely occur in many, if not most acid, problem subsoils. Studies are needed to separate these factors and to develop plant genotypes that have tolerance to multiple abiotic stresses. Unlike the A1 indicator cultivars, eastern gamagrass showed high tolerance to acid, compact soils in the field and did not respond to lime applications (pH 5.1-5.8).
Mccarty g.w., G.W., Shelton d.r., D.R., Sadeghi, A.
Biology And Fertility Of Soils (01782762)30(3)pp. 173-178
There has been concern that the measurement of gas emissions from a soil surface may not accurately reflect gas production within the soil profile. But, there have been few direct assessments of the error associated with the use of surface emissions for estimating gas production within soil profiles at different water contents. To determine the influence of air porosity on the distribution of gases within soil profiles, denitrification assays were performed using soil columns incubated with different water contents to provide air porosities of 18%, 13%, and 0% (equivalent to 62%, 73%, and 100% water-filled pore space, respectively). The soil columns were formed by packing sieved soil into cylinders which could be sealed at the top to form a headspace for the measurement of surface emissions of soil gases. Gas-permeable silicone tubing was placed at three depths (4.5, 9, and 13.5 cm) within each soil core to permit the measurement of gas concentration gradients within the soil core. Assays for denitrification were initiated by the addition of acetylene (5 kPa) to the soil column, and gas samples were taken from both the headspace and gas-permeable tubing at various times during a 46-h incubation. The results showed that at 18% air porosity, the headspace gases were well equilibrated with pore-space gases, and that gas emissions from the soil could provide good estimates of N2O and CO2 production. At air porosities of 13% and 0%, however, substantial storage of these gases occurred within the soil profiles, and measurements of surface emissions of gas from the soils greatly underestimated gas production. For example, the sole use of N2O emission measurements caused three to five fold underestimates of N2O production in soil maintained at 13% air porosity. It was concluded that the confounding influence of soil moisture on gas production and transport in soil greatly limits the use of surface emissions as a reliable indicator of gas production. This is particularly pertinent when assessing processes such as denitrification in which N gas production is greatly promoted by the conditions that limit O2 influx and concurrently limit N gas efflux.
Many of the variables that control transport of agrochemicals and pathogens in the field are difficult to measure because parameters such as slope, soil and plant conditions, and rainfall cannot be adequately controlled in the natural environment. This paper describes the design, construction, operation and performance of a system useful for studying surface transport of agrochemicals and pathogens under controlled slope, rainfall and soil conditions. A turntable is used to support and rotate 4 soil chambers under oscillating dripper units capable of simulating rainfall intensities from 1 to 43 mm h-1. Chambers (35 x 100 x 18 cm i.d.) were constructed with an adjustable height discharge gate to collect runoff and three drains to collect leachate. Height adjustable platforms were constructed to support and elevate the chambers up to 20% slope. The chambers were uniformly packed with 35 to 45 kg of soil (bulk density 1.18-1.27 g cm-3) and initially saturated with two low intensity rain events. The coefficient of variation of the rainfall delivery over a range of 5 to 43 mm h-1 averaged 7.5%. An experiment to determine the variability between chambers in runoff amount and uniformity indicated that at least one runoff-equilibration cycle is needed to obtain steady state conditions for conducting runoff transport evaluations. Another experiment conducted to evaluate atrazine [2-chloro-4-(ethylamino)-6-(isopropylamino)-s-triazine] runoff under simulated crop-residue covered vs bare soil conditions indicated six times more runoff from bare than crop residue covered soil. The system is capable of precise application of simulated rain, the simultaneous collection of runoff and leachate at slopes up to 20% and can be easily modified to meet a wide range of research parameters.
This study was designed to compare rates of herbicide dissipation and leaching in side-by-side microplots that have been under no-till and plow-till practices for various time periods. Microplots were established within eight field plots (0.1 to 0.25 ha) that had been in no-till for 1 or 4 years, resulting in 1-year and 4-year no-till and 1-year and 4-year plow-till treatments. Before application of atrazine, alachlor, and cyanazine, surface crop residues were removed from the no-till treatments to ensure comparable and uniform applications to soil surfaces. Soil samples were collected at 8, 14, 21, and 32 days after application at depth increments of 0 to 1.5, 1.5 to 3, 3 to 5, 5 to 10, and 10 to 20 cm (weed root zone). The leaching rate was slower in the no-till than in the plow-till treatment. Atrazine and cyanazine levels in the top 1.5 cm of soil, relative to the remaining soil profile, were nearly 50% higher in the 4-year no-till than in the 4-year plow-till microplots. The leaching trend of the 4-year plow-till was similar to that of the I year for both no-till and plow-till soils. Regardless of the tillage age differences, the concentrations of atrazine and cyanazine were higher in the no-till than in the plow-till microplots for the 8-day sampling, contrary to observation in previous studies where crop residues were present on the no-till system at the time of herbicide application. Dissipation rates for all three herbicides were approximately linear and in the order of cyanazine > atrazine > alachlor. Also, results indicate distribution of residues within the soil profile were different for atrazine and cyanazine than for alachlor. This observation is most likely caused by the differences in herbicide formulations.
Studies were conducted to determine atrazine sorption (partitioning), bioavailability (soil solution concentrations), and dissipation in the top 0 to 1.5, 1.5 to 3, and 3 to 5 cm of soil as a function of tillage. Paired microplots (plow-till vs no-till) were established in replicated long-term tillage field plots, such that treatments included 4-year plow-till, 4-year no-till, 1-year plow-till, and 1-year no-till. Organic carbon content in the top 0 to 1.5 cm was about 75% greater in 4-year no-till soil than in 4-year plow-till soil; at lower depths, organic carbon contents were consistently lower in 4-year no-till soil. Soil solution concentrations of atrazine in the top 0 to 1.5 cm of soil were approximately twofold lower in 4-year no-till soil than in 4-year plow-till soil (5.4 μg mL-1 vs 10.1 μg mL-1) 8 days after application. This was caused by increased sorption and higher gravimetric moisture contents. Soil solution concentrations of atrazine in the 1.5 to 3- and 3 to 5-cm soil depths were also lower in 4-year no-till soil compared with plow-till soil, apparently as a result of increased leaching in plow-till. Soil solution concentrations for 1-year plow-till and 1-year no-till soils were intermediate. Relative percentages of atrazine recovered in the top 5 cm of soil were comparable with tillage treatments (> 80%) through Day 21, indicating that the bulk of atrazine remained within the zone of weed germination. Rates of dissipation and leaching (0-5 cm) were comparable for plow-till versus no-till soil. These data indicate that atrazine bioavailability is diminished significantly in no-till soils, which may contribute to losses of atrazine efficacy.
Crop residue and living vegetation in no-till fields can intercept large amounts of the pesticides applied at the time of planting. Previous studies have shown that the type of plant tissue intercepting the pesticide can affect the amount washed off. This report compares the washoff characteristics of two cover crops with dead crop residue before and after treatment with burn-down herbicides. Laboratory studies were conducted to determine the effect of burn-down herbicides, paraquat (1,1'-dimethyl-4,4'-bipyridylium dichloride) and glyphosate (N-(phosphonomethyl)glycine) on washoff of atrazine (2-chloro-4-(ethylamio)-6-(isopropylamino)-s-triazine) from ryegrass (Lolium perenne L.) and hairy vetch (Vicia villosa Roth.). Ryegrass and hairy vetch were treated with paraquat and glyphosate and one or five days later 14C-atrazine was applied. Ryegrass, hairy vetch and dead crop residue not treated with paraquat and glyphosate were included as controls. One day after application of atrazine, all treatments were subjected to 4.5 to 5 cm of simulated rainfall at 9 mm h-1, leachate was collected and analyzed for atrazine. Atrazine washoff from hairy vetch, ryegrass and crop residue not treated with glyphosate or paraquat ranged from 29-37%, 43-49% and 70-75%, respectively, of the amount applied. Paraquat was more effective than glyphosate in increasing the amount of atrazine washoff in both ryegrass and hairy vetch. Washoff was increased when the time between application of the burn-down herbicides and atrazine was increased from one to five days, especially for the ryegrass treatments. Results indicate that availability of herbicides applied to no-till cropping systems may be significantly affected by type of vegetation and burn-down herbicide treatment.
Journal of Thoracic and Cardiovascular Surgery (00225223)116(1)pp. 28-35
Objective: A review of our recent experience of operating on infants weighing 2 kg or less who had congenital heart disease was performed to determine the outcome of early surgical repair or palliation. Methods: A retrospective review of hospital records was performed for infants who weighed 2 kg or less and who were identified to have undergone cardiac operation at our institution January 1992 to June 1997. The data collected included age, weight, gestational age, cardiac diagnosis, surgical procedure, and outcome measures such as length of stay, morbidity, and mortality rate. Outpatient charts were reviewed for follow-up survival and cardiac status. Results: Thirty-three operations were performed on 30 patients. Median age at operation was 19.5 days (1 to 140 days), and median weight was 1.8 kg (1.1 to 2.0 kg). Cardiac diagnoses varied, with coarctation of the aorta and tetralogy of Fallot most common. Twenty-four patients were born at 37 or fewer weeks' gestation. Hospital survival was 83% with no difference in mortality rates based on age, weight, or type of surgical procedure. Premature infants tended to have worse hospital survival. Median postoperative length of stay was 39 days (6 to 122 days). Median duration of mechanical ventilation in survivors was 6 days (2 to 24 days). Neurologic complications were documented in eight patients. Of the 25 hospital survivors, 20 (80%) are alive with good cardiac status at a mean follow-up of 13 months. Conclusion: Cardiac operations in a selected group of infants weighing 2 kg or less can provide acceptable hospital survival. In most instances, complete repair is possible with good medium-term outcome in the survivors. Investigation into neurologic outcomes in these patients is warranted.
Shelton d.r., D.R., Sadeghi, A., Mccarty g.w., G.W., Isensee, A.R.
Soil Science (0038075X)162(7)pp. 510-517
A soil core method is described for monitoring rates and extent of N-mineralization and denitrification from intact leguminous cover crops (e.g., hairy vetch) as a function of soil water content. The method also allows for estimates of N-fixation in order to perform N-mass balances. Field conditions were simulated by growing cover crops in soil cores from seed to biomass levels comparable to the field, followed by harvest/kill. Soil cores were wetted periodically using a rain simulator. After simulated rain events, samples of leachate were obtained and soil water content adjusted by application of a vacuum (15 kPa) to the bottom of cores. The use of a PVC/silica filter (bubble point = 30 kPa) allowed for cores to be drained to field capacity without pulling ambient air through the soil. N-fixation (before harvest/kill) and N-mineralization (after harvest/kill) were determined by comparing NO3/- leached from vetch with fallow cores. Denitrification was determined by periodically sealing cores, injecting and recirculating acetylene throughout cores, and quantifying N2O production after 48 h. Preliminary experiments with hairy vetch (Vicia villosa) indicate that plants are reasonably efficient at taking up soil NO3/-, intact roots decompose fairly rapidly in soil (<6 weeks), and there is potential for substantial losses of soil NO 3/- as a result of denitrification at soil water contents ≤70%.
Herbicide dissipation in soil has been reported extensively using soil cores/columns in the laboratory or in short-term field studies, but long-term persistence and movement under different tillage and year-to-year climatic differences have not been evaluated. We compared the persistence and movement of alachlor [2-chloro-N-(2, 6-diethylphenyl)-N-(methoxymethyl) acetamide] and cyanazine [2-{[4-chloro-6-(ethylamino) 1,3,5-triazin-2-yl]amino}-2-methylpropanenitrile] in soil under no-till and conventional-till corn production plots that received equal amounts of herbicides from 1991 to 1994. Four large (two no-till and two conventional-till) field plots, established in 1986 to evaluate pesticide movement to groundwater, were used for this study. The tillage treatments for the respective field plots were reversed before the corn planting in 1993. Thus, the plots were 5- and 6-year-old no-till and conventional-till plots in 1991 and 1992, but only 1-and 2-year-old plots, respectively, in 1993 and 1994. Each year, after herbicide application, alachlor and cyanazine residues were determined at the soil surface, at time zero, and in the upper 50-cm soil profiles at 2, 4, and 8 weeks after application. For both herbicides, time zero recovery was about 90% of the amount applied. Over the 4-year period, the amount of herbicide intercepted by crop residue in the no-till plots ranged from 60 to 70% for alachlor and 43 to 55% for cyanazine. During the first 2 weeks after application, the amount of alachlor and cyanazine on crop residue decreased by an average (over 4 years) of 83 and 75%, respectively. Alachlor persisted in soil about 2 weeks longer than cyanazine, regardless of tillage practice, and overall persistence was nearly two times longer for the conventional-till than for the no-till. For all years, regardless of year-to-year rainfall differences, cyanazine leached deeper in the soil profile than alachlor under no-till, whereas the reverse was true under conventional-till. Yearly comparison of the influence of rainfall patterns on herbicide movement in soil during the first 2 weeks after application showed that the presence of macropores and more movement of water through soil do not necessarily result in more herbicide leaching.
Laboratory studies were conducted to evaluate effects of tillage reversal and rainfall on 14C-atrazine (2-chloro-4-ethylamino-6-isopropylamino-s-triazine leaching patterns. Twelve intact soil cores (16 cm dia x 20 cm deep) were collected from 8-yr no-till (NT) fields. Half the cores were tilled (5 cm deep) prior to 14C-atrazine treatment (2.7 mg core-1) to all cores. All cores received two rains (27 mm rain in 1.5 h, one day after application followed, two days later, by a 17 mm rain in 2.5 h) and leachate was collected and analyzed for atrazine. These rains simulated the timing, amount and duration of natural rainfall events from a tillage reversal field study. During the first high integrity rainfall event, a pulse (2.1 μg L-1) of atrazine leached through tilled cores while leaching rate was linear and decreased (1.25 to 0.9 μg L-1) through un-tilled cores. Leaching rate was linear for both the tilled and un-tilled cores during the second rain. Less atrazine was left in the surface 5 cm of tilled soil than un-tilled after the two rains. Results confirmed field observations and suggested that when tillage is reversed on well structured soils, pesticide leaching may increase relative to un-tilled soil but these effects are probably confined to the first rain events after application only.
Starr, J.L., Sadeghi, A., Parkin t.b., , Meisinger j.j.,
Journal of Environmental Quality (00472425)25(4)pp. 917-923
The effectiveness of shallow groundwater areas to serve as a sink for NO3 is affected by many biological and physical properties. However, the direct impact of these properties on the fate of NO3 in shallow groundwater is not well understood, especially where the soils are intermittently saturated. This study was conducted to assess in situ reaction and transport of NO3-N in an intermittent shallow groundwater system. Tracer experiments were conducted within an imposed constant flowing shallow groundwater. A constant-head, single injection well technique was adapted for this study using multilevel soil water samplers placed at 14 locations around the center injection well. The use of Br as a tracer for NO3-N in these constant-flow experiments provided the means to assess in situ NO3-N removal both with and without added C. In experiments without added C, an average NO3 removal rate of 0.33 g N m-2 d-1 was estimated. In a second experiment with dextrose added as an added C source, an average NO3 loss rate 1.06 g N m-2 d-1 was observed. The observed response to added dextrose indicates that the N removal processes were primarily microbial in origin, i.e., the NO3 was denitrified or immobilized into microbial biomass.
Studies have demonstrated greater pesticide leaching to groundwater under well established no-till (NT) than under conventional-till (CT). Increased leaching in NT is thought to be caused by preferential transport through macropores. The time required for preferential pathways to develop or dissipate when tillage of well established NT and CT are reversed is unclear. Therefore, a 3-year field study was conducted to determine the effect of reversing the tillage of 7-year-old NT and CT plots on the leaching of atrazine (6-chloro-N-ethyl-N'-(methylethyl)-1,3,5- triazine-2,4- diamine), alachlor (2-chloro-N-(2,6-diethylphenyl)-N- (methoxymethyl)acetamide), and cyanazine (2-[[4-chloro-6- (ethylamino)- 1,3,5-triazine-2-yl]amino]-2-methylpropanenitrile) to groundwater. Groundwater samples were taken monthly before (January 1992 to April 1993) and after (May 1993 to December 1994) tillage reversal and analyzed for the herbicides. Atrazine concentrations in groundwater ranged from 0.15 to 8.9 μg L-1 and 0.07 to 4.9 μg L-1 under NT and CT, respectively, from January 1994 to July 1993 (before tillage reversal to three months after). Concentrations averaged 2.5 times (1.1 to 5) higher under NT than CT at each sampling. Atrazine levels were identical for both tillages from September 1993 to before herbicide application in May 1994. From June through December 1994, atrazine levels were again higher under NT than CT, but differences were smaller than before tillage reversal. Alachlor and cyanazine concentrations were consistently higher under NT than CT for all 3 years and decreased to nondetectable levels within 3 months of application. Results confirm that NT increases herbicide leaching compared with CT and that several years are required for preferential pathways to develop under NT.
Field studies comparing the fate of herbicides under various tillage practices have attributed the observed differences in herbicide leaching to tillage effects. It is not clear whether observed herbicide behavior is caused by tillage practice alone rather than inherent variations associated with individual tillage treatment plots. Therefore, the objective of this field study was to evaluate the effect of reversing the tillage of 7-year- old no-till (NT) and conventional-till (CT) field plots on the leaching patterns and dissipation of atrazine [6-chloro-N-ethyl-N'-(1- methylethyl)- 1,3,5-triazine-2,4-diamine] in soil. No-till and CT field plots, established in 1986, were reversed in 1993 prior to corn planting. Atrazine concentrations were determined in the crop residue and in the top 50 cm of soil from both tillage systems 0, 2, 4, and 8 weeks after application in 1992 (before tillage reversal) and in 1993 and 1994 (after tillage reversal). An average of 1.5 to 2 times more atrazine was recovered in the surface 10 cm of soil under CT than under NT in all 3 years. This difference was attributable to the interception of atrazine by crop residue in the NT plots, regardless of tillage reversal (73, 44, and 58% intercepted in 1992, 1993, and 1994, respectively). The continuation of trends for more atrazine in the topsoil of CT plots than in NT plots after tillage reversal indicated the importance of crop residue on interception of the atrazine spray. Differences in atrazine means between NT and CT plots in the 0 to 10-cm soil depth 2 weeks after application were significant only in 1993 and 1994 at the 80% confidence intervals. This was primarily caused by a decrease in the variability of atrazine residue levels in the new CT plots (after tillage reversal) rather than to an increase in the magnitude of the differences between the means. Tillage probably resulted in both a more homogeneous distribution of organic matter within the top soil profile and disruption of the macropores in the new CT plots, resulting in a more uniform distribution of atrazine residues.
Journal of Environmental Quality (00472425)22(3)pp. 389-391
To meet the global needs of a growing population, both increased productivity and additional land may need to be dedicated to agriculture. However, to effectively evaluate the impact of new farming strategies and agricultural chemicals on the environment, a broad perspective is needed to prevent simply shifting pollution from one part of the hydrologic cycle to another. The loss of agricultural chemicals to the environment may include a combination of processes such as volatilization, runoff and leaching, each exhibiting considerable spatial and temporal dependency. Subsequent losses of agricultural chemicals to the environment may also be transported offsite,having a potential detrimental effect on the environment. This overview provides a brief introduction to the papers presented at a special USDAARS symposium entitled″Agricultural Water Quality Priorities, A Team Approach to Conserving Natural Resources″.
Journal of Environmental Quality (00472425)22(1)pp. 162-166
Volatilization of agricultural chemicals is one process whereby chemicals may enter into parts of the environment where they were not intended. Starch encapsulation of pesticides has been proposed as way of modifying pesticide behavior in the soil environment. This study was conducted to assess how starch encapsulation and temperature affect volatilization of atrazine [6-chloro-N-ethyl-N'-(1 -methylethyl)-1,3,5-triazine-2,4-diamine] and alachlor [2-chloro-N-(2,6-diethylphenyl)-N-(methoxymethyl)acetamide]. Volatilization was measured using agroecosystem chambers as model systems. Herbicides were applied at rates of 1.7 kg ha-1 for atrazine and 2.8 kg ha-1 for alachlor, as either a commercial formulation or a starch encapsulated formulation, to the surface of moist soils maintained at temperatures of 15, 25 and 35° C. Air was drawn through the chambers (2.5 m3 min-1) and herbicide in the vapor phase was trapped in polyurethane foam plugs. Volatilization of both herbicides increased as temperature increased. Volatilization of atrazine was less when applied as starch-encapsulated formulation than the commercial formulation. After 35 d cumulative volatilization of atrazine ranged from < 1% of that applied as starch-encapsulated formulation at 15° C, to 14% of that applied as the commercial formulation at 35° C. Cumulative volatilization of alachlor was greater when applied as starch-encapsulated formulation than as the commercial formulation. After 35 d, cumulative volatilization of alachlor ranged from >2% of that applied as either formulation at 15 °C to 32% of that applied as starch encapsulated formulation at 35 °C. Differences in volatilization behavior between these herbicides are likely to be due to differences in chemical properties of these herbicides.
Journal of Environmental Quality (00472425)21(3)pp. 464-469
High variability of atrazine (2-chloro-4-ethylamino-6-isopropylamino-1,3,5 triazine) residues in soil and shallow groundwater have been reported under various agricultural management systems. This 2-yr study was conducted to evaluate atrazine residue levels in soil as influenced by no-till (NT) vs. conventional-till (CT) under natural rainfall conditions. Atrazine was applied annually (at 1.34 kg/ha), 1 d after corn (Zea mays L.) planting, to two NT and two CT plots. Atrazine residues within the 0- to 10-cm soil depth of CT plots were higher than in the NT plots, regardless of the difference in the rainfall patterns. Higher (ca. 61%) mean atrazine residues in the CT plots over NT plots in 1988 was most likely related to the rainfall that began 12 h after application. In contrast, in 1987, it rained 3 to 4 d after application and the residues in the CT were only 31% higher than in NT. These results indicate that even a subtle difference in rainfall distribution (temporal) can result in marked spatial variability in the distribution of atrazine.
Soil Science Society of America Journal (03615995)56(2)pp. 600-603
A chamber was designed and used to simulate shallow groundwater flow in the field. The chamber, made of Plexiglas with dimensions 120 by 60 by 60cm, was filled to a depth of 30cm with sand and had a multiport arrangement of 10mm-diam. holes on a 5 by 5cm grid on both end walls. As a first approximation, the flow and transport were assumed to be one dimensional, and a convective-dispersive solute-transport model was applied to the Cl breakthrough data of each of the 50 outlet ports in order to quantify the spatial distribution of the dispersion-coefficient values at the outlet plate. Based on the inconsistency observed between measured and estimated pore-water velocities and dispersion coefficients of each of the 50 outlet ports, it appears that the one-dimensional model is not appropriate to adequately characterize transport parameter in this horizontal flow system. -from Authors
Soil Science Society of America Journal (03615995)53(1)pp. 15-18
The objective of this study was to modify the parameters in an empirical equation of R.J. Papendick and G.S. Campbell and, if necessary, develop a new relationship to estimate the value of the molecular diffusion coefficient of urea in soil (Ds) in soils. Laboratory studies were conducted on seven soils in which the clay content ranged from 10 to 51%. Urea concentrations with depth at 48 h following surface-application were measured and also computed using numerical techniques with an initial estimate for Ds instead of computing it using Papendick and Campbell's equation. The Ds was modified incrementally, until the difference between computed and measured concentrations was minimized. In all seven soils, good agreement was obtained between measured and computer urea concentrations with depth.
Soil Science Society of America Journal (03615995)52(1)pp. 46-49
Urea fertilizer is often applied at the soil surface, where it hydrolyzes and can form NH//3. To quantify the volatilization of NH//3, the molecular diffusion of urea into the soil must be described. The diffusion coefficient of urea in soil is related to its diffusion coefficient in water, which varies with temperature. We initially regressed the value of the urea diffusion coefficient in water from the international critical tables (ICT) on temperature for the range of 10 to 20 degree C. Since surface soil temperatures often fall outside this range, additional values for the urea diffusion coefficients were needed. The capillary tube method of Phillips and Ellis was used to measure the diffusion coefficient of urea in water at temperatures ranging from 0 to 50 degree C. The new regression equation allowed a better agreement between actual and simulated urea concentrations in soil.
Keisling t.c., , Gilmour j.t., , Scott h.d., , Sadeghi, A., Baser r.e.,
(111)
The use of tile drains for alleviating soluble salt accumulation on silt loam soil was investigated during 1984. Although the chemical analyses of the floodwater and tile drainage water were very similar suggeting that the floodwater was moving to the tile drain, the overall results so far indicate that this is not a feasible solution owing to lack of significant drainage. Application of DRAINMOD utilizing soil and weather data from Arkansas showed no significant effluent from the tile drains for our experimental site during rice production. This was attributed to the extremely slow saturated hydraulic conductivity values for this particular soil. However, more observations (concerning the operation of the tile field) are needed before it can be concluded that tile drain fields are not a viable solution to the problem.
Sadeghi, A., Hancock g.d., , Waite w.p., W.P., Scott h.d., , Rand j.a.,
Water Resources Research (00431397)20(7)pp. 927-934
Laboratory and field experiments were conducted to investigate the ability of microwave remote sensing systems to detect the moisture status of a silt loam soil exhibiting abrupt changes in moisture content near the surface. Laboratory soil profiles were prepared with a discontinuous moisture boundary in the subsurface. Reflectivity measurements of these profiles were made with a bistatic reflectometer operating over the frequency ranges of 1–2 and 4–8 GHz (wavelength ranges of 30–15 and 7.5–3.75 cm, respectively). These measurements exhibited a well‐developed coherent interference pattern in good agreement with a simple two‐layer reflectivity model. Field measurements of bare soil surfaces were conducted for initially saturated profiles and continued for extended periods of drying. During drying, coherent interference patterns similar to those observed in the laboratory were detected. These appear to be due to steep moisture gradients occurring between drying layers near the surface. The field results were modeled by a five‐segment linear moisture profile with one or two steep segments and a multilayer reflectivity program. Agreement between model and field response over the frequency range was used to estimate the depth of drying layers within the soil. These depths were monitored over the second and third drying cycles. Formation of the drying layers under field conditions appears to be influenced by drying time, tillage, and evaporative demand. In any case, it appears that the coherent effects caused by nonuniform moisture profiles may substantially affect the reflectivity of even rough soil surfaces. Copyright 1984 by the American Geophysical Union.
Lee s., S., Sadeghi, A., Yeo i.-y., I., Mccarty g.w., G.W., Hively w.d., W.D., Lang, M.W., Sharifi a., A.
Climate change is expected to exacerbate water quality degradation in the Chesapeake Bay Watershed (CBW). Winter cover crops (WCCs) have been widely implemented in this region due to their high effectiveness at reducing nitrate loads. However, little is known about climate change impacts on the effectiveness of WCCs for reducing nitrate loads. The objective of this study is to assess climate change impacts on WCC nitrate uptake efficiency on the Coastal Plain of the CBW using Soil and Water Assessment Tool (SWAT) model. We prepared climate change scenarios using General Circulation Models (GCMs) under three greenhouse emission scenarios (e.g., A1B, A2, and B1). Simulation results showed that WCC biomass increased by ∼ 58 % under climate change scenarios, due to climate conditions conducive to WCC growth. Prior to WCC implementation, annual nitrate loads increased by ∼ 43 % (5.3 kg N•ha-1) under climate change scenarios compared to the baseline scenario. When WCCs were planted, nitrate loads were substantially reduced and WCC nitrate reduction efficiency increased by ∼ 5 % under climate change scenarios relative to the baseline, due to increased WCC biomass. Therefore, the role of WCCs in mitigating nitrate loads should increase in the future given predicted climate change.
Sexton a.m., A.M., Sadeghi, A., Shirmohammadi a., A.
7pp. 5291-5302
Hydrologic and water quality models are very sensitive to input parameter values, especially precipitation input data. With several different sources of precipitation data now available, it is quite difficult to determine which source is most appropriate under various circumstances. We used several sources of rainfall data in this study including single gauge rainfall data located outside the watershed boundary, and next generation radar (NEXRAD) rainfall data with different corrections, to examine the impact of such sources on Soil and Water Assessment Tool (SWAT) model streamflow predictions tor a 50 km 2 watershed located in the coastal plain of Maryland. For a watershed of that size with annual average precipitation of 43 inches, at least 3 rain gauges within the watershed would reduce the percentage error in measured average watershed rainfall amounts to less than 23% (for 0.5 inch storm events). The larger the amount of storm rainfall the less error was associated with its measurement. Model simulation results indicated that distance and location of the single rain gauge located outside the watershed boundary has a significant impact in simulating hydrologic and water quality response of the watershed in the temperate region of Maryland. In the absence of a spatially representative network of rain gauges within the watershed, NEXRAD data produced more accurate estimates of streamflow than using single gage data. This study concludes that one has to be mindful of the source and methods of interpolation of rainfall data for input into hydrologic and water quality models if simulation accuracies are desired.
Lee s., S., Yeo l.-y., , Sadeghi, A., Mccarty g.w., G.W., Lang, M.W., Hively w.d., W.D.
pp. 40-43
Elevated C02 concentration, temperature, and change in precipitation patterns driven by climate change are expected to cause significant environmental effects in the Chesapeake Bay Watershed (CBW). Although the potential effects of climate change are widely reported, few studies have been conducted to understand implications for water quality and the response of agricultural watersheds to climate change. The objective of this study is to quantify changes in hydrological processes and nitrate cycling, as a result of climate variability, using the Soil and Water Assessment Tool (SWAT) model. Specifically we assessed the performance of winter cover crops (WCC) as a means of reducing nutrient loss in the realm of climate change and evaluate its impacts on water quality at the watershed scale. WCC planting has been emphasized as the most cost-effective means for water quality protection and widely adopted via federal and state cost-share programs. Climate change data were prepared by modifying current climate data using predicted mean temperature and precipitation change for the future periods (2070-2099) predicted by four global climate models. Current CO2 concentration, temperature, and precipitation increased by 850 ppm, 4.5 °C, and 23%, respectively. Although temperature increase reduced the water and nitrate loads, nitrate loads were found to increase by 40% under baseline land management and WCC were found to be less effective at reducing nitrate (nitrate increased by 4.6 kg/ha). Therefore agricultural conservation practices are likely to be even more important in the future, but acreage goals may need to be adjusted to maintain baseline effects.