Human Amplified Changes in Precipitation-Runoff Patterns in Large River Basins of The Midwestern United States

. Complete transformations of land cover from prairie, wetlands, and hardwood forests to homogenous row crop agriculture scattered with urban centers are thought to have caused profound changes in hydrology in the Upper Midwestern US since the 1800s. Continued intensification of land use and drainage practices combined with increased precipitation have 10 caused many Midwest watersheds to exhibit higher streamflows today than in the historical past. While changes in crop type and farming practices have been well documented over the past few decades, changes in artificial surface (ditch) and subsurface (tile) drainage systems have not. This makes it difficult to quantitatively disentangle the effects of climate change and artificial drainage intensification on the observed hydrologic change, often spurring controversial interpretations with significant implications for management actions. In this study, we investigate four large (23,000-69,000 km 2 ) Midwest river 15 basins that span climate and land use gradients to understand how climate and agricultural drainage have influenced basin hydrology over the last 79 years. We use daily, monthly, and annual flow metrics to document streamflow changes and discuss those changes in the context of climate and land use change. While we detect similar timing of precipitation and streamflow changes in each basin, overall the magnitude and significance of precipitation changes are much less than we detect for streamflows. Of the basins containing greater than 20% area drained by tile and ditches, we observe 2 to 4 fold 20 increases in low flows and 1.5 to 3 fold increases in high and extreme flows. Monthly precipitation has increased slightly for some months in each basin, mostly in fall and winter months (August – March), but total monthly streamflow has increased in all months for the of the North (RRB) basins. The reported streamflow increases in the MRB, IRB, and RRB are large (18%-318%), and should have important implications for channel adjustment and sediment and nutrient transport. Acknowledging both economic benefits and apparent detrimental impacts of artificial drainage on river flows, sediments, and nutrients, we question whether any other human activity has comparably altered critical zone activities, while remaining largely unregulated and undocumented. We argue that better documentation of existing and future drain tile and ditch installation is 5 greatly needed. change at the large landscape scale. and post-period. These results are consistent with our hypothesis that 20 LULC changes in the MRB, RRB, and IRB necessarily change how precipitation is transformed into streamflow and that increases in precipitation alone cannot explain changes in streamflow in these basins. Without pervasive LULC impacts in the CRB, while precipitation has increased slightly in the CRB, flows have not changed, likely due to increases in soil moisture, groundwater, and/or lake, wetland and reservoir storage and/or increases in evapotranspiration. Seasonal changes in storage shown in Fig. 10 suggest that soil moisture, groundwater, and/or lake, wetland, and reservoir storage in the spring 25 and summer is negative (not enough P and ET a to produce observed flows) and positive in the fall (more P and ET a than necessary to produce observed flows and thus an increase in storage during the fall).

of the North (RRB) basins. The reported streamflow increases in the MRB, IRB, and RRB are large (18%-318%), and should have important implications for channel adjustment and sediment and nutrient transport. Acknowledging both economic benefits and apparent detrimental impacts of artificial drainage on river flows, sediments, and nutrients, we question whether any other human activity has comparably altered critical zone activities, while remaining largely unregulated and undocumented. We argue that better documentation of existing and future drain tile and ditch installation is 5 greatly needed.

Whether humans, climate or both have caused streamflow change matters for water quality and watershed management
The magnitude and frequency of streamflows strongly influence water quality, sediment and nutrient transport, 10 channel morphology, and habitat conditions of a river channel. While streamflows fluctuate naturally over event to millennial timescales, humans have also altered rainfall-runoff processes in pervasive and profound ways (Vörösmarty et al., 2004). For example, humans have substantially altered the timing and magnitude of evapotranspiration, have dammed, channelized and leveed waterways, and have installed artificial drainage networks in former wetlands (Boucher et al., 2004;Dumanski et al., 2015;Rockström et al., 2014;Schottler et al., 2014;Vörösmarty et al., 2004) . While it is inevitable that 15 wetland removal and artificial drainage will change rainfall-runoff processes, the effects of drainage on the hydrologic cycle may be subtle and difficult to discern, and may manifest differently at different spatial scales and times of year (e.g., Bullock and Acreman, 2003;Foufoula-Georgiou et al., 2016;Irwin and Whiteley, 1983;O'Connell et al., 2007). Systematic increases in streamflows (peak flows, total flow, base flows) are widely reported in the Midwestern US and attributed to changes in land use (including widespread conversion from perennial vegetation, such as grasses, to annual 20 row crops, primarily corn and soybean, and the addition of artificial drainage) and climate (increase in precipitation and earlier snowmelt) (e.g. Foufoula-Georgiou et al., 2015;Frans et al., 2013;Gerbert and Krug, 1996;Juckem et al., 2008;Novotny and Stefan, 2007;Schilling and Libra, 2003;Schottler et al., 2014;Zhang and Schilling, 2006). However, the question remains: how have climate and land use changes affected streamflows at the large (>10 4 km 2 ) watershed scale, the scale at which many states and federal programs are often tasked with monitoring and evaluating water quality ? 25 For waters impaired by sediment under the US Clean Water Act (CWA), EU Water Framework Directive, and similar regulations in countries around the world, loads often consist of both natural and human derived sediment sources (Belmont et al., 2011;Gran et al., 2011). Differentiating between these two sources is often very difficult, and yet essential for identifying and achieving water quality standards (Belmont et al., 2014;Trimble and Crosson, 2000;Wilcock, 2009). Sediment sources derived from near or within the channel itself (e.g., bank erosion from channel widening) are particularly 30 sensitive to changes in streamflows (Lauer et al., in review;Schottler et al., 2014). Bank erosion is a significant sediment source in many alluvial rivers, contributing as much as 80% to 96% of the sediment that comprise a river's total sediment load (Belmont et al., 2014;Kronvang et al., 2013;Palmer et al., 2014;Schaffrath et al., 2015;Simon et al., 1996;Stout et al., 2014;Willett et al., 2012). For some agricultural basins, erosion of near-channel sources contributes more fine sediment than does agricultural field erosion (Belmont et al., 2011;Lenhart et al., 2012;Trimble, 1999). For example, in the Le Sueur River, south central Minnesota, tall river bluffs contribute the greatest proportion (57%) of fine sediment (Belmont et al., 5 2011;Day et al., 2013). While bluffs have contributed to high sediment loads in the Le Sueur River since long before agricultural impacts, the total contribution of sediment from bluffs today is well above the Holocene average due to systematic increases in streamflows over the last 70 years (Belmont et al., 2011;Gran et al., 2013). This story is not unique to south-central Minnesota, as many basins across the Midwestern corn belt and around the world are experiencing greater runoff, higher sediment and nutrient loads, and accelerated loss of habitat than in the historical past (Blann et al., 2009). 10

What are the potential impacts of artificial drainage on streamflows in large watersheds?
The United States is the largest producer of corn and soybeans in the world (Boyd and McNevin, 2015;Guanter et al., 2014). Exceptionally high agricultural productivity over the past century and a half required massive conversion of grasslands, wetlands, and forests to agricultural lands (Dahl and Allord, 1996;Dahl, 1990;Marschner, 1974). Although many advances in cropping practices have led to the modern day prosperity of the corn belt, artificial drainage has played a 15 critical role for agriculture in the Midwestern USA. Today even the poorest drained lands are able to grow row crops with the use of artificial drainage. Throughout this paper "artificial drainage" is used as a general term that refers to both human installed surface (ditches) and subsurface (tile) drainage. Tile drains and ditch networks are installed to ameliorate waterlogged soils, which are known to limit crop growth (Hillel, 1998;Sullivan et al., 2001;Wuebker et al., 2001). Modern tile drains are composed of corrugated plastic tubing and are typically installed at depths of 1-2 m to control the elevation of the 20 water table below the soil surface (Hillel, 1998).
The economic benefits of artificial drainage are well understood by Midwestern farmers, who have invested heavily in drainage systems to reduce soil moisture in the spring for planting, reduce surface overland flow, and increase crop yields.
Benefits of artificial drainage include increased land value, simpler field geometry for more efficient mechanical equipment operation, as well as increased production of first class crops such as corn or soy (Burns, 1954;Fausey et al., 1987;Hewes 25 and Frandson, 1952;Johnston, 2013;McCorvie and Lant, 1993). Installation or enhancement of tile drainage systems often occurs simultaneously with land conversion from wild hay and small grains to soybeans as Fig. S1 demonstrates in the Supplement (Blann et al., 2009;Burns, 1954;Hewes and Frandson, 1952). And although the conversion of perennial grasses to corn and soybean rotations doesn't necessarily lead to a reduction in evapotranspiration (ET) over the course of an entire growing season, at least for well drained soils (Hamilton et al., 2015), several studies report a reduction of ET early in the 30 growing season (Hickman et al., 2010;McIsaac et al., 2010;Schottler et al., 2014;Zeri et al., 2013). Thus changes in land Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License. cover (and ET) and drainage expansion have been found to alter watershed hydrology and increase mean annual flows (Kibria et al., 2016), base flows (Juckem et al., 2008;Robinson, 1990;Schilling and Libra, 2003), annual peak flows (Dumanski et al., 2015;Magner et al., 2004;Skaggs et al., 1980Skaggs et al., , 1994, and total flow volumes (Dumanski et al., 2015;Frans et al., 2013;Lenhart et al., 2011). While it seems inevitable that altering ET and subsurface drainage efficiency should have measureable effects on streamflow, the combined effects have proven difficult to isolate empirically, especially across 5 scales, due to measurement uncertainties, high temporal and spatial variability in antecedent moisture conditions and runoff processes, a shift towards a wetter climate today than in the historical past, as well as limited documentation of artificial drainage installation in the US.
A thorough review of artificial drainage by Robinson (1990) points out that the debate over whether subsurface drains increase peak flows (by draining soil water faster than subsurface groundwater flow) or decrease peak flows (by 10 lowering the water table between storms, thus increasing the soil water capacity) is centuries old and is an over-simplified view of the effects of artificial drainage on streamflows. The effects on downstream river flows are complex and often highly sensitive to basin scale, weather patterns, watershed characteristics (slope, soil hydraulic conductivity, water table properties, land-cover change), and drainage patterns (depth of drainage, horizontal spacing of tiles, diameter of drains). Progress in understanding the effects of agricultural drainage on watershed hydrology has also been hindered by the fact that accurate 15 data regarding the location, size, depth, efficiency and connectivity of sub-surface drainage systems are rarely available.
In this paper we couple analysis of historical patterns in large (>10 4 km 2 ) river basin hydrology in the Midwestern USA with historical climate and land use data to identify how each of these factors have influenced streamflow patterns.
Specifically, we address the following questions: (1) how have land use, climate, and streamflows changed during the 20 th & 21 st centuries; (2) what are the time scales and times of year that changes are most prominent; and (3) can changes in climate 20 alone explain changes in streamflow? We hypothesize that in the most intensively managed agricultural basins, climate alone cannot explain streamflow patterns, and that land use changes in the Midwestern USA have amplified the expected hydrologic change associated with climate. We test this hypothesis in four large river basins with different histories and climates using a suite of quantitative methods that test the statistical significance of changes in streamflow and precipitation at multiple time scales. Finally, we present a water budget for each basin to evaluate how poorly constrained components of 25 the water cycle would need to have changed in an effort to determine whether or not climate alone can explain flows.
We acknowledge that the conversion of precipitation to streamflow occurs by a complex suite of physical processes.
Inevitably, we lack temporal and spatial coverage/resolution of all of the relevant hydrologic fluxes (e.g., groundwater, actual evapotranspiration, infiltration, soil water flux rates) to characterize the system completely and have limited ability to ascribe subtle changes to any given physical process, especially at large scales. Yet, with increasing concerns about water 30 quality and aquatic biota, disentangling the effects of artificial drainage and changing precipitation patterns is important for Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License. evaluating economic costs, benefits and risks, predicting the effects of future land and water management and informing future policy.

Study areas: large river basins of the Midwest with varying degrees of climate and land use change
We analyze hydrologic and land use change in four large Midwestern watersheds during 1935-2013. We selected these basins because all are agricultural, to various degrees, primarily producing corn and soybeans, were affected by 5 continental glaciation, are located mainly within the Central Lowland physiographic province, and are characterized by a humid, temperate climate (Kottek et al., 2006). All four basins also contain waters impaired for excessive sediment under the US Clean Water Act. Therefore, deconvolving the climate and land use effects on basin hydrology is essential for developing and attaining sediment-and nutrient-related water quality standards. Despite the broad similarities between basins, we have intentionally selected watersheds that span a gradient of climate and land use change. We discuss the unique properties and 10 histories of each basin below. From northwest to southeast, these include: the Red River of the North basin, upstream of Grand Forks, ND (67,005 km 2 ), Minnesota River basin, upstream of Jordan, MN (42,162 km 2 ), Chippewa River basin, upstream of Durand, WI (23,444 km 2 ), and Illinois River basin, upstream of Valley City, IL (69,268 km 2 ) ( Fig. 1). Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License.

Red River of the North Basin (RRB)
Roughly 45% of the Red River of the North basin (RRB) is cultivated for corn and soybeans (Fig. 1). In the heart of the RRB is the exceptionally flat and fertile Red River Valley, which was formerly the lakebed of Glacial Lake Agassiz. To the south and east, the valley is flanked by a large moraine. Further north, the basin is characterized by depressional peat bogs and wetlands. To the northwest, the Red River Valley climbs to the Pembina Escarpment, which separates the valley 5 from a glacial drift plain that is characterized by closed basins, surface depressions, and wetlands (Rosenberg et al., 2005).
Overall the basin is characterized by fine textured soils and little relief.
The Red River of the North basin is the coldest and driest of all four study basins, although the last two decades (1990's & 2000's) have been the wettest in historical times in the RRB. Precipitation records, lake level elevations, and paleoclimate studies indicate that the basin is prone to extreme climate variability (Fritz et al., 2000;Miller and Frink, 1984). 10 On average the basin receives 589 mm of precipitation annually (PRISM annual long term mean 1981-2010), with 67% of precipitation falling in the spring (MAM) and summer (JJA) months (Fig. S2).
In general, surface and subsurface drainage projects began during initial agricultural settlement in the late 1800's and early 1900's, when enormous tracts of wetlands and tall grass prairie (millions of acres) were levelled and drained mainly by surface ditches and canals (Miller and Frink, 1984). Most of these initial projects were relatively ineffective 15 (Miller and Frink, 1984). A series of severe floods in the Red River Valley, including 1997Valley, including , 2009Valley, including , and 2011 an increase in agricultural production driven by increased commodity prices and demand for biofuel over the past 20 years, have caused a resurgence of ditch and tile installation at unprecedented rates (Blann et al., 2009). Within the Bois de Sioux watershed (a sub-basin of the RRB where permits are required for drain tile installation), reported drain tile installation has increased from 3 miles in 1999 to 1,924 miles in 2015 for a total of 15,102 miles of new tile installed since 1999 (Bois de 20 Sioux Watershed District, 2015).
In addition to agricultural farm drainage (ditches and tile), there are a handful of dams in the basin, including Red Lake Dam, Bald Hill Dam (Lake Ashtabula), Browns Valley Dam (Lake Traverse), and Orwell Dam (Stoner et al., 1993). To

Minnesota River Basin (MRB)
The Minnesota River basin (MRB), a sub-basin of the Upper Mississippi River basin, is somewhat unique because over the last 13,000 years the tributaries to the Minnesota River have been rapidly incising through consolidated Pleistocene 30 glacial sediments, with a steep knickzone present in the lowest 30-60 km of each tributary (Belmont et al., 2011;Gran et al., Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License. 2013). Work from Day et al. (2013) and Belmont et al. (2011) suggests that suspended sediment loads in tributary knickzones may be exacerbated by high river flows. Drastic sediment reduction measures would be needed to achieve the 80-90% sediment load reduction proposed for the Minnesota River to meet the current total daily maximum load (TMDL) target (Minnesota Pollution Control Agency, 2014).
Throughout much of the rest of the MRB, Pleistocene glaciations have left very little relief aside from the 5 depressional areas, many of which were wetlands prior to widespread Euro-American settlement. Almost the entire river basin consists of poorly drained mollisol soils with a very small area consisting of alfisols and entisols (Stark et al., 1996).
Artificial drainage has played an important role in the settlement and agricultural development of the basin (Burns, 1954).
Today the MRB contains the greatest percent area of corn and soybeans of any of the four study basins, comprising 78% of the watershed (Fig. 1). 10 Novotny and Stefan (2007) found increased baseflows and peak flows during the latter part of the 20th century in the MRB. Hydrologic change has been attributed to both changes in land use (widespread conversion from perennial vegetation, such as grasses, to annual row crops, primarily corn and soybean, and addition of artificial drainage) and climate (increase in precipitation) (Frans et al., 2013;Schottler et al., 2014). The magnitude of flow increases in the Minnesota River With respect to climate, the MRB, much like the RRB, is uniquely situated at a "climatic triple junction" where warm moist air from the Gulf of Mexico, cold dry air from the Artic, and dry Pacific air dominate at different times of the 20 year and have varied in relative dominance in the past (Dean and Schwalb, 2000;Fritz et al., 2000). The MRB receives 716 mm of precipitation annually, with 68% of precipitation falling in the spring (MAM) and summer (JJA) months, and is characterized by cold, dry winters (Fig. S2). In south central Minnesota, Schottler et al. (2014) found an increase in September-October precipitation between 1940-1974and 1975 in 8 of 11 watersheds studied in the MRB. However, none showed significant increases in May-June precipitation. 25

Chippewa River Basin (CRB)
The 23,444 km 2 Chippewa River basin (CRB), originating in the Superior Upland and terminating in the Central Lowland physiographic province, is the smallest of the four major river basins. Only 12% of the basin is currently used for corn and soybean farming, as much of the northern part of the basin is characterized by forests, swamps, marshes, and small lakes (Fig. 1). The CRB is also characterized by rolling topography with glacial deposits of irregular thickness draped over 30 crystalline and sandstone bedrock (Smith, 1906). The dominant soil orders are alfisols and spodosols. The lower Chippewa Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. River, especially in the Driftless Area downstream of Chippewa Falls and Eau Claire, flows through an incised valley, approximately 30m deep filled with sand and gravel outwash deposits (Barnes, 1997).
Despite widespread logging in the late 19 th and early 20 th centuries, much of the basin today is forested. Probably the most notable watershed alteration during the 20 th century was hydropower development. The CRB has over 125 dams, of which 37 are for hydropower (Delong, 2005;Martin, 1965), and 33 are large dams, meaning they either exceed 50 feet in 5 height, have a normal storage capacity of 5,000 acre-feet or more, or have a maximum storage capacity of 25,000 acre-feet or more (Hyden, 2010). The first hydropower dam on the Chippewa was constructed in 1878, and by 1950 there were a total of eight dams on the mainstem Chippewa River, which have heavily regulated streamflow in the basin (Barnes, 1997).
The CRB is close in proximity to Lake Superior and Lake Michigan, and thus both temperature and humidity in the basin are more strongly influenced by the Great Lakes than the other basins. The CRB receives 822 mm of precipitation 10 annually, with 62% of precipitation falling in the spring (MAM) and summer (JJA) months, and is characterized by cold, dry winters ( Fig. S2). Previously, relatively little research has been focused on Chippewa River basin hydrology. In the nearby Kickapoo River watershed (Wisconsin Driftless Area), Juckem et al. (2008) found an increase in river baseflow near 1970, which coincided with changes in precipitation but could not be explained by precipitation changes alone. In the Hay River watershed, a tributary to the Chippewa, low flows have increased but peak flows remain unchanged, though in nearby 15 Driftless Area streams (adjacent to the CRB) not only have low flows increased, but peak flows have simultaneously decreased (Gerbert and Krug, 1996).

Illinois River Basin (IRB)
The Illinois River basin (IRB) is the largest of the four river basins (69,268 km 2 ) and is unique from the other basins in several ways. First, the IRB is the only river basin to contain a major metropolitan city, Chicago, though it is still largely 20 an agricultural river basin, with 60% of the basin classified as corn and soybean land use (Fig. 1). Urban impacts on watershed hydrology are most likely different from the agricultural impacts discussed in this paper. Based on 1995 surface water use in the upper Illinois River Basin, daily withdrawals were approximately 2% of the mean annual flow (12,600 cfs) (Arnold et al., 1999;Groschen et al., 2000). Though historical and present water withdrawals are largely unknown, increased water use for industry, agriculture, and public drinking supply may offset some of the climate impacts of increased 25 precipitation. Urban and suburban detention basins may also limit how much precipitation is converted to runoff.
The headwaters of Illinois River drain Lake Michigan via a canal, which was completed in 1848 and has played a role in regulating river baseflows (Lian et al., 2012). By 1915 the river was almost entirely lined by floodplain levees installed by the Army Corps of Engineers. The Illinois River has an extensive system of locks and dams to facilitate navigation, all seven of which were completed in the 1930's (Lian et al., 2012), and at least five of which regulate modern 30 river flows (Groschen et al., 2000). These river structures influence water storage, conveyance and evaporation within the Illinois River Valley, and thus the historical streamflow record.
The Illinois River Basin is part of the Till Plains and Great Lakes sections of the Central Lowlands physiographic province and is an exceptionally flat landscape with glacial features, such as Wisconsinan moraines, knob and kettles (typically less than 100 ft relief), and the Illinois River valley (100-400 ft relief) creating the only noteworthy topography 5 (Arnold et al., 1999;Groschen et al., 2000). IRB soils are dominated by mollisols, containing around 1% organic matter and generally of low to very low permeability, with some presence of more permeable alfisols and entisols (Arnold et al., 1999;Groschen et al., 2000).
The Illinois River Basin receives 960 mm of precipitation annually, with 59% of precipitation falling in the spring (MAM) and summer (JJA) months, and is characterized by relatively cold, dry winters ( Fig. S2). There is a precipitation 10 gradient across the basin, with the southwest portion of the basin generally receiving more precipitation than the northeast section in all months.
Widespread European settlement and modification of the Illinois River Basin predates the other three basins and began as early as the 1820's and 1830's, when construction began on a canal to connect Lake Michigan to the Illinois River (Lian et al., 2012). However for much of the basin in these early settlement years, permanent occupation was difficult 15 without the aid of artificial drainage due to wetlands and swamplands that made travel slow, farming nearly impossible (except for pasturing in the dry season), and often posed health risks, including malaria and cholera (Beauchamp, 1987).
State legislation passed in 1878 created organized drainage districts that installed ditches and tile to drain many permanently or seasonally wet areas and created more arable land (McCorvie and Lant, 1993). Between 1850 and 1930 Illinois lost an estimated 90% of its wetlands (McCorvie and Lant, 1993), while during this same time Minnesota and Wisconsin lost 20 approximately 53% and 32% of state wetlands, respectively. Therefore most of the IRB was drained prior to stream gage records, though the efficiency, intensity, and density of drainage has likely increased (Arnold et al., 1999).

Data and Methods: land use/land cover, climate, and streamflow
The statistical analyses and water budgets presented herein rely upon USGS gage records of daily streamflow (Q), total monthly PRISM precipitation (P), and average monthly actual evapotranspiration (ET a ) from Livneh et al. (2013). We 25 have also compiled decadal surveys of artificial drainage from US Census reports and annual crop surveys from the National Agricultural Statistics Service to understand the histories of land use/land cover (LULC) change in each of the basins.

LULC: county-level NASS data for surface and subsurface artificial drainage and crop types
We compiled all available information regarding agricultural drainage practices for each of the four study basins.
We used county-level US Census of Agriculture drainage data from 1940, 1950, 1960, 1978, and 2012 , 1942, 1952, 1961, 1981; U.S. Department of Agriculture, 2014a). Tabulations of land in drainage enterprises in all years (except 1940) do not include lands where surface ditches and subsurface tiles drain less than 500 acres (U.S. Bureau of the Census, 1922Census, , 1952. In 1940 and 2012, the types of drainage enterprises (ditches and tile) were reported individually.

Census
We summed the number of drained acres within each county across all counties within each watershed, weighing partial counties by area. To normalize the land area across basins of different sizes, we report the percentage of watershed area 5 drained. While the uncertainties in these data are high and of poor resolution (county-level), they are the best data available and can be used as a proxy for the relative drainage extent and expansion in each of the four large study basins (the smallest of which is still larger than 20 counties).
Although county level agricultural census drainage data are arguably the most complete artificial drainage inventory across the United States, these data are only available for five census years. Therefore, we also compiled annual ( (Burns, 1954;Hewes and Frandson, 1952). Therefore, we use these annual crop data as another indication of LULC changes, identifying land cover transitions (LCTs) when soybean acreage exceeds hay and small grains 15 (Foufoula-Georgiou et al., 2015).

Climate: PRISM precipitation and modelled actual evapotranspiration
We compiled basin-averaged monthly PRISM (Parameter elevation Regression on Independent Slopes Model) precipitation depths for each watershed for 1935-2013 (PRISM Climate Group, 2004) using ArcGIS model builder (precip_model.tbxavailable upon request), calculating spatially averaged monthly precipitation totals for each basin from 20 4 km grid precipitation rasters (Table 1). We report total monthly and annual precipitation as an average depth (cm month -1 ) and volume (km 3 month -1 ) for each watershed.
We used the same approach to compile average monthly evapotranspiration data (et.mon.mean.nc) for 1935-2011, produced by Livneh et al. (2013), after converting NetCDF4 data into monthly rasters using the ArcGIS multidimensional toolbox and the NetCDF_time_slice_export tool. These modeled actual evapotranspiration (ET a ) data were produced as a 25 gridded product for the continental United States using the Variable Infiltration Capacity (VIC) model run at 3-hr time steps in energy balance mode (Livneh et al., 2013), consistent with methods of Maurer et al. (2002). Hereafter we refer to Livneh et al. (2013) and Maurer et al. (2002) as L13 and M02. Estimates of VIC modeled ET a produced at 3-hr time steps in energy balance mode are more conservative than estimates produced at daily time steps or using water balance mode (Haddeland, 2006). 30 We have chosen these data over other available estimates of evapotranspiration because they cover a large spatial (the contiguous US) and temporal  domain necessary for the study, at reasonable spatial (1/16°) and temporal (monthly) resolution, unlike other global and North American reanalysis products such as ERA-Interim (data available from 1979-2013 at 0.7°) and NARR (data available from 1979-2015 at 0.3°). Although VIC model precipitation input used to generate the ET a data was not from PRISM, Livneh et al. (2013) gridded NCDC COOP station data and state that "gridded 5 precipitation values were subsequently scaled on a monthly basis so as to match the long-term mean from the Parameter-Elevation Regressions on Independent Slopes Model (PRISM; Daly et al. 1994); for consistency with M02, a 1961-90 PRISM climatology was used". Additionally, we directly compared monthly precipitation from L13 andPRISM (1935-2011) and found that for each of the four study basins the mean error was only 1% (Fig. S3).

USGS mean daily streamflow gage data 10
We followed a multiscale approach to analyze streamflow change  across each of the four river basins, evaluating annual, monthly (seasonal), and daily flow metrics.
Using multiple gages for a single basin, we compiled seven annual flow metrics: mean annual flow, 7-day low flow winter (November-April), 7-day low flow summer (May-October), peak flow spring (March-May), peak flow summer & fall (June-November), high flow days, and extreme flow days using mean daily flow data from USGS gages within each basin 15 ( Fig. S2; Table 1) following the methods of Novotny and Stefan (2007). For each gage, we normalized the annual flow metric by the 1950-2010 mean to facilitate comparisons among basins and to observe similarities in trends among metrics.
The number of gages included in the flow metric calculations were: 22 for RRB, 12 for MRB, 9 for CRB, and 20 for IRB (Table 1). Each gage record included a minimum of 62 years, and of the 63 gages analysed 53 gages had continuous records.
For the outlet gage in each basin (Table 1) we computed monthly and annual streamflow average depths (cm month -1 ) and volumes (km 3 month -1 ) for 1935-2013 for the MRB, RRB, and CRB, and 1939-2013 for the IRB due to missing gage data prior to 1939. We also calculated daily streamflow change exceedance probabilities, where dQ/dt>0 characterizes the rising limbs of daily hydrographs and dQ/dt<0 the falling limbs. 25

Frequency-magnitude analysis of precipitation and streamflow
A common method for detecting and quantifying changes in the magnitude/frequency content of a time series is via a localized time-frequency analysis using wavelets. The Continuous Wavelet Transform (CWT) of a signal ( ) is defined as the convolution of the signal with scaled and translated versions of a mother wavelet ( ): where ( − ) is the mother wavelet scaled by parameter and translated by parameter , and * denotes the complex conjugate. The mother wavelet is a function of compact support or sufficiently fast decay (to allow time-localized analysis) and has its area under the curve zero, making it a band-pass filter (e.g., Daubechies, 1992). By changing (i.e., dilating or contracting the mother wavelet by a scale factor) and changing (centering the wavelet at different locations along the time axis), the CWT quantifies the localized energy or variance of a signal at different times and scales (frequencies). To every 5 scale there is a corresponding frequency assigned as the central frequency of the Fourier transform of the wavelet at that scale. This relationship is analytically computable depending on the chosen mother wavelet. In this paper, we use the Morlet wavelet Daubechies, 1992;Seuront and Strutton, 2003), which has been proven effective for analyzing climate signals such as El Niño, streamflow, and precipitation among others (e.g., Anctil and Coulibaly, 2004;Foufoula-Georgiou et al., 2015;Labat et al., 2001;Torrence and Compo, 1998 and the references therein). The Morlet wavelet is given 10 as: which is simply a complex wave with a Gaussian envelope, where 0 is the central frequency of the mother wavelet. In practice, when 0 ≫ 0, the second term in the parenthesis of Eq.
(2) becomes negligible and the Morlet wavelet simplifies to: Typically 0 is set equal to 0.849, as this choice offers the best time-frequency localization (e.g., see . In other words, as we increase the time localization of the wavelet (by reducing the size of its compact support) we decrease its frequency localization (since its Fourier transform becomes broader). However, the product of the uncertainty in the time and frequency localizations is constrained by an upper bound (Heinzberg's uncertainty principle). The Morlet wavelet with 0 = 0.849 achieves this upper bound giving thus the best compromise between time and frequency localizations. We used 20 CWT to detect changes in monthly precipitation and streamflow time series data.

Statistical tests
We performed one-tailed student's t-tests (and Wilcoxon Rank Sum tests if the data did not meet parametric assumptions after testing log, square root, and arcsine transformations) and Kolmogorov-Smirnov (KS) tests using the statistical program R to analyze changes in the mean and distribution of annual and monthly total flow (Q at the basin outlet) 25 and precipitation (P) volumes between the defined pre-period and post-period (R Core Team, 2013). We test the hypothesis that mean monthly water volumes have increased and their distributions have shifted right during the post-period. We selected 1974/75 as one breakpoint for the pre-period and post-period because it lumps the time series data into two roughly equal periods (40/39 years), coincides with the timing of widespread acceptance of cheaper and easier to install corrugated plastic tile (Fouss and Reeve, 1987), and other studies in the MRB and IRB have identified hydrologic change occurring 30 Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. around that time (e.g. Foufoula-Georgiou et al., 2015;Lian et al., 2012;Schottler et al., 2014). However, we acknowledge that 1974/75 may not be the hydrologically relevant breakpoint in all basins at this large scale. Therefore, we ran all statistical tests using 1974/1975, LCT, and hydrologic breakpoints, identified by fitting a piecewise linear regression (PwLR) using a least-squares approach to the monthly streamflow and precipitation volume time series in each basin (Liu et al., 2010;Tomé and Miranda, 2004;Verbesselt et al., 2010;Zeileis et al., 2003). We selected an alpha value of 0.05 (95% 5 confidence level) for all statistical tests performed (286 t-test and 268 KS-test using the annual and monthly P and Q data for each basin and 28 t-tests on the streamflow metrics described in section 3.3 for a total of 600 statistical tests). In general the results of the statistical tests are not sensitive to the different breakpoints, spanning nearly four decades, and therefore we only report statistical results for the pre-period (1935-1974) and post-period (1975-2013), though all results are presented in Table S1 in the Supplement. 10

Hydrologic budget
For given watershed over a specified time period of integration, water inputs minus water outputs are equal to the change in storage per unit time: where P is average watershed precipitation (cm month -1 ), ET is estimated average watershed actual evapotranspiration (cm 15 month -1 ), Q is runoff depth at the basin outlet (cm month -1 ), and is the depth of change in soil water, groundwater, and lake/reservoir storage per unit time.
We have computed average annual water budgets for each basin by accumulating monthly P, ET, and Q during the pre-period and post-period, as well as during the pre-period and post-period determined by the land cover transition (LCT) in each basin, to solve for the change in storage. We expect that P and Q have both increased from the pre-period to the post-20 period, and that ET has remained relatively constant at the annual timescale, though seasonal ET may have changed with crop type. Therefore, if the change in storage term increases from the pre-period to post-period we conclude that soil moisture, groundwater, and/or lake/reservoir storage has also increased and that climate likely explains most of the increase in Q. However, if the change in storage term decreases from the pre-period to post-period, then we conclude that soil moisture, groundwater, and/or lake/reservoir storage has decreased despite precipitation increases, indicating that LULC 25 change has altered watershed storage and contributed (in addition to precipitation) to increased streamflows.
We acknowledge that there is uncertainty in the all of the input data, and while we do not have high confidence in the magnitude of the storage term, it is useful to understand the direction of change (i.e., is there more or less storage in recent times than in the past?). The largest uncertainty in a hydrologic budget is usually associated with evapotranspiration, which if overestimated or underestimated can easily alter the direction of the calculated change in storage. In this analysis if 30 ET is overestimated, we are more likely to calculate negative change in storage, which we interpret as decreased soil Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License. moisture/groundwater/reservoir storage. Therefore we rely upon conservative estimates of actual evapotranspiration, ET a (rather than potential evapotranspiration, ET p ), base water budget interpretations on the relative differences between past and present storage, and have checked estimates of ET a against ET estimates from other sources. Livneh et al. (2013) reported 17% overestimation of ET a during the summer months when compared with AmeriFlux station data, therefore in addition to the raw water budgets, we present water budgets where we have reduced 5 monthly ET a by 17% during summer months (JJA). Reducing summer ET a estimates by 17% results in a more conservative water budget, where the change in storage term is less likely to be negative. Overall, the data from Livneh et al. (2013) used in computing the monthly water budgets are mostly consistent with other sources (Bryan et al., 2015;Diak et al., 1998)  and RRB (Fig. 2). The drainage census data show that the MRB has the greatest percentage of the watershed area drained by tile (19% in 1940; 35% in 2012) and ditches (7% in 1940; 10% in 2012), followed closely by the IRB where tiles drained 10% and 34% of the watershed area in 1940 and 2012, respectively, and ditches drained 12% and 8% of the watershed area in 1940 and 2012, respectively (Fig. 2). The Red River Basin has experienced very little increase in total drainage since 1940 and most of the artificial drainage is ditches rather than tile drains. Although a dramatic increase in tile installation has been 20 reported in the Red River Valley since the 1990's, the area of this expansion appears small relative to the watershed area, as the acres reported to be drained by tile in 2012 represents only 2% of the total watershed area. The Chippewa River Basin has very little agricultural land, and thus very little drainage throughout the basin with less than 1.5% of the watershed area drained by tile and ditches in 2012 (Fig. 2).
The 1978 census data illustrate the uncertainty associated with variability in reporting and response rates, as it is 25 unlikely for total drainage to have decreased between 1960 and 1978 in the RRB and MRB (Fig. 2) and Wisconsin, respectively, were less than 500 acres, and therefore were not included in survey results (U.S. Department of Agriculture, 2014b). Therefore these estimates are likely to grossly underestimate the area drained by tile and ditches.
Although the 2012 census attempts to correct for incomplete and missing responses, because drainage enterprise records have traditionally been so poorly documented, it is difficult to know just how different the census reported acreage is from the actual acreage. 5 We also note that acres drained by tile and ditches does not explain the effectiveness of artificial drainage at transferring soil water to streams. Several factors influence the flow rate from soils to drains including the hydraulic conductivity of the soil, depth of the water table, depths of the tile lines, diameter of the tile, slope of the tile or ditch, horizontal spacing of tiles and ditches, material composing the tile or ditch, as well as precipitation intensity and duration and antecedent soil conditions (Hillel, 1998). We simply do not have this level of information regarding artificial drainage in 10 the Midwestern USA and suspect that the spatial variability in drainage management practices may be high. For example, Naz et al. (2009) mapped tile drains in a 202 km 2 Indiana watershed and found tile spacing that ranged from 17-80 m.
While we expect that the trends observed are relatively correct, we are cautious about drawing any definitive conclusions from the Census of Agriculture data regarding the actual extent of tile drainage and changes over time. It is clear that these estimates tend to underestimate the amount of drainage. Nevertheless, total drainage and tile drainage in the 15 Minnesota River basin and Illinois River basin has increased considerably since 1940. It is known anecdotally, but not included in these data, that tile drainage spacing has decreased and intensity (drainage rate in mm h -1 ) has increased on agricultural lands, often by a factor of two, as was done at the Lamberton Research Station, MN (L. Klossner, personal communication, November 17, 2015). While relative amounts of drainage in this inventory should be reliable (i.e., greatest in MRB and IRB, relatively low, but tile drainage growing recently in the RRB, and very little drainage in the CRB), the lack 20 of historical documentation on changes in location, density and type of tile installed limits our ability to model hydrologic change at the large landscape scale.    Table 3). The RRB is the only basin containing a significantly higher percentage of soybean acreage relative to corn; on average since 1995 soybean acreage in the RRB has been more than twice that of corn and in all other basins corn acreage exceeds soybean acreage by 3-8%. In the MRB, rapid increase in soybeans and decrease in hay & small grains began 15 around 1940 with LCT at 1978 ( Fig. 3; Table 3). The LCT (1978LCT ( /1979 in the MRB also occurs after the hydrologic breakpoints for precipitation (1958/1959) and streamflow (1967/1968) (Fig. 3; Table 3). In the CRB hay & small grains were the dominant crop type for most of the period of record, with a gradual increase in corn and soybean acres harvested until the LCT, 2009/2010 ( Fig. 3; Table 3). We identified 1995/1996 as the precipitation and streamflow breakpoint for the CRB. The LCT in the IRB occurs earlier than in the other three basins, occurring in 1961/1962, with rapidly increasing soybean acreage starting as early as 1925 and rapid decline in hay & small grains occurring around 1950 ( Fig. 3; Table 3). The LCT (1961/1962) in the IRB occurs before both the precipitation and streamflow transition (1981/1982).
The flow and precipitation breakpoints roughly coincide, and are within 2, 9, 0, and 0 years of each other for the RRB, MRB, IRB, and CRB, respectively ( Table 3). The flow and precipitation breakpoints are slightly different than those 5 identified from LULC changes in each basin (Fig. 3). The LCT breakpoints are within 13, 10, 21, and 13 years of the streamflow breakpoints for the RRB, MRB, IRB, and CRB, respectively. Similar temporal coincidence of precipitation and streamflow breakpoints (in contrast to the LCT and streamflow breakpoints) may suggest that streamflow changes are tightly coupled with precipitation changes, but this information does not account for the potential effects of drainage, which could act to amplify the streamflow response to precipitation. 10 Overall, changes in crop types occurred gradually in the MRB and IRB, much more rapidly and recently in the RRB. The CRB is largely non-agricultural, only 9% of the basin grew corn, soy, and hay & small grains in 2015, and the changes in the basin have been small during the period of record (Fig. 3). While we cannot directly ascribe these changes in crop type to changes in drainage practices or vice versa, they provide a relatively detailed history of LULC and whether the changes occurred gradually or rapidly and recently or long-ago in each basin. 15 Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License. Figure 4a shows five year running averages of seven streamflow metrics. Normalized values of 1 indicate that the annual value is equivalent to the mean  value. Stationary flow statistics vary around 1 for the entire time series, as is the case for the Chippewa River basin (Fig. 4). Non-stationary time series systematically deviate from 1, indicating that the mean condition has changed during the period of record. Qualitatively, all seven flow metrics in the Chippewa River 5

Time series of annual streamflow metrics aggregated from multiple gages within each basin
Basin have remained stable since the 1930's, although statistically seven day low flows in winter have seen a 12% increase in the mean condition since 1975 (p<0.01) (Fig. 4).
Unlike the Chippewa, flow metrics in the Minnesota, Red, and Illinois river basins systematically and dramatically increase in recent decades, with nearly a two-fold increase or greater in almost all flow metrics since 1975 (Fig. 4). Seven day low flows in summer (May-October) and winter (November-April) have increased most in these basins, with 67%-275% 10 increase in means (p<0.001) since 1975 (Fig. 4b). High flow and extreme flow days have also increased significantly in the MRB (p<0.001), IRB (p<0.05) and RRB (p<0.001). Spring (March-May) peak daily flows have changed the least, indicating 14% (p>0.05), 37% (p<0.05), and 60% (p<0.05) increase in mean between 1934-1974 & 1975-2013 for the IRB, MRB, and RRB, respectively (Fig. 4b). The Minnesota River basin has seen the greatest percent increase in mean annual flow, peak daily flow summer & fall, 7 day low flow in winter, high flow days and extreme flow days; and peak daily flow summer and 15 7 day low flow in summer have increased most in the Red River of the North basin (Fig. 4b).
All seven flow statistics in the Red River basin increase dramatically after the mid-1990's (Fig. 4a). Low flows have increased 3.5-4 fold (p<0.001) and high and extreme flows have increased 2.5-3 fold (p<0.001) in the RRB since 1995 (Fig. 4b). Flows in Minnesota River basin have increased similarly, with a 3-4 fold increase in low flows (p<0.001) and 3 fold increase in high and extreme flows (p<0.001) since the timing of land cover transition (LCT). Changes in the Illinois 20 River basin are less obvious, yet still significant, with a 2 fold increase in low flows (p<0.001) and 1.5 fold increase in high and extreme flows (p<0.05) since LCT.
The MRB and RRB exhibit an increase not only in the magnitude but also in the cyclicity and synchronicity of these metrics after about 1980 (Fig. 4a). Cyclicity could imply that climate is playing a role in the observed increase in flows. However, the extent to which agricultural land and water management practices may be amplifying this climate effect 25 cannot be ascertained from this figure alone. The Illinois River basin exhibits the most change in summer and winter 7 day low flows, which increase after 1970, and this trend is even more pronounced when only examining gages within predominantly agricultural sub-basins that are unaffected by large dams (Fig. 4a). However, the changes in the RRB and MRB are much more obvious (and statistically significant) than those in the IRB. Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci.

Annual changes
Average annual runoff depth at the outlet gage of the MRB has increased 5.9 cm y -1 between the pre-period (1935-1974) and post-period (1975-2013), and the difference in means between the two periods is significant (p<0.001). Average annual precipitation depth in the MRB has also increased by 4.6 cm y -1 (p=0.033). Average annual runoff ratio has increased 10 Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License. from 0.11 to 0.18, equivalent to a 65% increase and consistent with the results of Vandegrift and Stefan (2010), who report a 70% increase in the runoff ratio using a shorter record from 1946-1965 to 1986-2005 at the MRB outlet gage.
In the RRB, the average annual runoff ratio has increased 65%, from 0.07 (pre-period) to 0.11 (post-period) at the outlet gage, which is greater change than the 55% increase reported by Vandegrift and Stefan (2010). On average the annual runoff and precipitation depths have increased 2.9 cm y -1 and 4.1 cm y -1 respectively, and the difference in means is 5 significant for annual flow (p<0.01) and precipitation (p=0.019).
Average annual runoff and precipitation depths in the IRB have increased 5.4 cm y -1 and 4.2 cm y -1 respectively between the pre-period (1939-1974) and post-period (1975-2013). The difference in means between the two periods is significant for runoff (p=0.011) but not precipitation (p=0.086). The average annual runoff ratio has increased from 0.30 to 0.34, a 14% increase. 10 The CRB average runoff ratio has decreased slightly (2%), from 0.37 to 0.36 between the pre-period and postperiod. On average, annual runoff depth has not changed (0.00 cm y -1 ) and average precipitation depth has increased 2.0 cm y -1 . The difference in annual means is not significant for annual flow (p=0.499) or precipitation (p=0.243). In summary, the MRB and RRB exhibit the greatest change in the annual runoff ratio, followed by the IRB, with negligible change in the CRB. Statistical results for annual changes in streamflow and precipitation for all breakpoints can be found in Table S1 in 15 the Supplement.

Monthly changes
The spectrogram (i.e., the squared magnitude of the Morlet wavelet coefficients |T(a, b)| 2 ) of monthly streamflow and precipitation from Eqn. 1 are plotted in Fig. 5. Figure 5 displays a significant increase in the annual and inter-annual energy of the basin outlet streamflow signal around 1975, 1980, and 1995 for the IRB, MRB, and RRB respectively, while 20 the CRB does not exhibit any striking changes in energy throughout the period of record. The timing of streamflow change identified qualitatively from Fig. 5 roughly coincide (within 5-12 years) with the breakpoints identified from the piecewise linear regression of the streamflow time series (Table 3). We observe minimal changes in the energy of the precipitation signal for any basins during the period of record (Fig. 5). Cumulative monthly precipitation, plotted in Fig. 6, indicates no systematic change in cumulative precipitation with time (i.e. constant slope) for any basin. However, cumulative monthly 25 streamflow (1935-2013) plotted in Fig. 6 indicates a sudden change in slope around 1973 in the IRB, 1980 in the MRB, and 1995 in the RRB, without a distinct change in slope in the CRB. Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License.  We ran 286 t-tests and 286 KS tests, splitting the monthly precipitation and flow data at the flow and precipitation breakpoints, LCT breakpoints, as well as at the 1974/1975 breakpoint (Table 3; Table S1). Because the p-values of the statistical tests resulted in the same interpretations for 95% of the tests regardless of the breakpoint, for consistency Fig. 7 summarizes the results of these statistical tests for flow and precipitation in all basins using the 1974/1975 breakpoint. Figure   7a illustrates the kernel density estimation (or non-parametric estimation of the probability density function) during the pre -5 period (1935-1974) and post-period (1975-2013) for June and September flows in each basin. Figure 7b reports 192 results (48 p-values reported per basin) from the monthly streamflow and precipitation t-tests and KS tests. Each color wheel displays 24 results (2 results per month for each basin), and shows significant p-values for t-tests (blue) and KS tests (grey) based on color. Color is inversely related to p-value such that smaller p-values (and thus more significant results) are shown in increasingly darker colors, with p-values greater than 0.05 colored white. As such the streamflow color wheel in Fig. 7b  10 for the Chippewa River basin is completely white, indicating there were no statistically significant changes in the mean or distribution of monthly streamflow volumes for any months, consistent with the qualitative assessment of the seven annual streamflow metrics. We report a significant increase in mean October precipitation in the CRB. Monthly results for flow and precipitation changes in the CRB are consistent with the annual changes reported earlier.
In stark contrast to the CRB, the streamflow color wheels for the MRB and RRB show significant changes in mean 15 or distribution of monthly streamflow for nearly all months (22 out of 24 for MRB and 21 out of 24 for RRB) (Fig. 7b). In the RRB, mean precipitation in October has increased, and the precipitation distributions have shifted to the right for September and October (Fig. 7b). In the MRB, there has been a significant increase in mean March precipitation (Fig. 7b).
The IRB exhibits fewer overall changes in streamflow than the RRB and MRB, with significant changes (in the mean or distribution) in monthly streamflow volumes for September, October, November, December and March, and significant 20 changes in August and November precipitation (Fig. 7b).
We acknowledge that due to high variability and small sample sizes, we may not have sufficient power to detect small, but real changes in precipitation and streamflow using these statistical tests, and thus may be prone to Type II error . However, these results are consistent with the qualitative assessment of CWT and cumulative precipitation and streamflow trends, which indicate only slight changes in total precipitation across all basins, large increases 25 in total flow in the MRB and RRB, moderate flow increases in the IRB, and no streamflow changes in the CRB. Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License.

Figure 7. a) Kernel density plots of monthly streamflow volumes for June and September for each basin b) Corresponding significance results for t-tests and K-S tests (α=0.05) of monthly streamflow and precipitation volumes, where a significant result indicates a positive shift (increase) in the mean or distribution between 1935-1974 & 1975-2013.
Amplification of streamflow for MRB, RRB, and IRB is apparent from the seven annual flow statistics (Fig. 4a), 5 CWT (Fig. 5), cumulative streamflow (Fig. 6), and t-tests and K-S tests of monthly streamflow between the pre-period and post-period (Fig. 7b). Using these same analyses, precipitation changes at the monthly scale are almost entirely undetectable (Figs. 5,6,and 7b). To understand whether the cause and effect interconnection of streamflow and precipitation has changed we plotted the joint probability distribution functions (joint PDF) of monthly P and Q, f(P, Q), for each basin (Fig. 8). Joint PDF of two random variables (here pairs of P and Q) is the chance of their occurrence simultaneously. In Fig. 8 we illustrate 10 three empirical quantiles of the joint PDFs through contour levels ∈ {0.1, 0.6, 0.9} , where each contour level represents the boundary of a discrete 2D space in which the probability of each (P, Q) pair to fall inside that 2D space is alpha. A shift in the contour levels in the vertical, rather than diagonal, direction suggests that precipitation changes alone cannot explain Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License. changes in streamflow, and some other component of the system must be amplifying the transformation of precipitation to runoff at the monthly timescale.
There is a shift toward larger monthly streamflow volume for the same volume of precipitation at each 10% & 60% quantile in the MRB and 60% and 90% quantile in the RRB (Fig. 8). However it appears the 90% exceedance contour for the MRB and 10% exceedance contour for the RRB have shifted up and to the right, indicating that an increase in precipitation 5 in the driest months in the MRB and wettest months in the RRB could be driving some of the change in flow (Fig. 8).
Certainly the largest observable change in the MRB and RRB during this time is a shift from small grains to soybeans and an increase in the density and efficiency of drain tile networks. While analyses shown above documented significant changes in streamflow of IRB (Figs. 4,5,6,and 7b), this change is not as obvious in these JPDF contours, which indicate only a slight vertical shift in all quantiles (Fig. 8). Consistent with other analyses, the CRB does not demonstrate any shift in the P-Q 10 relation suggesting the streamflow has been largely unaffected by the observed slight increase in annual precipitation in the basin (Fig. 8). Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License.

Daily changes 5
At the daily scale, we found an increase in the magnitude of streamflow change (hydrograph slopes) for both the daily rising limbs (dQ/dt>0) and falling limbs (dQ/dt<0) of the hydrographs for RRB, MRB, and IRB outlet gages, Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. suggesting an increase in the "flashiness", or daily rate of change, of the hydrologic response (Fig. 9). Although the greatest average daily rates of change are less in the post-period than pre-period in the MRB for probabilities of exceedance less than 0.2%, this constitutes a small fraction of the total observations and can be linked to extreme events (Fig. 9). Figure 9 shows a slight decrease in the post-period curve for the CRB, indicating that the rising limb and falling limb flows may actually be less "flashy" in recent times  than in the past . May-June is approximately the start of the growing 5 season for soybean and corn and it is the time that tiles are most active, as this time of year usually corresponds to high monthly rainfall, high antecedent moisture conditions from spring snowmelt, and lower ET rates than the peak growing season due to lower crop water demands, and air temperatures that precede the annual peak. Considering rising and falling limbs exclusively in May-June, the magnitude of changes are even greater than the "all months" period for the MRB and RRB (Fig. 9). For the CRB, the lower cumulative probability of daily rising and falling limbs in May-June indicates a 10 decrease in the daily "flashiness" of the streamflow regime at the outlet of the basin, which may be related to climate, but more likely could be related to dam building and operations upstream of the outlet.

Hydrologic budgets: determining whether climate alone explain changes in flow
While time series and statistical analyses reveal useful insights regarding the timing, magnitude, and significance of precipitation and streamflow changes, as well as provide a qualitative indication of whether or not changes in precipitation 15 and streamflow may be correlated and proportional, they cannot fully deconvolve the influence of human activities (land conversion and agricultural drainage) and climate (precipitation increase) on streamflows. Therefore, we calculate water budgets for each basin as a tool to understand whether the observed changes in precipitation are large enough to account for the changes in streamflow (Healy et al., 2007). Table 4 reports the calculated average annual water budget terms (precipitation, streamflow, evapotranspiration, and 20 change in storage) during the periods before and after the 1974/1975 and LCT breakpoint. Without changes in land use or climate, we would expect that on an average annual basis, change in storage (i.e., soil moisture, groundwater, and lakes/reservoirs) would equal zero. Increased precipitation that exceeds increases in output terms translates into positive change in storage, while increased drainage or other output terms translate into negative change in storage. We also address how uncertainty in ET a , may influence interpretations. 25 Using the conservative estimates of ET a (reduced by 17% in JJA), we find that prior to the LCT in the MRB, average change in storage is approximately +2.4 cm y -1 (Table 4). After the LCT, the change in storage is about -2.7 cm y -1 , indicating a net reduction in water stored in soil, groundwater, and/or lakes, wetlands, or reservoirs (Table 4). To test the sensitivity of this result to ET a we fix the change in storage so that it equals zero in both periods, and find that in the pre-LCT-period total ET a would need to be 4% greater and 5% less than the conservative estimates in the post-LCT-period for 30 the budgets to close. The conservative estimates of ET a tend to over predict estimates of ET a from the Ameriflux site in Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License. Rosemount, MN by 5% on an annual basis (Fig. S5). Thus, even if the ET a estimates over predict actual ET a by roughly 5%, this offset would systematically affect the time series. Therefore, we could still expect to find that the change in storage term would decrease between the pre-and post-LCT-periods. The most parsimonious explanation for this reduction in water storage is the systematic removal of wetlands and lowering the water table, accomplished through tile drainage installation and expansion. 5 The average annual change in storage in the IRB was -3.0 cm y -1 and -6.1 cm y -1 during the pre and post LCT periods, respectively ( Table 4). Given that the conservative estimates of ET a tend to over predict the average annual estimates of ET a at the Bondville, IL Ameriflux site by 7%, it is rational for both water budget change in storage terms to be negative (Fig. S5). Letting the change in storage term equal zero in both periods suggests that total ET a would need to be 4% and 9% less in the pre-and post-LCT-periods respectively. Regardless of uncertainty in ET a which we expect to 10 systematically affect both periods, drainage intensification in the post-LCT period is likely responsible for the overall decrease in storage between the pre-and post-LCT periods.
Prior to the LCT in the RRB, change in storage was about 8.4 cm y -1 , while only 5.8 cm y -1 during the post-LCTperiod (Table 4). Positive change in storage in both the pre-period and post-period indicates that some precipitation may go towards increasing soil moisture, groundwater, and/or lake, wetland, or reservoir storage during both time periods. In order 15 to close the water budget such that the change in storage is zero, ET a would need to be 20% and 12% greater in the pre-and post-LCT-periods, respectively. Estimates of ET a from a grassland Ameriflux site in Brookings, SD suggest that the conservative ET a estimate used in this study may under predict ET a by as much as 34% annually (Fig. S5). Although we are uncertain as to why L13 ET a estimates underestimate Ameriflux ET a estimates so dramatically, especially during peak growing season one possibility is misclassification of grassland as cropland; corn has been found to have lower ET a rates 20 than some grasses (Hickman et al., 2010). Regardless of the cause, this underestimation likely explains in part why the preperiod and post-period storage terms were positive. Again, while the uncertainties in these water budgets are high, the direction of change between the two periods in the storage term is reliable. In the case of the RRB, the trend is towards less storage in the post-LCT-period than pre-LCT-period, which most likely indicates the influence of factors other than climate, such as drainage, on increased streamflows. 25 For the CRB, which is not intensively drained (Fig. 2) and has seen relatively little change in crop type (Fig. 3), it seems most appropriate to discuss the hydrologic budget results for the pre-period and post-period using the 1974/1975 breakpoint, rather than the LCT breakpoint, 2009/2010. The CRB has seen some increase in average annual precipitation, but no increase in runoff (Table 4), consistent with Figs. 8 & 9b. Somewhat surprising is the CRB change in storage during the pre-period and post-period is -6.2 cm y -1 and -5.2 cm y -1 respectively using the conservative estimates of ET a (ET a in JJA 30 reduced by 17%). In absence of widespread drainage activities in the CRB, these results suggest an overestimate of ET a . The conservative estimates of ET a used in this study are approximately 19% greater on an average annual basis than the estimates of ET a from the Willow Creek, WI Ameriflux station (Fig. S5). We believe this bias could be related to the presence of lakes within the watershed and the basin's proximity to Lake Superior and Lake Huron. Closing the water budget by letting the storage term equal zero requires ET a to be 10% and 12% lower in the pre-and post-1975-periods respectively. Regardless of this uncertainty, the overall trend indicates that water storage in the basin may have actually increased slightly between the pre-and post-1975-period, which could be accomplished through increased soil moisture, groundwater recharge, or 5 reservoir storage in recent times.
The ET a estimates used in this study were calculated using the Hansen et al. (2000) global vegetation classification, which masks bodies of water, as the land cover input. Due to the coarse resolution of this global dataset, parts of southern Wisconsin appear to be misclassified as broadleaf deciduous forest instead of cropland. Some studies in the Great Lakes region report broadleaf deciduous forest to have slightly higher annual ET a rates than cropland (Mao and Cherkauer, 2009;10 Mishra et al., 2010). Likely of larger significance is that Livneh et al. (2013) and Maurer et al. (2002) do not suggest that they considered lake and wetland effects on evapotranspiration, which in the Great Lakes region can be significant (Bryan et al., 2015).
Given the uncertainty in all estimates of ET a , and also recognizing that it may be systematic, we believe it is most pertinent to consider the relative change in storage or the difference in the storage term between the pre-period and post-15 period (using the conservative estimates summer of ET a ). The change in storage term has decreased by about 200%, 100%, and 30%, in the MRB, IRB, and RRB from the pre-LCT-period to post-LCT-period. In the CRB change in storage has increased by roughly 30% from 1935-1974 to 1975-2011. These relative changes suggest that soil moisture, groundwater, and/or lake, wetland, and reservoir storage has decreased between the pre-LCT and post-LCT periods in the MRB, RRB, and IRB and increased in the CRB between the pre-period and post-period. These results are consistent with our hypothesis that 20 LULC changes in the MRB, RRB, and IRB necessarily change how precipitation is transformed into streamflow and that increases in precipitation alone cannot explain changes in streamflow in these basins. Without pervasive LULC impacts in the CRB, while precipitation has increased slightly in the CRB, flows have not changed, likely due to increases in soil moisture, groundwater, and/or lake, wetland and reservoir storage and/or increases in evapotranspiration. Seasonal changes in storage shown in Fig. 10 suggest that soil moisture, groundwater, and/or lake, wetland, and reservoir storage in the spring 25 and summer is negative (not enough P and ET a to produce observed flows) and positive in the fall (more P and ET a than necessary to produce observed flows and thus an increase in storage during the fall).

Interpretations, implications, and conclusions
The impacts of climate and land use changes on Upper Midwest hydrology are complex and often scale dependent; 5 what we might expect and observe as the impacts of artificial drainage at the field scale might be very different from the large watershed scale. This study reports the impacts of climate and land use changes on streamflows over a 79 year period in large (>10 4 km 2 ) watersheds, the scale at which state and federal agencies must regulate water quality.
Surface and subsurface drainage remains an economically beneficial, yet largely unregulated land management practice in the Midwestern USA and Canada that affects enormous swaths of agricultural land (Cortus et al., 2011). Drainage 10 data reported by USDA Census of Agriculture, indicate that the Minnesota River basin and Illinois River basin are most extensively drained (at least 42%-45% of the watershed area), have seen an increase in drainage activity since 1940, and that the majority of drainage in these basins today is subsurface tiling. While drainage in the Red River basin has not increased much since 1940, is less extensive (~23% of RRB drained in 2012) than drainage in the IRB and MRB, and has traditionally consisted of surface ditches, tile drainage installation appears to be on the rise in the Red River Valley in response to recent 15 floods. Drainage has never been a widespread practice in the CRB. Though drainage census data are prone to reporting inconsistencies and errors, overall underestimation of drainage from excluding farms less than 500 acres, and do not provide the information necessary for modeling basin hydrology in large agricultural watersheds (such as drain size, depth, spacing, and extent), these are the most comprehensive inventory of drainage in the United States. This raises the question: why is such a widespread practice with such potentially profound and pervasive impacts on watershed hydrology and water quality 20 so poorly documented and regulated? Until we have the information necessary to calibrate and validate watershed models, it will be difficult to deconvolve proportional impacts of climate and land use impacts on flows at large spatial scales. Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License.
We report increased streamflow based on annual and monthly metrics and increased flashiness of daily flows in the IRB, MRB, and RRB, with no discernable changes in these flow metrics, except for a decrease in flow flashiness in the CRB during the period 1935-2013. We further document that average monthly precipitation totals have increased in very few months, mostly in the fall, in each of the four basins, with annual precipitation totals not statistically different in the CRB and IRB (p>0.05) and a significant increase in mean precipitation in the MRB (p=0.03) and the RRB (p=0.02). The joint 5 probability distributions of streamflow and precipitation indicate a slight increase in precipitation in all basins, as well as a large increase in flows associated with the same magnitude precipitation in the MRB and RRB (Fig. 8).
The Red and Minnesota river basins have some of the poorest drained soils of the Upper Midwest and historically grew more hay and small grains than the other basins (Fig. 3). The introduction of artificial drainage combined, replacement of hay & small grains with soybeans, and the lack of major dams and municipal and industrial water use, has resulted in 10 pronounced streamflow amplification in response to land use and climate changes in the RRB and MRB relative to the IRB and CRB (Fig. 4). However, the extensively drained Minnesota River Basin has seen the largest increases in flow for relatively similar climatic change to the IRB and RRB, and this is likely because of the high degree of watershed hydrologic alteration and connectivity from drainage and lack of other anthropogenic water uses.
Despite large uncertainties in Q, P, and ET a , conservatively constructed water budgets suggest that soil moisture, 15 groundwater, and/or lake, wetland, and reservoir storage must be less during the post-LCT-period than pre-LCT-period in the MRB, IRB, and RRB. Watershed storage appears to have increased in the CRB since 1975. Overall, the changes in precipitation and drainage seem to affect the water balance at different times of year, with precipitation increases most influential in the fall, and drainage effects most pronounced in the spring (Fig. 10).
The combined results of this study lead us to three conclusions: 1) drainage expansion and intensification, 20 especially of tile drainage, continues to occur in agricultural river basins of the Midwestern USA; 2) across multiple scales (daily, monthly , annual) and for a range of flows (low, mean, extreme) streamflows have increased in intensively managed agricultural watersheds (IRB, MRB, and RRB) and have remained stationary in the more forested and heavily dammed CRB; 3) increases in precipitation in the Midwestern USA contribute to recently observed increases in streamflow, but the magnitude of precipitation increases alone cannot explain the observed increases in flow for agricultural basins. Therefore, it 25 appears that the pervasive and extensive artificial drainage in agricultural basins has contributed to increased streamflow, not only at 10 2 -10 3 km watershed scales (e.g. Foufoula-Georgiou et al., 2015;Harrigan et al., 2014;Schilling and Libra, 2003;Schottler et al., 2014;Zhang and Schilling, 2006), but also at the scale of very large basins studied here. Improved information regarding the size, spacing, depth, and extent of artificial drainage would greatly enhance our ability to model agricultural systems and predict downstream impacts. 30 Though artificial drainage reduces field erosion by reducing surface runoff, it has been shown to essentially shift the sediment source from fields to channels (Belmont et al., 2011). The cause of that shift is still linked to agricultural land use. Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-571, 2016 Manuscript under review for journal Hydrol. Earth Syst. Sci. Published: 11 November 2016 c Author(s) 2016. CC-BY 3.0 License.
Basins experiencing increases in streamflow due to natural (climate) and anthropogenic (drainage) factors have excess stream power and are expected therefore to erode sediments and sediment bound nutrients and contaminants. Gains in water quality, to come into compliance with water quality standards, might be achieved through reducing large, erosive flows, which generally occur in spring and early summer when tiles are actively draining soils and precipitation events are large.
Thus these large agricultural basins may benefit from increasing water storage to offset some of the land use impacts on river 5 flows.  1999-2002 2003-2008 2004-2009 2004-2009 Vegetation