Dow, C.L. 1999. Journal of the American Water Resources Association 35(2):349–362.
An established trend analysis methodology was applied to the problem of identifying and quantifying stream base flow impacts from water withdrawals and water loss through interbasin transfers. Impacts were simulated using base flow values selected from two U.S. Geological Survey (USGS) continuous record streamflow sites located within the Pinelands of southern New Jersey. Study site base flows were regressed against index site base flows with monotonic and step trend tests applied to the residuals from the regression model. The smallest, significantly detectable (α= 0.10) percentage reduction within a given simulation was used as an estimate of the sensitivity of a trend test. Evaluation of the trend analysis methodology led to the following practical considerations regarding trend test sensitivity. The proportion of study site base flow variability explained by index site base flows should be maximized, while at the same time minimizing positive, first-order autocorrelation in the regression residuals. Given the importance of detecting autocorrelation, missing values should be avoided or minimized. The quarterly (three-month) interval reduced the magnitude of autocorrelation relative to a shorter two-month sampling interval. Sensitivity appeared to improve when equalizing the number of values before and after a base flow impact(s) while seasonally biased sampling appeared to reduce sensitivity. Based primarily on past trend detection studies, nonparametric tests were deemed a better choice over their parametric counterparts, due to the lack of stringent data distributional requirements coupled with little or no loss of power even when applied to normally distributed data.