Indian Forum for Water Adroit

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Pankaj Dey

Pages: 1 [2] 3 4 ... 6

The detection and attribution of long-term patterns in hydrological time series have been important research topics for decades. A significant portion of the literature regards such patterns as ‘deterministic components’ or ‘trends’ even though the complexity of hydrological systems does not allow easy deterministic explanations and attributions. Consequently, trend estimation techniques have been developed to make and justify statements about tendencies in the historical data, which are often used to predict future events. Testing trend hypothesis on observed time series is widespread in the hydro-meteorological literature mainly due to the interest in detecting consequences of human activities on the hydrological cycle. This analysis usually relies on the application of some null hypothesis significance tests (NHSTs) for slowly-varying and/or abrupt changes, such as Mann-Kendall, Pettitt, or similar, to summary statistics of hydrological time series (e.g. annual averages, maxima, minima, etc.). However, the reliability of this application has seldom been explored in detail. This paper discusses misuse, misinterpretation, and logical flaws of NHST for trends in the analysis of hydrological data from three different points of view: historic-logical, semantic-epistemological, and practical. Based on a review of NHST rationale, and basic statistical definitions of stationarity, nonstationarity, and ergodicity, we show that even if the empirical estimation of trends in hydrological time series is always feasible from a numerical point of view, it is uninformative and does not allow the inference of nonstationarity without assuming a priori additional information on the underlying stochastic process, according to deductive reasoning. This prevents the use of trend NHST outcomes to support nonstationary frequency analysis and modeling. We also show that the correlation structures characterizing hydrological time series might easily be underestimated, further compromising the attempt to draw conclusions about trends spanning the period of records. Moreover, even though adjusting procedures accounting for correlation have been developed, some of them are insufficient or are applied only to some tests, while some others are theoretically flawed but still widely applied. In particular, using 250 unimpacted stream flow time series across the conterminous United States (CONUS), we show that the test results can dramatically change if the sequences of annual values are reproduced starting from daily stream flow records, whose larger sizes enable a more reliable assessment of the correlation structures.

Link to the paper:

AbstractInspired by the work of Newton, Darwin, and Wegener, this paper tracks the drivers and dynamics that have shaped the growth of hydrological understanding over the last century. On the basis of an interpretation of this history, the paper then speculates about what kind of future is in store for hydrology and how we can better prepare for it. The historical narrative underpinning this analysis indicates that progress in hydrological understanding is brought about by changing societal needs and technological opportunities: new ideas are generated by hydrologists through addressing societal needs with the technologies of their time. We suggest that progress in hydrological understanding over the last century has expressed itself through repeated cycles of euphoria and disenchantment, which have served as stimuli for the progress. The progress, for it to happen, also needed inspirational leaders as well as a supportive scientific community that provided the backdrop to major advances in the field. The paper concludes that, in a similar way to how Newton, Darwin, and Wegener conducted their research, hydrology too can benefit from synthesis activities aimed at “connecting the dots.”



Droughts and other extreme precipitation events are predicted to increase in intensity, duration, and extent, with uncertain implications for terrestrial carbon (C) sequestration. Soil wetting from above (precipitation) results in a characteristically different pattern of pore-filling than wetting from below (groundwater), with larger, well-connected pores filling before finer pore spaces, unlike groundwater rise in which capillary forces saturate the finest pores first. Here we demonstrate that pore-scale wetting patterns interact with antecedent soil moisture conditions to alter pore-scale, core-scale, and field-scale C dynamics. Drought legacy and wetting direction are perhaps more important determinants of short-term C mineralization than current soil moisture content in these soils. Our results highlight that microbial access to C is not solely limited by physical protection, but also by drought or wetting-induced shifts in hydrologic connectivity. We argue that models should treat soil moisture within a three-dimensional framework emphasizing hydrologic conduits for C and resource diffusion.


Environmental simulation models, such as precipitation-runoff watershed models, are increasingly used in a deterministic manner for environmental and water resources design, planning, and management. In operational hydrology, simulated responses are now routinely used to plan, design, and manage a very wide class of water resource systems. However, all such models are calibrated to existing data sets and retain some residual error. This residual, typically unknown in practice, is often ignored, implicitly trusting simulated responses as if they are deterministic quantities. In general, ignoring the residuals will result in simulated responses with distributional properties that do not mimic those of the observed responses. This discrepancy has major implications for the operational use of environmental simulation models as is shown here. Both a simple linear model and a distributed-parameter precipitation-runoff model are used to document the expected bias in the distributional properties of simulated responses when the residuals are ignored. The systematic reintroduction of residuals into simulated responses in a manner that produces stochastic output is shown to improve the distributional properties of the simulated responses. Every effort should be made to understand the distributional behavior of simulation residuals and to use environmental simulation models in a stochastic manner.

Link to the paper:

Programming / Colaboratory
« on: November 08, 2017, 11:14:45 AM »
Colaboratory is a research project created to help disseminate machine learning education and research. It’s a Jupyter notebook environment that requires no setup to use.


Data / CCI Toolbox
« on: November 07, 2017, 12:10:46 PM »
In 2009, ESA, the European Space Agency, has launched the Climate Change Initiative (CCI), a programme to respond the need for climate-quality satellite data as expressed by GCOS, the Global Climate Observing System that supports the UNFCCC, the United Nations Framework Convention on Climate Change.
In the ESA CCI programme 14 Essential Climate Variables (ECV) are produced by individual expert teams, and cross-cutting activities provide coordination, harmonisation and support. The CCI Toolbox and the CCI Open Data Portal are the two main technical support projects within the programme. The CCI Open Data Portal will provide a single point of harmonised access to a subset of mature and validated ECV-related data products. The CCI Toolbox will provide tools that support visualisation, analysis and processing across CCI and other climate data products.
Please follow the links for more information:

Precipitation data are important for hydrometeorological analyses, yet there are many ways to measure precipitation. The impact of station density analyzed by the current study by comparing measurements from the Missouri Mesonet available via the Missouri Climate Center and Community Collaborative Rain, Hail, and Snow (CoCoRaHS) measurements archived at the program website. The CoCoRaHS data utilize citizen scientists to report precipitation data providing for much denser data resolution than available through the Mesonet. Although previous research has shown the reliability of CoCoRaHS data, the results here demonstrate important differences in details of the spatial and temporal distribution of annual precipitation across the state of Missouri using the two data sets. Furthermore, differences in the warm and cold season distributions are presented, some of which may be related to interannual variability such as that associated with the El Nino and Southern Oscillation (ENSO). The contradictory results from two widely-used datasets displays the importance in properly choosing precipitation data that have vastly differing temporal and spatial resolutions. With significantly different yearly aggregated precipitation values, the authors stress caution in selecting one particular rainfall dataset as conclusions drawn could be unrepresentative of the actual values. This issue may be remediated by increased spatiotemporal coverage of precipitation data.


Data / The Global Streamflow Indices and Metadata Archive
« on: October 10, 2017, 07:43:46 PM »
This is the first part of a two paper series presenting the Global Streamflow Indices and Metadata archive (GSIM), a worldwide collection of metadata and indices derived from more than 35,002 daily streamflow timeseries. This paper focuses on the compilation of the daily streamflow timeseries based on 12 free-to-access streamflow databases (seven national databases and five international collections). It also describes the development of three metadata products (freely available at (1) a GSIM catalogue collating basic metadata associated with each timeseries, (2) catchment boundaries for the contributing area of each gauge, and (3) catchment metadata extracted from 12 gridded global data products representing essential properties such as land cover type, soil type, climate and topographic characteristics. The second paper in the series then explores production and analysis of streamflow indices. Having collated an unprecedented number of stations and associated metadata, GSIM can be used to advance large-scale hydrological research and improve understanding of the global water cycle.

Link to paper:

The Variable Infiltration Capacity (VIC) hydrologic and river routing model simulates the water and energy fluxes that occur near the land surface and provides useful information regarding the quantity and timing of available water within a watershed system. However, despite its popularity, wider adoption is hampered by the considerable effort required to prepare model inputs and calibrate the model parameters. This study presents a user-friendly software package, named VIC-Automated Setup Toolkit (VIC-ASSIST), accessible through an intuitive MATLAB graphical user interface. VIC-ASSIST enables users to navigate the model building process through prompts and automation, with the intention to promote the use of the model for practical, educational, and research purposes. The automated processes include watershed delineation, climate and geographical input set-up, model parameter calibration, sensitivity analysis, and graphical output generation. We demonstrate the package's utilities in various case studies.


Programming / EcoHydRology: A R Package
« on: September 08, 2017, 05:59:57 PM »
This package provides a flexible foundation for scientists, engineers, and policy makers to base teaching exercises as well as for more applied use to model complex eco-hydrological interactions.

Announcements / Re: Introduction to Monte Carlo Markov Chain Methods.
« on: September 01, 2017, 11:45:25 AM »
Today's Talk is Cancelled because Prof K B Athreya has fallen ill and is not in a position to give talk today.

Daily temperature values are generally computed as the average of the daily minimum and maximum observations, which can lead to biases in the estimation of daily averaged values. This study examines the impacts of these biases on the calculation of climatology and trends in temperature extremes at 409 sites in North America with at least 25 years of complete hourly records. Our results show that the calculation of daily temperature based on the average of minimum and maximum daily readings leads to an overestimation of the daily values of ~10+% when focusing on extremes and values above (below) high (low) thresholds. Moreover, the effects of the data processing method on trend estimation are generally small, even though the use of the daily minimum and maximum readings reduces the power of trend detection (~5-10% fewer trends detected in comparison with the reference data).


Announcements / Introduction to Monte Carlo Markov Chain Methods.
« on: August 28, 2017, 06:44:38 PM »
:A very useful result in probability theory as applied to the real world is the law of large numbers. It says that the sample mean of iid observations converges to the population mean in some sense. The CLT is a refinement of this. About half a century ago this method was extended to Markov chains and a new tool known as MCMC was born. In this talk, we shall outline this method with an application.

 Speaker:    Prof Krishna B Athreya
             Department of Statistics
             Iowa State University, USA

Title:      Introduction to Monte Carlo Markov Chain Methods.
Venue:      EEB308 (old PE303)
Date:       Friday, Sep 1, 2017
Time:       4 pm - 5 pm.    (Tea/Coffee at 3:45 pm)

Due to inherent bias the climate model simulated precipitation and temperature cannot be used to drive a hydrological model without pre-processing e statistical downscaling. This often consists of reducing the bias in the climate model simulations (bias correction) and/or transformation of the observed data in order to match the projected changes (delta change). The validation of the statistical downscaling methods is typically limited to the scale for which the transformation was calibrated and the driving variables (precipitation and temperature) of the hydrological model. The paper introduces an R package ”musica” which provides ready to use tools for routine validation of statistical downscaling methods at multiple time scales as well as several advanced methods for statistical downscaling. The musica package is used to validate simulated runoff. It is shown that using conventional methods for downscaling of precipitation and temperature often leads to substantial biases in simulated runoff at all time scales.


Data / Relevant Datasets and their sources
« on: August 16, 2017, 05:42:38 PM »
Please find the attached document for datasets and their online links.

To read the "State of the Climate 2016" by American Meteorological Society, follow the link:

Thank you,

Pages: 1 [2] 3 4 ... 6