Indian Forum for Water Adroit

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Diwan

Pages: [1] 2
This article examines the effects of using leaves, something most students see every day and have some familiarity with, as an analogy for the concept of watersheds in an undergraduate water resources engineering course. The ultimate goal of the leaf/watershed analogy and associated instruction is to increase students’ understanding of hydrology principles, which in turn may facilitate better watershed management through increased public awareness, increased adoption of appropriate best management practices, and improved policy decisions. The assessment was performed with junior and senior undergraduate students enrolled in a Water Resource Engineering course. The assessment results showed that overall, students benefitted from the leaf analogy as a tool for learning watersheds. However, this effect varied depending on students’ learning style preferences.

Citation: Anandhi, A., Y. Yang, and M. Hubenthal. 2017. Using Leaves as a Model for Teaching Watershed Concepts in Natural Resources Science and Engineering Programs. Natural Sciences Education 46:170020. doi:10.4195/nse2017.09.0020
The following users thanked this post: Diwan

Programming / Re: Writing two dimensional netCDF file in python-Reg
« on: February 08, 2018, 05:30:35 PM »
Hi Diwan

I think the problem is with the variableName used. The variables associated with the dimensions should have the same name as the dimensions. Change these lines

Code: [Select]
latitude = nc_file.createVariable('Latitude',float32,('lat',))
longitude = nc_file.createVariable('Longitude',float32,('lon',))

Modified :

Code: [Select]
#Creating variables
latitude = nc_file.createVariable('lat',float32,('lat',))
longitude = nc_file.createVariable('lon',float32,('lon',))
The following users thanked this post: Diwan

We present TerraClimate, a dataset of high-spatial resolution (1/24°, ~4-km) monthly climate and climatic water balance for global terrestrial surfaces from 1958–2015. TerraClimate uses climatically aided interpolation, combining high-spatial resolution climatological normals from the WorldClim dataset, with coarser resolution time varying (i.e., monthly) data from other sources to produce a monthly dataset of precipitation, maximum and minimum temperature, wind speed, vapor pressure, and solar radiation. TerraClimate additionally produces monthly surface water balance datasets using a water balance model that incorporates reference evapotranspiration, precipitation, temperature, and interpolated plant extractable soil water capacity. These data provide important inputs for ecological and hydrological studies at global scales that require high spatial resolution and time varying climate and climatic water balance data. We validated spatiotemporal aspects of TerraClimate using annual temperature, precipitation, and calculated reference evapotranspiration from station data, as well as annual runoff from streamflow gauges. TerraClimate datasets showed noted improvement in overall mean absolute error and increased spatial realism relative to coarser resolution gridded datasets.

Link to paper:
Link to data:
The following users thanked this post: Diwan

Recently, I have published a couple of papers related to satellite microwave soil moisture research in the Advances in Water Resources Journal.

The process of retrieving soil moisture from satellite microwave sensors depends on the type of sensor i.e., active and passive microwave sensors. Over the past four decades, the microwave community has progressed in terms of improving the sensor design and retrieval algorithm so as to achieve accurate global scale soil moisture observations.

The first paper titled "Four decades of microwave satellite soil moisture observations: Part 1. A review of retrieval algorithms" gives a comprehensive overview of the developments that took place in the retrieval algorithms over the past four decades. The paper discusses the algorithmic developments of both active as well as passive sensors. We have also, for the first time, summarized the literature in the form of figures, one each for active and passive microwave soil moisture research (PFA). This review also discusses the latest developments in components of the algorithms, and also discusses the challenges that need attention in future. I can say that this paper serves as a starting point for someone who wants to venture into the field of microwave soil moisture research.

The second paper titled "Four decades of microwave satellite soil moisture observations: Part 2. Product validation and inter-satellite comparisons" is focused on assessing the accuracy of soil moisture products, which were developed over the past forty years. We have considered eight passive (SMMR, SSM/I, TMI, AMSR-E, WindSAT, AMSR2, SMOS, and SMAP), two active (ERS-Scatterometer, and MetOp-ASCAT), and one active-passive combined (ESA-CCI combined product) soil moisture products for the analysis. The Contiguous United States (CONUS) is considered as a case study. The validation is carried out using the data pertaining to 1058 stations over the CONUS, along with model soil moisture simulations obtained from the VIC land surface model. We analyzed these products in terms of daily coverage (a figure in this context is atteched with this post), temporal performance, and spatial performance. We also carried out inter-satellite comparisons to study the roles of sensor design and algorithms on the retrieval accuracy. The Part 1 serves as a prelude to this paper. Through these papers, one can get an idea about how well the satellites along with algorithms have progressed over the four decades and significantly improved the accuracy of soil moisture retrievals.

Part 1 can be downloaded for free from here: (until November 4, 2017)
Part 2 can be downloaded for free from here: (until November 17, 2017)

Here are the permanent links:
Part 1:
Part 2:

You can write an email to me at for full-text requests or any other query related to papers.
The following users thanked this post: Diwan

The work carried out by Ila Chawla along with Prof. P.P. Mujumdar titled "Partitioning Uncertainty in Streamflow Projections under Nonstationary Model Conditions" was recently published in the Advances in Water Resources Journal.

In this work, the authors, in a novel attempt, have addressed the possibility of nonstationary hydrological model (here they have used the VIC land surface model coupled with a routing model). They found that the hydrological model parameters, which are influenced by climate variables, vary over time, and thus may not be assumed to be stationary. Further, the work also involves attribution the total uncertainty in the streamflow projections to multiple effects such as, model parameters, GCM simulations, emission scenarios, land use scenarios, the assumption of hydrological model stationarity, along with internal variability of the model. The Upper Ganga Basin (UGB) is considered as a case study for this analysis.

Further details on the work can be found here:

Authors are glad to share their article with interested readers.

Correspondence to:
The following users thanked this post: Diwan

The Variable Infiltration Capacity (VIC) hydrologic and river routing model simulates the water and energy fluxes that occur near the land surface and provides useful information regarding the quantity and timing of available water within a watershed system. However, despite its popularity, wider adoption is hampered by the considerable effort required to prepare model inputs and calibrate the model parameters. This study presents a user-friendly software package, named VIC-Automated Setup Toolkit (VIC-ASSIST), accessible through an intuitive MATLAB graphical user interface. VIC-ASSIST enables users to navigate the model building process through prompts and automation, with the intention to promote the use of the model for practical, educational, and research purposes. The automated processes include watershed delineation, climate and geographical input set-up, model parameter calibration, sensitivity analysis, and graphical output generation. We demonstrate the package's utilities in various case studies.


The following users thanked this post: Diwan

Announcements / INDIA-UK Exchange
« on: September 06, 2017, 09:32:25 AM »
The India-UK Water Centre is inviting proposals from members of its Open Network of Water Scientists to apply for funding under one of two researcher exchange schemes. Funding is available to support at least two researcher exchanges to be undertaken during the period 01 January 2018 - 30 June 2018: at least one exchange by an Indian water scientist to the UK and at least one exchange by a UK water scientist to India.
1st September 2017: call opens
22nd September 2017: deadline for submission of application webform
10th October 2017: applicants notified of outcome
The following users thanked this post: Diwan

Due to inherent bias the climate model simulated precipitation and temperature cannot be used to drive a hydrological model without pre-processing e statistical downscaling. This often consists of reducing the bias in the climate model simulations (bias correction) and/or transformation of the observed data in order to match the projected changes (delta change). The validation of the statistical downscaling methods is typically limited to the scale for which the transformation was calibrated and the driving variables (precipitation and temperature) of the hydrological model. The paper introduces an R package ”musica” which provides ready to use tools for routine validation of statistical downscaling methods at multiple time scales as well as several advanced methods for statistical downscaling. The musica package is used to validate simulated runoff. It is shown that using conventional methods for downscaling of precipitation and temperature often leads to substantial biases in simulated runoff at all time scales.

The following users thanked this post: Diwan

Data / Access to ESA Third party missions data
« on: May 19, 2017, 03:58:25 PM »

Union Science and Technology Minister Harsh Vardhan on Monday launched a new web portal - Nakshe Portal - providing free download of Survey of India's topographic maps of the entire country in PDF format to Indian citizens.
The following users thanked this post: Diwan

Post your question/information / Re: Quantile Mapping for bias correction
« on: January 26, 2017, 12:22:26 AM »
Dear Sat Kumar Tomer,

I read many discussions in the forum and I saw that you work with Python and you developped a script to make a bias correction using quantile mapping (i work with matlab and started to deal with python last week).

Here's my problem:
I have to make a bias correction on a RCM dataset using quantile mapping. I made a matlab script to do it but i'm wondering if with python could be faster? I'm using an ensemble of 60 simulations on 10km resolution over north america at daily scale...
I attached my code if it could help any Matlab users...It deals with Netcdf files.

Many thanks in advance.
The following users thanked this post: Diwan

Programming / Re: R vs Python
« on: December 29, 2016, 03:31:06 PM »
I think MATLAB can be competent enough for GIS functionality, its just that the toolbox is not well explored. MATLAB has a dedicated MAPPING toolbox. I had a task of plotting spatial maps to present my results and I was not satisfied with the colormaps of R and Python (may be I was not patient enough to edit the preset colormaps to my requirements), this lead me to MATLAB. I presented my findings from toolbox in a MATLAB primer seminar series held at IISc. The following code plots a spatial map of mean daily rainfall (mm) over India. I plotted the same data in R and Python and felt MATLAB produced satisfactory outcome.

Code: [Select]
% Geo-Spatial Plotting in MATLAB using MAPPING Toolbox %

% Unzip data.rar and change directory to unzipped folder; Load requisite data for plot
load data_spatial_mapping.mat;                 
% Read shapefile of India
S = shaperead('India.shp','usegeocoords',true);
% Define map projection
axesm('MapProjection', 'eqdcylin')
% Set map axis limits and labels
setm(gca,'GLineStyle','--', 'GColor',[0.83 0.82 0.78], 'Grid','on','Frame','on')
setm(gca, 'MapLatLimit',[8 37.2],'MapLonLimit',[68 97.5],...
    'MlabelLocation', 5,'MLabelParallel','north', 'MeridianLabel','on',...
% Plot shapefile of India
geoshow(S.Lat,S.Lon,'LineWidth', 2,'Color',[0 0 0])
% Overlay with data
% geoshow(lat_mesh,lon_mesh,data_mesh,'DisplayType','surface');
colormap hsv;
xlabel('Mean Daily Rainfall (mm)','FontSize',14)      % Add title to the map

I can say that 'setm' is the heart of this code which sets the requisite parameters of the map.
Functions list in Mapping Toolbox can be found from here:

I would suggest interested people to edit this code and see the changes happening in the map.

I am still interested in learning in detail the 'ggplot2' and other important packages of R that support GIS functionality. Although they are huge, I wish there is some tutorial that covers all the aspects of these packages.
The following users thanked this post: Diwan

Programming / Re: R vs Python
« on: December 23, 2016, 05:05:24 PM »
According to you which is good for both statistical and Non-statistical analysis?Better based on ur experience can u suggest for what type of problems we should use MATLAB , R & Python Respectively ? Or by learning which language we can solve majority of problems ??

R and MATLAB are relatively better than Python for advanced statistical analysis. However for most commonly used stat is available in all.
If you learn any one of these, learning others is very easy. So just start with anyone. R and Python are better than MATLAB for handling GIS data. Python is relatively better for handling strings.

Which is easier to Learn, since learning and becoming critically good, like knowing nook and corner in any one of the programming languages  is itself difficult and challenging on ??
MATLAB has few datatype, so relatively easy to learn.

Is there any you-tube channel or website where they teach these algorithms from the scratch ?
Search in google and select the video option. You will find lots of videos. I prefer to learn from books. For MATLAB, I prefer: Basics of MATLAB and Beyond by Andrew Knight. For python, you can use Python in hydrology and other books are also mentioned in the forum. Just visit the Python section. For R, we added some notes - you can find in the R section of the forum.

Among MATLAB R and Python which has many GIS functionalities ???
Answered above.

For Beginners what one has to do, to become extremely good in a specific programming language because the net is flooded with so many teaching modules that we can't check each one and come to a conclusion .If we do so, we shall be left with learning less things with more time invested .
To learn programming, you need to program a lot. No other shortcut. Whatever assignment/problem you get, try to solve only in that programming language.

Is there  any online certification test conducted by any community testing our potential in MATLAB, R and Python like The GIS Certification Institute (GISCI) is a non-profit organization that provides the GIS community with a complete certification program, leading to GISP® (Certified GIS Professional) recognition. GISCI offers participants around the world, from the first early years on the job until retirement, a  method of showing competence for professionals and employers in the GIS profession.  We offer the only industry-wide, internationally-recognized,  software-agnostic Certification available to geospatial professional.Similar to it is there any institution which gives complete certification program in any of programming languages ?? IF so what is the language and what is the name of the institution ??? If so how much are they charging ??? By getting such a certification does it increase the employability in a developing country like INDIA ???
My advice, focus on the learning.

When you pose a question, please put it in a separate post.
The following users thanked this post: Diwan

MATLAB has a function called 'MEX function - Matlab EXecutable function' which is used to compile your source codes written in C or C++ or Fortran and invoke them from MATLAB. It is a facility through which you can create a C/C++/Fortran wrapper code (called the MEX function in this context) which in turn is used to pass data from MATLAB to source codes which does the computations and pass the outputs back to MATLAB.

MATLAB----> Prepare your data in MATLAB ----> Pass data to MEX function ----> MEX function passes data to C/C++/Fortran codes which does computation and produces output ----> Read output in MATLAB

Effectively, once you compile the MEX function, it just acts as any other MATLAB function, although it's written in C/C++/Fortran language.

The biggest advantage of this procedure is that the computations are extremely fast because of precise memory allocations that you carry out in your C/C++/Fortran codes.

So, you can write core operations of your bulky MATLAB codes in these machine languages and call them using MEX function.

You can find further information in
MATLAB Documentation: 1) 2)

Some tutorial:

I would like to illustrate an example of this function where the task is to add two matrices. So, I write the following MEX function in C language (code's name is addmat.c) to carry out the task.

Code: [Select]
/********* addmat.c ************/
#include "mex.h"
void mexFunction( int nlhs, mxArray *plhs[],
                  int nrhs, const mxArray *prhs[])
    /* Declarations */
    double *A, *B, *C; /* A, B, C are pointers */
    int m,n,i,j;
    /************** INPUT AND OUTPUT INITIALIZATIONS *********************************/
    /* Initialization - 'mxGetPr' is used to pass data; 'mxGetM' & 'mxGetN' are used to estimate the size of matrix */
    A = mxGetPr(prhs[0]); /* Passes first input matrix using prhs[0] which is a pointer to input */
    B = mxGetPr(prhs[1]); /* Passes first second matrix using prhs[1] which is a pointer to input */
    m = mxGetM(prhs[0]);
    n = mxGetN(prhs[0]);
   /* Similar to prhs, 'plhs' are array of pointers which pass expected output to MATLAB */
   /* In the step below pointer to output matrix is being created, which is of consistent dimensions with that of A and B */
    plhs[0] = mxCreateDoubleMatrix(m, n, mxREAL);

    C = mxGetPr(plhs[0]); /* This line indicates initialization of plhs[0] pointer to that of C, this step ensures that what ever modifications we do to C will be reflected in output, i.e., we ultimately get C as the output */


    /************** CODE TO ADD TWO MATRICES *********************************/
    for (i=0;i<m;i++)
        for (j=0;j<n;j++)

Once you write this C code, its MEX function can be created by running line >> mex addmat.c in MATLAB and you get following outcome.
Code: [Select]
>> mex addmat.c
Building with 'gcc'.
MEX completed successfully.

Once MEX file is successfully compiled you can observe a file addmat.mexa64 created in your working directory. That's it! Now you can use your mex code in MATLAB the following way to add two matrices.
Code: [Select]
>> a=[1 2;3 4];b=[7 -2;6 -10];
>> a

a =

     1     2
     3     4

>> b

b =

     7    -2
     6   -10

>> c=addmat(a,b)

c =

     8     0
     9    -6

One can write much complicated C/C++/Fortran codes and invoke them from MATLAB.

Writing MEX function is the most efficient way of tackling with cumbersome algorithms. Such a facility exists in R ( and Python ( too and the concept is something very similar to that of MATLAB.
The following users thanked this post: Diwan

Programming / R vs Python
« on: December 08, 2016, 05:46:27 PM »
This blog gives a nice intro and comparison between R and Python. Some popular packages and the scripting language to choose for your work are described eloquently.
The following users thanked this post: Diwan

Pages: [1] 2