This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.
Messages  Diwan
1
« on: January 17, 2018, 04:09:02 PM »
We present TerraClimate, a dataset of highspatial resolution (1/24°, ~4km) monthly climate and climatic water balance for global terrestrial surfaces from 1958–2015. TerraClimate uses climatically aided interpolation, combining highspatial resolution climatological normals from the WorldClim dataset, with coarser resolution time varying (i.e., monthly) data from other sources to produce a monthly dataset of precipitation, maximum and minimum temperature, wind speed, vapor pressure, and solar radiation. TerraClimate additionally produces monthly surface water balance datasets using a water balance model that incorporates reference evapotranspiration, precipitation, temperature, and interpolated plant extractable soil water capacity. These data provide important inputs for ecological and hydrological studies at global scales that require high spatial resolution and time varying climate and climatic water balance data. We validated spatiotemporal aspects of TerraClimate using annual temperature, precipitation, and calculated reference evapotranspiration from station data, as well as annual runoff from streamflow gauges. TerraClimate datasets showed noted improvement in overall mean absolute error and increased spatial realism relative to coarser resolution gridded datasets. Link to paper: https://www.nature.com/articles/sdata2017191Link to data: http://doi.org/10.7923/G43J3B0R
The following users thanked this post: Diwan
2
« on: October 23, 2017, 12:13:28 PM »
Recently, I have published a couple of papers related to satellite microwave soil moisture research in the Advances in Water Resources Journal. The process of retrieving soil moisture from satellite microwave sensors depends on the type of sensor i.e., active and passive microwave sensors. Over the past four decades, the microwave community has progressed in terms of improving the sensor design and retrieval algorithm so as to achieve accurate global scale soil moisture observations. The first paper titled "Four decades of microwave satellite soil moisture observations: Part 1. A review of retrieval algorithms" gives a comprehensive overview of the developments that took place in the retrieval algorithms over the past four decades. The paper discusses the algorithmic developments of both active as well as passive sensors. We have also, for the first time, summarized the literature in the form of figures, one each for active and passive microwave soil moisture research (PFA). This review also discusses the latest developments in components of the algorithms, and also discusses the challenges that need attention in future. I can say that this paper serves as a starting point for someone who wants to venture into the field of microwave soil moisture research. The second paper titled "Four decades of microwave satellite soil moisture observations: Part 2. Product validation and intersatellite comparisons" is focused on assessing the accuracy of soil moisture products, which were developed over the past forty years. We have considered eight passive (SMMR, SSM/I, TMI, AMSRE, WindSAT, AMSR2, SMOS, and SMAP), two active (ERSScatterometer, and MetOpASCAT), and one activepassive combined (ESACCI combined product) soil moisture products for the analysis. The Contiguous United States (CONUS) is considered as a case study. The validation is carried out using the data pertaining to 1058 stations over the CONUS, along with model soil moisture simulations obtained from the VIC land surface model. We analyzed these products in terms of daily coverage (a figure in this context is atteched with this post), temporal performance, and spatial performance. We also carried out intersatellite comparisons to study the roles of sensor design and algorithms on the retrieval accuracy. The Part 1 serves as a prelude to this paper. Through these papers, one can get an idea about how well the satellites along with algorithms have progressed over the four decades and significantly improved the accuracy of soil moisture retrievals. Part 1 can be downloaded for free from here: https://authors.elsevier.com/a/1VjkU16J1mlNrZ (until November 4, 2017) Part 2 can be downloaded for free from here: https://authors.elsevier.com/a/1VoHy16J1mlNra (until November 17, 2017) Here are the permanent links: Part 1: http://www.sciencedirect.com/science/article/pii/S0309170817301859Part 2: http://www.sciencedirect.com/science/article/pii/S0309170817301860You can write an email to me at karthik120120@gmail.com for fulltext requests or any other query related to papers.
The following users thanked this post: Diwan
3
« on: October 23, 2017, 11:22:09 AM »
The work carried out by Ila Chawla along with Prof. P.P. Mujumdar titled "Partitioning Uncertainty in Streamflow Projections under Nonstationary Model Conditions" was recently published in the Advances in Water Resources Journal. In this work, the authors, in a novel attempt, have addressed the possibility of nonstationary hydrological model (here they have used the VIC land surface model coupled with a routing model). They found that the hydrological model parameters, which are influenced by climate variables, vary over time, and thus may not be assumed to be stationary. Further, the work also involves attribution the total uncertainty in the streamflow projections to multiple effects such as, model parameters, GCM simulations, emission scenarios, land use scenarios, the assumption of hydrological model stationarity, along with internal variability of the model. The Upper Ganga Basin (UGB) is considered as a case study for this analysis. Further details on the work can be found here: http://www.sciencedirect.com/science/article/pii/S0309170817300179Authors are glad to share their article with interested readers. Correspondence to: pradeep@iisc.ac.in
The following users thanked this post: Diwan
4
« on: October 09, 2017, 11:18:02 AM »
The Variable Infiltration Capacity (VIC) hydrologic and river routing model simulates the water and energy fluxes that occur near the land surface and provides useful information regarding the quantity and timing of available water within a watershed system. However, despite its popularity, wider adoption is hampered by the considerable effort required to prepare model inputs and calibrate the model parameters. This study presents a userfriendly software package, named VICAutomated Setup Toolkit (VICASSIST), accessible through an intuitive MATLAB graphical user interface. VICASSIST enables users to navigate the model building process through prompts and automation, with the intention to promote the use of the model for practical, educational, and research purposes. The automated processes include watershed delineation, climate and geographical input setup, model parameter calibration, sensitivity analysis, and graphical output generation. We demonstrate the package's utilities in various case studies. Link: http://www.sciencedirect.com/science/article/pii/S1364815216308131
The following users thanked this post: Diwan
5
« on: September 06, 2017, 09:32:25 AM »
The IndiaUK Water Centre is inviting proposals from members of its Open Network of Water Scientists to apply for funding under one of two researcher exchange schemes. Funding is available to support at least two researcher exchanges to be undertaken during the period 01 January 2018  30 June 2018: at least one exchange by an Indian water scientist to the UK and at least one exchange by a UK water scientist to India. 1st September 2017: call opens 22nd September 2017: deadline for submission of application webform 10th October 2017: applicants notified of outcome http://www.iukwc.org/opencallinvitationapplyresearcherexchange0
The following users thanked this post: Diwan
6
« on: August 26, 2017, 02:12:09 PM »
Due to inherent bias the climate model simulated precipitation and temperature cannot be used to drive a hydrological model without preprocessing e statistical downscaling. This often consists of reducing the bias in the climate model simulations (bias correction) and/or transformation of the observed data in order to match the projected changes (delta change). The validation of the statistical downscaling methods is typically limited to the scale for which the transformation was calibrated and the driving variables (precipitation and temperature) of the hydrological model. The paper introduces an R package ”musica” which provides ready to use tools for routine validation of statistical downscaling methods at multiple time scales as well as several advanced methods for statistical downscaling. The musica package is used to validate simulated runoff. It is shown that using conventional methods for downscaling of precipitation and temperature often leads to substantial biases in simulated runoff at all time scales. LINK: https://cran.rproject.org/web/packages/musica/musica.pdf
The following users thanked this post: Diwan
7
« on: May 19, 2017, 03:58:25 PM »
8
« on: April 13, 2017, 09:09:03 PM »
Union Science and Technology Minister Harsh Vardhan on Monday launched a new web portal  Nakshe Portal  providing free download of Survey of India's topographic maps of the entire country in PDF format to Indian citizens. http://soinakshe.uk.gov.in/
The following users thanked this post: Diwan
9
« on: January 26, 2017, 12:22:26 AM »
Dear Sat Kumar Tomer, I read many discussions in the forum and I saw that you work with Python and you developped a script to make a bias correction using quantile mapping (i work with matlab and started to deal with python last week). Here's my problem: I have to make a bias correction on a RCM dataset using quantile mapping. I made a matlab script to do it but i'm wondering if with python could be faster? I'm using an ensemble of 60 simulations on 10km resolution over north america at daily scale... I attached my code if it could help any Matlab users...It deals with Netcdf files. Many thanks in advance.
The following users thanked this post: Diwan
10
« on: December 29, 2016, 03:31:06 PM »
I think MATLAB can be competent enough for GIS functionality, its just that the toolbox is not well explored. MATLAB has a dedicated MAPPING toolbox. I had a task of plotting spatial maps to present my results and I was not satisfied with the colormaps of R and Python (may be I was not patient enough to edit the preset colormaps to my requirements), this lead me to MATLAB. I presented my findings from toolbox in a MATLAB primer seminar series held at IISc. The following code plots a spatial map of mean daily rainfall (mm) over India. I plotted the same data in R and Python and felt MATLAB produced satisfactory outcome. % GeoSpatial Plotting in MATLAB using MAPPING Toolbox %
% Unzip data.rar and change directory to unzipped folder; Load requisite data for plot load data_spatial_mapping.mat; % Read shapefile of India S = shaperead('India.shp','usegeocoords',true); % Define map projection axesm('MapProjection', 'eqdcylin') % Set map axis limits and labels setm(gca,'GLineStyle','', 'GColor',[0.83 0.82 0.78], 'Grid','on','Frame','on') setm(gca, 'MapLatLimit',[8 37.2],'MapLonLimit',[68 97.5],... 'MlabelLocation', 5,'MLabelParallel','north', 'MeridianLabel','on',... 'ParallelLabel','on','MlineLocation',5,'PlabelLocation',4,'PlineLocation',4) % Plot shapefile of India geoshow(S.Lat,S.Lon,'LineWidth', 2,'Color',[0 0 0]) % Overlay with data contour_num=20; contourfm(lat_mesh,lon_mesh,data_mesh,contour_num); % geoshow(lat_mesh,lon_mesh,data_mesh,'DisplayType','surface'); colormap hsv; colorbar('SouthOutSide'); tightmap xlabel('Mean Daily Rainfall (mm)','FontSize',14) % Add title to the map I can say that 'setm' is the heart of this code which sets the requisite parameters of the map. Functions list in Mapping Toolbox can be found from here: https://in.mathworks.com/help/map/functionlist.htmlI would suggest interested people to edit this code and see the changes happening in the map. I am still interested in learning in detail the 'ggplot2' and other important packages of R that support GIS functionality. Although they are huge, I wish there is some tutorial that covers all the aspects of these packages.
The following users thanked this post: Diwan
11
« on: December 23, 2016, 05:05:24 PM »
According to you which is good for both statistical and Nonstatistical analysis?Better based on ur experience can u suggest for what type of problems we should use MATLAB , R & Python Respectively ? Or by learning which language we can solve majority of problems ?? R and MATLAB are relatively better than Python for advanced statistical analysis. However for most commonly used stat is available in all. If you learn any one of these, learning others is very easy. So just start with anyone. R and Python are better than MATLAB for handling GIS data. Python is relatively better for handling strings. Which is easier to Learn, since learning and becoming critically good, like knowing nook and corner in any one of the programming languages is itself difficult and challenging on ?? MATLAB has few datatype, so relatively easy to learn. Is there any youtube channel or website where they teach these algorithms from the scratch ? Search in google and select the video option. You will find lots of videos. I prefer to learn from books. For MATLAB, I prefer: Basics of MATLAB and Beyond by Andrew Knight. For python, you can use Python in hydrology and other books are also mentioned in the forum. Just visit the Python section. For R, we added some notes  you can find in the R section of the forum. Among MATLAB R and Python which has many GIS functionalities Answered above. For Beginners what one has to do, to become extremely good in a specific programming language because the net is flooded with so many teaching modules that we can't check each one and come to a conclusion .If we do so, we shall be left with learning less things with more time invested . To learn programming, you need to program a lot. No other shortcut. Whatever assignment/problem you get, try to solve only in that programming language. Is there any online certification test conducted by any community testing our potential in MATLAB, R and Python like The GIS Certification Institute (GISCI) is a nonprofit organization that provides the GIS community with a complete certification program, leading to GISP® (Certified GIS Professional) recognition. GISCI offers participants around the world, from the first early years on the job until retirement, a method of showing competence for professionals and employers in the GIS profession. We offer the only industrywide, internationallyrecognized, softwareagnostic Certification available to geospatial professional.Similar to it is there any institution which gives complete certification program in any of programming languages ?? IF so what is the language and what is the name of the institution If so how much are they charging By getting such a certification does it increase the employability in a developing country like INDIA My advice, focus on the learning. When you pose a question, please put it in a separate post.
The following users thanked this post: Diwan
12
« on: December 17, 2016, 03:24:58 AM »
MATLAB has a function called 'MEX function  Matlab EXecutable function' which is used to compile your source codes written in C or C++ or Fortran and invoke them from MATLAB. It is a facility through which you can create a C/C++/Fortran wrapper code (called the MEX function in this context) which in turn is used to pass data from MATLAB to source codes which does the computations and pass the outputs back to MATLAB. Flow: MATLAB> Prepare your data in MATLAB > Pass data to MEX function > MEX function passes data to C/C++/Fortran codes which does computation and produces output > Read output in MATLAB Effectively, once you compile the MEX function, it just acts as any other MATLAB function, although it's written in C/C++/Fortran language. The biggest advantage of this procedure is that the computations are extremely fast because of precise memory allocations that you carry out in your C/C++/Fortran codes. So, you can write core operations of your bulky MATLAB codes in these machine languages and call them using MEX function. You can find further information in https://in.mathworks.com/help/matlab/matlab_external/introducingmexfiles.htmlMATLAB Documentation: 1) https://in.mathworks.com/help/pdf_doc/matlab/apiref.pdf 2) https://in.mathworks.com/help/pdf_doc/matlab/apiext.pdfSome tutorial: https://classes.soe.ucsc.edu/ee264/Fall11/cmex.pdfI would like to illustrate an example of this function where the task is to add two matrices. So, I write the following MEX function in C language (code's name is addmat.c) to carry out the task. /********* addmat.c ************/ #include "mex.h" void mexFunction( int nlhs, mxArray *plhs[], int nrhs, const mxArray *prhs[]) { /* Declarations */ double *A, *B, *C; /* A, B, C are pointers */ int m,n,i,j; /************** INPUT AND OUTPUT INITIALIZATIONS *********************************/ /* Initialization  'mxGetPr' is used to pass data; 'mxGetM' & 'mxGetN' are used to estimate the size of matrix */ A = mxGetPr(prhs[0]); /* Passes first input matrix using prhs[0] which is a pointer to input */ B = mxGetPr(prhs[1]); /* Passes first second matrix using prhs[1] which is a pointer to input */ m = mxGetM(prhs[0]); n = mxGetN(prhs[0]); /* Similar to prhs, 'plhs' are array of pointers which pass expected output to MATLAB */ /* In the step below pointer to output matrix is being created, which is of consistent dimensions with that of A and B */ plhs[0] = mxCreateDoubleMatrix(m, n, mxREAL);
C = mxGetPr(plhs[0]); /* This line indicates initialization of plhs[0] pointer to that of C, this step ensures that what ever modifications we do to C will be reflected in output, i.e., we ultimately get C as the output */
/*************************************************************************/
/************** CODE TO ADD TWO MATRICES *********************************/ for (i=0;i<m;i++) { for (j=0;j<n;j++) { C[j+i*n]=A[j+i*n]+B[j+i*n]; } } /*************************************************************************/ } Once you write this C code, its MEX function can be created by running line >> mex addmat.c in MATLAB and you get following outcome. >> mex addmat.c Building with 'gcc'. MEX completed successfully. Once MEX file is successfully compiled you can observe a file addmat.mexa64 created in your working directory. That's it! Now you can use your mex code in MATLAB the following way to add two matrices. >> a=[1 2;3 4];b=[7 2;6 10]; >> a
a =
1 2 3 4
>> b
b =
7 2 6 10
>> c=addmat(a,b)
c =
8 0 9 6 One can write much complicated C/C++/Fortran codes and invoke them from MATLAB. Writing MEX function is the most efficient way of tackling with cumbersome algorithms. Such a facility exists in R ( https://www.rbloggers.com/threewaystocallccfromr/) and Python ( https://docs.python.org/2/extending/extending.html too and the concept is something very similar to that of MATLAB.
The following users thanked this post: Diwan
15
« on: November 21, 2016, 03:11:56 PM »
The MK test gives a holistic monotonic trend without any categorization of the time series into a set of clusters, but the innovativeŞen method is based on cluster and provides categorical trend behavior in a given time series. The main purpose of this paper is to provide important differences between these two approaches and their possible similarities. Please follow the link to access the paper: http://link.springer.com/article/10.1007/s1126901614784?wt_mc=alerts.TOCjournals
The following users thanked this post: Diwan
