Indian Forum for Water Adroit

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Alok Pandey

Pages: [1] 2 3 ... 13
River Linking may have the potential to accelerate global warming on a short term in addition to its possible adverse effect  on monsoon rain-fall in India on a long term.

Article :
Rajamani, V., Mohanty, U.C., Ramesh, R., Bhat, G.S., Vinayachandran, P.N., Sengupta, D., PrasannaKumar, S. and Kolli, R.K., 2006. Linking Indian rivers vs Bay of Bengal monsoon activity. Indian Academy of Sciences.

Programming / List of R Packages for Hydro Research by Sam Zipper
« on: July 20, 2018, 03:42:50 PM »
Sam Zipper( @ZipperSam ) has compiled a list of packages which can be useful in water resources engineering and research.

Twitter Thread :
Link to Doc :

Models / Re: Statistical downscaling of GCMs
« on: October 18, 2017, 05:08:49 PM »

The problem you are facing needs thorough look into data used, model created and the results obtained. It will be very difficult for anyone to give satisfactory answer to this problem without checking all the steps followed. I would suggest plotting the raw data of variables of all grid points, bias corrected data, and the downscaled data and then comparing the all the plots for better understanding.

Programming / Re: Downscaling
« on: October 18, 2017, 05:01:31 PM »

We have discussed a lot about downscaling climate variables in our forum. A simple search would have yielded desired results on the same.

For e.g.
An R package for assessment of statistical downscaling methods for hydrological climate change impact,1307.msg3022.html#msg3022
Documentation of Musica R package:

Discussion on Statistical Downscaling,1216.msg3050.html#msg3050,1239.msg2919.html#msg2919 (Article reference is also given, do read it for better understanding)

Also, It would be well appreciated if the question asked precisely mentions the problem faced in the research activity.

Models / Re: Statistical downscaling of GCMs
« on: October 05, 2017, 05:38:50 PM »
But in the result it was found that  50% of downscaled temperature data has lesser value for higher RCP. i.e, temperature of RCP 8.5 is lower than that of RCP2.6 for certain days in the period between 2020-2100. I have checked the procedure that i have done. I am not able solve the above mentioned problem.

Can you please suggest any solution for the above mention problem?

My view :
50% of downscaled temperature data has lesser value for higher RCP is "an observation based on analysis performed and not a problem" . Also it doesn't say anything about trends (which may be present and should be look into) in the results obtained. Check for the frequency of events (above certain threshold) obtained in results.

Models / Re: Statistical downscaling of GCMs
« on: August 31, 2017, 04:17:57 PM »
You can use Grads to regrid and automate regriding for many variables/files.

Models / Re: Statistical downscaling of GCMs
« on: August 11, 2017, 10:00:29 AM »
Not needed. After standardization, you can perform PCA.

Post your question/information / River Geomorphology Videos
« on: August 07, 2017, 07:55:04 AM »
Little River Research and Design (LRRD) under contract with the Missouri Department of Conservation have produced these videos for educational purpose. Many videos were made using the Emriver movable bed model.

Link :

Models / Re: Statistical downscaling of GCMs
« on: May 09, 2017, 07:54:09 PM »

Eigenvalues of the covariance matrix of data matrix is used to select the number of newly formed variable (without any physical significance).
To learn more about PCA, go through this educational video :
Lecture series on Stochastic Hydrology by Prof. P. P. Mujumdar, Department of Civil Engineering, IISc Bangalore

Models / Re: Statistical downscaling of GCMs
« on: April 27, 2017, 03:34:25 PM »

In downscaling, we usually take grids which encompass the whole study area and not just the grid which falls into it. This way the minimum number of grids for the study comes out to be 4 (for one climate variable). Thus more number of variables necessitates use of PCA.

Abstract :
River piracy—the diversion of the headwaters of one stream into another one—can dramatically change the routing of water and sediment, with a profound effect on landscape evolution. Stream piracy has been investigated in glacial environments, but so far it has mainly been studied over Quaternary or longer timescales. Here we document how retreat of Kaskawulsh Glacier—one of Canada’s largest glaciers—abruptly and radically altered the regional drainage pattern in spring 2016. We use a combination of hydrological measurements and drone-generated digital elevation models to show that in late May 2016, meltwater from the glacier was re-routed from discharge in a northward direction into the Bering Sea, to southward into the Pacific Ocean. Based on satellite image analysis and a signal-to-noise ratio as a metric of glacier retreat, we conclude that this instance of river piracy was due to post-industrial climate change. Rapid regional drainage reorganizations of this type can have profound downstream impacts on ecosystems, sediment and carbon budgets, and downstream communities that rely on a stable and sustained discharge. We suggest that the planforms of Slims and Kaskawulsh rivers will adjust in response to altered flows, and the future Kaskawulsh watershed will extend into the now-abandoned headwaters of Slims River and eventually capture the Kluane Lake drainage.


Daniel H. Shugar, John J. Clague,   James L. Best, Christian Schoof, Michael J. Willis, Luke Copland & Gerard H. Roe

Article Link :

Models / Re: Statistical downscaling of GCMs
« on: April 14, 2017, 09:01:26 AM »

"There are a lot of files with different variable in NCEP-NCAR reanalysis data archive? Should i download entire files? How will we obtain NCEP -NCAR reanalysis data for the grid containing my study area."
-> No need to download entire files. NCEP-NCAR provides freedom to download specific variable (pressure, humidity,air temp etc.) of particular time scales (daily, monthly etc.) corresponding to region of interest. more information on below links :

Models / Re: Statistical downscaling of GCMs
« on: April 12, 2017, 04:17:09 PM »

"Based on my understanding from literature, in statistical downscaling, if we want future precipitation, first we have to select predictor variables (pressure, humidity,air temp etc) from NCEP-NCAR reanalysis data and precipitation as predictand (station data) for same period. Then find the relation between predictand and predictor variable in NCEP-NCAR data. Then calibrate the model and then select same predictor variables from GCM (pressure, humidity,air temp etc) and find predictand (future station data). Is this true? Or is there any change from this?"
-> The above mentioned approach is widely used. Also many agencies now provide reanalysis data (i.e. NCEP-NCAR, ERA, JRA). The main assumptions behind this approach is that the relationship between reanalysis data (predictor) and station data (predictand) will be preserved (i.e. stationary relationship) in future and thus can be used as it is on GCM data to obtain final future projections of predictand.

"Or should we relate predictor variable in NCEP- NCAR data and predictor variables in GCM?"
-> I did not understand the rationale behind this? Where will we use this relationship in the analysis?

Models / Re: Statistical downscaling of GCMs
« on: April 07, 2017, 03:18:40 PM »
My view :

1. Can the GCMs of CMIP5 project be statistically downscaled using SDSM downscaling model? (In example which in the model webpage, they have used only HADCM3)

-> Yes.

2. Metereological parameter of SWAT are precipitation, min and max temperature, wind speed, humidiy and solar radiation. Is it necessary to downscale all parameters from GCMs to use it in SWAT? Or can we use only precipitation and temperature?

->  Since all these parameters are also provided by the GCM, it is advisable or preferred (not necessarily imposed) to use  them in analysis. However if any other model provides these parameters as long term forecasts, it can be used to enhance our perspectives on uncertainties associated with the models.

3. When downloading GCM output is it necessary to download files of all variables? Or need to download only precipitation and min and max temperature?

->  Better download only those files which will be used for analysis.

4. Which is the best and simple method to statistically downscale GCM output?

->  Every method has its pros and cons. "Best" word is very subjective here. Complexity of downscaling methods can indeed be discussed. Useful Link :,706.msg2064.html#msg2064

5.  If we are using SDSM downscaling model, on what basis we have to select predictor variables?

->  Predictor variables are selected based on their strong relationships (linear and/or non-linear) with predictand. High Correlation (also supported by physics of nature) is one widely accepted measure.

P.S. Researchers are encouraged to add and/or correct points mentioned in this post.

Post your question/information / Hydrological data biases and errors
« on: March 28, 2017, 04:33:00 PM »
Article :

The ‘dirty dozen’ of freshwater science: detecting then reconciling hydrological data biases and errors

Abstract :

Sound water policy and management rests on sound hydrometeorological and ecological data. Conversely, unrepresentative, poorly collected, or erroneously archived data introduce uncertainty regarding the magnitude, rate, and direction of environmental change, in addition to undermining confidence in decision-making processes. Unfortunately, data biases and errors can enter the information flow at various stages, starting with site selection, instrumentation, sampling/measurement procedures, postprocessing and ending with archiving systems. Techniques such as visual inspection of raw data, graphical representation, and comparison between sites, outlier, and trend detection, and referral to metadata can all help uncover spurious data. Tell-tale signs of ambiguous and/or anomalous data are highlighted using 12 carefully chosen cases drawn mainly from hydrology (‘the dirty dozen’). These include evidence of changes in site or local conditions (due to land management, river regulation, or urbanization); modifications to instrumentation or inconsistent observer behavior; mismatched or misrepresentative sampling in space and time; treatment of missing values, postprocessing and data storage errors. Also for raising awareness of pitfalls, recommendations are provided for uncovering lapses in data quality after the information has been gathered. It is noted that error detection and attribution are more problematic for very large data sets, where observation networks are automated, or when various information sources have been combined. In these cases, more holistic indicators of data integrity are needed that reflect the overall information life-cycle and application(s) of the hydrological data.

Link to article :

Pages: [1] 2 3 ... 13