Indian Forum for Water Adroit

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - ASHWATHI V K

Pages: [1] 2
1
Models / Statistical downscaling of GCMs with bias correction
« on: December 18, 2017, 04:20:15 PM »
Hai,

Please find the following procedure that i have used for downscaling:

Downscaling Method:

GCM- GFDL- CM3, MRI CGCM 3 from CMIP5 Project for RCP2.6, 4.5, 8.5

Predictor variables - Sea level pressure, Surface air temperature, Zonal wind speed at 500Hpa, Zonal wind speed at 850Hpa, Meridonal wind speed at 500Hpa, Meridonal wind speed at 850Hpa, Zonal wind speed at surface, Meridonal wind speed at surface, Geopotential height at 500Hpa, Geopotential height at 850Hpa.

Calibration data - ERA INTERIM Daily data of all the predictor variables.

Procedure-
- Downloaded daily data of above mentioned predictor variables from ERA INTERIM website for a period from 1979
   -2012 (for GFDL - 2.5 degree resolution data and for CGCM3 1.125 degree resolution data). Data include all the
    surrounding grids of study area.
- Extracted data from netcdf files to excel files using R program.Removed Feb 29 data of leap year..
- Standardised all the data of predicted variable for all the grids.
- Done Principal Component Analysis using SPSS software and selected 50 Principal components which give 97% variance.
- Divided data into Calibration and validation datasets.
- Done Multiple linear regression between 50 Principal Components in calibration data set and observed station
 temperature using SPSS statistical software. And found correlation coeficients.
- Using the coefficients from regression during calibration, station temperature was found from 50 Principal Components
  in Validation Period and found Correlation coefficients.
- Calibration and Validation results are attached (PFA).
- Downloaded daily data of above mentioned predictor variables of GFDL- CM3 and CGCM3 model from CMIP5 project
 for three different scenarios for a period from 2020 to 2100.
- Extracted and resampled the data of all the predictor variables of GCM using R program so that final selected grid are
  same as that of ERA INTERIM datasets and converted to excel format. This was done for each scenario for each GCM
  separately.
- Done Standardisation of all the variables
- Done Principal component Analysis of all standardised variables and selected 50 Principal Components which give 97%
  variance
-  Using the coefficients obtained during Multiple linear regression during Calibration, temperature for each station was
   found.

After following above procedure if am not getting a proper downscaled value. I wanted to check how my result changes when i include bias correction.

How can i include bias correction in the above procedure? can i apply bias correction directly to the downscaled result??
Please help me.


With Regards
Ashwathi V K


2
Models / Re: Statistical downscaling of GCMs
« on: October 25, 2017, 03:02:17 PM »
Hai,

Can anybody tell the function in r program to get values of optimum parameters after performing support vector regression?

With regards
Ashwathi V K

3
Models / Re: Statistical downscaling of GCMs
« on: October 06, 2017, 11:14:04 AM »
Hai Alok and Rohith,

But my doubt is that , for RCP 8.5, most of the temperature  should be greater than that of RCP 2.6 because of higher emission in RCP8.5? And average of daily downscaled temperature for RCP2.6,4.5,8.5 over a period from 2020 to 2100 is excatly same.

With Regards
Ashwathi V K

4
Models / Re: Statistical downscaling of GCMs
« on: September 28, 2017, 05:34:06 PM »
Hai,

I have a doubt on results of PCA regression statistical downscaling of temperature. I have downscaled GFDL- CM3 and MRI-CGCM3 of CMIP5 project for Scenarios RCP2.6, 4.5, 8.5 and found temperature for a period of 2020 to 2100 for different stations.  I have used ERA INTERIM daily data for Calibration. Correlation coefficient of  Calibration and Validation results for different stations ranges between 0.65-0.8.
But in the result it was found that  50% of downscaled temperature data has lesser value for higher RCP. i.e, temperature of RCP 8.5 is lower than that of RCP2.6 for certain days in the period between 2020-2100. I have checked the procedure that i have done. I am not able solve the above mentioned problem.

Can you please suggest any solution for the above mention problem?

With Regards
Ashwathi V K

5
Models / Re: Statistical downscaling of GCMs
« on: August 30, 2017, 05:25:32 PM »
Hai,
I have a netcdf file( GCM data) with 2.5*2 degree resolution which is of 4 dimension (lat, lon, time and pressure level) and  has 1 variable (Example - uwind). I also have Era Interim climate data which is of 2.5*2.5 degree resolution. I want change resolution of GCM data to that of Era Interim data so that points of both the file overlap and then extract data from resampled GCM data for points in between Latitude 7.5 degree N and 15 degree N and Longitude 72.5 and 80 degree East for 500Hpa and 750Hpa pressure levels and convert it to csv file, so that i can use data directly for PCA. Likewise I have to perform these for about 100 files. Is there any way to do this for all the files together? I tried with R program. But I failed. Please help me.

With Regards
Ashwathi V K


6
Models / Re: Statistical downscaling of GCMs
« on: August 10, 2017, 11:11:46 AM »
Hai,

While doing statistical downscaling, after selecting predictor variables of all the surrounding grids of the study area, should i take the average of all the grid for each of these variables before doing Principal component Analysis?

Please help me..

With Regards
Ashwathi V K

7
Models / Re: Statistical downscaling of GCMs
« on: July 14, 2017, 11:28:08 AM »
Hai,

I have downloaded some predictor variables from GFDL climate model of CMIP5 project which is in netcdf format. It has four dimension (lat, lon, time, and pressure levels). I want extract data for one pressure level (500Hpa). Is there any method to extract the values? I have tried with R program. But i have failed. Is there any method. Please help me.


With regards
Ashwathi V K

8
Post your question/information / Calibration of SWAT model
« on: June 29, 2017, 02:36:34 PM »
I have simulated streamflow  using QGis interface of SWAT model in a semiarid catchment that has forest and agricultural ecosystem for total 13years (3 years warmup, 8 years calibration, 2 years validation). Area of catchment is 1200 sq.km. My warmup period is 3 years. After simulation R^2 value and NS was found to be 0.78 and -0.21.

I tried calibrating the model using SWATCUP , by following the paper http://www.sciencedirect.com/science/article/pii/S0022169415001985 for calibration period. In SWATCUP, initially I set the result of initial model, with period of simulation as calibration period.. Then my NS is found to be -1.16 and R^2 is found to be 0.54.

Since overall flow overestimates the observed flow, in first two simulation i have decreased CN, increased sol_AWC, ESCO and  in the third and fourth simulation i have increased GW_revap, GW_qmn, GW_Delay and decreased Revapmn, Alpha_BF.  I have done 4 iteration of 150 simulation but there is no much improvement in the performance. I have attached my output flow from QSWAT model. I request you to suggest parameter that is to be needed to change during calibration, so that i get a good performance.


With Regards
Ashwathi V K

9
Models / Re: I need help to run Skehekin sample with VIC 5.0
« on: June 29, 2017, 10:59:18 AM »
Hai,

I had same problem. Error is due to the difference in the path of input file in global parameter file from actual path.

With Regards
Ashwathi

10
Models / Re: Statistical downscaling of GCMs
« on: June 29, 2017, 10:02:15 AM »
Hai,

Thankyou Rohith.

Do you have any material on quantile mapping method of downscaling?


With regards
Ashwathi V K

11
Models / Re: Statistical downscaling of GCMs
« on: May 29, 2017, 04:19:30 PM »

Hai

Is there GCMs of finer resolution (1 degree x 1 degree) under CMIP5 project with RCP 2.6 scenario available?

With Regards
Ashwathi V K

12
Models / Re: Statistical downscaling of GCMs
« on: May 25, 2017, 12:04:08 PM »
Hai,

In downscaling, i have selected predictor variables such as air temperature at 2m, air temperature at 250Hpa, u wind speed at 250Hpa, V wind speed at 250Hpa, geopotential height at 500Hpa, mean sea level pressure, specific humidity at 2m and vertical wind velocity from NCEP-NCAR reanalysis data. I have standardized the data and done Principal component analysis using XLSTAT. I got 5 principal components. Then from observed station data (predictand) i have deducted mean and then performed multiple regression using XLSTAT.  After regression my correlation is found to be very poor approximately 0.2. Is this acceptable? How can i improve correlation? Is there any mistake in this? Please help me.


With regards
Ashwathi V K

13
Models / Re: Statistical downscaling of GCMs
« on: May 09, 2017, 05:07:14 PM »
Hai,

In procedure to do statistical downscaling of precipitation, I have selected 8 variables from NCEP-NCAR reanalysis data and have done principal component analysis using XLSTAT. I have got some results after PCA (attached). Based on what value of  the result  will we have to select important variables out of 8 variables? I am totally confused. Please help me.

With regards
Ashwathi 

14
Models / Re: Statistical downscaling of GCMs
« on: May 03, 2017, 01:45:16 PM »
Hai,

I have NCEP/NCAR reanalysis data in netcdf format. I want to convert it into csv file. I have a script in R programming to convert netcdf to csv. But since the data in netcdf file is too large which exceeds number columns in csv file, i am not able get entire data in csv format. How can i split the data in nc file to multiple sheets of csv file.

Script is as follows:

library(ncdf4)
getwd()
workdir <- "I:\\NCEP\\"
setwd(workdir)
ncin <- nc_open("X27.251.228.22.106.3.5.59.nc")
print("ncin")
dname <- "hgt"
lon <- ncvar_get(ncin, "lon")
nlon <- dim(lon)
head(lon)
lat <- ncvar_get(ncin, "lat", verbose = F)
nlat <- dim(lat)
head(lat)
print(c(nlon, nlat))
t <- ncvar_get(ncin, "time")
tunits <- ncatt_get(ncin, "time", "units")
nt <- dim(t)
tmp.array <- ncvar_get(ncin, dname)
dlname <- ncatt_get(ncin, dname, "long_name")
dunits <- ncatt_get(ncin, dname, "units")
fillvalue <- ncatt_get(ncin, dname, "_FillValue")
dim(tmp.array)
title <- ncatt_get(ncin, 0, "title")
institution <- ncatt_get(ncin, 0, "institution")
datasource <- ncatt_get(ncin, 0, "source")
references <- ncatt_get(ncin, 0, "references")
history <- ncatt_get(ncin, 0, "history")
Conventions <- ncatt_get(ncin, 0, "Conventions")
# split the time units string into fields
tustr <- strsplit(tunits$value, " ")
tdstr <- strsplit(unlist(tustr)[3], "-")
tmonth = as.integer(unlist(tdstr)[2])
tday = as.integer(unlist(tdstr)[3])
tyear = as.integer(unlist(tdstr)[1])
chron::chron(t, origin = c(tmonth, tday, tyear))
tmp.array[tmp.array == fillvalue$value] <- NA
length(na.omit(as.vector(tmp.array[, , 1])))
m <- 1
tmp.slice <- tmp.array[, , m]
lonlat <- expand.grid(lon, lat)
tmp.vec <- as.vector(tmp.slice)
length(tmp.vec)
tmp.df01 <- data.frame(cbind(lonlat, tmp.vec))
names(tmp.df01) <- c("lon", "lat", paste(dname, as.character(m), sep = "_"))
head(na.omit(tmp.df01), 20)
csvfile <- "cru_tmp_1.csv"
write.table(na.omit(tmp.df01), csvfile, row.names = FALSE, sep = ",")
tmp.vec.long <- as.vector(tmp.array)
length(tmp.vec.long)
tmp.mat <- matrix(tmp.vec.long, nrow = nlon * nlat, ncol = nt)
dim(tmp.mat)
head(na.omit(tmp.mat))
# create a dataframe
lonlat <- expand.grid(lon, lat)
tmp.df02 <- data.frame(cbind(lonlat, tmp.mat))
options(width = 110)
head(na.omit(tmp.df02, 20))
csvfile <- "cru_tmp_2.csv"
write.table(na.omit(tmp.df02), csvfile, row.names = FALSE, sep = ",")

In the case of GCMs, i tried downloading a GCM output. But it took nearly 10 hours for a single variable in my laptop. Is there any requirement for laptops and internet connections to download and use GCM data?

Please give me solution.

With regards
Ashwathi

15
Models / Re: Statistical downscaling of GCMs
« on: April 27, 2017, 12:23:37 PM »
Hai,

If there is only one grid of reanalysis data in the study area, is it necessary to do principal component analysis before doing regression?


With regards
Ashwathi

Pages: [1] 2