~~~ file structure ~~~ There are 5 directories for SWH_proj data** - input folder lib - contains all parameters out** - output folder run - the fortran codes and bash scripts to run tmp** - tmp folder and interim datasets **could be change in run/commondata.f ~~~ data ~~~ This is where the program looks for the 2x2 (180x91) 6HLY SLP datasets. There are some codes in place that was used during testing. Those codes can convert annual SOSEI 6HLY SLP netcdf datasets into a continuous 2x2 binary dataset. Filename must be in a form similar to 'SOSEI_hist_r1_SLP_6HLY_Grid2x2_1951010100-2010123118.dat'. See data/run.sh ~~~ lib ~~ Contains the following: coef -- coefficients for model: trans(Hs) ~ f(Pt,trGt,eof_Pt,eof_trGt,lag1-5) statistics for ERAint Hs, max(Gt), std for trHs will be used in final projection. eof -- eof's for Pt and trGt (ERAint) eraint -- base period std of ERAint's PtPCs and trGtPCs lambda -- coefficients for BoxCox transform for ERAint Hs and Gt locations -- location information for 1x1 Hs (for final output) means -- base period climate for ERAint SLP/Pt/trGt, also for CMIP5 hist runs SLP, since rcp runs need to be adjusted SLP to hist runs' climate. std -- base period std for ERAint SLP/Pt/trGt, also for CMIP5 hist runs SLP, since rcp runs need to be adjusted SLP to hist runs' climate. surroundings -- the location information for the most close 4 points (2x2 SLP field) for 1x1 Hs grids CMIP5 base period climate is stored because of the codes' use with CMIP5 model datasets. They are not used with the SOSEI datasets, instead the base period climate will be calculated. ~~~ out ~~~ This is where the output files will be. The outputs are 6HLY 1x1 Hs binary datasets, indices and statistics. There are some codes in place that was used during testing. Those codes can convert SOSEI binary datasets into netcdf datasets. See out/run.sh ~~~ run ~~~ This is where all the fortran code and bash script is stored. In our computers, we used pgf90 to compile the codes. There is bash script 'compile.sh' to compile all the necessary codes and bash script 'run.sh' to run the codes in the correct order. Many interim files will be generated during the calculation but will be cleanup at the end. Place the files and its information in 'inputs.txt' to run multiple datasets in batch. An example within 'inputs.txt' is as follow: 'SOSEI_hist 1 87660 1951 1 1 0 1 SOSEI_hist_r1_SLP_6HLY_Grid2x2_1951010100-2010123118.dat' The 9 parameters are defined as follow: model_exp run number_of_time_step first_year first_month first_day first_hour calendar_type (0=noleap, 1=gregorian, 2=360days) input_filename The code sequence is as follow: SLP -> slp_get.f -> slp_mean.f -> slp_std.f -> base period climate SLP mean and standard deviation (used in slp_both_adjusted.f, hist r1 only) SLP -> slp_both_adjusted.f -> gt_get.f -> gt_mask_SQ.f -> gt_boxcox_SQ.f -> gt_mean_SQ.f -> base period climate trGt mean (used in gt_pc_get.f, hist r1 only) SLP -> slp_both_adjusted.f -> pt_get.f -> pt_mask.f -> pt_standardized.f -> pt_surroundings.f -> Pt (1x1) SLP -> slp_both_adjusted.f -> gt_get.f -> gt_mask.f -> gt_boxcox.f -> gt_standardized.f -> gt_surroundings.f -> trGt (1x1) SLP -> slp_both_adjusted.f -> pt_pc_get.f -> PtPCs (30) SLP -> slp_both_adjusted.f -> gt_get.f -> gt_mask_SQ.f -> gt_pc_get.f -> trGtPCs (30) Pt + trGt + PtPCs + trGtPCs -> proj_PM.f -> Hs (1x1) Hs regional,seasonal (ASCII) -> Hs global,seasonal (binary) -> Hs global,annual (binary) Hs (binary) -> Hs (netcdf) ~~~ tmp ~~~ This is where all the interim files are stored. These files are big in terms of filesize and will be cleaned up after each dataset. Only mean and standard deviation datasets will remain as they are needed for future calculations and has small filesize.