Domain Information:
Here are the details:
Number of CPUs used in Bluevista: 32 (4 nodes)
Model run; 1960/1/1~1960/2/29 (winter), daily coupling
 
RSM
Domain size: 121*66 (1114.2400E-245.0400E, 8.4890N~65.0010N)
dx=1.09 deg. (~114.88km)
dy=0.98 deg. (~108.8km) to 0.442 deg (49.565km)
dt=300s during this particular winter time of 1960
 
ROMS
Domain size: 481*261
dx=0.2725 deg. (28.721km)
dy=0.2455 deg. (27.283km) ~ 0.11505 deg. (12.28km)
dt=1600s throughout the integration.


Bluevist GAUS charge webpgae:
http://www.cisl.ucar.edu/computers/bluevista/queue.charge.html

GAUs = wallclock hours * number of nodes * number of processors in that node * computer factor * queue charging factor

bluevista: 8 processor per node.
computer factor=0.87 for bluevista and blueice
queue charging factor=1

When using 4 bluevista node (32 cpus)

Total: 92 s / day
RSM: 39s (42%)
ROMS: 45s (49%)
Coupler: 8s (9%)
(RSM->ROMS: 7s, ROMS-> RSM: 1s)
 
Wallclock hours 9.32777 for 1 year (365days)
 
GAUS charged for 1 year would be
GAUS = 3.32777 * 4 * 8 * 0.87 * 1= 259.68511.
 
50 years run requires 12984.2555 GAUS
 
6 member enssemble: 77905.533 GAUS
 
2 more complementray uncoupled run 5453.38731 GUAS (for RSM) + 6362.285195 GAUS (for ROMS)
 
Total GAUS requested: 89721.20551 GUAS
 
 
Here is Art Miller's GAUS request to CHAP
 
Hi Ginger,  I realize this is very late, but perhaps we can
still circumvent another panel review by responding to this.
 
Dr. Hyodeo Seo has been doing a post-doc at SOEST
and has gotten the coupled model to run on the bluevista.
There are still some problems on getting it to run on blueice.
 
The estimates that we had originally made were actually quite close.
 
Below is the updated response to the key questions.
 
Please let me know if we really need to re-submit the
whole proposal.  Thanks, Art
 
 
============================================================
 
1. Number of GAUs required per run.
 
The coupled ocean-atmosphere model (SCOAR) in the North Pacific
domain requires 13,000 GAU for a 50-year run at unit priority.
(We hope to run at lower priority for the long runs.)
 
2. CPU time required per run.
 
12,500 days of total CPU time per 50-year run at
unit priority (if I understand the relation between GAU and CPU correctly).
 
3. number of nodes * number of processors per node * wall clock time.
 
On bluevista:
 
   4 nodes * 8 processors/node * (8 hours/year * 50 years) = 12,800
 
This is based on a two-year run, that eventually went unstable,
so the time step may need to be decreased or other modifications
may need to be made.  Our testing allocation is now used up
so we cannot pursue this further.
 
4. Include memory requirements.
 
   Roughly 5 Gbytes of core memory for this run.
 
5. number of runs required.
 
We expect to require at least 6 runs for our ensemble
to have respectable statistical significance.
Plus, we will want to run uncoupled ocean and uncoupled
atmospheric runs for comparison of response.
So add 2 more "split" coupled runs for a total of 8.
 
6. special requirements
 
   B. maximum number of GB stored: 500 Gbyte/run = 4 Tbyte.
 
7. evidence that your program code is well optimized.
 
Both ROMS and RSM, which comprise SCOAR,
are well-known to be well-optimized on MPI
machines.
 

These are reviews from the committee for our initial GAUS estimate
 
 
 
>              NATIONAL CENTER FOR ATMOSPHERIC RESEARCH
>           Computational & Information Systems Laboratory
>                      1850 Table Mesa Drive
>               P.O. Box 3000, Boulder, CO  80307-3000
>
>
> Dear Dr. Miler:
>
> Your request for the Computational & Information Systems Laboratory
> (CISL) computing support was reviewed by the Advisory Panel on April 5,
> 2007.  The panel has not recommended an allocation at this time, pending
> receipt of a revised proposal or supplemental information addressing the
> reviewers’ questions.  Please email your response to Ginger Caldwell,
> cal@ucar.edu .  The reviews are provided below.
>
> Please contact Ginger Caldwell at cal@ucar.edu or 303-497-1229 if you
> have any questions.
>
>                                   Best regards,
>
>                                   Al Kellie
>                                   Director CISL
>
> cc:
> C. Jacobs, NSF
> E. Itsweire, NSF
>
> =======================================================================
> Review #1
> Miller, Scripps Institution of Oceanography, UCSD, “Collaborative
> Research: Decadal Coupled Ocean-Atmosphere Interactions in the North
> Pacific”
> NSF Award: OCE-0647815
>
> While we are supposed to evaluate if the proposed request is an adequate
> use of the CISL resources, the methodology needs to be described in
> order to be able to perform such an evaluation. Specifically, there is
> no description on how the downscaling is going to be performed and how
> the NCEP reanalysis fields will be used with the proposed coupled
> simulations. Overall, the request is an adequate use of the resources,
> but the total estimated amount of GAUS is not well justified. First, it
> is not clear that the PI has a clear idea of how well the coupled model
> performs on the NCAR computers and how many GAUS are truly needed. The
> number of GAUs is estimated from some runs performed by the PI's
> graduate student, but are not for the exact proposed configuration. At a
> minimum, the PI should use their already allocated 3000 GAUs to perform
> benchmarks and truly estimate their needs. Second, the number of
> requested GAUS is computed using days instead of hours, making this
> request off by a factor of 24. The above should be addressed in a
> revised proposal before any recommendation can be made.
>
> Review #2
>
> This project represents an ambitious effort to compute the coupled
> response of the North Pacific ocean using a mesoscale ocean model (ROMS)
> and regional spectral atmospheric model.  The work represents one of the
> first attempts to simulate a coupled ocean response with resolved
> mesoscale eddies and potential feedbacks between the ocean SST structure
> and atmospheric wind stress forcing.  The PI’s give a couple of examples
> where changes in the ocean SST affects the atmospheric boundary layer,
> which alters the flux and ocean circulation.
>
> The PI and his former student state that they have extensive experience
> working in a parallel environment; mostly on a cluster at their home
> institution.  However, they gave very little detail on how their coupled
> model is designed and how load balance would be achieved on a
> large-scale system.  There is some suggestion that the two models run in
> a shared way, swapping processor usage, but more details are needed,
> especially with regard to the load balance.
>
> A more glaring problem with the proposal is the calculation of GAUs.
>  From my reading,  Miller did not include the number of hours in each
> day for his calculation.  He gives a GAU value for one model run based
> on 56 days x 32 nodes x 8 processors = 14,000.  This would actually be
> 24 x 0.87 times larger using 24 hrs/day and the bluevista GAU factor.
> Consequently, the computational requirements of this study appear to be
> much larger than the request, i.e. about 20 times greater or ~800,000 GAU.
>
> I would recommend that the committee suggest that the PI perform a more
> thorough analysis of computational needs and do a test simulation on
> bluevista or blueice.  This would provide a much better baseline than
> the estimate based on the Linux cluster, and hopefully avoid the error
> noted above.  A more detailed description of the modeling system is also
> needed to make sure that the PI has an efficient coupled modeling
> strategy.