[Nsa] NCAR Scientists' Assembly Summer 2023 Newsletter

Ben Johnson johnsonb at ucar.edu
Tue Jun 27 07:01:05 MDT 2023


 NCAR Scientists' Assembly Summer 2023 NCAR Scientists' Assembly Summer
2023 Newsletter


Summer 2023 Newsletter

Celebrating the arrival of the third-generation machine at the NCAR-Wyoming
Supercomputing Center, Derecho, which is a HPE Cray EX supercomputer, this
issue focuses primarily on NCAR's long standing relationship with Cray
computers. It includes a brief history of NCAR's acquisition of the Cray-1
in the 1970's, remembrances of what scientific computing was like in that
era, and summaries of new research that will be conducted on Derecho. This
issue took longer than anticipated to produce in part because of the care
necessary to record and report that history with the diligence it deserves.
As always, this newsletter serves to share stories among NCAR's labs and
generations of scientists. We welcome articles that contribute to this goal.
*NCAR'S ACQUISITION OF THE CRAY-1*


* The signing of the contract between UCAR and Cray Research, Inc. to
acquire the CRAY-1 on May 12, 1976. Top row: Lionel Dicharry, Paul Rotar,
G. Stuart Patterson, Clifford Murino, and John Firor, all of NCAR. Bottom
row: Noel Stone, assistant secretary of Cray Research, Inc., Seymour Cray,
Francis Bretherton, and Harriet Crowe of UCAR.*

Scientific computing has experienced a period of relative stability in the
three decades since the adoption of the Message-Passing Interface standard,
which saw computations distributed among increasingly powerful central
processing units. But with the arrival of Derecho, an HPE Cray
supercomputer, NCAR transitions into a new era of computing in which
general-purpose graphics processor units make up a significant portion of
the machine's computing power.

The 1970's were a similar period of computing transition and not only
heralded the arrival of NCAR's first Cray supercomputer, the Cray-1, but
also a shift in I/O and data management.

G. Stuart Patterson, then the director of NCAR's Computing Facility,
describes the era, "Where we were going with the Computing Facility was a
total change of how it worked, the equipment it had and what you could do
with it. Number one at the heart of it was to get people off of punch cards
and to get them onto terminals. The second was to have a really massive
data management facility. We had 25,000 half-inch tapes sitting beneath the
front steps of NCAR where it would leak in a rainstorm."

The Computing Facility also needed to replace its workhorse supercomputer,
the Control Data Corporation (CDC) 7600.

Paul Rotar, a longtime NCAR employee who eventually became head of the
Systems Section, knew Seymour Cray, the brilliant engineer who co-founded
CDC. At the time, CDC was one of the major computer companies in the United
States. Although tech geniuses weren't as widely known in the 1970's as
they are today, Cray was a revered figure. He graced magazine covers and an
article in the Washington Post, reflecting on his life, described him as,
"the Thomas Edison of the supercomputing industry."

Rotar said he would check in with CDC to get some insight into what their
next generation computer would look like. He came back from a phone call,
and according to Patterson's retelling, said, "Hey! Guess what? Seymour's
left CDC and he's working on a new computer up in Chippewa Falls."

Rotar and Patterson scheduled a trip to visit the newly organized Cray
Research, Inc. in Chippewa Falls to get a sense of how near the new machine
was to a finished product. Patterson remembers, "It had a lot of aspects of
reality about the whole project. It wasn't close to being finished. The
software was way out there but it looked like something that was going to
be very doable and would have a whole lot of power to it. Particularly the
vector architecture that he was talking about…that was just off the wall!"

The trip convinced Patterson and Rotar that Cray's machine would be a
viable candidate for NCAR's next supercomputer. Upon returning to Boulder,
Patterson and a team consisting of Ed Wolff, then NCAR-director John Firor,
and others, worked for a year to properly structure the request for
proposal (RFP) for the new machine. "We made sure that we were in total
control of putting out an RFP. So when that went out we had full control
over what we were benchmarking, how we were benchmarking it and we were
divorced from the government regulations on how you did an RFP. That was a
big legal step. We had to separate UCAR, make sure UCAR was an independent
legal operational entity from NSF and the federal government."

Structuring the financing for the acquisition also required creativity.
Patterson explained, "The price to us for the Cray was $10 million. The way
the government works is that you can't buy anything. So you'd have to lease
it from them." Based on the funding from NSF, it would have taken four
years to accumulate enough funds to acquire $10 million but the computer
companies expected the funds on a shorter timescale – two years.

Fortunately, Ray Chamberlain, then the president of Colorado State
University (CSU) was on the UCAR Board of Trustees. In Patterson's
retelling, Chamberlain spoke up at a UCAR board meeting, "You know that's
crazy. You shouldn't have to pay the lease rate. I tell you what, CSU can
float an industrial development bond for $10 million and then we can lease
it back to you at a reasonable interest rate and you can pay it off at a
reasonable rate."

With the financing in place and Cray Research, Inc. selected as the winning
proposal, Seymour Cray visited NCAR to sign the contract to deliver the
newly designed Cray-1. Patterson arranged for Cray to deliver a rare public
lecture to NCAR staff after the signing.

"We set up a conference room auditorium and we probably had about fifty
people from the Computing Facility come in, some other people." Patterson
continued, "Seymour gave his talk and afterwards I said, 'Seymour would be
happy to answer your questions' and there was total silence from the
audience. Not a word."

Patterson invented a couple questions to break the tension, the lecture
adjourned, and the auditorium emptied. He approached some of the staff
afterwards wondering, "Why didn't you ask any questions?" The staff looked
at each other, trying to adequately convey how intimidating it was to be in
the presence of such a genius. Finally, one staff member responded, "How do
you talk to God?"

After the initial fanfare, the staff got to work making the new computing
system available for the scientists. There were some hiccups. Patterson
remembers, "We did have big problems with the memory system, the
large-scale, the terabit memory system. So I spent a lot of time on that.
And we had a lot of problems with the networking. Networking turned out to
be extremely difficult to the point where we had to shut down a project
that we had going on for well over a year and start it again."

The Computing Facility had the benefit of having an incredibly capable
manager: Margaret Drake. "Margaret at that time was probably managing all
of the programming support that the Computing Facility was supporting to
scientists. She had about forty programmers, I think, working for her, of
which about a third were women," said Patterson. "In my experience I've
never seen a better manager in my life."

For the computer itself, he remembers, "The Cray was not a big deal to get
running and running well. I don't remember any significant issues that I
had to deal with..it probably took at least a year to break in things or
something like that."

With the numerical models up and running, producing copious amounts of
output, Patterson envisioned NCAR becoming a leader in the field of
computer graphics but anticipated some difficulties, "I felt it was going
to be a tough sell to the scientists at NCAR who were just really very much
focused on computation and not focused on the presentation or analysis of
the data. That's where I wanted to go and I felt that was going to be a
hard political battle."

Patterson remained unsure of how to proceed, until one day Seymour Cray
came to him to ask, "How would you like your own R&D lab?"

Seymour Cray developed a pattern over the course of his career of starting
up a company, developing a successful computer, getting frustrated with the
management and then leaving to start a new company. So, just like he left
CDC to start Cray Research, Inc., he planned to start a new company to work
on the next iteration of supercomputer design. Patterson joined him,
resigning from NCAR to become president of Cray Labs and set out to
construct a building to house the new company. After considering a few
locations, they settled on a site north of downtown Boulder and began
construction of a C-shaped building at 3375 Mitchell Lane.

While the company didn't last, the building remains. It was purchased by
UCAR in 2009 to expand the Foothills Lab campus. Only now it is known by a
different name: the Anthes Building.


*G. Stuart Patterson*



*This article was authored by Ben Johnson (CISL) after an informal
conversation with G. Stuart Patterson. Patterson earned his SB in chemical
engineering (1957) and SM in nuclear engineering (1959) from MIT. He
completed his PhD in mechanics (1966) at Johns Hopkins University under the
supervision of Owen Phillips and Stanley Corrsin. Patterson spent a
sabbatical year at NCAR from 1970-1971 while he was a professor of
engineering at Swarthmore College. He returned to NCAR as the director of
NCAR's Computing Facility from 1973-1979 before leaving to become president
of Cray Laboratories. He became a serial entrepreneur, founding several
companies over his career, culminating with a position as Chief Technical
Officer for OR Manager, Inc., a company co-founded by his wife, Pat
Patterson.*
*A Brief Recollection of My Early Days of Computing at NCAR*
*Contributed by Annick Pouquet, Part-Time Research Scientist, LASP; &
Emeritus, NCAR*

As an Advanced Study Program (ASP) post-doctoral scientist at NCAR starting
in the Fall of 1973, I had a discussion with G.S. Patterson (SCD, NCAR)
together with U. Frisch (Observatoire de Nice), and I quickly realized the
potential of performing accurate pseudo-spectral direct numerical
simulations (DNS) of turbulent flows [1 <https://doi.org/10.1063/1.1693365>,
2 <https://doi.org/10.1002/sapm1972513253>], in my case having in mind the
problem of the generation of magnetic fields (or dynamo effect) in the Sun,
the stars and the Universe at large. I started this project in my second
year at NCAR and pursued it for quite a while, using periodic boundaries [4
<https://doi.org/10.1017/S0022112078000658>, 7
<https://doi.org/10.1103/PhysRevLett.47.1060>, 9
<https://ui.adsabs.harvard.edu/abs/1983JMTAS......191F>, 11
<https://doi.org/10.1103/PhysRevA.33.4266>], while Peter Gilman and his
colleagues at the time were tackling at HAO the solar dynamo problem in a
rotating spherical shell, a very complex task [6
<https://ui.adsabs.harvard.edu/abs/1981ApJS...46..211G>]. A bit earlier,
Ulrich Schumann and Jack Herring were comparing properties of closures of
turbulence and DNS [3 <https://doi.org/10.1017/S0022112076000888>]; Eric
Siggia and Stu Patterson were already measuring intermittency property of
turbulent flows [5 <https://doi.org/10.1017/S0022112078001287>] and of
vortex tubes [8 <https://doi.org/10.1017/S002211208100181X>], while Greg
Holloway was studying, again with models and DNS, the stirring of tracer
fields in the ocean [10 <https://doi.org/10.1017/S0022112084000720>]. NCAR
graphical tools were also being developed at the time and proved very
useful to many. One should note that several of these early papers were
co-authored by ASP fellows, ASP playing a central role in the dissemination
of NCAR savoir-faire for the community at large.

Only much later, after I came back to NCAR in the 2000s, did I move with my
team to the task of unraveling some of the properties of rotating and/or
stratified turbulence as it occurs in the atmosphere and the ocean,
stressing in particular the central role the waves play in dynamically
shaping the structures, governing the scaling laws, as well as the
transport and dissipation properties of such complex media [12
<https://doi.org/10.1017/jfm.2012.99>, 13
<https://link.aps.org/doi/10.1103/PhysRevLett.111.234501>, 14
<https://link.aps.org/doi/10.1103/PhysRevLett.114.114504>, 15
<https://doi.org/10.1063/1.4921076>, 16 <https://doi.org/10.1063/1.5114633>,
17 <https://link.aps.org/doi/10.1103/PhysRevFluids.7.033801>], as Alfven
waves are shaping in some way the dynamics of conducting flows.

I did not realize that, in the 70s, not only were the numerical method and
impressive computational power entirely new to me, but the pseudo-spectral
methodology was also rather new for the open scientific community at large.
Indeed, the advance in speed procured by the Fast Fourier Transform
algorithm was phenomenal and, following the leadership of G.S. Patterson
and S.A. Orszag [1 <https://doi.org/10.1063/1.1693365>, 2
<https://doi.org/10.1002/sapm1972513253>] and under the strong guidance of
Stu Patterson, we were able to produce perhaps the first such DNS in the
magnetohydrodynamic (MHD) "turbulence" framework, at a grand numerical
resolution of 323 grid points on the CDC 7600 [4
<https://doi.org/10.1017/S0022112078000658>]. Rather impressive at the
time, this computation is feasible today on your smart phone, and of course
the turbulence was not quite there yet, due to the lack of a sufficiently
large ratio of excited scales.

Soon after, the Cray-1 arrived at NCAR and I pursued this line of work and
made use of the Cray with other French colleagues to tackle a few other
problems of MHD turbulence, such as showing the growth of magnetic fields
even in the absence of helicity [7
<https://doi.org/10.1103/PhysRevLett.47.1060>], or the dynamics of current
sheets in two-dimensional ideal (non-dissipative) MHD [9
<https://ui.adsabs.harvard.edu/abs/1983JMTAS......191F>], or the problem of
growth of the correlations between the velocity and magnetic field as a
signature of the role of Alfven waves in the dynamics of conducting
turbulent flows [11 <https://doi.org/10.1103/PhysRevA.33.4266>]. Progress
was still slow at the time, everything had to be learned and the scientific
community had to be convinced of the realisability and the reliability of
such an approach. Difficult to believe perhaps today, but still this
research was not considered as numerical experiments but rather as oddities
perhaps. I do recall long evenings going into the night where the room was
often filled with the French team on the one end, and the Spanish one
(e.g., J.M. Massaguer), working on convective flows with Juri Toomre and
Niels Hurlburt, on the other hand. We did achieve and we held for a while a
few world records for NCAR, including using the NCAR computers, up until
the early 2010s [12 <https://doi.org/10.1017/jfm.2012.99>, 15
<https://doi.org/10.1063/1.4921076>]. Indeed, we have pursued this type of
work to this day in the US, in France and elsewhere as a succession of
Crays were being made available, then followed by many other computers with
numerous architectures and various technical improvements. THAT is another
story ...

*Supporting Cutting-Edge Science in NCAR's Computing Facility in the 1970's*
*Contributed by Dick Valent*

Before the CRAY-1 Serial 3 arrived at NCAR in 1977, I'd worked in the
Computing Facility long enough to know that many of our resident and
university scientists wanted to solve larger problems than our existing
6600 and 7600 CDC computers could accommodate. I often heard this lament
when I was helping the scientists and programmers use the computers. The
7600 offered 65K 60-bit words of small core memory [18
<https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.917.3051&rep=rep1&type=pdf>,
pages 2-4], and the 6600 the same [19
<https://www2.cisl.ucar.edu/ncar-supercomputing-history/cdc6600>].
In fact, one could manage out-of-core computations thanks to the 7600's
Large Core Memory hardware and its accompanying LCM subroutines. These
routines allowed you to transfer blocks of data between the 7600's small
core memory and its large core memory, and also to overlap transfers with
computation for better runtime performance.  Using this strategy, you could
boost the 7600's usable memory to 250K 60-bit words [18
<https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.917.3051&rep=rep1&type=pdf>,
pages 2-4].

As you may guess, using this larger memory was costly.  The reads and
writes were slow compared to those that could fit in the 7600's small core
memory and setting up the calls to the routines was exacting and
time-consuming. And you would need to facilitate the program's
save-restarts in LCM, as well. As anyone who has worked with this sort of
out-of-core computation knows, it makes one wish, I mean really wish, for a
computer with larger memory.  Happily, this desire was realized in the
CRAY-1 with its million 64-bit words of "core".  Also very important, the
CRAY-1 offered greater computational speed via vectorization. Many NCAR
scientists made the additional effort of learning how to write vectorized
code for their applications.

You can imagine the community's elation when scientists and programmers
were at last able to run larger and more efficient applications. Our
overall excitement and sense of unity was enhanced by seeing Seymour Cray
walking the halls here at the NCAR Mesa Lab, checking on the new computer.
Also, for several months after the CRAY-1's arrival, the Computing Facility
hosted a "War Room" at the Mesa Lab staffed by our engineers and
programmers, where CRAY-1 users could bring their codes and discuss their
current problems.

Readers interested in the era of the CRAY-1 Serial 3 at NCAR will find
substantially more information in NCAR's 1978 Scientific Report [20
<https://opensky.ucar.edu/islandora/object/archives%3A7306/datastream/OBJ/view>,
pages 180-191]. I'd like to give a nod here to the UCAR Opensky archives:
there's a huge amount of historical information on these pages, both for
NCAR computing and the science done here. And please consider archiving
some of your own materials, for the projects you are working on.

In closing, I thank the Scientists' Assembly for inviting me to contribute
these memories. So many of the old-timers who helped bring the CRAY-1 to
NCAR are gone now.  Also gone are many of the early users, the scientists
and programmers who used the Serial 3 within its first few years at NCAR.
If they were here to help, this note would be more informative. If you have
questions or corrections about it, you may contact me, Dick Valent at
valent at ucar.edu.
*NCAR-LED ACCELERATED SCIENTIFIC DISCOVERY PROJECTS*

The Accelerated Scientific Discovery (ASD) program provides early access to
users to complete computationally ambitious projects on NCAR
high-performance computing systems during the first few months after
acceptance testing has been completed. Six university-led projects and ten
NCAR-led projects (nine primary and one alternate) were selected for ASD
allocations.
*Data-inspired MURaM Simulations of Flares Resulting from Sunspot
Collisions *
*Matthias Rempel, Yuhong Fan, Anna Malanushenko (HAO), Georgios
Chintzoglou, Mark Cheung (Lockheed/LMSAL), MURaM GPU team (CISL, University
of Delaware, & Max Planck Institute for Solar System Research)*
Major solar eruptions often originate from complex active regions,
specifically in active regions that are composed of several bipolar spot
groups that interact with each other. AR 11158 (Feb 2011) is a well-studied
example in which two opposite polarities collide and form a flare
productive collisional polarity inversion line (cPIL). We propose a data
inspired simulation of AR 11158 with the MURaM radiative MHD code in which
the observed spot motions will be imposed at the lower sub-photospheric
boundary condition. Synthetic observables covering visible to EUV
observations will be computed and compared to the available observations
from NASA/SDO. The investigation will focus on connecting changes in the
magnetic topology prior to flares to available observables, specifically
constraining the build-up and release of magnetic energy. Unlike earlier
MURaM simulations, this simulation aims for the first time at reproducing
processes in a specific observed active region through data-constrained
boundary driving.

*Urban Air Quality Across the Globe with MUSICA*
*Louisa Emmons (ACOM), Simone Tilmes, Gabriele Pfister, Rebecca Buchholz,
Duesong Jo, Wenfu Tang, & David Edwards (ACOM); Behrooz Roozitalab
(University of Iowa)*
Air quality is primarily driven by local anthropogenic emissions sources,
but it can also be strongly influenced by long-range transport of
pollutants and regional influences (natural emissions, chemistry,
meteorology, climate). In turn, local air quality can have impacts that
extend all the way to the global scale. Global models including
CESM2(CAM-chem) with comprehensive chemistry in the troposphere and
stratosphere usually perform well in reproducing distributions of important
air pollutants, including those of ozone and particulate matter (PM2.5).
However, over highly polluted urban regions the model's coarse horizontal
resolution is often unable to capture local peak emissions of specific
precursors for ozone. So far however, global models are not able to
increase the horizontal resolution sufficiently because of needed large
computer resources for transporting 200-300 chemical tracers. MUSICAv0, a
configuration of CESM2.2(CAM-chem) with variable resolution, now has the
unique capability to simultaneously simulate urban-scale air quality with
high horizontal resolution and regional-to-hemispheric-to global influences
and impacts of pollutants, while still using an expensive chemistry and
aerosol scheme.
This project will perform simulations of MUSICAv0 with a custom variable
resolution mesh with 3 refined regions of special interest targeting the
United States, Europe and southern and eastern Asia. Our plan is to use a
base resolution of ne60 (0.5 degree), zooming into ne240 (~1/8 degree)
resolution over the 3 regions.  Using this unique setup will allow us to
better quantify the impact of urban pollution simultaneously on local,
regional and hemispheric scales.

*Deep Learning-based Large Ensemble for Subseasonal Prediction of Global
Precipitation*
*Lead: Maria J. Molina (CGD), Co-Lead: Katie Dagon (CGD); Collaborators:
Jadwiga Richter (CGD), David John Gagne (CISL), Gerald Meehl (CGD), Kirsten
Mayer (CGD), Judith Berner (MMM/CGD), John Schreck (CISL), William Chapman
(ASP), Aixue Hu (CGD), Anne Glanville (CGD), and Abby Jaye (MMM)*
Every year, extreme precipitation and drought disrupt life, destroy
infrastructure, and result in fatalities across the United States and the
world. Skillful precipitation forecasts with a lead time of several weeks
(i.e., subseasonal) can help stakeholders of societally-relevant public
sectors (e.g., water management, agriculture, and health) understand
imminent threats and take protective actions to mitigate harm. Our proposal
aims to improve subseasonal prediction of precipitation using a
data-driven, deep learning approach. With capabilities provided by Derecho,
and as part of the Accelerated Scientific Discovery opportunity, we will
use a data-driven, deep learning approach to create a 100-member ensemble
of subseasonal forecasts of global precipitation. We will leverage deep
learning approaches with observational and reanalysis products to improve
already existing subseasonal reforecasts created using the Community Earth
System Model version 2 (CESM2). Motivating our large ensemble approach is
that an ensemble mean of subseasonal precipitation prediction can yield
more skill than individual forecasts, but the large computational cost of
simulating global numerical model subseasonal hindcasts precludes its
creation. Moreover, recent studies have shown that deep learning models can
produce subseasonal-to-multiyear forecasts with skill that exceeds current
dynamical forecasting systems, making this an ideal time to take advantage
of the unique opportunity that the Accelerated Scientific Discovery program
presents.

*High-resolution Simulations of Wildland Fires and Long-Range Smoke
Transport During the 2020 August Complex Fires in California*
*Timothy Juliano, Masih Eghdami, Rajesh Kumar, & Branko Kosović (RAL);
Gabriele Pfister & Rebecca Buchholz (ACOM); Hamed Ebrahimian & Kasra
Shamsaei (University of Nevada)*
Wildfires are some of the most destructive natural disasters on Earth, many
times leading to devastating effects. There is no doubt that wildland fire
activity in the United States (U.S.) and across the world has increased
significantly over recent decades, and is projected to increase in
forthcoming years. Currently, for air quality and solar energy forecasts,
the fire emissions are estimated based on satellite observations that lag
behind the actual emissions and therefore may not accurately represent the
wildfire evolution. In light of this scientific gap in knowledge, we will
use a multiphysics modeling approach and focus on improving air quality
forecasts during the 2020 U.S. wildfire season. A very intense period of
wildfire activity plagued a large portion of the U.S. in August and
September 2020.
We propose to conduct coupled simulations using a multiscale (i.e.,
spanning the mesoscale and microscale) approach. The WRF-Fire
wildfire-atmosphere modeling system will simulate fire behavior at fine
resolution. While the proposal team has extensive experience using the
WRF-Fire model, we have been limited computationally to conducting
relatively small domain (order 10s of km) simulations at large-eddy
simulation (LES) resolutions. However, during large conflagrations, such as
the 2020 August Complex, much larger LES domains are required to accurately
capture the wildfire spread and smoke production and transport. We will
then use the biomass burning results from WRF-Fire to inform WRF-Chem and
the MUlti-Scale Infrastructure for Chemistry and Aerosols (MUSICA)
configuration of the Community Earth System Model (CESM), both of which
contain complex chemical processes. Such hierarchical multiscale and
multiphysics simulations will advance predictive science in support of air
quality management and enhance warning systems protecting human health.

*Nonlinear Multiscale Coupled Data Assimilation: Designing the Future of
Air Quality Forecasting*
*B. Gaubert, W. Tang, F. Lacey, L. K. Emmons, S. Tilmes, M. Dawson, M.
Barth, & G. Pfister (ACOM); K. Raeder, M. Gharamti, & J. L. Anderson
(CISL); A. Arellano (University of Arizona)*
*This project is designated as an alternate and will be elevated in the
event that other projects fail to make progress.*
Applying concurrent data assimilation of chemical and physical observations
in coupled chemistry meteorology models is often overlooked because of
computational limitations. Unstructured grids with regional refinements
have never been explored in chemical Data Assimilation (DA). This project
aims to apply ensemble DA to an global online coupled chemistry-meteorology
variable-resolution model. We will assess how improving the dynamics and
physics via higher resolution impacts the chemical surface fluxes and state
of the atmosphere. We will explore ensemble representations of physical and
chemical uncertainties to disentangle errors stemming from emissions,
transport and chemistry. The system is built on the coupling between the
Multi-Scale Infrastructure for Chemistry and Aerosols (MUSICA) and the Data
Assimilation Research Testbed (DART).
It uses the spectral element (SE) dynamical core of Community Atmosphere
Model with full chemistry (CAM-chem) with horizontal mesh refinement
defining a Regionally Refined (RR) domain, denoted as CAM-chem-SE-RR. The
global grid has a resolution of ne30 (∼111 km) and the refinements reach
ne240 (30x8, or ∼14 km) over the conterminous United States. The first
objective is to assess the performance of the meteorological data
assimilation and its comparison to current specified dynamics approaches.
The second objective is to evaluate the impact of spatial resolution on
initial state optimization and fluxes inversion of carbon monoxide (CO).
The set of chemical data assimilation experiments will focus on the
assimilation of CO from the TROPOMI instrument.

*Extreme Weather Events Under a Wide Range of Climates in High-Resolution
Coupled CESM*
*Bette Otto-Bliesner (CGD), Jiang Zhu (CGD), Esther Brady (CGD), Jesse
Nusbaumer (CGD), Chijun Sun (ASP), Jessica Tierney (University of Arizona),
Ran Feng (University of Connecticut), Clay Tabor (University of
Connecticut), Andrew Walters (University of Arizona)*
We propose an unprecedented, landmark set of fully coupled high-resolution
(HR) climate simulations for past greenhouse and icehouse climates to study
the dynamics that govern the characteristics of extreme weather events in
both atmosphere and ocean under altered climate states. We target
well-studied paleoclimate intervals with higher and lower atmospheric
CO2, including the preindustrial, the Last Glacial Maximum, the Pliocene,
and the Early Eocene. We employ scientifically validated and extensively
tested CESM code and configuration, the iHESP (International Laboratory for
High-Resolution Earth System Prediction) HR CESM1.3 (~0.25° atmosphere/land
and ~0.1° ocean/sea ice) with water isotopes. The unique water isotope
capability enables unprecedented integration of information from model and
paleoclimate observational data. The paleo-HR simulations will complement
the preindustrial, historical and RCP8.5 future simulations available from
the iHESP project, resulting in HR simulations to investigate the dynamics
that connect past and future climate changes. The proposed work will
greatly expand our fundamental understanding of how elevated CO2 levels
affect the pattern and intensity of extreme weather events, thus
contributing to future projections of climate change and the physical
science basis for actionable policies.

*Benchmark Simulations Using a Lagrangian Microphysics Scheme to Study
Cloud-Turbulence interactions: from Direct Numerical Simulations of a
Laboratory Cloud Chamber to High-Resolution Large-Eddy Simulations of
Clouds*
*Hugh Morrison (MMM), Kamal Kant Chandrakar (MMM), Wojciech W. Grabowski
(MMM), George H. Bryan (MMM), Lulin Xue (RAL), Sisi Chen (RAL), Raymond A.
Shaw (Michigan Technological University), and Greg McFarquhar (University
of Oklahoma)*
Clouds involve an enormous range of scales from the ~1 mm dissipation
microscale to synoptic scales. Accurate representation of clouds in
atmospheric models across these scales poses a significant challenge and is
a critical source of uncertainty in weather and climate models. Our
high-resolution simulations on Derecho will use a novel approach to
simulating cloud/rain droplets and lead to better understanding of
multi-scale processes in clouds. They can serve as benchmarks for
developing and testing "traditional" cloud parameterizations in weather and
climate models. These datasets can also be used to train artificial
intelligence and machine learning algorithms for parameterization
development.
The proposed simulations will utilize a Lagrangian particle-based
microphysics scheme called the "super-droplet method" (SDM). SDM provides a
major advancement for representing cloud microphysics in models compared to
traditional bin and bulk microphysics schemes. For example, it is free from
numerical diffusion, unlike bin schemes. SDM is available in NCAR's CM1
model and runs efficiently on other supercomputing systems. The CM1-SDM
framework was successfully applied to study the effects of turbulence and
entrainment on cloud droplet size distributions. We will use three related
model configurations in a hierarchical approach from direct numerical
simulation (DNS) of small-scale turbulence to cloud-scale and mesoscale
dynamics using large eddy simulation (LES).

*Global Convection-Permitting Simulations with GPU-MPAS*
*Falko Judt (MMM), Andreas Prein (MMM), Bill Skamarock (MMM), Supreeth
Suresh (CISL), Roy Rasmussen (RAL), Tim Schneider (RAL)  *
We will produce a series of global convection-permitting simulations using
GPU-MPAS. Our main goal is to assess the "added value" of
convection-permitting resolution in (1) simulating structure and life cycle
of mesoscale convective systems across different climate zones, (2)
capturing the diurnal cycle and the duration, frequency, & intermittency of
precipitation, (3) predicting extreme weather from local to global scales,
and (4) representing orographic precipitation. Our secondary goal is to
better understand the dynamics of tropical convection, and the
predictability of the atmosphere in different climate zones.
We will simulate 4 pairs of 40-day long simulations on a globally
quasi-uniform 3.75-km mesh, where one pair consists of a control run and a
stochastically perturbed run. The number of simulation days on the 3.75-km
mesh will be 4*2*40 = 320 (i.e., almost one year of global 3.75-km
resolution data). In addition, there will be 15 km, 30 km, and 120 km mesh
counterparts (again 4 pairs à 40 days) for added value assessments. These
simulations will be identical to the 3.75-km runs except with reduced
horizontal resolution.
The four 40-day long simulations will include the following time
periods/events: April 2011, simulating the super Outbreak, the largest,
costliest, and one of the deadliest tornado outbreaks ever recorded;
August/September 2017, simulating Hurricanes Harvey, Irma, and Maria;
December 2018–January 2019, simulating a Madden-Julian oscillation event
initiated in the Indian Ocean and propagated across the Maritime Continent
into the western Pacific; June/July 2021, simulating a series of record
shattering extreme events that happened within a 4-week period in early
summer 2021.

*Response of Tropical Cyclone Rainfall to Thermal Forcing in Long-Term
Convection-Permitting Simulations*
*George H. Bryan (MMM), Andreas Prein (MMM), Brian Medeiros (CGD), Jonathan
Martinez (ASP), Kelly Nunez Ocasio (ASP), and Kerry Emanuel (Massachusetts
Institute of Technology)*
Recent Hurricanes Maria, Harvey, Lane, and Florence had something in
common: they brought record-breaking, catastrophic rainfall to their
landfall locations. Their associated rainfall amounts were considered
"extreme" in our current climate, but those amounts may become more common
if our planet continues to warm at the projected rates. At the same time,
an increasing body of literature suggests that interactions between clouds
and radiation—known as cloud-radiative feedbacks (CRFs)—impact several
processes in the tropical atmosphere, including convective organization,
precipitation extremes, and tropical cyclone formation. Given the impactful
nature of tropical cyclone rainfall, this proposed study will use
convection-permitting idealized simulations to investigate if tropical
cyclone rainfall will increase under a projected 4-K warming while also
investigating if CRFs affect extreme rainfall in tropical cyclones.

*Enhancing Earth System Predictability by Diagnosing Model Error in Mode
Water Regions*
*Ben Johnson (CISL), Moha Gharamti (CISL), Anna-Lena Deppenmeier (CGD), Ian
Grooms (University of Colorado)*
Major modes of climate variability such as the Pacific Decadal Oscillation
(PDO) have off-equatorial dipole cores that coincide with regions of mode
water formation. Mode waters are ocean mixing pathways that connect
near-surface waters to the deeper central and intermediate waters beneath
them. Simulations of these regions produced by eddy-parameterizing ocean
models diverge from observations to such an extent that data assimilation
schemes fail to assimilate many observations. This project conducts twin
data assimilation experiments using the Data Assimilation Research Testbed
(DART) with eddy-parameterizing (~1.0° horizontal resolution) and
eddy-resolving (~0.1° horizontal resolution) eighty-member ensembles of
POP2. The ensembles, which are forced by the CAM6 Reanalysis, are designed
to diagnose model error in these regions and improve earth system
predictability. These experiments use DART's capability to identify
observations that exceed an ensemble's outlier threshold in an attempt to
associate specific model processes with model errors.
*Early Career Scientists' Assembly (ECSA) Steering Committee*

*ACOM*

*CGD*

*CISL*

*EOL*

*HAO*

*MMM*

*RAL*

*Brett Palm*

*Dan Amrhein*

*Agbeli Ameko*

*Carol Ruchti**

*Soudeh Kamali*

*Mariana Cains*

*Nick Lybarger*


*Meg Fowler*

*Charlie Becker*
*Anna del Moral Mendez*



*Darcy Jacobson*

*NCAR Scientists' Assembly Executive Committee (NSA-EC)*

*ACOM*

*CGD*

*CISL*

*EOL*

*HAO*

*MMM*

*RAL*

*Eric Apel*

*Peter Lawrence*

*Allison Baker*

*Tammy Weckwerth*

*Anna Malanushenko*

*Peter Sullivan*

*Andrew Newman**

*Kelley Barsanti*



*Ben Johnson*

*Holger Vömel*

*Kevin Pham*

*Erin Towler*

*Fei Chen*

**Denotes committee chairs. The ECSA committee chairs concurrently serve as
members of the NSA-EC.*
*The NCAR Scientists' Assembly represents all members of the NCAR
scientific staff (Scientists, Associate Scientists, Project Scientists,
Post-Docs and Visiting Scientists) and research engineering staff. This
newsletter aims to provide curated information to the scientific community
at NCAR and UCAR member institutions. To contribute content, please
email nsa-ec at ucar.edu <nsa-ec at ucar.edu>.*

*This material is based upon work supported by the National Center for
Atmospheric Research, a major facility sponsored by the National Science
Foundation and managed by the University Corporation for Atmospheric
Research. Any opinions, findings and conclusions or recommendations
expressed in this material do not necessarily reflect the views of the
National Science Foundation.*
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.ucar.edu/pipermail/nsa/attachments/20230627/e24db76f/attachment-0001.htm>


More information about the Nsa mailing list