CEDAR email: AMS Session: Open Env. Datasets for AI Applications: Benchmarking Needs, Frameworks, Lessons Learned

Rob Redmon - NOAA Federal rob.redmon at noaa.gov
Mon Aug 1 09:00:00 MDT 2022


Dear Colleagues,

We’d like to encourage you to submit an abstract to the next *AMS annual
meeting (January 8-12, 2023)* sharing your experiences and inspiring
critical conversations on community needs, frameworks and standards
development, and lessons learned for developing benchmark datasets.
Abstracts are due via the submission portal by *August 24th.* We look
forward to discussing this important topic with you online or in person in
beautiful Colorado!

Best wishes,
Rob Redmon

*Session Topic Title:* Open Environmental Datasets for AI Applications:
Benchmarking Needs, Frameworks, Lessons Learned
*Session Topic ID:* 61571
*Conference: *22nd Conference on Artificial Intelligence for Environmental
Science

*AMS abstract portal:*
https://annual.ametsoc.org/index.cfm/2023/program-events/conferences-and-symposia/22nd-conference-on-artificial-intelligence-for-environmental-science/

*Session Description:*
Benchmark datasets, such as ImageNet, are instrumental for innovation in
artificial intelligence (AI) and machine learning (ML). The infusion of AI, and
ML, and other advanced data science (DS) techniques is expanding
exponentially to contribute to solving Earth system and space science
problems. Thus, developing benchmark datasets, standards, and frameworks
for evaluation, use and publishing, and sharing lessons learned in a
coordinated manner is needed to ensure AI/ML/DS applications continually
increase our ability to predict complex physical processes with high levels
of trust and explainability. Benchmark datasets that are highly
AI/ML/DS-ready will empower research in Earth and space science and the
transition of research to decision making services by lowering the cost of
curiosity to getting started with baseline models and interactive
notebooks. Use cases and community driven benchmarking frameworks using
open science principles will also foster collaborative development by
providing common evaluation metrics, ontologies for labeling features, and
mechanisms for capturing user feedback for trustworthy AI applications.
Benchmarking will ensure efficient research and development on topics of
societal importance including tackling climate change, improving weather
forecasts, protecting satellite observing systems and other technologies,
safeguarding ecosystems, and improving social inequities. This session
invites presentations sharing experiences and inspiring critical
conversations on community needs, frameworks and standards development, and
lessons learned for developing benchmark datasets. We invite experiences
exploring domain agnostic benchmark standards and framework development, as
well as domain specific topics, such as climate and weather science,
environmental justice, fire weather, ocean conservation, hydrology, space
weather, and any other relevant topics.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.ucar.edu/pipermail/cedar_email/attachments/20220801/ab3eb2e1/attachment.htm>


More information about the Cedar_email mailing list