Download presentation
Presentation is loading. Please wait.
Published byLuke Mitchell Modified over 8 years ago
1
AeroCom organisation Core team : Christiane Textor, Sarah Guibert, Stefan Kinne, Joyce Penner, Michael Schulz, Frank Dentener (LSCE-MPIM-JRC-UMI) Initial questionary Oct 2002 to define output four intensive workshops since 2003 (Paris, Ispra, New York, Oslo, each 40 participants) Open call for model participation ( -> ca 20 models ), 3 experiments (original model + emissions 2000 + 1750) Central model output database (~2TB) at LSCE and public web interface to image catalogues, joint papers Funding through: EU projects PHOENICS and CREATE, NASA for indirect effect study, institutes themself, ACCENT+European science foundation
2
Analysis Model simulation Model diagnostics submission feedback Analysis storage Visualisation Intercomparison Database Storage Access WWW access Observational Database Scientific papers Image database FTP email idl RAID server idl Perl/javascript AeroCom Workup Structure Model description excel Check Format Reformat Regrid Protocol NCDUMPNCO/ncregrid feedback netCDF WEBSITE cp pngdivers
3
http://nansen.ipsl.jussieu.fr/AEROCOM/data.html WWW access Image database Perl/javascript png
4
ACCENT Emission databaseACCENT Emission database visualisation RETRO RETRO 40 year simulation visualisation AEROCOM interfaces INCA chemical weather forecast Why we should cooperate: AeroCom tools exported to other projects (Harmonised structure Interface to image database AEROCOM comparison to data interfaces WWW access Image database Perl/javascript png
5
Model Data Database (Distributed DODS?) Meta Database mysql Interface to model Interface to check tools Interface to Analysis tools Interface to Interactive analysis tools Intermediate Analysis results Database Image Database Interface to Web Intercomparisons – a communication problem FEEDBACK to modeller FEEDBACK to Analyst/Observer How can we cooperate?
6
Model outputMetadata baseResults databaseImage database Analysis Visualisation Reformatting Regridding Interactive analysis Checking Metadatabase WEB How fuzzy should the interface definition be? Illustration of communication success between model output and analysis Analysis functions not used Model output not used Foreseen adaptation but useless Perfect functioning Analysis does not work Foreseen adaptation from analysis tools Adaptation from Model side Analysis works
7
Proposal for cooperation « Atmospheric Tracer Model Intercomparison Tools Initiative » Goals General Goals Accelerate Analysis of Models and Feedback to model participants Develop jointly intercomparison tools through transformation, integration, adaptation and development of tools Allow participants to join more easily into the analysis of an intercomparison Specific Objectives to which the Network should contribute for any intercomparison Model documentation Model quality control Model comparison Model benchmarking Model improvement Scientific understanding
8
Proposal part II « Atmospheric Tracer Model Intercomparison Tools Initiative » Procedure In the first place tutorial institutions are identified, which provide general support for the planning and implementation of the initiative (JRC, LSCE/IPSL + ??) A steering committee is put into place to develop the initiative and report to the tutorial institutions. A work plan is elaborated until early summer 2006 to structure and prioritise the actions to be undertaken. A pilot project is the support of the planned 1st phase of the intercomparison under the HTAP convention, by reusing and integrating tools prepared for EuroDelta, ACCENT and AeroCom. Present the initiative at forthcoming meetings (eg AeroCom 17-19 Octobre HTAP workshops etc) and promote its existence through publication (IGAC newsletter? +?)
9
Proposal part III « Atmospheric Tracer Model Intercomparison Tools Initiative » Workplan components: The Workplan aims to identify a detailed implementation plan On What steps are taken, Who is doing them, and When they are ready. 1)Identify the formatting standards needed > units, CF or less strict?? 2)Identify standard names for specific tracer variables > report to CF 3)Identify the standard interface specifications between any of the tools and any of the data bases (see slide 5) so that different tools can call and adress each databases 4)To develop a standard protocol form to be used for different intercomparisons and subparts of them ( reference+compliance w check tools) 5)Define standards for file names and image names for databases 6)Develop compliance test tools to test whether data and tools fulfill the standards set under 1-5 7)Implement a pilot database based on automod for testing of different existing tools (AeroCom catalogues etc)
10
8)Build a web based repository for users to find tools to prepare CF compliant model output (fortran routines) to rename and reformat files (nco examples) to do specific diagnostics (region budgets, aerosol size fractions etc) to do regridding compliant to the standards to handle/replace/identify missing data 9)To develop check tools for physical meaning of data (budgets, units, order of magnitude, ocean/land contrast) 10) Develop observational data comparison tools which interface to the model data as defined under 1-4 11) Develop tools to develop higher order analysis results (ensemble averages, spatial correlation) 12) Documentation tool for keeping track of model version characteristics Proposal part IV « Atmospheric Tracer Model Intercomparison Tools Initiative » Workplan components (continued):
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.