Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add json schema for combined pp yaml #19

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from
Draft
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
189 changes: 189 additions & 0 deletions FRE/fre_pp.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,189 @@
{
"$schema": "http://json-schema.org/draft-06/schema#",
"type": "object",
"additionalProperties": false,
"properties": {
"name": {
"description": "The name of the experiment",
"type": "string"
},
"platform": {
"description": "The platforms listed in the command",
"type": "string"
},
"target": {
"description": "The targets listed in the command",
"type": "string"
},
"build": {
"type": "object",
"additionalProperties": false,
"properties": {
"compileYaml": {
"description": "Path to the compile yaml.",
"type": "string"
},
"platformYaml": {
"description": "Path to the platform yaml.",
"type": "string"
}
}
},
"directories": {
"description": "FRE shared directories",
"type": "object",
"items": {"$ref": "&/$defs/dirs"}
},
"postprocess": {
"description": "FRE post-processing information",
"type": "object",
"items":{"$ref": "#/$defs/pp"}
}
},
"$defs": {
"dirs": {
"history_dir": {
"description": "Diectory path to raw model output.",
"type":"string"
},
"pp_dir": {
"description": "Directory path to post-processing output.",
"type":"string"
},
"ptmp_dir": {
"description": "Directory to use for history file cache.",
"type":"string"
},
"preanalysis_script":{
"description": "Filepath to the user script.",
"type":["string","null"]
},
"history_refined":{
"description": "",
"type":["string","null"]
},
"analysis_dir":{
"description": "",
"type":["string","null"]
},
},
"pp": {
"type": "object",
"properties": {
"settings": {
"type:": "object",
"properties": {
"history_segment": {
"description": "Amount of time covered by a single history file (ISO8601 datetime)",
"type":"string"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Dana, this looks great. Is it possible to define a custom "type" for the schema, do you know? That would be the best case scenario, if it were possible.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, I'm not sure. What would you want the type to be? Or do you mean custom as in it could be multiple types?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ISO8601 dates and durations would be great to validate, but seem hard to validate beyond "string". An enumerated list wouldn't quite work, since there are so many possibilities.

The way we "validate" the dates and durations now is attempt to parse it with metomi.isodatetime.parsers and check if it succeeds or fails.

It would be great to catch typos like "P1M"/"PM1" during validation, but it may not be possible.

Copy link
Contributor Author

@singhd789 singhd789 Jan 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah I see. It does look like you can specify specific values: https://www.tutorialspoint.com/yaml/yaml_json_schema.htm

we can have a list of values that would work maybe

@J-Lentz might know more on this too!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regular expression matching would be really nice.. let's keep an eye out for that.

Copy link
Contributor Author

@singhd789 singhd789 Jan 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

aha!! Uriel coming in clutch in a previous commit he's done:
646e68c

https://json-schema.org/understanding-json-schema/reference/regular_expressions

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fantastic. Enumerations and regular expressions are about all what we had for the XML schema, I think.

},
"site": {
"description": "",
"type":"string"
},
"pp_chunk_a": {
"description": "Amount of time covered by a single postprocessed file (ISO8601 datetime).",
"type":"string"
},
"pp_chunk_b": {
"description": "Secondary chunk size for postprocessed files, if desired (ISO8601 datetime). Divisble by pp_chunk_a.",
"type":"string"
},
"pp_start": {
"description": "Start of the desired postprocessing (ISO8601 datetime).",
"type":"string"
},
"pp_stop": {
"description": "End of the desired postprocessing (ISO8601 datetime)."
"type":"string"
},
"pp_components": {
"description": "Space-separated list of user-defined components, discussed in more detail below."
"type":"string"
},
"pp_grid_spec": {
"description": "Path to FMS grid definition tarfile."
"type":"string"
},
"refine_diag_scripts": {
"description": "Path(s) to FMS grid definition tarfile."
"type":"array",
"items": {"type": "string"}
},
}
},
"switches": {
"type": "object",
"properties": {
"clean_work": {
"description": "Switch to remove intermediate data files when they are no longer needed.",
"type":"boolean"
},
"do_mdtf": {
"description": "Switch to run MDTF on generated pp output.",
"type":"boolean"
},
"do_statics": {
"description": "Switch to turn on/off statics processing.",
"type":"boolean"
},
"do_timeavgs": {
"description": "Switch to turn on/off time-average file generation.",
"type":"boolean"
},
"do_refinediag": {
"description": "Switch to run refine-diag script(s) on history file to generate additional diagnostics.",
"type":"boolean"
},
"do_atmos_plevel_masking": {
"description": "Switch to mask atmos pressure-level output above/below surface pressure/atmos top.",
"type":"boolean"
},
"do_preanalysis": {
"description": "Switch to run a pre-analysis script on history files",
"type":"boolean"
},
"do_analysis": {
"description": "Switch to launch analysis scripts",
"type":"boolean"
},
"do_analysis_only": {
"description": "Switch to only launch analysis scripts",
"type":"boolean"
}
}
},
"components": {
"type": "array",
"properties": {
"type": {
"description": "Component name",
"type":"string"
},
"sources": {
"description": "",
"type":"array",
"item": {"type": "string"}
},
"sourceGrid": {
"description": "",
"type":"string"
},
"xyInterp": {
"description": "",
"type":"string"
},
"interpMethod": {
"description": "The interpolation method.",
"type":"string"
},
"inputRealm": {
"description": "",
"type":"string"
}
}
}
}
}
}
}