The self validating sensor rationale definitions and examples
Along with this process, a contractor who had worked at the CMMI Institute (CMMI, Capability Maturity Model Integration, is a software engineering process level improvement training and appraisal program) was hired to improve software processes, with a focus on improvement and code rejuvenation of the surface temperature processing code, in particular the GHCN-M dataset.The first NCDC/NCEI surface temperature software to be put through this rejuvenation was the pairwise homogeneity adjustment portion of processing for the GHCN-Mv4 beta release of October 2015.Step 2 involves applying various adjustments to the data, including bias adjustments, and provides as output the adjusted and unadjusted data on a standard grid.Step 3 involves application of a spatial analysis technique (empirical orthogonal teleconnections, EOTs) to merge and smooth the ocean and land surface temperature fields and provide these merged fields as anomaly fields for ocean, land and global temperatures. Rigorous ORR for each of these steps in the global temperature processing began at NCDC in early 2014. Generic data flow for NCDC/NCEI surface temperature products.
I was able to implement such policies, with the help of many colleagues, through the NOAA Climate Data Record policies (CDR) [link].
There are three steps to the processing, and two of the three steps are done separately for the ocean versus land data.
Step 1 is the compilation of observations either from ocean sources or land stations.
The incident report had found that there were unidentified coding errors in the GHCN-M processing that caused unpredictable results and different results every time code was run.
The generic flow of data used in processing of the NCDC/NCEI global temperature product suite is shown schematically in Figure 2.