NASA Logo, National Aeronautics and Space Administration

National Aeronautics and Space Administration

NASA's Arctic-Boreal Vulnerability Experiment

ABoVE

Project Data Management Plan Requirements and Guidelines

Referenced on page A.4-15 in NASA Research Announcement for Terrestrial Ecology: Arctic-Boreal Vulnerability Experiment – Phase 3 NNH21ZDA001N-TE

The Data Management Plan (DMP) for an ABoVE proposal is a short document that describes (1) the investigator's commitment to the ABoVE Data Policy and (2) how the investigator plans to manage the data generated by the proposed research during and after the project. NASA-funded projects selected for ABoVE are expected to develop a more detailed DMP to address the dissemination and sharing of research results.

  • ABoVE Data Policy: This policy pertains to the life-cycle of data during ABoVE - from data collection, through quality checking and analysis, to distribution to ABoVE science team members and stakeholders, and to depositing finalized products in a long-term archive.
  • ABoVE Standard Projection and Reference Grid: ABoVE data products are produced in the ABoVE standard projection and reference grid. To facilitate integration, synthesis, and modeling in Phase 3, proposers should plan to use this projection and reference grid.

The DMP for NASA-funded ABoVE investigators is a living document with periodic (~annual) updates to the status and schedule for data set production. Additional information, including best practices for data management and guidance for writing a DMP, is available at the ORNL DAAC website.

1. Description of data to be generated/collected in the research

  • Indicate the data products to be generated/collected in the proposed research. For each data product, provide a summary that contains, as appropriate, the following information:
    • A descriptive title for the data product
    • Parameters/measurements, including units and descriptive names
    • Data collecting platform/instrument
    • Type of data: swath, gridded, vector, tabular, etc.
    • The data file formats to be used.
    • Estimated number of files to be compiled.
    • The expected data volume (e.g., in MB or GB).
    • Organization of data files and their file naming convention.
    • Additional products generated which could be useful to a data user, such as maps and plots
  • Planned/desired data collection protocols
    • Describe (or reference) the simulation protocols for modeled data.
    • Reference relevant sections of Scientific/Technical plan, if applicable
  • How will the data be processed?
    • Briefly describe the software, algorithms, and workflows for data processing.
    • Briefly describe plans for reprocessing and the conditions expected to trigger reprocessing.
    • If there are other documents that describe the processing and reprocessing activities, point to such documents, and provide a summary.
  • Describe the plans and processes used for assessing product quality as well as for collecting, documenting, and conveying the quality information of data products. Recommended items to be covered by the Data Quality section of the DMP include (but not limited to):
    • General: Describe the process for assuring data quality. Include data flows and organizations/groups involved in assuring data quality.
    • Errors/Uncertainties: Indicate how errors/uncertainties in the input data used to produce the products will be accounted for, minimized through improved calibration (i.e., to meet the error budget constraints if required for Cal/Val), and/or propagated in higher level products.
    • Calibration/Validation (Cal/Val): Calibration/Validation is only applicable to missions and funded projects in which Cal/Val is explicitly mandated. Provide the targeted error budget that will be used to assess Cal/Val performance.
    • Ancillary datasets: Describe plans for managing, archiving, and distributing the ancillary datasets used for QA/QC, Cal/Val, error budget validation, uncertainty quantification, and uncertainty characterization.
    • Quality Flags/Indicators: Indicate how quality flags and/or indicators are defined and used in the generated products.

2. Compliance with ABoVE Policies for access, sharing, and re-use

  • Describe the data sharing planned during and after the ABoVE project.
  • Are there any ethical, privacy, intellectual property, and copyright issues for the data set?
  • What is the schedule for delivery of data and related metadata to the ABoVE project?  Early delivery of products useful for ABoVE synthesis activities is encouraged and should be noted on the schedule.
  • Geospatial data products generated in support of the campaign should at a minimum be produced using the ABoVE Standard Projection (vector and raster) and Reference Grid (raster).

3. Information about existing data to be used by the research

  • If you will use existing data, describe the data and how it will be obtained (may reference relevant sections of Scientific/Technical plan, if applicable).
  • What data will be requested from the ABoVE project office and from other ABoVE investigators? Include estimates of the type and amount of data to be requested and the desired time of delivery.

4. Metadata and Data Product Documentation

  • Describe the metadata formats and standards that will be used to document the data so that data files can be self-descriptive and readily used by others.

5. Long-term storage and archival

  • Which of the data generated will become mature enough to submit to a long-term archive (e.g., a DAAC) for sharing with the broader scientific community? Describe the ways data products are likely to be used by other scientists or by external stakeholders.
  • Most data products generated during ABoVE with NASA funding will be archived at the ORNL DAAC, and cross-referenced with the archive centers of Partner Programs (DOE NGEE-Arctic, Polar Knowledge Canada) where appropriate.
  • Develop a plan to communicate with and support the publication of final data at the designated DAAC. Contact the designated DAAC at an early stage and allocate enough data management resources to ensure a smooth publication of high-quality data.
  • What steps need to be taken to prepare the data and documentation for the archive? Who will be responsible for preparing the data for the archive? When will the data be submitted to the archive?