Skip to end of metadata
Go to start of metadata

Objective: The UTG content is created by running loaders against V2, V3 and FHIR source files. Version 1 of the UTG published content is scheduled to be released at the end of March, so it is crucial that the UTG content is vetted by those with a high level of knowledge about the content prior to the initial published release.

The UTG Source of Truth content is maintained via GitHub at (Not needed for testing)

The rendered pages for the UTG Source of Truth content is available at These pages are updated each time a change is committed to the GitHub Source of Truth content. Testing of UTG content should include a comparison of original source files (V2 Chapter 2c, coremif, FHIR, etc.) against the rendered pages. 

Content testing is dependent on completion of critical content-related tasks, documented at Critical UTG Content Tasks.


The main goal of testing is to ensure that the UTG resources are correct so that Version 1 can be published by the end of March. In order to accomplish this, testers will compare V2, V3, and FHIR source data to the resources being generated out of the source data by UTG. For example, V2 content will be QA'd by comparing the data in the Chapter 2c Word Document to the rendered vocabulary pages on Each tester will be given a handful of resources to verify and check for correctness and completeness. Instructions and training will be provided to smooth the learning curve for testing.

While the primary goal is to check the correctness and completeness of the content, we are also looking for feedback on usability. For example, the column order in the tables might be more useful in a different order than currently listed. 


Testing is scheduled to begin in early-March, once a handful of critical dependencies have been completed (see Critical UTG Content Tasks). 

Testing will be held for 5 days and each tester should expect to spend between 1-2 hours to verify their assigned content. 

Getting Involved

To participate in UTG content testing please email

Current testers include: Carmela A. Couderc Julie James Susan Matney Reuben Daniels Frank Oemig Craig Newman

Test Data (Source Files)

V2: v29_CH02C_Tables201912Final.doc.docx

V3: DEFN=UV=VO=1451-20191215.coremif (requires RoseTree to view)

FHIR: (FHIR External Tab)

CDA: Need list of CDA content finalized!

Required Tooling 

Rose Tree III is required to visualize the V3 source content and compare it to what is seen in UTG. Note that Rose Tree is only supported in Windows. 

Download the '5.0.44' package at:

Run the installer by following the prompts. You will receive an error (known bug), click 'Continue' to complete the install and 'Close' to exit.

Open Rose Tree III and select File → Open RIM or VocabularyMIF File. Select the DEFN=UV=VO=1451-20191215.coremif file and the application should load and look like below. 

Click on Concept Domains (left), Code Systems and their codes (middle), Value Sets and their codes (right). If you are unsure of what an icon is indication, select Help → Icon interpretation

Testing Instructions

Please follow these general instructions for testing:

  1. Install Rose Tree III (instructions above) if reviewing V3 content
  2. Download the source data (or simply open for FHIR)
    1. V2: v29_CH02C_Tables201912Final.doc.docx

    2. V3: DEFN=UV=VO=1451-20191215.coremif (requires RoseTree to view)

    3. FHIR: (FHIR External Tab)

  3. Review assigned source data resources in applicable tool/file 
    1. Note: tooling is only required to view V3 data via the coreMIF
  4. Compare the source data resources to how they appear in UTG (i.e.
    1. To find a resource you must navigate through the tabs first by object type (Code System or Value Set) and then by product family (V2, V3, FHIR). Then click on the object's name to see the content.
    2. The test scripts provide information in the 'Expected Results' column that guide testers on what to look for and the expected result of the test. 
  5. Fill our your test scripts (see below) according to the result
    1. In the 'Status' column, 'N' indicates that a test needs to be done
    2. Once a test is complete, enter 'P' if it passes, or 'F' if issues are found. The counts of tests passed and failed will be updated automatically at the bottom of the scripts to keep track of progress.
    3. More information about the issues should be placed in the 'Tester Notes' column
    4. Additionally, screenshots can be provided by creating additional tabs and pasting them in with an explanation
    5. Note: The tests in bold are more of an overall check of the components of a section. This is a great spot to leave feedback on how these sections look and feel in UTG.
    6. Note: Red text within a script indicate that there is a known issue with that information or that it is not testable by the testing group. If not testable, the implementation team will verify.
  6. Submit the test script document back to 
    1. Issues will be compiled and prioritized

Test Scripts and Assignments

Please download a copy of your test scripts to guide and keep track of testing statuses (see Testing Instructions section above for more information about using the test scripts).

UTG Content Test Scripts.xlsx

UTG Content Test Scripts_Carmela.xlsx (V2 Confidentiality Code System and Value Set, V3 ActPriority Code System and Value Set, FHIR ObservationCategoryCodes CodeSystem)

UTG Content Test Scripts_Craig.xlsx (V2 CyclicEntryExitIndicator Code System and Value Set, FHIR AdverseEventCategory CodeSystem)

UTG Content Test Scripts_Susan.xlsx (V3 OrderableDrugForm Code System and Value Set, FHIR AdmitSource Code System)

UTG Content Test Scripts_Julie.xlsx (V3 RoleClass Code System and Value Set, FHIR ConsentCategoryCodes)

Reporting Content Issues

Any issues related to the content can be added directly in the 'UTG Content Test Scripts.xlsx'. Use the 'Tester Notes' column to document any issues found. 

Create a new tab to provide additional information or screenshots about the issue if required. 

Once complete, test scripts should be sent to Jess (the PM) will compile the list of issues and summarize them below.

Test Results

Please see the testing report for information about the testing requirements, process and results: UTG Project Content Test Report.pdf

  • No labels