Skip to end of metadata
Go to start of metadata

Publication Request

Publication Request

1. Published Name of the Standard for which request is being made

HL7 Informative Document: C-CDA Scorecard Rubric, Release 1- US Realm

2. Standards Material/Document


3. Date of Request

Apr 30, 2020

4. Use Period

5. Reason for extension, timeline, and actions

6. Original Publication Date

7. End date of the current STU period

8. Length of the requested extension

9. Review Process

10. HL7 Work Group making this request and date

Structured Documents

10a. Requesting WG Date

Apr 30, 2020

11. URL of approval minutes

12. HL7 Product Management Group

CDA Management Group

12a. Management Group Date of Approval

May 06, 2020

13. URL of approval minutes

14. Is the artifact ready for final publication?


15. If not ready, please describe remaining steps.

16. Tool name used to produce the machine processable artifacts in the IG


17. The name of the “IG artifact” within the context of the above mentioned tool.


18. Balloted Name of the standard for which request is being made

HL7 Informative Document: C-CDA Scorecard Rubric, Release 1- US Realm

19. Requested name for published standard

HL7 Informative Document: C-CDA Scorecard Rubric, Release 1- US Realm

20. If CMET, list IDs balloted

21. Project Insight Number


22. Document Realm


23. Ballot cycle in which the document was successfully balloted

May 2019

24. Results of that ballot (following reconciliation activities):

24. Results of that ballot (following reconciliation activities):

(not needed for errata, STU extension, or unballoted STU update)

25. Affirmative


26. Negative


27. Abstentions


28. Not Returned


29. Total in ballot pool


30. Date on which final document/standards material was supplied to HQ

31. URL of publication material/ SVN repository

32. Publishing Facilitator

Matt Rahn

33. Special Publication Instructions

34. URL of ballot reconciliation document

35. Has the Work Group posted its consideration of all comments received in its reconciliation document on the ballot desktop?


36. Substantive Changes Since Last Ballot?

37. Product Brief Reviewed By

Structured Documents Workgroup

38. Date Product Brief Reviewed

Apr 30, 2020

39. Has the Product Brief changed?

Product Brief

Product Brief

40. Family


41. Section

Implementation Guides, Rules and References

42. Topic


43. Please Describe the Topic

C-CDA Data Quality

44. Product Type

Data Types Document Methodology Specifications Services Standard Reference Materials Structures

45. Parent standard


46. Parent Standard Status


47. Update/replace standard

48. Common name/search keyword

Rubric, Rubric Criteria, C-CDA Scoring Rubric, Scoring

49. Description

This informative implementation guide contains rubric criteria created through an ongoing project in the HL7 Structured Documents Work Group (SDWG), originating in 2016. Throughout 2018 a group of HL7 members created a new set of rubrics to add to the original rubric. This project identified key problem areas in real system-generated C-CDA documents where similar data was consistently misrepresented or incomplete. Variably constructed data removes the ability to share and compare data reliably, adversely impacting interoperability. The rubric criteria indicate where the implementation community, using the C-CDA standard, has agreed the constraints need to be improved in order to support a greater degree of interoperability. The guide describes the rubrics in prose that facilitate the encoding of testable rules in tooling.



These are categories of potential users, implementers, or other interested parties such as those that are indicated on the Project Scope Statement under “Stakeholders/Vendors/Providers”. Select those that are applicable, or suggest others:

50. Stakeholders

Regulatory Agency, Payors

51. Vendors

EHR, PHR, Health Care IT, HIS

52. Providers

53. Benefits

The goal is to promote the expansion of nationwide interoperability by allowing providers and health IT developers to identify inconsistencies in data representation in C-CDA documents and proactively adopt tighter constraints to eliminate the variability using these rubric.
C-CDA quality assessment tooling built based on these criteria will support the expansion of useful nationwide interoperability.

54. Implementations/Case Studies

ONC Scorecard

55. Development Background

ONC Scorecard