Perspectives on Credential Maintenance: Part 2

by NBRC Vice President of Examinations, Robert C. Shaw, Jr., PhD, RRT, FAARC

Part 1 of the ‘Perspectives on Credential Maintenance’ article series focused on how practitioners choose to either respond to Credential Maintenance Program (CMP) assessments or ignore them.  Choosing to respond to CMP assessments is an indicator of a practitioner who shows the highest regard for public protection. An employer with this same level of regard for public protection could choose to integrate participation in the CMP assessments into employment expectations. The purpose of this article is to review participation information that became available after CMP assessments went into full effect in January 2020. The NBRC first released CMP assessments as a pilot program in 2019.

Engagement Level

Notices about the release of CMP assessments were sent to about 82,500 practitioners whose credentials have an expiration date associated with it. Assessments are released in January, April, July, and October. As of this writing, the first quarter set of assessments had gone through a full three months of availability.


NBRC policy allows those who fail to complete the assessment in the current quarter until the last day of the next quarter to finish the assessment, so the final participation level is not fully known but still fairly clear in Figure 1. Two-thirds of those with expiring credentials chose not to participate in the assessments. Among the one-third of practitioners who engaged with assessments, most completed them.


Typical Response Load

It will help to understand the information in Table 1 for interpretation of Figure 2. Most practitioners who started assessments responded to 10 items which is consistent with those who intended to maintain their CRT or RRT credential. It is possible that some of the practitioners who submitted 1o responses were focusing on two 5-item specialties, but most of those described in the second bar of Figure 2 were maintaining CRT or RRT credentials. After observing that the next largest subgroup had submitted 15 responses in the third bar of Figure 2, most of this subgroup were maintaining the CRT or RRT credential along with one specialty credential.

Table 1. Assessment Details

Figure 2. Number of Assessment Responses during the First Quarter for Each Practitioner

Potential to Decrease Continuing Education Credits

It may sound contrary to propose that documenting fewer continuing education credits is aligned with maintained competence. If the reason for the reduced number is evidence of higher performance through standardized formative assessments, the rationale becomes clearer. However,the rationale is clear when knowing assessment content covers topics that changes rapidly and put the public at risk.

The Part 1 article described how each practitioner can access a color-coded dashboard. Practitioners can use the dashboard to do several things including the following:

  • Review the stem and options of each assessment item including the best response to which a panel of peers agreed.
  • Review whether a correct response was submitted to an item.
  • Learn the percentage of peers who selected the correct response to an item.
  • Review the explanation of each option.
  • View the reference citation from which the item approval panel relied so the content can be further studied.
  • Learn the number of correctly answered assessment items and which performance color zone they fall under as defined in the Part 1 article.
  • Anticipate the number of continuing education credits to be documented after 16 quarters.

Among those who responded to assessments, color zones observed after the first quarter indicate that 80% of the subgroup should anticipate a reduction in the number of continuing education credits to be documented. An assumption behind that statement is that each practitioner in the subgroup will tend to perform at the same level on future assessments.

Figure 3. Zone Results from First Quarter


It is fair to conclude that if more practitioners from the purple slice of Figure 1 were to participate in future assessments, 80% of this subgroup would see a reduction in the number of continuing education credits to be documented as their abilities are assumed to be the same as the subgroup who responded to assessments.

A final point pulled from the Part 1 article is that the act of responding to CMP assessments is more important than the number of assessment items answered correctly.  Each assessment exposes respondents to ideas that are linked to rapidly changing underlying knowledge and high risk to the public. If larger proportions of practitioners can be encouraged to respond to assessments, then public protection will be the beneficiary. However, when two-thirds of the population choose to ignore assessments, help from employers to create incentives for practitioners to participate would be welcomed.