Deprecated: /home/traumane/public_html/wp-content/plugins/wordpress-seo/src/deprecated/src/presenters/open-graph/fb-app-id-presenter.php is deprecated since version WPSEO 15.5 with no alternative available. in /home/traumane/public_html/wp-includes/functions.php on line 5059

Trauma Registry: How to build validation rules that improve data quality (and simplify back-end data validation)


Every trauma center must create a process for monitoring the validity of trauma registry data. In most centers, the focus is on retrospective review: periodically re-abstracting a sample of charts to check inter-rater reliability (IRR). In fact, the ACS Committee on Trauma highlights this approach in Resources for Optimal Care of the Injured Patient.

However, “after the fact” review is not the only data validation tool available to trauma program leaders — or the most effective one. This article explains how to build concurrent review into your data validation strategy. The key is to use validation rules to drive a real-time data quality process.

“Wait, we already use validation rules”

Yes, validation rules are nothing new. Extensive validation checks are already incorporated into every trauma registry system at several levels:

  • National: Every trauma registry platform incorporates edit checks from the National Trauma Data Bank (NTDB).
  • State: Most registry software systems also integrate state-specific validation rules based on state registry inclusion criteria.
  • Local: All trauma registry systems come with a wide range of prepackaged user prompts, and they also allow registry managers to create unique validation rules and edit checks.

However, these validation tools are limited in two ways.

First, while national, state and vendor-packaged rules provide an important layer of data validation, they do not cover all possible data quality problems. Every trauma program has specific data needs and struggles with a unique set of data challenges.

Craig Rees, product manager at ImageTrend

Second, many trauma programs do not use custom validation rules effectively. Validation rules are supposed to function as a data quality check, but in my experience they are often used as a registrar performance check.

Say a hospital’s trauma registry routinely misses key ED data fields, such as initial vital signs and GCS scores. Since the root issue is poor documentation, it seems unfair to registrars to continually flag these gaps. As a result, the validation rules are rewritten to give registrars a fighting chance to show 100% completion. However, this misses the original point of a validation rule — to identify opportunities to improve trauma data quality.

The bottom line is that most trauma programs have a significant opportunity to improve data quality using existing software capabilities. Following are several strategies for leveraging system prompts to provide strong concurrent data validation.

Create validation rules to target specific data problems

Every trauma registry struggles with specific data points. Start by running a series of data completeness reports to identify the data fields that cause consistent problems. When you identify a trouble spot, design a custom validation rule to help improve data quality.

For a simple example, let’s return to the issue noted above — routine gaps in ED data fields. One solution is to create a conditional (if/then) rule based on patient type:

If the patient is not a direct admit, then every ED field must contain an appropriate value.

This rule will not solve the problem completely. Registry leaders still need to address the documentation issues that are leading to data gaps. However, the associated system prompt will help ensure registrars spend time reviewing the chart to make sure ED data is as complete as possible.

Configure validation rules to improve NTDB submissions

According to the NTDS Data Dictionary, trauma centers should only submit data on procedures that were performed at the hospital. However, many trauma registries collect data on procedures performed by EMS personnel or staff at referring hospitals. These centers need to make sure their NTDB submission file does not include data on any prehospital or referring hospital procedures.

The solution is to create a conditional rule based on procedure time and location. For example:

If Procedure Date/Time is prior to Hospital Arrival Date/Time,
then Procedure Location must be PTA (Referring Hospital) or PTA (Prehospital).

In this example, system prompts will remind the registrar to include the location, which can be used by the software to exclude non-trauma center procedures from the NTDB data submission. This will result in higher quality data for both trauma center management and national benchmarking.

Here is another example: A data completeness report shows that patient sex is recorded as Not Applicable for a percentage of patients. Granted, this is a relatively small issue. On the other hand, patient gender data is critical to many research studies. In addition, the NTDB does not accept Not Applicable as a sex field value.

What can be done to improve the quality of this data? First, in the small number of cases where the patient’s gender is truly unavailable in the chart, the registrar should always select Not Known/Not Recorded. If your system allows it, the simplest approach is to inactivate the Not Applicable value for the patient sex field. Alternatively, create a validation rule that generates a system prompt:

The Sex field cannot contain Not Applicable and it cannot be blank.

True, the NTDB validator will catch improper values for this data field. However, depending on when your software applies the NTDB validation rules, that might not be until months later. Using system prompts to validate this field at the point of data entry is much more efficient. The registrar can correct the file immediately, rather than needing to go back later and reopen the record to make a change.

One more note about NTDB submissions: Some registry software systems automatically insert a different value into the data submission export if the collected value is not accepted by the NTDB. In the patient gender example, a system may replace Not Applicable with Not Known/Not Recorded. As a result, the export will pass the validation check but the bad data will still exist in the hospital’s record.

Use validation rules to support specific PI goals

Trauma registry validation rules can also help ensure the success of a performance improvement (PI) initiative. For instance, a trauma program might want to improve timely antibiotic administration for fracture patients. The NTDB collects data on 24-hour antibiotic administration, but a hospital’s PI initiative is targeting a 12-hour benchmark. The registry manager could create the following rule:

If the patient has open fracture(s), then IV Antibiotic Therapy Within 12 Hours cannot be null.

This rule helps make sure the registry captures key data points to support an important PI initiative.

Use validation rules to improve data completeness

Data completeness depends on documentation quality. However, there are many opportunities to use validation rules to make sure registrars capture as much available data as possible.

One strategy is to build a series of validation rules that link related data fields. For example, transfer delays can have a big impact on patient outcomes. Unfortunately, registries often fail to capture the reason for delayed transfer. Here is one way to address this problem with a string of validation rules:

If the ED length of stay is greater than 120 minutes,
then the Transfer Delay field must equal Yes or No.

If the Transfer Delay field equals Yes,
then the Reason for Transfer Delay field cannot be blank.

If the Reason for Transfer Delay field equals EMS,
then the Detailed Reason for EMS Transfer Delay field must contain
No ALS units available or No air transport units available
or No hospital staff available to accompany BLS unit.

Multiple benefits for data quality

Validation rules and system prompts do not replace periodic IRR review and other retrospective approaches to data quality. However, by helping to ensure data quality upstream, this approach can help reduce downstream data validation work for registry managers and others.

In addition, validation rules provide registrars with immediate feedback, so they are a great tool for building the skills of a registry team.

Finally, strong data validation rules can help clarify that certain data problems are not registrar performance issues, they are system issues rooted in poor documentation and other problems. This will help program leaders identify true opportunities for improvement in data quality.

Craig Rees is product manager at ImageTrend.

ImageTrend Patient Registry is a web-based, multi-discipline registry solution that collects and analyzes trauma, stroke and cardiac data. Dynamic forms and configurable data validation tools allow you to gather information more efficiently, and advanced reporting and analysis tools help transform data into knowledge. State-of-the-art integrations with EMS systems and the hospital EMR enable increased collaboration across the continuum of care. The system’s scalable architecture and relational database make it perfect for stand-alone hospitals, large healthcare networks, and regional care systems. Find out more at ImageTrend Patient Registry.

Leave A Reply