Reliability of Dutch obstetric telephone triage

Bernice Engeltjes*, Ageeth Rosman, Loes C.M. Bertens, Eveline Wouters, Doug Cronie, Fedde Scheele

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

42 Downloads (Pure)

Abstract

Background:
Safety and efficiency of emergency care can be optimized with a triage system which uses urgency to prioritize care. The Dutch Obstetric Telephone Triage System (DOTTS) was developed to provide a basis for assessing urgency of unplanned obstetric care requests by telephone. Reliability and validity are important components in evaluating such (obstetric) triage systems.

Objective:
To determine the reliability of Dutch Obstetric Telephone Triage, by calculating the inter-rater and intra-rater reliability.

Methods:
To evaluate the urgency levels of DOTTS by testing inter-rater and intra-rater reliability, 90 vignettes of possible requests were developed. Nineteen participants, from hospitals where DOTTS had been implemented, rated in two rounds a set of ten vignettes. The five urgency levels and five presenting symptoms had an equal spread and had to be entered in accordance with DOTTS per vignette. Urgency levels were dichotomized into high urgency and intermediate urgency. Inter-rater reliability was rated as degree of agreement between two different participants with the same vignette. Intra-rater reliability was rated as agreement by the same participants at different moments in time. The degree of inter-rater and intra-rater reliability was tested using weighted Cohen’s Kappa and ICC.

Results:
The agreement of urgency level between participants in accordance with predefined urgency level per vignette was 90.5% (95% CI 87.5– 93.6) [335 of 370]. Agreement of urgency level between participants was 88.5% (95% CI 84.9– 93.0) [177 of 200] and 84.9% (95% CI 78.3– 91.4) after re-rating [101 of 119]. Inter-rater reliability of DOTTS expressed as Cohen’s Kappa was 0.77 and as ICC 0.87; intra-rater reliability of DOTTS expressed as Cohen’s Kappa was 0.70 and as ICC 0.82.

Conclusion:
Inter-rater and intra-rater reliability of DOTTS showed substantial correlation, and is comparable to other studies. Therefore, DOTTS is considered reliable.
Original languageEnglish
Pages (from-to)3247-3254
JournalRisk Management and Healthcare Policy
Volume14
DOIs
Publication statusPublished - 2021

Keywords

  • obstetrics
  • triage system
  • inter-observer agreement
  • intra-observer agreement
  • undertriage and overtriage

Fingerprint

Dive into the research topics of 'Reliability of Dutch obstetric telephone triage'. Together they form a unique fingerprint.

Cite this