Skip navigation
  • Home
  • Browse
    • Communities
      & Collections
    • Browse Items by:
    • Publication Date
    • Author
    • Title
    • Subject
    • Department
  • Sign on to:
    • My MacSphere
    • Receive email
      updates
    • Edit Profile


McMaster University Home Page
  1. MacSphere
  2. Open Access Dissertations and Theses Community
  3. Open Access Dissertations and Theses
Please use this identifier to cite or link to this item: http://hdl.handle.net/11375/27287
Title: Establishing validity evidence for the use of VIMEDIX-AR automated metrics in assessment of FAST exam skills
Authors: Ward, Mellissa
Advisor: Engels, Paul
Department: Health Science Education
Keywords: simulation;health science education;ultrasound;automated metrics
Publication Date: 2021
Abstract: Introduction: Simulation has an increasing role in medical education. It offers the ability to learn and practice in a safe environment. Ultrasound is a key tool for many clinicians; however, it requires significant experience to gain expertise. The most common method to gain experience is by training courses with volunteers, where experts are present for one-on-one teaching. This is time and labour intensive. Commercial ultrasound simulators are increasingly available with software capable of generating automated metrics. We sought validity evidence to support the use of automated metrics as a tool for assessment of learners completing a Focused Assessment with Sonography in Trauma (FAST) exam. Methods: Three groups with differing expertise were recruited to participate: novices with no ultrasound training, intermediates who had completed a formal course within six months, and experts with at least five years of clinical experience. All participants were recorded while completing a FAST exam. Automated metrics of time, path length, angular movement, and percent area viewed were obtained. This video was then scored using the Quality of Ultrasound Imaging and Competence (QUICk) by two expert assessors. Participants were also asked to complete ten find fluid exercises, where automated metrics were generated. Automated metrics from the recorded FAST and QUICk were compared using Kruskall-Wallis to assess for differences in expertise. Correlations between QUICk score and the automated metrics were assessed using Pearson’s correlation coefficient. Find fluid exercises were also assessed using repeated measures one-way ANOVA models. Results: Time, angular movement, and percent area viewed left upper quadrant (LUQ) were significantly different with novices requiring more time and angular movement, and higher percent area viewed LUQ than experts. The QUICk scores were significantly higher for the experts and intermediates compared to the novices. The scores from the QUICk overall and checklist did not correlate with any automated metrics. Individual components of positioning and handling, probe handling, and image scrolling were negatively correlated with percent area viewed LUQ. Overall, the QUICk tool could differentiate novices from both intermediates and experts when using the VIMEDIX-AR simulator. Several automated metrics could differentiate expertise. Further work should develop a composite score of automated metrics to assess learners.
URI: http://hdl.handle.net/11375/27287
Appears in Collections:Open Access Dissertations and Theses

Files in This Item:
File Description SizeFormat 
Ward_Mellissa_202112_MSc.pdf
Open Access
1.15 MBAdobe PDFView/Open
Show full item record Statistics


Items in MacSphere are protected by copyright, with all rights reserved, unless otherwise indicated.

Sherman Centre for Digital Scholarship     McMaster University Libraries
©2022 McMaster University, 1280 Main Street West, Hamilton, Ontario L8S 4L8 | 905-525-9140 | Contact Us | Terms of Use & Privacy Policy | Feedback

Report Accessibility Issue