Dealing with brittleness in the design of expert systems for immunohematology

Publications

Share / Export Citation / Email / Print / Text size:

Immunohematology

American National Red Cross

Subject: Medical Laboratory Technology

GET ALERTS SUBSCRIBE

ISSN: 0894-203X
eISSN: 1930-3955

DESCRIPTION

5
Reader(s)
5
Visit(s)
0
Comment(s)
0
Share(s)

SEARCH WITHIN CONTENT

FIND ARTICLE

Volume / Issue / page

Archive
Volume 37 (2021)
Volume 36 (2020)
Volume 35 (2019)
Volume 34 (2018)
Volume 33 (2017)
Volume 32 (2016)
Volume 31 (2015)
Volume 30 (2014)
Volume 29 (2013)
Volume 28 (2012)
Volume 27 (2011)
Volume 26 (2010)
Volume 25 (2009)
Volume 24 (2008)
Volume 23 (2007)
Volume 22 (2006)
Volume 21 (2005)
Volume 20 (2004)
Volume 19 (2003)
Volume 18 (2002)
Volume 17 (2001)
Volume 16 (2000)
Volume 15 (1999)
Volume 14 (1998)
Volume 13 (1997)
Volume 12 (1996)
Volume 11 (1995)
Volume 10 (1994)
Volume 9 (1993)
Volume 8 (1992)
Volume 7 (1991)
Volume 6 (1990)
Volume 5 (1989)
Volume 4 (1988)
Volume 3 (1987)
Related articles

VOLUME 12 , ISSUE 3 (September 1996) > List of articles

Dealing with brittleness in the design of expert systems for immunohematology

Stephanie A. Guerlain / Philip J. Smith / Jodi Heintz Obradovich / Sally Rudmann / Patricia L. Strohm / Jack W. Smith / John Svirbely

Citation Information : Immunohematology. Volume 12, Issue 3, Pages 101-107, DOI: https://doi.org/10.21307/immunohematology-2019-758

License : (Transfer of Copyright)

Published Online: 16-November-2020

ARTICLE

ABSTRACT

In recent years, there has been increased discussion about the potential of expert systems to support medical decision-making tasks, including applications in clinical laboratory settings. This study provides data regarding the cognitive errors that technologists make on an important problem-solving task: the identification of antibodies in a patient’s blood. It explores alternative designs for expert systems developed to reduce such errors. It also evaluates the effects of these alternative designs on the ability of the users to effectively stay “in the loop,” applying their own expertise and judgment while using the computer as a tool to assist with their analyses. A pilot study was conducted involving 32 certified medical technologists, which compared two alternative roles for the computer: (1) use of the computer to automatically complete subtasks upon request, and (2) use of the computer as a monitoring device to critique technologists as they completed the analyses themselves. The system design that automatically completed subtasks for the technologist induced a 29 percent increase in errors relative to the design that critiqued technologists as they completed the analyses themselves.

You don't have 'Full Text' access of this article.

Purchase Article Subscribe Journal Share