AModelforTypesandLevelsofHumanInteraction
withAutomation
RajaParasuraman,ThomasB.Sheridan,Fellow,IEEE,andChristopherD.Wickens
Abstract—Technicaldevelopmentsincomputerhardwareandsoftwarenowmakeitpossibletointroduceautomationintovirtu-allyallaspectsofhuman-machinesystems.Giventhesetechnicalcapabilities,whichsystemfunctionsshouldbeautomatedandtowhatextent?Weoutlineamodelfortypesandlevelsofautoma-tionthatprovidesaframeworkandanobjectivebasisformakingsuchchoices.Appropriateselectionisimportantbecauseautoma-tiondoesnotmerelysupplantbutchangeshumanactivityandcanimposenewcoordinationdemandsonthehumanoperator.Wepro-posethatautomationcanbeappliedtofourbroadclassesoffunc-tions:1)informationacquisition;2)informationanalysis;3)de-cisionandactionselection;and4)actionimplementation.Withineachofthesetypes,automationcanbeappliedacrossacontinuumoflevelsfromlowtohigh,i.e.,fromfullymanualtofullyautomatic.Aparticularsystemcaninvolveautomationofallfourtypesatdif-ferentlevels.Thehumanperformanceconsequencesofparticulartypesandlevelsofautomationconstituteprimaryevaluativecri-teriaforautomationdesignusingourmodel.Secondaryevaluativecriteriaincludeautomationreliabilityandthecostsofdecision/ac-tionconsequences,amongothers.Examplesofrecommendedtypesandlevelsofautomationareprovidedtoillustratetheapplicationofthemodeltoautomationdesign.
IndexTerms—Automation,cognitiveengineering,functionallo-cation,human-computerinteraction,humanfactors,human-ma-chinesystems,interfacedesign.
I.INTRODUCTION
ONSIDERthefollowingdesignproblem.Ahumanop-eratorofacomplexsystemprovidedwithalargenumberofdynamicinformationsourcesmustreachadecisionrelevanttoachievingasystemgoalefficientlyandsafely.Examplesincludeananesthesiologistgivenvariousvitalsignswhomustdecidewhethertoincreasethedosageofadrugtoapatientundergoingsurgery;anairdefenseoperatorgivenvarioussensorreadingswhohastodecidewhethertoshootdownapotentiallyhostileenemyaircraft;orasecuritiesanalystgivenvariousfinancialdatawhomustjudgewhethertobuyalargeblockofstocks.Technicaldevelopmentsincomputerhardwareandsoftwaremakeitpossibletoautomatemanyaspectsofthesystem,i.e.,tohaveacomputercarryoutcertainfunctionsthatthehumanoperatorwouldnormallyperform.Theautomation
ManuscriptreceivedJanuary26,1999;revisedFebruary7,2000.ThisworkwassupportedbygrantsfromNASAAmesResearchCenter,MoffettField,CA(NAG-2-1096)andNASAGoddardSpaceResearchCenter,MD(NAG5-8761)toR.P.,andfromNASAAmesResearchCentertoT.B.S.(NAG-2-729).ThispaperwasrecommendedbyAssociateEditorR.Rada.
R.ParasuramaniswiththeCognitiveScienceLaboratory,TheCatholicUni-versityofAmerica,Washington,DC200USA.
T.B.SheridaniswiththeMassachusettsInstituteofTechnology,Cambridge,MA02165USA.
C.D.WickensiswiththeUniversityofIllinois,Champaign,IL61820USA.PublisherItemIdentifierS1083-4427(00)03579-7.
C
candifferintypeandcomplexity,fromsimplyorganizingtheinformationsources,tointegratingtheminsomesummaryfashion,tosuggestingdecisionoptionsthatbestmatchtheincominginformation,oreventocarryoutthenecessaryaction.Thesystemdesignissueisthis:giventhesetechnicalcapabil-ities,whichsystemfunctionsshouldbeautomatedandtowhatextent?Thesefundamentalquestionsincreasinglydrivethede-signofmanynewsystems.Inthispaperweoutlineamodelofhumaninteractionwithautomationthatprovidesaframeworkforanswerstothesequestions.Thehumanperformancecon-sequencesofspecifictypesandlevelsofautomationconstitutetheprimaryevaluativecriteriaforautomationdesignusingthemodel.Secondaryevaluativecriteriaincludeautomationrelia-bilityandthecostsofactionconsequences.(Boththesesetsofcriteriaaredescribedmorefullylaterinthispaper).Suchacom-binedapproach—distinguishingtypesandlevelsofautomationandapplyingevaluativecriteria—canallowthedesignertode-terminewhatshouldbeautomatedinaparticularsystem.Be-causetheimpactoftheevaluativecriteriamaydifferbetweensystems,theappropriatetypesandlevelsofautomationfordif-ferentsystemscanvarywidely.Ourmodeldoesnotthereforeprescribewhatshouldandshouldnotbeautomatedinapartic-ularsystem.Nevertheless,applicationofthemodelprovidesamorecompleteandobjectivebasisforautomationdesignthandoapproachesbasedpurelyontechnologicalcapabilityoreco-nomicconsiderations.
II.AUTOMATION
Machines,especiallycomputers,arenowcapableofcarryingoutmanyfunctionsthatatonetimecouldonlybeperformedbyhumans.Machineexecutionofsuchfunctions—orautoma-tion—hasalsobeenextendedtofunctionsthathumansdonotwishtoperform,orcannotperformasaccuratelyorreliablyasmachines.Technicalissues—howparticularfunctionsareau-tomated,andthecharacteristicsoftheassociatedsensors,con-trols,andsoftware—aremajorconcernsinthedevelopmentofautomatedsystems.Thisisperhapsnotsurprisinggiventheso-phisticationandingenuityofdesignofmanysuchsystems(e.g.,theautomaticlandingofajumbojet,orthedockingoftwospacecraft).Theeconomicbenefitsthatautomationcanprovide,orareperceivedtooffer,alsotendtofocuspublicattentiononthetechnicalcapabilitiesofautomation.
Incontrasttothevoluminoustechnicalliteratureonautoma-tion,thereisasmallbutgrowingresearchbaseexaminingthehumancapabilitiesinvolvedinworkwithautomatedsystems[1]–[8].Thisworkhasshownclearlythatautomationdoesnotsimplysupplanthumanactivitybutratherchangesit,oftenin
1083-4427/00$10.00©2000IEEE
PARASURAMANetal.:TYPESANDLEVELSOFHUMANINTERACTIONWITHAUTOMATION287
waysunintendedandunanticipatedbythedesignersofautoma-tion[8],andasaresultposesnewcoordinationdemandsonthehumanoperator[7].Untilrecently,however,thesefindingshavenothadmuchvisibilityorimpactinengineeringandde-signcircles.Examinationofhumanperformanceissuesises-peciallyimportantbecausemoderntechnicalcapabilitiesnowforcesystemdesignerstoconsidersomehardchoicesregardingwhattoautomateandtowhatextent,giventhatthereislittlethatcannotbeautomated.Inthepresentpaperweproposeamodelfortypesandlevelsofautomationthatprovidesaframeworkandanobjectivebasisformakingsuchchoices.Ourapproachwasguidedbytheconceptof“human-centeredautomation”[9]andbyapreviousanalysisofautomationinairtrafficcontrol(ATC)[10].1
Letusbeginbydefiningautomation,becausethetermhasbeenusedmanydifferentways.TheOxfordEnglishDictionary(19)definesautomationas
1)Automaticcontrolofthemanufactureofaproductthroughanumberofsuccessivestages;
2)theapplicationofautomaticcontroltoanybranchofin-dustryorscience;
3)byextension,theuseofelectronicormechanicaldevicestoreplacehumanlabor.
Theoriginaluseofthetermimpliesautomaticcontrol(auto-matichavingmanyalternativedefinitionssuggestingreflexiveaction,spontaneity,andindependenceofoutsidesources).Au-tomaticcontrolcanbeopenloopaswellasclosedloop,andcanrefertoelectronicaswellasmechanicalaction.Automationdoesnotsimplyrefertomodernizationortechnologicalinnova-tion.Forexample,updatingacomputerwithamorepowerfulsystemdoesnotnecessarilyconstituteautomation,nordoesthereplacementofelectricalcableswithfiberoptics.Thepresentpaperisconcernedwithhumanperformanceinautomatedsys-tems.Wethereforeuseadefinitionthatemphasizeshuman-ma-chinecomparisonanddefineautomationasadeviceorsystemthataccomplishes(partiallyorfully)afunctionthatwasprevi-ously,orconceivablycouldbe,carriedout(partiallyorfully)byahumanoperator[8].
III.AMODELFORTYPESANDLEVELSOFAUTOMATIONInourdefinition,automationreferstothefullorpartialre-placementofafunctionpreviouslycarriedoutbythehumanoperator.Thisimpliesthatautomationisnotallornone,butcanvaryacrossacontinuumoflevels,fromthelowestleveloffullymanualperformancetothehighestleveloffullautomation.Severallevelsbetweenthesetwoextremeshavebeenproposed[11],[12].TableIshowsa10-pointscale,withhigherlevelsrep-resentingincreasedautonomyofcomputeroverhumanaction[10],basedonapreviouslyproposedscale[11].Forexample,atalowlevel2,severaloptionsareprovidedtothehuman,butthesystemhasnofurthersayinwhichdecisionischosen.Atlevel4,thecomputersuggestsonedecisionalternative,butthehuman
1Inprinciple,ourapproachdoesnotexcludethepossibilityoffullautomation,
withoutanyhumanoperatorinvolvement.Thismightsuggestthatourmodelisnotneedediftotalautomationistechnicallyfeasible.Aswediscusslater,how-ever,fullautomationdoesnotnecessarilyeliminateahumanroleinautomatedsystems[8].
TABLEI
LEVELS
OFAUTOMATIONOFDECISIONANDACTIONSELECTION
Fig.1.Simplefour-stagemodelofhumaninformationprocessing.
retainsauthorityforexecutingthatalternativeorchoosingan-otherone.Atahigherlevel6,thesystemgivesthehumanonlyalimitedtimeforavetobeforecarryingoutthedecisionchoice.Automatedsystemscanoperateatspecificlevelswithinthiscontinuum.Forexample,aconflictdetectionandresolutionsystemthatnotifiesanairtrafficcontrollerofaconflictintheflightpathsoftwoaircraftandsuggestsaresolutionwouldqualifyaslevel4automation.Underlevel6orhigher,thesystemwouldautomaticallyexecuteitsownresolutionadvisory,unlessthecontrollerintervened.
IntheproposedmodelweextendTableItocoverautoma-tionofdifferenttypesoffunctionsinahuman-machinesystem.ThescaleinTableIrefersmainlytoautomationofdecisionandactionselection,oroutputfunctionsofasystem.However,au-tomationmayalsobeappliedtoinputfunctions,i.e.,tofunc-tionsthatprecededecisionmakingandaction.Intheexpansionofthemodel,weadoptasimplefour-stageviewofhumanin-formationprocessing(seeFig.1).
Thefirststagereferstotheacquisitionandregistrationofmultiplesourcesofinformation.Thisstageincludestheposi-tioningandorientingofsensoryreceptors,sensoryprocessing,initialpre-processingofdatapriortofullperception,andse-lectiveattention.Thesecondstageinvolvesconsciouspercep-tion,andmanipulationofprocessedandretrievedinformationinworkingmemory[13].Thisstagealsoincludescognitiveop-erationssuchasrehearsal,integrationandinference,buttheseoperationsoccurpriortothepointofdecision.Thethirdstageiswheredecisionsarereachedbasedonsuchcognitiveprocessing.Thefourthandfinalstageinvolvestheimplementationofare-sponseoractionconsistentwiththedecisionchoice.
Thisfour-stagemodelisalmostcertainlyagrosssimplifica-tionofthemanycomponentsofhumaninformationprocessingasdiscoveredbyinformationprocessingandcognitivepsychol-ogists[14].Theperformanceofmosttasksinvolvesinter-de-pendentstagesthatoverlaptemporallyintheirprocessingoper-ations[15].Thestagescanalsobeconsideredtobecoordinated
288IEEETRANSACTIONSONSYSTEMS,MAN,ANDCYBERNETICS—PARTA:SYSTEMSANDHUMANS,VOL.30,NO.3,MAY2000
Fig.2.Levelsofautomationforindependentfunctionsofinformationacquisition,informationanalysis,decisionselection,andactionimplementation.Examplesofsystemswithdifferentlevelsofautomationacrossfunctionaldimensionsarealsoshown.
togetherin“perception-action”cycles[16]ratherthaninastrictserialsequencefromstimulustoresponse.Ourgoalisnottode-batethetheoreticalstructureofthehumancognitivesystembuttoproposeastructurethatisusefulinpractice.Inthisrespect,theconceptualizationshowninFig.1providesasimplestartingpointwithsurprisinglyfar-reachingimplicationsforautomationdesign.Similarconceptualmodelshavebeenfoundtobeusefulinderivinghumanfactorsrecommendationsfordesigningsys-temsingeneral[17].
Thefour-stagemodelofhumaninformationprocessinghasitsequivalentinsystemfunctionsthatcanbeautomated.Ac-cordingly,weproposethatautomationcanbeappliedtofourclassesoffunctions(seealso[18]andrelatedproposalsin[9]and[19]):1)informationacquisition;2)informationanalysis;
3)decisionandactionselection;4)
actionimplementation.
Eachofthesefunctionscanbeautomatedtodifferingde-grees,ormanylevels.ThemultiplelevelsofautomationofdecisionmakingasshowninTableIcanbeapplied,withsomemodification,totheinformationacquisition,informationanalysis,andactionimplementationstagesaswell,althoughthenumberoflevelswilldifferbetweenthestages.Fig.2providesaschematicofourmodeloftypesandlevelsofautomation.Asaconvenientshorthand,werefertothefourtypesasacquisition,analysis,decision,andactionautomation.Wealsooccasionallyreferjointlytoacquisitionandanalysisautomationasinformationautomation.
Aparticularsystemcaninvolveautomationofallfourdimen-sionsatdifferentlevels.Thus,forexample,agivensystem(A)couldbedesignedtohavemoderatetohighacquisitionautoma-tion,lowanalysisautomation,lowdecisionautomation,andlowactionautomation.Anothersystem(B),ontheotherhand,mighthavehighlevelsofautomationacrossallfourdimensions.
A.AcquisitionAutomation
Automationofinformationacquisitionappliestothesensingandregistrationofinputdata.Theseoperationsareequivalenttothefirsthumaninformationprocessingstage,supportinghumansensoryprocesses.Atthelowestlevel,suchautomationmayconsistofstrategiesformechanicallymovingsensorsinordertoscanandobserve.Forexample,theradarsusedincommercialATCacquireinformationonaircraftbyscanningtheskyinafixedpattern,butinmilitaryATCtheradarsmay“lockon”asafunctionofdetectedtargets.Artificialvisualandhapticsensorscouldalsobeusedwithanindustrialrobottoallowittofindandgraspanobject,therebyprovidinginformationaboutthatobject.Moderatelevelsofautomationatthisstagemayinvolveorgani-zationofincominginformationaccordingtosomecriteria,e.g.,aprioritylist,andhighlightingofsomepartoftheinformation.Forexample“electronicflightstrips”forairtrafficcontrollerscouldlistaircraftintermsofpriorityforhandling;andtheelec-tronicdatablockshowingaircraftonthecontroller’sradardis-play(whichitselfrepresentsanearlierformofacquisitionau-tomation)couldbehighlightedtoindicateapotentialproblemwithaparticularaircraft.Notethatbothorganizationandhigh-lightingpreservethevisibilityoftheoriginalinformation(“raw”data).Thisisnotnecessarilythecasewithamorecomplexop-erationatthisstageofautomation,filtering,inwhichcertainitemsofinformationareexclusivelyselectedandbroughttotheoperator’sattention.Highlightingandfilteringcanleadtodif-feringhumanperformanceconsequences,asdescribedinalatersectioninadiscussionofautomationreliability.B.AnalysisAutomation
Automationofinformationanalysisinvolvescognitivefunc-tionssuchasworkingmemoryandinferentialprocesses.Atalowlevel,algorithmscanbeappliedtoincomingdatatoallowfortheirextrapolationovertime,orprediction.Forexample,predictordisplayshavebeendevelopedforthecockpitthatshowtheprojectedfuturecourseofanotheraircraftintheneighboringairspace[20],[21].Trenddisplayshavealsobeendevelopedforuseinprocesscontrol(e.g.,nuclearpowerplants),inwhichamodeloftheprocessisdevelopedandusedtoshowboththecur-rentandtheanticipatedfuturestateoftheplant[22].Ahigherlevelofautomationatthisstageinvolvesintegration,inwhichseveralinputvariablesarecombinedintoasinglevalue.Oneexampleistouseadisplaywithanemergentperceptualfea-turesuchasapolygonagainstabackgroundoflines[23].An-otherexampleofinformationanalysisautomationinATCistheconvergingrunwaydisplayaid(CRDA),whicheliminatestheneedforthecontrollertomentallyprojecttheapproachpathofoneaircraftontothatofanotherlandingonaconvergingrunway[24].Inboththeseexamples,informationintegrationservesthepurposeofaugmentinghumanoperatorperceptionandcognition.Morecomplexformsofanalysisautomationin-clude“informationmanagers”thatprovidecontext-dependentsummariesofdatatotheuser[45].C.DecisionAutomation
Thethirdstage,decisionandactionselection,involvesse-lectionfromamongdecisionalternatives.Automationofthis
PARASURAMANetal.:TYPESANDLEVELSOFHUMANINTERACTIONWITHAUTOMATION2
stageinvolvesvaryinglevelsofaugmentationorreplacementofhumanselectionofdecisionoptionswithmachinedecisionmaking,asdescribedpreviouslyinTableI.Forexampleex-pertsystemsaredesignedwithconditionallogic(i.e.,produc-tionrules)toprescribeaspecificdecisionchoiceifparticularconditionsexist[25].Examplescanbefoundinmedicine[26],militarycommandandcontrol[27],andinrouteplanningforpilotstoavoidbadweather[28].Aswiththeanalogousde-cision-makingstageinhumanperformance,suchsystemsde-partfromthoseinvolvedininference(analysisautomation)be-causetheymustmakeexplicitorimplicitassumptionsaboutthecostsandvaluesofdifferentpossibleoutcomesofthedecisionprocess,andthenatureoftheseoutcomesisuncertaininaprob-abilisticworld.ThedifferentlevelsofautomationatthisstagearebestdefinedbytheoriginaltaxonomyproposedbySheridan[11]andshowninTableI,whichdefinesacontinuumthatpro-gressesfromsystemsthatrecommendcoursesofaction,tothosethatexecutethosecourses.Forexample,incomparingproposedandexistingdesignsfordecisionautomationinavoidingair-craft–groundcollisions,thecurrentgroundproximitywarningsystem(GPWS)ispositionedatlevel4,inwhichasinglema-neuverisrecommended,butthepilotcanchosetoignoreit.Butaproposedautomaticgroundcollisionavoidance(autoGCAS)systemforcombataircraftisdefinedatlevel7,inwhichautoma-tionwillautomaticallytakecontrolifthepilotdoesnot[29].D.ActionAutomation
Thefinalstageofactionimplementationreferstotheac-tualexecutionoftheactionchoice.Automationofthisstageinvolvesdifferentlevelsofmachineexecutionofthechoiceofaction,andtypicallyreplacesthehandorvoiceofthehuman.Differentlevelsofactionautomationmaybedefinedbytherel-ativeamountofmanualversusautomaticactivityinexecutingtheresponse.Forexample,inaphotocopier,manualsorting,au-tomaticsorting,automaticcollation,andautomaticstaplingrep-resentdifferentlevelsofactionautomationthatcanbechosenbytheuser.AsomewhatmorecomplexexamplefromATCistheautomated“handoff,”inwhichtransferofcontrolofanair-craftfromoneairspacesectortoanotheriscarriedoutautomat-icallyviaasinglekeypress,oncethedecisionhasbeenmadebythecontroller.Ontheflightdeck,systemsarealsobeingconsideredinwhichaflightplan,uplinkedfromtheground,canbe“autoloaded”intotheplane’sflightmanagementcom-puterbyasinglekeypress,ratherthanthroughmoretime-con-sumingmanualdataentry[30]–[32].Finally,actionautomationincludes“agents”thattrackuserinteractionwithacomputerandexecutecertainsub-tasksautomaticallyinacontextually-appro-priatemanner[45].E.AdaptiveAutomation
Levelsofautomationacrossanyofthesefunctionaltypesneednotbefixedatthesystemdesignstage.Instead,thelevel(andperhapseventhetype)ofautomationcouldbedesignedtovarydependingonsituationaldemandsduringoperationaluse.Context-dependentautomationisknownasadaptiveautomation[33]–[35].Twoexampleswillillustratetheconcept.Inanairde-fensesystem,thebeginningofa“pop-up”weapondeliveryse-quencecouldleadtotheautomationatahighlevelofallaircraft
defensivemeasures[36].Theautomationisadaptivebecauseifthiscriticaleventdoesnotoccur,theautomationisnotinvokedorissetatalowlevel.Inanotherexample,thedecisiontocon-tinueorabortanaircrafttakeofffollowinganenginemalfunc-tionmightbeautomatedateitheraloworahighleveldependinguponthetimecriticalityofthesituation(e.g.,howclosetheair-craftistothecriticalspeedV1fortakeoff)[37].Considerableempiricalresearchonadaptiveautomationhasbeenreportedinrecentyears[38]–[44].However,wedonotdescribethisworkbecauseitraisesseveralcomplexancillaryissues,thediscussionofwhichwouldtakeusfarafieldfromtheprimarypurposeofthispaper.
IV.AFRAMEWORKFORAUTOMATIONDESIGN
Themodelwehaveoutlinedprovidesaframeworkforexam-iningautomationdesignissuesforspecificsystems.Howcantheframeworkbeused?Weproposeaseriesofstepsandaniter-ativeprocedurethatcanbecapturedinaflowchart(seeFig.3).Thefirststepistorealizethatautomationisnotall-or-nonebutcanvarybytype.Onecanaskwhetherautomationshouldbeappliedtoinformationacquisition,informationanalysis,deci-sionselection,ortoactionimplementation.Automationofoneclassoffunction(e.g.,informationanalysis),ofdifferentcom-binationsoffunctions,orofallfourfunctionaldomains,canbeentertained.
Atasubsequentstageofdesign,onecanaskwhatlevelofautomationshouldbeappliedwithineachfunctionaldomain.Thereisprobablynosimpleanswertothisquestion,andtrade-offsbetweenanticipatedbenefitsandcostsarelikely.However,thefour-dimensionalmodelwehaveproposedcanprovideaguidingframework.AsshowninFig.3,multiplelevelsofau-tomationcanbeconsideredforeachtypeofautomation.Weproposethatanyparticularlevelofautomationshouldbeeval-uatedbyexaminingitsassociatedhumanperformanceconse-quences.Theseconstituteprimaryevaluativecriteriaforlevelsofautomation.However,humanperformanceisnottheonlyim-portantfactor.Secondaryevaluativecriteriaincludeautomationreliabilityandthecostsofdecision/actionconsequences2.Theseshouldalsobeappliedtoevaluatethefeasibilityandappropri-atenessofparticularlevelsofautomation.Weenvisagetheap-plicationofthesecriteriaandtheirevaluationasconstitutingarecursiveprocess(seeFig.3)thatcouldbemadepartofaniter-ativedesignprocedure.Weemphasize,however,thatthemodelshouldnotbetreatedasastaticformulaorasaprescriptionthatdecreesaparticulartypeorlevelofautomation.Rather,whenconsideredincombinationwiththeprimaryandsecondaryeval-uativecriteriawehavedescribed,themodelcanprovideprinci-pledguidelinesforautomationdesign.
Weprovideexampleswhere,followingconsiderationoftheseevaluativecriteria,particularlevelsofautomationarerecom-mendedforeachofthefourtypesorstagesofautomation.Suchrecommendationsrefertotheappropriateupperboundonthelevelofautomation,i.e.,themaximum,butnotnecessarilytherequiredlevel.Inotherwords,werecommendthatautomation
2This
isnotanexhaustivelistofcriteria.Othersthatareimportantinclude
easeofsystemintegration,efficiency/safetytradeoffs,manufacturingandoper-atingcosts,andliabilityissues.
290IEEETRANSACTIONSONSYSTEMS,MAN,ANDCYBERNETICS—PARTA:SYSTEMSANDHUMANS,VOL.30,NO.3,MAY2000
Fig.3.Flowchartshowingapplicationofthemodeloftypesandlevelsofautomation.Foreachtypeofautomation(acquisition,analysis,decision,andaction),alevelofautomationbetweenlow(manual)andhigh(fullautomation)ischosen.Thislevelisthenevaluatedbyapplyingtheprimaryevaluativecriteriaofhumanperformanceconsequence,andadjustedifnecessary,inaniterativemannerasshown.Secondaryevaluativecriteriaarethenalsoiterativelyappliedtoadjustthelevelofautomation.Theprocessisthenrepeatedforallfourtypesofautomation.
couldbedesignedtogoashighasthatparticularlevel,butnofurther.Butthedesignercouldchoosealevellowerthanthismaximumifnecessary,particularlyafterconsideringevaluativecriteriaotherthantheoneswediscuss(e.g.,easeofsystemin-tegration,orcost).Thelowerboundonthelevelofautomationcanalsobedeterminedbyapplyingthesameevaluativecriteria.Acceptablesystemperformancemayrequireacertainminimallevelofautomation.
A.HumanPerformanceConsequences:PrimaryEvaluativeCriteriaforAutomationDesign
Animportantconsiderationindecidinguponthetypeandlevelofautomationinanysystemdesignistheevaluationoftheconsequencesforhumanoperatorperformanceintheresultingsystem(i.e.,afterautomationhasbeenimplemented).AsshowninFig.3,particulartypesandlevelsofautomationareevalu-atedbyexaminingtheirassociatedhumanperformanceconse-quences.Totakeahypotheticalexample,supposepriorresearchhasshown(ormodelingpredicts)thatcomparedtomanualop-eration,bothhumanandsystemperformanceareenhancedby
level4automationbutdegradedbyautomationabovelevel6.Applicationofourframeworkwoulddeterminethelowerandupperboundsofautomationtobe4and6,respectively.Thisinitialspecificationwouldthenbeevaluatedagainwithrespecttothesecondaryevaluativecriteria,inaniterativemanner,andafinalchoiceoflevelwithinthisrangecouldbemade(seeFig.3).Overthepasttwodecades,researchershaveexaminedanumberofdifferentaspectsofhumaninteractionwithauto-matedsystems.Thisresearch,whichhasincludedtheoreticalanalyzes,laboratoryexperiments,simulationandmodeling,fieldstudies,andanalyzesofreal-worldincidentsandacci-dents,hasfoundthatautomationcanhavebothbeneficialandnegativeeffectsonhumanperformance[1]–[10],[45]–[48].Webrieflydiscussfourhumanperformanceareas:mentalwork-load,situationawareness,complacency,andskilldegradation.1)MentalWorkload:Theevidencesuggeststhatwell-de-signedinformationautomationcanchangehumanoperatormentalworkloadtoalevelthatisappropriateforthesystemtaskstobeperformed.Atthesimplestlevel,organizinginfor-mationsources,e.g.,inaprioritylist,willhelptheoperatorinpickingtheinformationrelevanttoadecision.Datasummariescanalsohelpbyeliminatingtime-consumingsearchorcom-municationoperations.Asmentionedpreviously,theelectronicdatablockontheairtrafficcontroller’sradardisplayreplacestheneedforthecontrollertocommunicatewithpilotstodeterminetheaircraftpositionandaltitude.Otherinformationautomationoperationsthatarebeneficialincludehighlighting,andintegration,inwhichdifferentinformationsourcesarecollatedandpresentedtogether[10].Cockpitpredictordisplayshavealsoshownthatpilotworkloaddecreasesandhazarddetectionperformanceimproveswiththeadditionofpredictiveinformationconcerningtheflightpathofneighboringaircraft[21].Datatransformation,forexamplegraphicpresentationofinformation,canalsobebeneficial.Transformationandintegrationofrawdataintoaform(graphicalorotherwise)thatmatchestheoperator’srepresentationofsystemoperationshasbeenfoundtobeausefuldesignprinciple[49].Agoodexampleisthehorizontalsituationindicatorinthecockpit,whichprovidesthepilotwithagraphicdisplayoftheprojectedflightplanandthecurrentpositionoftheaircraft.This,morethananyotherautomatedsysteminthecockpit,hasbeencreditedwithreducingtheworkloadofthepilot[50].
Theseresultsshouldnotbeconstruedtomeanthatautoma-tionalwaysresultsinbalancedoperatorworkload.Instancesofautomationincreasingworkloadhavealsobeenfound[8],[50].Thesemostlyinvolvesystemsinwhichtheautomationisdiffi-culttoinitiateandengage,thusincreasingbothcognitivework-load[51]andifextensivedataentryisrequired,thephysicalworkloadoftheoperator.Suchsystemshavebeenreferredtoasimplementing“clumsy”automation[50].Ingeneral,theef-fectofautomationonmentalworkloadhasbeenmirroredbythesimilarlymixedrecordofautomationinimprovinghumanproductivityandefficiency[52].
Inadditiontounbalancedmentalworkload,otherhumanper-formancecostshavebeenlinkedtoparticularformsofautoma-tion.Webrieflyconsiderthreesuchcosts.
2)SituationAwareness:First,automationofdeci-sion-makingfunctionsmayreducetheoperator’sawareness
PARASURAMANetal.:TYPESANDLEVELSOFHUMANINTERACTIONWITHAUTOMATION291
ofthesystemandofcertaindynamicfeaturesoftheworkenvironment.Humanstendtobelessawareofchangesinen-vironmentalorsystemstateswhenthosechangesareunderthecontrolofanotheragent(whetherthatagentisautomationoranotherhuman)thanwhentheymakethechangesthemselves[53]–[56].Also,ifadecisionaid,expertsystem,orothertypeofdecisionautomationconsistentlyandrepeatedlyselectsandexecutesdecisionchoicesinadynamicenvironment,thehumanoperatormaynotbeabletosustainagood“picture”oftheinformationsourcesintheenvironmentbecauseheorsheisnotactivelyengagedinevaluatingtheinformationsourcesleadingtoadecision.Thismightoccurinsystemswhereoperatorsactaspassivedecision-makersmonitoringaprocesstodeterminewhentointervenesoastopreventerrorsorincidents[53].Notethatsuchacostmayoccurevenastheuseofautomationofinformationanalysis,e.g.,dataintegration,mayimprovetheoperator’ssituationawareness.
3)Complacency:Second,ifautomationishighlybutnotperfectlyreliableinexecutingdecisionchoices,thentheoper-atormaynotmonitortheautomationanditsinformationsourcesandhencefailtodetecttheoccasionaltimeswhentheautoma-tionfails[57],[58].Thiseffectofover-trustor“complacency”isgreatestwhentheoperatorisengagedinmultipletasksandlessapparentwhenmonitoringtheautomatedsystemistheonlytaskthattheoperatorhastoperform[58].Thecomplacencyeffectinmonitoringhasrecentlybeenmodeledusingaconnectionistar-chitecture[59]:theanalysissuggestedthatcomplacencyreflectsdifferentiallearningmechanismsformonitoringundermanualcontrolandautomation.
Automationofinformationanalysiscanalsoleadtocompla-cencyifthealgorithmsunderlyingfiltering,prediction,orin-tegrationoperationsarereliablebutnotperfectlyso.Arecentstudyofasimulatedair-groundtargetingtask[60]foundthatacuethatincorrectlydirectedattentionawayfromthetargetledtopoorerdetectionperformanceeventhoughpilotswereinformedthatthecuewasnotperfectlyreliable.Automatedcueing(at-tentionguidance)canleadoperatorstopaylessattentiontoun-cuedareasofadisplaythanisappropriate[61].Thuscompla-cency-likeeffectscanalsobeobtainedevenifautomationisap-pliedtoinformationacquisitionandanalysisandnotjusttode-cision-making.Itisnotknown,however,whethersucheffectsofunreliableautomationapplyequallystronglytoallstagesofinformationprocessing.Thereissomeevidencetoindicatethatalthoughcomplacencycanoccurwithbothinformationautoma-tionanddecisionautomation,itseffectsonperformancearegreaterwiththelatter.Inastudyofdecisionaiding,bothformsofautomationbenefitedperformanceequallywhentheautoma-tionwasperfectlyreliable[62].Whentheautomationwasun-reliable,however,performancesufferedmuchmorewhenun-reliablerecommendationsweregivenbydecisionautomationthanwhenonlyincorrectstatusinformationwasprovidedbyinformationautomation.Thisstudy,however,istheonlyonetodatethathasdirectlycomparedtheeffectsofautomationunre-liabilityatdifferentstagesofautomation.Theissueofwhetherautomationunreliabilityhassimilarnegativeeffectsforallfourstagesofautomationinourmodelneedsfurtherexamination.4)Skilldegradation:Third,ifthedecision-makingfunctionisconsistentlyperformedbyautomation,therewillcomeatime
whenthehumanoperatorwillnotbeasskilledinperformingthatfunction.Thereisalargebodyofresearchincognitivepsychologydocumentingthatforgettingandskilldecayoccurwithdisuse[63].Degradationofcognitiveskillsmaybepartic-ularlyimportantfollowingautomationfailure.Arecentsimula-tionstudyofhumancontrolofateleroboticarmusedformove-mentofhazardousmaterialsfoundthatfollowingautomationmalfunction,performancewassuperiorwithanintermediatelevelofdecisionautomationcomparedtohigherlevels[53].Thesepotentialcosts—reducedsituationawareness,com-placency,andskilldegradation—collectivelydemonstratethathigh-levelautomationcanleadtooperatorsexhibiting“out-of-the-loop”unfamiliarity[47].Allthreesourcesofvulnerabilitymayposeathreattosafetyintheeventofsystemfailure.Automationmustthereforebedesignedtoensurethatsuchpotentialhumanperformancecostsdonotoccur.Humanperformancecostsotherthantheareaswehavediscussedshouldalsobeexamined.Automationthatdoesnotleadtounbalancedmentalworkload,reducedsituationawareness,complacency,orskilllossmayneverthelessbeassociatedwithotherhumanperformanceproblemsthatultimatelyimpactonsystemperformance,includingmodeconfusionandlowoperatortrustinautomation[1]–[10],[45]–[48].
Byconsideringthesehumanperformanceconsequences,therelativemeritsofaspecificlevelofautomationcanbedeter-mined.However,fullapplicationofourmodelalsorequirescon-siderationofothercriteria.Weconsidertwoothersecondarycriteriahere,automationreliabilityandthecostofdecisionandactionoutcomes.
B.SecondaryEvaluativeCriteria
1)AutomationReliability:Thebenefitsofautomationonoperatormentalworkloadandsituationawarenessnotedprevi-ouslyareunlikelytoholdiftheautomationisunreliable.Henceensuringhighreliabilityisacriticalevaluativecriterioninap-plyingautomation.Severalproceduresforestimatingreliabilityhavebeenproposed,includingfaultandeventtreeanalysis[]andvariousmethodsforsoftwarereliabilityanalysis[65].Theuseofthesetechniquescanbehelpful,solongastheirresultsareinterpretedcautiously.Inparticular,whatappeartobe“hardnumbers,”suchasareliabilityof.997,orameantimetofailureof100000hours,mustbeviewedwithsomeskepticismbecausesuchvaluesrepresentsanestimateofamean,whereaswhatisrequiredisthevariancearoundthemean,whichcanbeconsid-erable.Thecomplexityandsizeofsoftwareinmanyautomatedsystemsmayalsoprecludecomprehensivetestingforallpos-siblefaults,particularlythosethatarisefrominteractionwiththeexistingsysteminwhichtheautomatedsub-systemisplaced[10].Furthermore,automationreliabilitycannotalwayssimplybedefinedinprobabilisticterms.Failuresmayoccurnotbe-causeofapredictable(inastatisticalsense)malfunctioninsoft-wareorhardware,butbecausetheassumptionsthataremodeledintheautomationbythedesignerarenotmetinagivenopera-tionalsituation[8].
Automationreliabilityisanimportantdeterminantofhumanuseofautomatedsystemsbecauseofitsinfluenceonhumantrust[66][67].Unreliabilitylowersoperatortrustandcanthere-foreunderminepotentialsystemperformancebenefitsofthe
292IEEETRANSACTIONSONSYSTEMS,MAN,ANDCYBERNETICS—PARTA:SYSTEMSANDHUMANS,VOL.30,NO.3,MAY2000
automation.Automatedsystemsmaybeunderutilizedordis-abledbecauseofmistrust,asinthecaseofalarmsystemsthatfrequentlygivefalsealerts[8].Signaldetectionanalysis[68]canbeusedtodeterminethealertingthresholdthatbalancesthecompetingrequirementsoftimelydetection(toallowforef-fectiveaction),anear-zeromisseddetectionrate(becauseofpotentiallycatastrophicconsequences—e.g.,acollision),andalowfalsealertrate[69].Toensurealertreliability,theproba-bilitythatanalarmreflectsatruehazardouseventmustalsobemaximizedtotheextentpossible:thiscanbeexaminedbycom-biningsignaldetectiontheoryandBayesianstatistics[70].Ifinformationautomationcanbemadeextremelyreliable,thenpursuingveryhighlevelsofinformationautomationcanbejustified.Ofcourse,highreliabilitycannotbeguaranteedinmanycases.Asmentionedpreviously,theinherentuncertainna-tureofinformationsources,eitherduetosensorimprecisionortochangesinoperatorpriorities,meansthattherewillalwaysexistconditionsinwhichthealgorithmsusedbytheautomationareinappropriateforthoseconditions.Nevertheless,informa-tionacquisitionandanalysisautomationmaystillberetainedatarelativelyhighlevel,aslongastheoperatorhasaccesstotherawdata(e.g.,highlighting,butnotfiltering),andtheoperatorisawareof(calibratedto)thelevelofunreliability,suchthatsomeattentionwillbeallocatedtotheoriginalinformation[60],[71].Althoughmanyexamplesofhighlyreliableinformationau-tomationexist,moresophisticatedformsofsuchautomationarebeingdevelopedinwhichcomplexalgorithmsareappliedtotherawdatainordertopredictfutureevents.Forexample,trafficdisplaysinthecockpit,andconflictpredictiontoolsfortheairtrafficcontrollerbothattempttoprojectthefutureflightpathsofaircraft.Projectingthefutureisinherentlylessthanperfectlyreliable,particularlyifcarriedoutfarenoughoutintime(e.g.,20min.forATCconflictprediction).Furtherworkneedstobedonetoevaluatenotonlythereliabilityofthealgorithmsun-derlyingthesepredictorsystems,butalsotheirsusceptibilitytonoiseintherawdata,andtheconsequencesforhumanperfor-manceofinformationautomationunreliability.Someemergingresearchisbeginningtodefinetheconditionsunderwhichunre-liabilitydoesordoesnotinfluencehumanperformance.Forex-ample,tworecentstudiesfoundthatwhenfeedbackisprovidedastotheoccasionalerrorsmadebyinformationautomation,ap-propriatecalibrationoftheoperator’strustintheautomationcantakeplacefairlyrapidly,andthebenefitsofinformationau-tomationcanstillberealized[60],[71].Thissuggeststhatthenegativeeffectsofover-trust,notedearlierfordecisionautoma-tion,maybelessapparentforinformationautomation.How-ever,asdiscussedpreviously,onlyonestudyhasdirectlycom-paredinformationanddecisionautomation[62].Thustheissueofwhetherautomationunreliabilityhasgreaternegativeeffectsforlaterstagesofautomationrequiresfurtherexamination.2)CostsofDecision/ActionOutcomes:Ouranalysissofarindicatesthathighlevelsofautomationmaybeassociatedwithpotentialcostsofreducedsituationawareness,complacency,andskilldegradation.Thisisnottosaythathighlevelsofau-tomationshouldnotbeconsideredfordecisionandactionau-tomation.However,assessingtheappropriatelevelofautoma-tionfordecisionautomationrequiresadditionalconsiderationofthecostsassociatedwithdecisionandactionoutcomes.
Thedecisionsandassociatedactionsthathumansandauto-matedsystemstakeinmostsystemsvaryinthecoststhatoccuriftheactionsareincorrectorinappropriate.Manyroutineac-tionshavepredictableconsequencesthatinvolvelittleornocostiftheactionsdonotgoasplanned.Theriskassociatedwithadecisionoutcomecanbedefinedasthecostofaerrormulti-pliedbytheprobabilityofthaterror.Fordecisionsinvolvingrelativelylittlerisk,therefore,out-of-the-loopproblemsareun-likelytohavemuchimpact,evenifthereisacompleteautoma-tionfailure.Suchdecisionsarestrongcandidatesforhigh-levelautomation.Infact,ifhumanoperatorshadtobecontinuallyin-volvedinmakingeachoftheserelativelysimpledecisions,theycouldbesooverloadedastopreventthemfromcarryingoutothermoreimportantfunctions.
Notethathigh-levelautomationofdecisionselectionandac-tionmayalsobejustifiedinhighlytime-criticalsituationsinwhichthereisinsufficienttimeforahumanoperatortorespondandtakeappropriateaction.Forexample,ifcertainseriousprob-lemsaredetectedinthereactorofanuclearpowerplant,controlrodsareautomaticallyloweredintothecoretoturnoffthere-actor,withoutanyhumanoperatorintervention.Bypassingthehumanoperatorisjustifiedinthiscasebecausetheoperatorcannotreliablyrespondintimetoavoidanaccident.Aspre-viouslydiscussed,automatingthedecisiontoabortorcontinuethetakeoffofanaircraftwhenanenginemalfunctionoccurstoonearintimetothecriticalV1speedforappropriatepilotactionwouldrepresentanotherqualifyingexample[37],aswouldthedecisiontotakecontroloftheaircraftifafighteraircraftisabouttorunintotheground[29].
Itisalsoappropriatetoconsiderhigh-levelautomationfordecisionsinvolvinghighriskinsituationsinwhichhumanop-eratorshavetimetorespond.Inthiscase,thecostofadverseconsequencesdefinemajorevaluativecriteriafordeterminingappropriatelevelsofautomation.Theexamplesinanesthesi-ology,airdefense,andthestockmarketwithwhichwebeganthispaperqualifyasinvolvinghigh-costdecisions.Systemde-signerscancertainlyconsiderimplementingdecisionautoma-tionabovelowtomoderatelevelsforsuchsystems,e.g.,atlevelsatorabovelevel6inTableI,inwhichcomputersys-temsaregivenautonomyoverdecisionmaking.Thiswouldbeappropriateifthehumanoperatorisnotrequiredtointerveneormanagethesystemintheeventofautomationfailure.Infact,inthiscaseevenfullautomation(Level10)couldbejustified3.However,ifthehumanoperatoriseverexpectedunderabnormalcircumstancestotakeovercontrol,thenouranalysissuggeststhathighlevelsofdecisionautomationmaynotbesuitablebe-causeofthedocumentedhumanperformancecostsassociatedwithsuchautomation.Theburdenofproofshouldthenbeonthedesignertoshowthattheirdesignwillnotleadtotheprob-lemsoflossofsituationawareness,complacency,andskilllossthatwehavediscussed.
3Full
automationrequireshighlyreliableerrorhandlingcapabilitiesandthe
abilitytodealeffectivelyandquicklywithapotentiallylargenumberofanoma-loussituations.Inadditiontorequiringthetechnicalcapabilitytodealwithalltypesofknownerrors,fullautomationwithouthumanmonitoringalsoassumestheabilitytohandleunforeseenfaultsandevents.Thisrequirementcurrentlystrainstheabilityofmostintelligentfault-managementsystems.
PARASURAMANetal.:TYPESANDLEVELSOFHUMANINTERACTIONWITHAUTOMATION293
Asystemdesignermayobjecttotherecommendationthatdecisionautomationshouldnotexceedamoderatelevelforhigh-risksituationsonthegroundsthatifinformationautoma-tioncanbemadehighlyreliable,thendecisionautomationcanalsobe,sowhynotimplementhigh-levelautomationforthisfunctiontoo?Theansweristhatalthoughdecision-aidingsystemscanbeengineeredtobehighlyreliableformanyknownconditions,the“noisiness”oftherealworld,withunplannedvariationsinoperatingconditions,unexpectedorerraticbehaviorofothersystemcomponentsorhumanoperators,systemmalfunctions,etc.,aswellastheinherentunreliabilityofpredictingthefuture,willmeanthattherewillalwaysbeasetofconditionsunderwhichtheautomationwillreachanincorrectdecision.Ifundersuchconditionsofsystemfailurethehumanoperatorisrequiredtointerveneandsalvagethesituation,theproblemofout-the-loopunfamiliaritymaypreventtheoperatorfrominterveningsuccessfullyorinatimelymanner[8],[47],[55].
Finally,theinter-dependenceofthedecisionautomationandactionautomationdimensionsforhigh-riskfunctionsshouldbenoted.Asystemcouldbedesignedtohavehigh-leveldeci-sionautomation,inwhichdecisionchoicesareselectedwithouthumaninvolvementorvetopower.Forexample,currentlyanairtrafficcontrollerissuesaverbalclearancetoapilot,whoac-knowledgesandthenexecutesaflightmaneuverconsistentwiththeclearance.Withthedevelopmentoftwo-wayelectronicdatalinkcommunicationsbetweenaircraftandATC,however,theclearance(whichitselfmaybeacomputerchoice)couldbeup-linkedandloadedintheaircraft’sflightmanagementsystem(FMS)automatically.Theaircraftcouldthencarryoutthema-neuver,withoutpilotintervention.Iftheconsequencesofanincorrectorinappropriatedecisionaregreat,however,thenitwouldbeprudenttorequirethattheactionautomationlevelbesufficientlylowsothatthe(automated)decisionchoiceisexe-cutedbythepilot(i.e.,byactivelypressingabuttonthat“loads”theproposedflightplanintotheFMS).Givingthepilottheop-portunitytoreviewthedecisionchoiceandforcingaconsciousovertaction,providesan“error-trapping”mechanismthatcanguardagainstmindlessacquiescenceincomputer-generatedso-lutionsthatarenotcontextuallyappropriate.Notethatwearenotimplyingthatsomedegreeofhumanactionisalwaysneededforthepurposesoferrortrapping.Theneedonlyarisesatthelastactionimplementationstageifthepreviousdecisionselectionstagehasbeenhighlyautomated.Inthissituationhavingsomehumaninvolvementattheactionstageprovidesa“lastchanceopportunity”totraperrors.
RecentstudieshaveexaminedtherelativeeffectsoflowandhighlevelsofactionautomationonuseoftheFMS[30],[31].Useofalowerlevelofautomationofactionselection—inen-teringdata-linkedflightinformationintotheflightmanagementcomputer—allowedformoreerrorsofdecisionmakingautoma-tiontobecaught,thanahigherlevel,inwhichdataentrywasac-complishedbypressingasingle“accept”button.Ofcoursethisadvantageforerrortrappingmustbebalancedagainsttheaddedworkload,andpossibleerrorsourceoflessautomated(manual)dataentry[32].Certainlycumbersomeandclumsydataentryremainsaviablecandidateforautomation.Buttoreiteratethelinkagebetweendecisionandactionautomation,ifhighau-
tomationisselectedforthelatter,thendesignersshouldresistthetemptationforhighautomationlevelsofdecisionmaking.C.ApplicationExample
Ourmulti-stagemodelofhuman-automationinteractioncanbeappliedtospecificsystemsinconjunctionwithaconsidera-tionofevaluativecriteria,ofwhichwehavediscussedthreeinthispaper—humanperformanceconsequences,automationre-liability,andthecostsofdecision/actionconsequences.Tofur-therillustrateapplicationofthemodel,webrieflyconsideritsuseinthedesignoffutureATCsystems,basedonanalysespre-viouslypresentedin[10].
ATCsystemsarebeingredesignedbecausethevolumeofairtrafficislikelytodoubleoverthenexttwodecades,posingasignificantthreattohandlingcapacity[72].OnealternativeisFreeFlight[73],whichwouldallowuser-preferredroutingandfreemaneuvering,amongotherchangesaimedatminimizingATCrestrictions[74].Anotherapproachistosupplementthecurrentsystemofground-basedATCwithadditionalautoma-tiontosupportairtrafficcontrollersinthemanagementofanincreasinglydenseairspace[10].Elementsofbothalternativesarelikelytobeimplemented,buttheincreasingcomplexityoffutureairspacewillrequireautomationtoolstosupportbothairtrafficcontrollersandpilots.Automationtoolswillbeneededforplanning,trafficmanagement,conflictdetectionandresolu-tion,etc.
Applicationofourmodelsuggeststhefollowingrecommen-dationsforfutureATCautomation.(Weagainemphasizethateachrecommendationrepresentsanupperboundormaximumlevelofautomation,notarequiredlevel.)Highlevelsofinfor-mationacquisitionandanalysisautomationcanbepursuedandimplementediftheresultingsystemcanbeshowntobereliable.ThisrecommendationisrepresentedbythearrowsontheleftpartofthescalesinFig.4.Severalexamplesofsuchautomation(suchasCRDA)alreadyexistandothersarebeingdeveloped.Fordecisionandactionautomation,however,highlevelsshouldbeimplementedonlyforlow-risksituations(indicatedbytheupperarrowinthemiddlescaleinFig.4).Forallothersitu-ations,thelevelofdecisionautomationshouldnotexceedthelevelofthecomputersuggesting(butnotexecuting)apreferredalternativetothecontroller(indicatedbythelowerarrow).Forexample,inriskysituations,aswhenaclimbclearancehastobeissuedtoresolveacrossingconflictindenseairspace,con-flictresolutionautomationcanprovidealternativestothecon-trollerbutshouldnotselectoneofthemwithoutcontrollerin-volvement.Ifrelativelyhigh-leveldecisionautomationisim-plementedinriskysituations,however,thenwerecommendthatsomedegreeofhumanactionberetainedbyhavingamoderatelevelofactionautomation.Asdiscussedpreviously,thisallowsforlast-stageerrortrapping.Thisrecommendationisindicatedbytheright-mostarrowinFig.4.
V.ALTERNATIVES,LIMITATIONS,ANDEXTENSIONSBeforeconcluding,webrieflyconsidertwoalternativeap-proachestotheimplementationofautomation,anddiscusssomelimitationsandextensionsofourframework.Onealternativetoourapproachistoautomateeverythingthatonecan.Thiscanbe
294IEEETRANSACTIONSONSYSTEMS,MAN,ANDCYBERNETICS—PARTA:SYSTEMSANDHUMANS,VOL.30,NO.3,MAY2000
Fig.4.RecommendedtypesandlevelsforfutureATCsystems,consistentwiththreeevaluativecriteria-humanperformanceconsequences,automationreliability,andcostsofactions.
aviableoptionandtosomeextenthasbeenthedefaultstrategyusedinmostsystemsthathavebeenautomatedtodate,oftenbe-causeincreasingefficiencyorreducingcostsaremajordrivingforcesforautomation.However,aproblemwiththisstrategyisthatthehumanoperatorisleftwithfunctionsthatthedesignerfindshard,expensive,orimpossibletoautomate(untilaclev-ererdesignercomesaround).Thisapproachthereforedefinesthehumanoperator’srolesandresponsibilitiesintermsoftheautomation[8].Designersautomateeverysubsystemthatleadstoaneconomicbenefitforthatsubsystemandleavetheoperatortomanagetherest.Technicalcapabilityorlowcostarevalidreasonsforautomation,giventhatthereisnodetrimentalim-pactonhumanperformanceintheresultingwholesystem,butthisisnotalwaysthecase.Thesumofsubsystemoptimizationsdoesnottypicallyleadtowholesystemoptimization.Asecondalternativeistousetaskallocationmethodstomatchhumanandmachinecapabilities,asintheFittslistapproach[75].Thatis,tasksthatareputativelyperformedbetterbymachinesshouldbeautomated,whereasthosethathumansdobettershouldnot.Unfortunately,althoughfunctionallocationmethodsareusefulinprinciple,ithasproveddifficultinpracticetouseproceduressuchastheFittsListtodeterminewhichfunctionsshouldbeautomatedinasystem[76].
Somelimitationsofourmodelfortypesandlevelsofautoma-tionshouldalsobenoted.First,whileweusedSheridan’s10levelsofautomation[11]fordecisionautomation,wedidnotexplicitlyspecifythenumberoflevelsfortheothertypesofau-tomation,e.g.,informationautomation.Onereasonisthatwhilethereisextensiveresearchpointingtothebenefitsofinforma-tionautomationvs.noautomation(e.g.,asinpredictordisplaysforCDTI,see[20],[21]),thereisasyetlittleempiricalworkexplicitlycomparingtheeffectsonhumanperformanceofdif-ferentlevelsofautomationforinformationacquisitionandanal-ysis.Anotherreasonisthatanyproposedtaxonomyislikelytobesupercededbytechnologicaldevelopmentsinmethodsforinformationintegrationandpresentation,sothatnewlevelswillneedtobespecified.
Second,inproposinghumanperformancebenefitsandcostsasevaluativecriteriafordeterminingappropriatetypes
andlevelsofautomation,wedidnotdiscusshowtherelativebenefitsandcostsshouldbeweighed.Shouldthebenefit(ofaparticularautomationlevel)ofbalancedmentalworkloadbeoutweighedbythecostofreducedsituationawarenessorincreasedlikelihoodofcomplacency?Whatistherelativeweightingofthehumanperformancecostswediscussedinthispaper,aswellasofthosewedidnot?Similarly,whichisthemostimportantoftheseveralsecondaryevaluativecriteriawehavelisted,suchasautomationreliability,costsofactionoutcomes,easeofsystemintegration,efficiency/safetytradeoffs,manufacturingandoperatingcosts,andliability?Thesearedifficultissuestowhichtherearenosimpleanswers.Ofcourse,asaqualitativemodelourapproachismeanttoprovideaframeworkfordesign,notasetofquantitativemethods.Nevertheless,onewayforwardmightbetoexaminethepossibilityofformalizingthemodel.Moregenerally,itwouldbedesirabletohavequantitativemodelsthatcouldinformautomationdesignforhuman-machinesystems[77].Severalcomputationalmodelsofhuman-automationinterac-tionhavebeenputforwardveryrecently,includingmodelsbasedonexpectedvaluestatistics[37],[78],task-loadmodels[79],cognitive-systemmodels[80],andamodelbasedonstate-transitionnetworks[81](forarecentreviewofthesemodels,see[82]).Astheseandrelatedmodelsmatureandarevalidated,itmaybepossibletoimproveautomationdesignbysupplementingthequalitativeanalysispresentedherewithquantitativemodeling.
VI.CONCLUSIONS
Automationdesignisnotanexactscience.However,nei-therdoesitbelongintherealmofthecreativearts,withsuc-cessfuldesigndependentuponthevisionandbrillianceofindi-vidualcreativedesigners.(Althoughsuchqualitiescancertainlyhelpthe“lookandfeel”andmarketabilityoftheautomatedsystem—see[83]).Rather,automationdesigncanbeguidedbythefour-stagemodelofhuman-automationinteractionwehaveproposed,alongwiththeconsiderationofseveralevaluativecri-teria.Wedonotclaimthatourmodelofferscomprehensivede-signprinciplesbutasimpleguide.Themodelcanbeusedasastartingpointforconsideringwhattypesandlevelsofautoma-tionshouldbeimplementedinaparticularsystem.Themodelalsoprovidesaframeworkwithinwhichimportantissuesrel-evanttoautomationdesignmaybeprofitablyexplored.Ulti-mately,successfulautomationdesignwilldependuponthesat-isfactoryresolutionoftheseandotherissues.
ACKNOWLEDGMENT
TheauthorsthankthemembersofthePanelonHumanFac-torsinAirTrafficControlAutomationoftheNationalResearchCouncil(AnneMavor,StudyDirector)fortheircontributionstothiswork.TheyalsothankP.Hancock,D.Kaber,N.Moray,U.Metzger,andM.Scerboforusefulcommentsonthiswork.
REFERENCES
[1]E.L.WienerandR.E.Curry,“Flight-deckautomation:Promisesand
problems,”Ergonomics,vol.23,pp.995–1011,1980.
[2]L.Bainbridge,“Ironiesofautomation,”Automatica,vol.19,pp.
775–779,1983.
PARASURAMANetal.:TYPESANDLEVELSOFHUMANINTERACTIONWITHAUTOMATION295
[3]N.ChambersandD.C.Nagel,“Pilotsofthefuture:Humanorcom-puter?,”Commun.ACM,vol.28,pp.1187–1199,1985.
[4]R.Parasuraman,“Human-computermonitoring,”HumanFactors,vol.29,pp.695–706,1987.
[5]T.B.Sheridan,Telerobotics,Automation,andSupervisoryCon-trol.Cambridge,MA:MITPress,1992.
[6]R.ParasuramanandM.Mouloua,AutomationandHumanPerformance:TheoryandApplications.Mahwah,NJ:Erlbaum,1996.
[7]
D.D.Woods,“Decomposingautomation:Apparentsimplicity,realcomplexity,”inAutomationandHumanPerformance:TheoryandApplications,R.ParasuramanandM.Mouloua,Eds.Mahwah,NJ:Erlbaum,1996,pp.1–16.
[8]R.ParasuramanandV.A.Riley,“Humansandautomation:Use,misuse,disuse,abuse,”HumanFactors,vol.39,pp.230–253.
[9]C.E.Billings,AviationAutomation:TheSearchforaHuman-CenteredApproach.Mahwah,NJ:Erlbaum,1997.
[10]C.D.Wickens,A.Mavor,R.Parasuraman,andJ.McGee,TheFutureofAirTrafficControl:HumanOperatorsandAutomation.Washington,DC:NationalAcademyPress,1998.
[11]T.B.SheridanandW.L.Verplank,“HumanandComputerControlofUnderseaTeleoperators,”MITMan-MachineSystemsLaboratory,Cambridge,MA,Tech.Rep.,1978.
[12]V.Riley,“Ageneralmodelofmixed-initiativehuman-machinesys-tems,”inProc.33rdAnnualHumanFactorsSocietyConf..SantaMonica,CA,19,pp.124–128.
[13]A.D.Baddeley,WorkingMemory.Oxford,U.K.:Clarendon,1996.[14]D.E.Broadbent,PerceptionandCommunication.London,U.K.:Pergamon,1958.
[15]C.D.WickensandJ.Hollands,EngineeringPsychologyandHumanPerformance,3rded.EnglewoodCliffs,NJ:Prentice-Hall,1999.[16]J.J.Gibson,TheEcologicalApproachtoVisualPerception.Boston,MA:Houghton-Mifflin,1979.
[17]C.D.Wickens,S.E.Gordon,andY.Liu,AnIntroductiontoHumanFactorsEngineering.NewYork:Longman,1998.
[18]T.B.Sheridan,“Ruminationonautomation,”inProc.JFAC-MMSConf..Kyoto,Japan,1998.
[19]J.D.LeeandT.F.Sanquist,Maritimeautomationin“AutomationandHumanPerformance:TheoryandApplications”,R.ParasuramanandM.Mouloua,Eds.Mahwah,NJ:Erlbaum,1996,pp.365–384.
[20]
S.G.HartandT.E.Wempe,“CockpitDisplayofTrafficInformation:AirlinePilotsOpinionsaboutContent,Symbology,andFormat.,”NASAAmesResearchCenter,MoffettField,CA,NASATech.Memo.78601,1979.
[21]
M.E.MorphewandC.D.Wickens,“PilotperformanceandworkloadusingtrafficdisplaystosupportFreeFlight,”inProc.42ndAnnualHumanFactorsandErgonomicsSocietyConf..SantaMonica,CA,1998,pp.52–56.
[22]N.Moray,“Humanfactorsinprocesscontrol,”inHandbookofHumanActorsandErgonomics,2nded.ed,G.Salvendy,Ed.NewYork:Wiley,1997,pp.1944–1971.
[23]K.BennettandJ.M.Flach,“Graphicaldisplays:Implicationsfordi-videdattention,focusedattention,andproblemsolving,”HumanFac-tors,vol.34,pp.513–533,1992.
[24]A.Mundra,“ANewAutomationAidtoAirTrafficControllersforIm-provingAirportCapacity,”TheMitreCorporation,McLean,VA,Tech-nicalReportMP-W00034,19.
[25]A.Madni,“Theroleofhumanfactorsinexpertsystemsdesignandac-ceptance,”HumanFactors,vol.30,pp.395–414,1988.
[26]E.H.Shortliffe,Computer-BasedMedicalConsultation:MYCIN.Amsterdam,TheNetherlands:Elsevier,1976.
[27]L.L.Schlabach,C.C.Hayes,andD.E.Goldberg,“FOX-GA:Ageneticalgorithmforgeneratingandanalyzingbattlefieldcoursesofaction,”J.Evol.Comput.,vol.7,no.Spring,pp.45–68,1999.
[28]C.Layton,P.J.Smith,andC.E.McCoy,“Designofacooperativeproblem-solvingsystemforen-routeflightplanning:Anempiricaleval-uation,”HumanFactors,vol.36,pp.94–119,1994.
[29]W.B.Scott,“AutomaticGCAS:Youcan’tflyanylower,”AviationWeekandSpaceTechnology,pp.76–79,February1,1999.
[30]
W.OlsonandN.B.Sarter,“Supportinginformedconsentinhuman-machinecollaboration:Theroleofconflicttype,timepressure,displaydesign,andtrust,”inProc.HumanFactorsandErgonomicsSociety43rdAnnualMeeting.SantaMonica,CA,1999,pp.1–193.
[31]
E.W.Logdson,S.E.Infield,S.Lozito,A.McGann,M.Macintosh,andA.Possolo,“Cockpitdatalinktechnologyandflightcrewcom-municationsprocedures,”inProc.8thInt.Symp.AviationPsychology.Columbus,OH,1995,pp.324–239.
[32]E.E.HahnandJ.Hansman,“ExperimentalStudiesontheEffectofAu-tomationonPilotSituationalAwarenessintheDatalinkATCEnviron-ment,”SAEInternational,PA,Tech.Paper922922,1992.
[33]P.A.Hancock,M.H.Chignell,andA.Lowenthal,“Anadaptivehuman-machinesystem,”inProc.15thAnnualIEEEConf.Syst.,Man,Cybern..Washington,DC,1985,pp.627–629.
[34]R.Parasuramanetal.,“TheoryandDesignofAdaptiveAutomationin
AviationSystems,”NavalAirWarfareCenter,Warminster,PA,Tech.Rep.NAWCADWAR-92033-60,1992.
[35]W.B.Rouse,“Adaptiveaidingforhuman/computercontrol,”Human
Factors,vol.30,pp.431–438,1988.
[36]M.BarnesandJ.Grossman,“TheIntelligentAssistantConceptforElec-tronicWarfareSystems,”NavalWarfareCenter,ChinaLake,CA,Tech.Rep.NWCTP5585,1985.
[37]T.Inagaki,“Situation-adaptiveautonomy:Tradingcontrolofauthority
inhuman-machinesystems,”inAutomationTechnologyandHumanPerformance:CurrentResearchandTrends.Mahwah,NJ:Erlbaum,1999,pp.1–158.
[38]E.A.ByrneandR.Parasuraman,“Psychophysiologyandadaptiveau-tomation,”Biol.Psychol.,vol.42,pp.249–268,1996.
[39]B.Hilburn,P.G.Jorna,E.A.Byrne,E.A.Byrne,andR.Parasuraman,
“Theeffectofadaptiveairtrafficcontrol(ATC)decisionaidingoncon-trollermentalworkload,”inHuman-AutomationInteraction:ResearchandPractice,M.MoulouaandJ.Koonce,Eds.Mahwah,NJ:Erlbaum,1997,pp.84–91.
[40]D.B.KaberandJ.M.Riley,“Adaptiveautomationofadynamiccon-troltaskbasedonworkloadassessmentthroughasecondarymonitoringtask,”inAutomationTechnologyandHumanPerformance:CurrentRe-searchandTrends,M.W.ScerboandM.Mouloua,Eds.Mahwah,NJ:Erlbaum,1999,pp.129–133.
[41]N.Moray,T.Inagaki,andM.Itoh,“Adaptiveautomation,trust,andself-confidenceinfaultmanagementoftime-criticaltasks,”J.Exper.Psych.:Appl.,vol.6,pp.44–58.
[42]R.Parasuraman,M.Mouloua,andR.Molloy,“Effectsofadaptivetask
allocationonmonitoringofautomatedsystems,”HumanFactors,vol.38,pp.665–679,1996.
[43]S.Scallen,P.A.Hancock,andJ.A.Duley,“Pilotperformanceandpref-erenceforshortcyclesofautomationinadaptivefunctionallocation,”Appl.Ergon.,vol.26,pp.397–403,1995.
[44]MScerbo,“Theoreticalperspectivesonadaptiveautomation,”in
AutomationandHumanPerformance:TheoryandApplications,R.ParasuramanandM.Mouloua,Eds.Mahwah,NJ:Erlbaum,1996,pp.37–63.
[45]M.Lewis,“Designingforhuman-agentinteraction,”Artif.Intell.Mag.,
vol.Summer,pp.67–78,1998.
[46]“Automationsurprises,”N.Sarter,D.D.Woods,andC.E.Billings,
inHandbookofHumanFactorsandErgonomics,2nded.,G.Salvendy,Ed.NewYork:Wiley,1997,pp.1926–1943.
[47]C.D.Wickens,“Designingforsituationawarenessandtrustin
automation,”inProc.IFACConf.IntegratedSystemsEngineering.Baden-Baden,Germany,1994.
[48]D.D.WoodsandN.Sarter,“Evaluatingtheimpactofnewtechnologyon
human-machinecooperation,”inVerificationandValidationofComplexSystems,J.A.Wise,V.D.Hopkin,andP.Stager,Eds.Berlin:Springer-Verlag,1993,pp.133–158.
[49]K.J.VicenteandJ.Rasmussen,“Ecologicalinterfacedesign:Theoretical
foundation,”IEEETrans.Syst.,Man,Cybern.,vol.22,pp.4–506,1992.
[50]E.L.Wiener,“Cockpitautomation,”inHumanFactorsinAviation,E.
L.WienerandD.C.Nagel,Eds.NewYork:Academic,1988,pp.433–461.
[51]A.Kirlik,“Modelingstrategicbehaviorinhuman-automationinterac-tion:Why\"aid\"can(andshould)gounused,”HumanFactors,vol.35,pp.221–242,1993.
[52]T.K.Landauer,TheTroublewithComputers.Cambridge,MA:MIT
Press,1995.
[53]D.B.Kaber,E.Omal,andM.R.Endsley,“Levelofautomationeffects
ontelerobotperformanceandhumanoperatorsituationawarenessandsubjectiveworkload,”inAutomationTechnologyandHumanPerfor-mance:CurrentResearchandTrends.Mahwah,NJ:Erlbaum,1999,pp.165–170.
[]M.Endsley,“Automationandsituationawareness,”inAutomationand
HumanPerformance:TheoryandApplications,R.ParasuramanandM.Mouloua,Eds.Mahwah,NJ:Erlbaum,1996,pp.163–181.
[55]M.EndsleyandE.O.Kiris,“Theout-of-the-loopperformanceproblem
andlevelofcontrolinautomation,”HumanFactors,vol.37,pp.381–394,1995.
296IEEETRANSACTIONSONSYSTEMS,MAN,ANDCYBERNETICS—PARTA:SYSTEMSANDHUMANS,VOL.30,NO.3,MAY2000
[56]N.SarterandD.D.Woods,“’Strong,silent,andout-of-the-loop’:Prop-ertiesofadvanced(cockpit)automationandtheirimpactonhuman-au-tomationinteraction,”CognitiveSystemsEngineeringLaboratory,OhioStateUniversity,Columbus,OH,TechnicalReportCSEL95-TR-01,1995.
[57]E.L.Wiener,“Complacency:Isthetermusefulforairsafety?,”inProc.
26thCorporateAviationSafetySeminar.Denver,CO,1981.
[58]R.Parasuraman,R.Molloy,andI.L.Singh,“Performanceconsequences
ofautomation-induced’complacency’,”Int.J.AviationPsychology,vol.3,pp.1–23,1993.
[59]S.FarrellandS.Lewandowsky,“Aconnectionistmodelofcomplacency
andadaptiverecoveryunderautomation,”J.Exper.Psychol.:Learn.,Memory,Cogn.,vol.26,pp.395–410.
[60]C.D.Wickens,R.Conejo,andK.Gempler,“Unreliableautomated
attentioncueingforair-groundtargetingandtrafficmaneuvering,”inProc.34thAnnualHumanFactorsandErgonomicsSocietyConf..SantaMonica,CA,1999.
[61]M.Yeh,C.D.Wickens,andF.J.Seagull,“Targetcueinginvisualsearch:
Theeffectsofconformalityanddisplaylocationontheallocationofvisualattention,”HumanFactors,vol.41,1999.
[62]W.M.CrocollandB.G.Coury,“Statusorrecommendation:Selecting
thetypeofinformationfordecisionaiding,”inProc.34thAnnu.HumanFactorsandErgonomicsSocietyConf..SantaMonica,CA,1990,pp.1524–1528.
[63]A.M.Rose,“Acquisitionandretentionofskills,”inApplicationof
HumanPerformanceModelstoSystemDesign,G.McMillan,Ed.NewYork:Plenum,19.
[]A.Swain,“Humanreliabilityanalysis:Needs,status,trends,andlimi-tations,”Reliab.Eng.Syst.Saf.,vol.29,pp.301–313,1990.
[65]D.L.Parnas,A.J.vanSchouwen,andS.P.Kwan,“Evaluationofsafety-criticalsoftware,”Commun.ACM,vol.33,pp.636–8,1990.
[66]J.D.LeeandN.Moray,“Trust,controlstrategies,andallocationoffunc-tioninhuman-machinesystems,”Ergonomics,vol.35,pp.1243–1270,1992.
[67]A.J.MasalonisandR.Parasuraman,“Trustasaconstructforevalua-tionofautomatedaids:Pastandpresenttheoryandresearch,”inProc.HumanFactorsandErgonomicsSociety43rdAnnualMeeting.SantaMonica,1999,pp.184–188.
[68]J.A.SwetsandR.M.Pickett,EvaluationofDiagnosticSystems:
MethodsfromSignalDetectionTheory.NewYork:Academic,1982.[69]J.Kuchar,“Methodologyforalerting-systemperformanceevaluation,”
J.Guidance,Control,andDyanmics,vol.19,pp.438–444,1996.
[70]R.Parasuraman,P.A.Hancock,andO.Olofinboba,“Alarmeffective-nessindriver-centeredcollision-warningsystems,”Ergonomics,vol.39,pp.390–399,1997.
[71]J.L.Merlo,C.D.Wickens,andM.Yeh,“Effectsofreliabilityoncue
effectivenessanddisplaysignalling,”UniversityofIllinoisAviationRe-searchLab,Savoy,IL,TechnicalreportARL-99-4/Fedlab-99-3,1999.[72]“AviationWeekandSpaceTechnology,”AnswerstotheGridlock,pp.
42–62,February2,1998.
[73]“ReportoftheRTCABoardofDirector’sSelectCommitteeonFree
Flight,”RTCA,Washington,DC,1995.
[74]R.vanGent,J.M.Hoekstra,andR.C.J.Ruigrok,“Freeflightwith
airborneseparationassurance,”inProc.Int.Conf.HumanComputerIn-teractioninAeronautics.Montreal,,Canada,1998,pp.63–69.
[75]P.M.Fitts,HumanEngineeringforanEffectiveAirNavigationand
TrafficControlSystem.Washington,DC:NationalResearchCouncil,1951.
[76]T.B.Sheridan,“Allocatingfunctionsrationallybetweenhumansand
machines,”ErgonomicsinDesign,vol.6,no.3,pp.20–25,1998.
[77]R.W.PewandA.S.Mavor,ModelingHumanandOrganizational
Behavior:ApplicationtoMilitarySimulations.Washington,DC:NationalAcademyPress,1998.
[78]T.B.SheridanandR.Parasuraman,“Humanvs.automationinre-spondingtofailures:Anexpected-valueanalysis,”HumanFactors,tobepublished.
[79]Z.Wei,A.P.Macwan,andP.A.Wieringa,“Aquantitativemeasurefor
degreeofautomationanditsrelationtosystemperformance,”HumanFactors,vol.40,pp.277–295,1998.
[80]K.Corker,G.Pisanich,andM.Bunzo,“Empiricalandanalyticstudies
ofhuman/automationdynamicsinairspacemanagementforfreeflight,”inProc.10thInt.CEASConf.FreeFlight.Amsterdam,TheNetherlands,1997.
[81]A.Degani,M.Shafto,andA.Kirlik,“Modelsinhuman-machinesys-tems:Constructs,representation,andclassification,”Int.J.AviationPsy-chology,vol.9,pp.125–138,1999.
[82]R.Parasuraman,“Designingautomationforhumanuse:Empirical
studiesandquantitativemodels,”Ergonomics,tobepublished.[83]D.A.Norman,TheInvisibleComputer,1998.
RajaParasuramanreceivedtheB.Sc.degree(firstclasshonors)inelectricalengineeringfromImperialCollege,UniversityofLondon,U.K.in1972,andtheM.Sc.andPh.D.degreesinappliedpsychologyfromtheUniversityofAston,Birmingham,U.Kin1973and1976,respectively.
From1978to1982,hewasaResearchFellowattheUniversityofCalifornia,LosAngeles.In1982,hejoinedtheCatholicUniversityofAmerica,Washington,D.C.asAssociateProfessorandwaspromotedtoFullProfessorin1986.Heiscurrently
DirectoroftheCognitiveScienceLaboratoryandalsoholdsavisitingappointmentattheLaboratoryofBrainandCognitionattheNationalInstituteofMentalHealth,Bethesda,MD.Hisresearchinterestsareintheareasofattention,automation,aviationandairtrafficcontrol,event-relatedbrainpotentials,functionalbrainimaging,signaldetection,vigilance,andworkload.Dr.ParasuramanisaFellowoftheAmericanAssociationfortheAdvance-mentofScience(1994),theHumanFactorsandErgonomicsSociety(1994),andtheAmericanPsychologicalSociety(1991).HeisalsocurrentlyservingontheNationalResearchCouncilPanelonHumanFactors.
ThomasB.Sheridan(M’60–SM’82–F’83–LF’96)receivedtheB.S.degreefromPurdueUniversity,WestLafayette,IN,theMSdegreefromUniversityofCalifornia,LosAngeles,theSc.D.degreefromtheMassachusettsInstituteofTechnology(MIT),Cambridge,andtheDr.(honorary)fromDelftUniversityofTechnology,TheNetherlands.
FormostofhisprofessionalcareerhehasremainedatMIT,whereheiscurrentlyFordProfessorofEngi-neeringandAppliedPsychologyEmeritusintheDe-partmentofMechanicalEngineeringandDepartment
ofAeronauticsandAstronautics,continuingtoteachandserveasDirectoroftheHuman-MachineSystemsLaboratory.Hehasalsoservedasavisitingpro-fessoratUniversityofCalifornia,Berkeley,Stanford,DelftUniversity,KasselUniversity,Germany,andBenGurionUniversity,Israel.Hisresearchinterestsareinexperimentation,modeling,anddesignofhuman-machinesystemsinair,highwayandrailtransportation,spaceandundersearobotics,processcontrol,armscontrol,telemedicine,andvirtualreality.Hehaspublishedover200tech-nicalpapersintheseareas.Heisco-authorofMan-MachineSystems(Cam-bridge,MA:MITPress,1974,1981;USSR,1981),coeditorofMonitoringBe-haviorandSupervisoryControl(NewYork:Plenum,1976),authorofTeler-obotics,Automation,andHumanSupervisoryControl(Cambridge,MA:MITPress,1992),andco-editorofPerspectivesontheHumanController(Mahwah,NJ:Erlbaum,1997).HeiscurrentlySeniorEditoroftheMITPressjournalPresence:TeleoperatorsandVirtualEnvironmentsandservesonseveraledito-rialboards.HechairedtheNationalResearchCouncil’sCommitteeonHumanFactors,andhasservedonnumerousgovernmentandindustrialadvisorycom-mittees.HeisprincipalofThomasB.SheridanandAssociates,aconsultingfirm.
Dr.SheridanwasPresidentoftheIEEESystems,Man,andCyberneticsSo-ciety,EditorofIEEETRANSACTIONSONMAN-MACHINESYSTEMS,receivedtheirNorbertWienerandJosephWohlawards,theIEEECentennialMedalandThirdMilleniumMedal.HeisalsoaFellowoftheHumanFactorsandEr-gonomicsSociety,recipientoftheirPaulM.FittsAward,andwasPresidentofHFES.Hereceivedthe1997NationalEngineeringAwardoftheAmer.icanAs-sociationofEngineeringSocietiesandthe1997OldenburgerMedalofASME.HeisamemberoftheNationalAcademyofEngineering.
PARASURAMANetal.:TYPESANDLEVELSOFHUMANINTERACTIONWITHAUTOMATION297
ChristopherD.WickensreceivedtheA.B.degreefromHarvardCollege,Cambridge,MA,in1967andthePh.D.degreefromtheUniversityofMichigan,AnnArbor,in1974.
HeservedasacommissionedofficerintheU.S.Navyfrom1969to1972.HeiscurrentlyaProfessorofExperimentalPsychology,HeadoftheAviationResearchLaboratory,andAssociateDirectoroftheInstituteofAviationattheUniversityofIllinoisatUrbana-Champaign.HealsoholdsanappointmentintheDepartmentofMechanicalandIndustrialEngi-neeringandtheBeckmanInstituteofScienceandTechnology.Hisresearchin-terestsinvolvetheapplicationoftheprinciplesofhumanattention,perceptionandcognitiontomodelingoperatorperformanceincomplexenvironments,par-ticularlyaviation,airtrafficcontrolanddatavisualizations.
Dr.WickensisamemberandFellowoftheHumanFactorsSocietyandre-ceivedtheSociety’sJeromeH.ElyAwardin1981forthebestarticleintheHumanFactorsJournal,andthePaulM.FittsAwardin1985foroutstandingcontributionstotheeducationandtrainingofhumanfactorsspecialists.HewaselectedtotheSocietyofExperimentalPsychologists,electedFellowoftheAmericanPsychologicalAssociation,andin1993receivedtheFranklinTaylorAwardforOutstandingContributionstoEngineeringPsychologyfromDivision21ofthatassociation.
因篇幅问题不能全部显示,请点此查看更多更全内容
Copyright © 2019- huatuo0.com 版权所有 湘ICP备2023021991号-1
违法及侵权请联系:TEL:199 1889 7713 E-MAIL:2724546146@qq.com
本站由北京市万商天勤律师事务所王兴未律师提供法律服务