Res. Agr. Eng., 2025, 71(2):69-79 | DOI: 10.17221/73/2024-RAE

Perception of bimodal warning cues during remote supervision of autonomous agricultural machinesOriginal Paper

Anita Chidera Ezeagba ORCID...1, Cheryl Mary Glazebrook ORCID...2, Daniel Delmar Mann ORCID...1
1 Department of Biosystems Engineering, University of Manitoba, Winnipeg, Manitoba, Canada
2 Faculty of Kinesiology and Recreation Management, University of Manitoba, Winnipeg, Manitoba, Canada

Agricultural machines that are fully autonomous will still need human supervisors to monitor and troubleshoot system failures. Recognising the emergency as soon as possible is crucial to reduce adverse effects. The ability of humans to detect visual, auditory, or tactile cues is usually enabled by warning systems. The effectiveness of different warning cues varies in terms of prompting a quick response. The study’s objective was to compare the effectiveness of two bimodal warnings (i.e., visual-auditory and visual-tactile) at eliciting supervisor perception (which equates to level one situation awareness). Twenty-five participants engaged in an autonomous sprayer simulation. Two realistic remote supervision scenarios (i.e., in-field and close-to-field) were used to examine two bimodal warning cues: (i) visual-auditory and (ii) visual-tactile. The effectiveness of each bimodal warning was assessed based on two measures: (i) response time and (ii) noticeability. There was no significant difference between the bimodal warning cues in terms of response time when tractor sound was present in the experimental environment (reflecting the in-field remote supervision scenario); however, visual-tactile cues yielded shorter response times than visual-auditory cues when the experimental environment was quiet (reflecting the close-to-field remote supervision scenario). There were no statistically significant differences between visual-auditory and visual-tactile warnings concerning noticeability. Participants’ subjective answers indicated they preferred the visual-tactile cues better than the visual-auditory cues. It is concluded that visual-tactile warnings are preferred over visual-auditory warnings to enable perception during remote supervision of autonomous agricultural machines (AAMs).

Keywords: warning systems; situation awareness; human supervision; automated farm machinery

Received: September 2, 2024; Accepted: February 19, 2025; Prepublished online: April 24, 2025; Published: June 18, 2025  Show citation

ACS AIP APA ASA Harvard Chicago Chicago Notes IEEE ISO690 MLA NLM Turabian Vancouver
Ezeagba AC, Glazebrook CM, Delmar Mann D. Perception of bimodal warning cues during remote supervision of autonomous agricultural machines. Res. Agr. Eng. 2025;71(2):69-79. doi: 10.17221/73/2024-RAE.
Download citation

References

  1. Adamides G., Christou G., Katsanos C., Xenos M., Hadzilacos T. (2014): Usability guidelines for the design of robot teleoperation: A taxonomy. IEEE Transactions on Human-Machine Systems, 45: 256-262. Go to original source...
  2. Alexander R.D., Herbert N.J., Kelly T.P. (2009): The role of the human in an autonomous system. Proceedings of the 4th IET System Safety Conference, London, Oct 26-29, 2009: 1-6. Go to original source...
  3. Auat Cheein F.A., Carelli R. (2013): Agricultural robotics: Unmanned robotic service units in agricultural tasks. IEEE Industrial Electronics Magazine, 7: 48-58. Go to original source...
  4. Bechar A., Vigneault C. (2016): Agricultural robots for field operations: Concepts and components. Biosystems Engineering, 149: 94-111. Go to original source...
  5. Bilski B. (2013): Exposure to audible and infrasonic noise by modern agricultural tractor operators. Applied Ergonomics, 44: 210-214. Go to original source... Go to PubMed...
  6. Blackmore B.S., Griepentrog H., Fountas S., Gemtos T. (2007): A specification for an autonomous crop production mechanisation system. Agricultural Engineering International: The CIGR E-Journal, 9: 1-17.
  7. Braun D., Breitmeyer B.G. (1988): Relationship between directed visual attention and saccadic reaction times. Experimental Brain Research, 73: 546-552. Go to original source... Go to PubMed...
  8. Burke J.L., Prewett M.S., Gray A.A., Yang L., Stilson F.R., Coovert M.D., Elliot L.R., Redden E. (2006): Comparing the effects of visual-auditory and visual-tactile feedback on user performance: A meta-analysis. Proceedings of the 8th International Conference on Multi-modal Interfaces, Banff, Nov 2-4, 2006: 108-117. Go to original source...
  9. Caspi A., Roy A., Wuyyuru V., Rosendall P.E., Harper J.W., Katyal K.D., Greenberg R.J. (2018): Eye movement control in the Argus II retinal-prosthesis enables reduced head movement and better localisation precision. Investigative Ophthalmology & Visual Science, 59: 792-802. Go to original source... Go to PubMed...
  10. Chen J., Šabić E., Mishler S., Parker C., Yamaguchi M. (2022): Effectiveness of lateral auditory collision warnings: Should warnings be toward danger or toward safety? Human Factors, 64: 418-435. Go to original source... Go to PubMed...
  11. Corneil B.D., Munoz D.P. (1996): The influence of auditory and visual distractors on human orienting gaze shifts. Journal of Neuroscience, 16: 8193-8207. Go to original source... Go to PubMed...
  12. Delavarpour N., Eshkabilov S., Bon T., Nowatzki J., Bajwa S. (2019): The performance analysis of tactile and ultrasonic sensors for planting, fertilising, and cultivating cover crops. 2019 ASABE Annual International Meeting. American Society of Agricultural and Biological Engineers, Boston, Jul 7-10, 2019: 1. Go to original source...
  13. Dorais G.A., Gawdiak Y. (2003): The personal satellite assistant: An internal spacecraft autonomous mobile monitor. 2003 IEEE Aerospace Conference, Big Sky, Mar 8-15, 2003: 1-348. Go to original source...
  14. Edet U., Mann D.D. (2021): Evaluating warning modalities for remotely supervised automated agricultural machines. Journal of Agricultural Safety and Health, 28: 1-17. Go to original source... Go to PubMed...
  15. Edet U., Ogidi F., Mann D. (2022): Design and evaluation of a user interface for an autonomous agricultural sprayer. Agricultural Sciences, 13: 221-243. Go to original source...
  16. Elbert K.K., Kroemer H.B., Hoffman A.D.K. (2018): Selection, design, and arrangement of controls and displays. In: Kroemer K.H.E. (ed.): Ergonomics: How to Design for Ease and Efficiency. Englewood Cliffs, NJ: Prentice Hall: 500-557.
  17. Endsley M.R. (1995): Measurement of situation awareness in dynamic systems. Human Factors, 37: 65-84. Go to original source...
  18. Endsley M.R., Bolté B., Jones D.G. (2003): Designing for Situation Awareness: An Approach to User-Centered Design. Taylor & Francis, London.
  19. Endsley M.R., Garland D.J. (2000): Theoretical underpinnings of situation awareness: A critical review. In: Endsley M.R., Garland D.J. (eds): Situation Awareness Analysis and Measurement, 1: 3-21. Go to original source...
  20. Haas E.C., Van Erp J.B. (2014): Multi-modal warnings to enhance risk communication and safety. Safety Science, 61: 29-35. Go to original source...
  21. Han S., Steward B.L., Tang L. (2015): Intelligent agricultural machinery and field robots. In: Precision Agriculture Technology for Crop Farming, CRC Press, Boca Raton: 133-176. Go to original source...
  22. Hancock P.A., Mercado J.E., Merlo J., Van Erp J.B. (2013): Improving target detection in visual search through the augmenting multi-sensory cues. Ergonomics, 56: 729-738. Go to original source... Go to PubMed...
  23. Hoppe S., Loetscher T., Morey S.A., Bulling A. (2018): Eye movements during everyday behavior predict personality traits. Frontiers in Human Neuroscience, 12: 328. Go to original source... Go to PubMed...
  24. Ima C.S., Mann D.D. (2003): Lightbar design: The effect of light color, lightbar size, and auxiliary indicators on tracking and monitoring performance. Agricultural Engineering International 3: 5.
  25. Karimi D., Mann D.D. (2008): Role of visual cues from the environment in driving an agricultural vehicle. The Ergonomics Open Journal, 1: 54-61. Go to original source...
  26. Laughery K.R., Wogalter M.S. (2006): Designing effective warnings. Reviews of Human Factors and Ergonomics, 2: 241-271. Go to original source...
  27. Lee C. (2015): Banner blindness: An effect of information overload on the World Wide Web [unpublished manuscript]. Available at: citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.501.8980&rep=rep1&type=pdf (accessed Jun 28, 2023)
  28. Lee W.S., Kim J.H., Cho J.H. (1998): A driving simulator as a virtual reality tool. In: Proceedings of the 1998 IEEE International Conference on Robotics and Automation, Leuven, May 16-20, 1998, 1: 71-76. Go to original source...
  29. Macadam C.C. (2003): Understanding and modeling the human driver. Vehicle System Dynamics, 40: 101-134. Go to original source...
  30. Mazza V., Turatto M., Rossi M., Umiltà C. (2007): How automatic are audiovisual links in exogenous spatial attention? Neuropsychologia, 45: 514-522. Go to original source... Go to PubMed...
  31. Mouloua M., Ferraro J.C., Kaplan A.D., Mangos P., Hancock P.A. (2019): Human factors issues regarding automation trust in UAS operation, selection, and training. In: Human Performance in Automated and Autonomous Systems, CRC Press, Boca Raton: 169-190. Go to original source...
  32. Murata A., Kanbayashi M., Hayami T. (2013): Effectiveness of automotive warning system presented with multiple sensory modalities. In: International Conference on Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management, Springer, Berlin, Heidelberg: 88-97. Go to original source...
  33. Niehorster D.C., Santini T., Hessels R.S., Hooge I.T., Kasneci E., Nyström M. (2020): The impact of slippage on the data quality of head-worn eye trackers. Behavior Research Methods, 52: 1140-1160. Go to original source... Go to PubMed...
  34. Parasuraman R. (2000): Designing automation for human use: Empirical studies and quantitative models. Ergonomics, 437: 931-951. Go to original source... Go to PubMed...
  35. Peryer G., Noyes J., Pleydell-Pearce K., Lieven N. (2005): Auditory alert characteristics: A survey of pilot views. The International Journal of Aviation Psychology, 15: 233-250. Go to original source...
  36. Petocz A., Keller P.E., Stevens C.J. (2008): Auditory warnings, signal-referent relations, and natural indicators: Rethinking theory and application. Journal of Experimental Psychology: Applied, 14: 165. Go to original source... Go to PubMed...
  37. Politis I., Brewster S.A., Pollick F. (2014): Evaluating multi-modal driver displays under varying situational urgency. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, Apr 26 - May 1, 2014: 4067-4076. Go to original source...
  38. Sabic E., Mishler S., Chen J., Hu B. (2017): Recognition of car warnings: An analysis of various alert types. Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, Denver, May 6-11, 2017: 2010-2016. Go to original source...
  39. Schreckenghost D., Fong T., Milam T. (2008): Human supervision of robotic site surveys. AIP Conference Proceedings, Jan 21, 2008. American Institute of Physics. 969: 776-783. Go to original source...
  40. Shi J., Bai Y., Diao Z., Zhou J., Yao X., Zhang B. (2023): Row detection-based navigation and guidance for agricultural robots and autonomous vehicles in row-crop fields: Methods and applications. Agronomy, 13: 1780. Go to original source...
  41. Smith C.A.P., Clegg B.A., Heggestad E.D., Hopp-Levine P.J. (2009): Interruption management: A comparison of auditory and tactile cues for both alerting and orienting. International Journal of Human-Computer Studies, 67: 777-786. Go to original source...
  42. Whang M.C., Hyun H.J., Lim J.S., Park K.R., Cho Y.J., Park J.S. (2007): Effect of different modal feedback on attention recovery. In: International Conference on Usability and Internationalization, Beijing, Jul 22-27, 2017: 631-636. Go to original source...
  43. Whelan R. (2008): Effective analysis of reaction time data. Psychological Record, 58: 475-482. Go to original source...
  44. White T.L., Hancock P.A. (2020): Specifying advantages of multi-modal cueing: Quantifying improvements with augmented tactile information. Applied Ergonomics, 88: 103-146. Go to original source... Go to PubMed...
  45. Wogalter M.S., Conzola V.C., Vigilante Jr W.J. (1999): Applying usability engineering principles to the design and testing of warning messages. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, SAGE Publications, Los Angeles, 43: 921-925. Go to original source...
  46. Wogalter M.S., Conzola V.C., Smith-Jackson T.L. (2002): Research-based guidelines for warning design and evaluation. Applied Ergonomics, 33: 219-230. Go to original source... Go to PubMed...
  47. Yim M., Salemi B., Rus D. (2007): Modular self-reconfigurable robot systems [Grand challenges of robotics]. IEEE Robotics & Automation Magazine, 14: 43-52. Go to original source...
  48. Young M.S., Stanton N.A. (2002): Attention and automation: New perspectives on mental underload and performance. Theoretical Issues in Ergonomics Science, 3: 178-194. Go to original source...

This is an open access article distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International (CC BY NC 4.0), which permits non-comercial use, distribution, and reproduction in any medium, provided the original publication is properly cited. No use, distribution or reproduction is permitted which does not comply with these terms.