Close mobile menu

Michael Jenkin

Professor, EECS Department
Member, Centre for Vision Research
Member, IC@L
Core Member, VISTA
Visiting Professor, Samsung AI Center, Montreal

Website | Email

2021 – 2022 Research Highlights

Robust and Interactive Autonomous Machines

Several projects related to human-robot interaction produced solid results this year. The critical question here is – what is the best way or providing effective HRI for non-roboticists interacting with a robot. Two major results here:

(a) development of technology to provide an interactive avatar (a face if you will) for HRI:

  1. Tarawneh, E. and Jenkin, M. System, and method for rendering of an animated avatar. US Patent 10,580,187 B2, March 3, 2020.
  2. Altarawneh, E., Jenkin M. and MacKenzie, I. S. Is Putting a Face on a Robot Worthwhile? Proc. Workshop on Active Vision and Perception in Human(-Robot) Collaboration. (with the 29th IEEE Int. Conf. on Robot and Human Interactive Communication). Held online. September 2020. 

(b) work with Samsung on the use of ‘haptic air’ to provide a socially acceptable interaction cue

  1. Friedman, N., Goedicke, D., Zhang, V., Rivkin, D., Jenkin, M., Degutyte, Z., Astell, A., Liu, X. and Dudek, G. Capturing attention with wind. Proc. Workshop on Integrating Multidisciplinary Approaches to Advanced Physical Human-Robot Interaction. (with IEEE ICRA 2020, Paris, France. May 31, 2020.)
  2. Friedman, N., Goedicke, D., Zhang, V., Rivkin, D., Jenkin, M., Degutyte, Z., Astell, A., Liu, X. and Dudek, G. Out of my way! Exploring different modalities for robots to ask people to move out of the way. Proc. Workshop on Active Vision and Perception in Human (Robot) Collaboration. *with the 29th IEEE Int. Conf. on Robot and Human Interactive Communication. Held online. Sept. 2020. 
  3. Zhang, V., Friedman, N., Goedicke, D., Rivkin, D., Jenkin, M., Liu, X. and Dudek, G. The answer is blowing in the wind: directed air flow for socially acceptable human-robot interaction. Proc. Int. Conf. on Robotics, Computer Vision, and Intelligent Systems (ROBOVIS) 2020, Held Online.

Work on multi-cue integration provided results in how humans integrate competing cues to orientation and self-motion. The self-motion work focussed on citizen science research conducted at the Ontario Science Centre prior to COVID.

  1. Jenkin, M., Harris, L. R. and Herpers, R. Long-duration head down bed rest as an analog of microgravity: effects on the perception of upright. 23rd International Academy of Astronautics Humans in Space Conference. April 2021. Moscow, Russia (Held Online).
  2. Bury, N.-A., Jenkin, M. R., Allison, R. S., Harris, L. R. (2020). Perceiving jittering self-motion in a field of lollipops from ages 4 to 95. PLoS ONE, 15(10): e0241087

Work on robotic technology concentrated on deploying autonomous surface vessels for invasive plant monitoring, and novel robotic sensor technology.

  1. Codd-Downey, R., Jenkin, M., Dey, B. B., Zacher, J., Blainey, E. and Andrews, P. Monitoring re-growth of invasive plants using an autonomous surface vessel. Frontiers Robotics and AI. Jan. 2021.
  2. Hogan, F. R., Jenkin, M., Rezaei-Shoshtari, S., Girdhar, Y., Meger, D. and Dudek, G. Seeing through your skin: recognizing objects with a novel visuotactile sensor. Proc WACV 2021, Held Online.
  3. Bir Dey, B. and Jenkin, M. Design and construction of the DragonBall. Proc. ROMANSY 2020, Sapporo, Japan. September 2020.
  4. Hogan, F. R., Jenkin, M., Rezaei-Shoshtari, S., Girdhar, Y., Meger, D. and Dudek, G. Seeing through your skin: recognizing objects with a novel visuotactile sensor. Proc WACV 2021, Held Online.

This past year. While COVID has made new data collection difficult, we pressed forward with work on Invasive Plant Surveying. This work (now published) demonstrates how USV can be used to monitor aquatic events — here invasive aquatic plants. This summer, COVID permitting, we will be deploying the robot roughly every two weeks to monitor infestation in a lake north of Toronto. The IDEAS project has begun to show results. Again, COVID has complicated some data collection here although we are running field trials roughly once a quarter, with individuals located at different sites around the city. Work on the CSA-funded VECTION project, continues with the ISS with ground data being collected at Houston using volunteers currently embedded at Houston to perform data collection while we supervise remotely.