Robots, Devices & Interfaces

Our Robots

We explore teleoperation in different scenarios and robots.
These robots are currently available at ISR-Lisboa.
If you want to use our robots for master or phd thesis and/or collaborations/projects contact our PIs.

Raposa-NG: Search and Rescue UGV

Research Purposes:

Augment teleoperation user interface to enhance operator Situation Awareness.

Onboard sensors and functionalities

Sensors
Laser: Hokuyo UTM-30LX
Stereo Camera: Point Grey Bumblebll2 BB2-03S2C
Depth Camera: Intel RealSense D435
IMU: Microstrain 3DM-GX2
RGB Camera: Generic USB Webcam

Functionalities
Tracked wheel robot
Front body can be tilted
Laser is on a gimbal to keep it leveled
Stereo Camera on a pan & tilt gimbal
Controllable by either tether or wireless connection

Research work developed with this robot

Research grantee
Rute Luz: Haptic tablet for UGV teleoperation (journal ieee access – link), in Collaboration with Inria Centre at University of Lille
        + Video explaning how it works: https://youtu.be/USiuOQ7r6o0
 
MSc theses
Filipe Jesus: Ogre interface (delivered – link)
Rute Luz: Haptic devices for traction awareness (delivered – link)

 

PhD thesis
Jéssica Corujeira: Augmentation of Situation Awareness Through Multimodal Interfaces in Mobile Robot Teleoperation. (delivered)

Tested scenarios:

Has been used in collaboration with Portuguese Search and Rescue corporations:
– GNR GIPS
– RSBL

And in collaboration with Portuguese Explosive Ordinance Disposal Squadron:
– FAP ERIEE

Trident: Underwater ROV

Won  in the Science Exploration Education Initiative   organized by the National Geographic and OpenROV.

Research Purposes

  • Development platform to study teleoperation in an underwater scenarios
  • Create an immersive underwater experience

Onboard sensors and functionalities

Sensors
DDS-based communication
Off-the-shelf platform (not longer available, previously sold by OpenRov)
Currently installed payload: 360º 3D camera

Research work developed with this robot

PhD thesis
Rui Xavier: title (under development)

MSc theses
João Nascimento: title (under development)

Research Grantee
Rute Luz: Creating immersive underwater experienced

KayJay: Planetary Rover

The story behind the KayJay name:

– Tribute to Katherine Johnson
– Aligned our group philosophy of promoting inclusion in research

 

 Research Purposes

Develop a platform to test our onboard algorithms and remote teleoperation

Build a deployable container (physical or virtual) that can be easily integrated in various ground rovers

Reuse a mobile platform with added modern computing and sensors

Onboard Sensors and Functionalities
Differential mobile platform (Pioneer based)
RGB-D camera
GNSS receiver
IMU
Integration with ROS1 and ROS2

Onboard functionalities include
Traction detection algorithm
Onboard autonomous navigation (under development)

Research work developed with this robot

MEROP robotics team @ AMADEE analog missions  – (link to amadee page)

MSC theses
Rui Abrantes: title (under development)
Gonçalo Coelho: title (under development)
Margarida Pereira: title (under development)

PhD thesis
Rute Luz: title (under development)

 

 

Operator Control Unit

In our group we have built a custom teleoperation operator control unit, and made haptic devices to provide feedback to the operator and tackle challenges identified in various field experiments (attitude, traction, and traversability awareness).

Photograph credits: Photograph taken by Gonçalo Gouveia from Instituto Superior Técnico – MDN

Short Description: Easily deployable/self-contained teleoperation console: teleoperation GUI interface and haptic devices. The case has a laptop to run the MEROP teleoperation software, it holds the haptic devices for attitude and traction feedback, It also stores the joystick to control the robot and the teleoperation functionalities, a video camera and replacement parts for the haptic devices.

Haptic devices

Haptic Attitude Feedback Device

The haptic attitude feedback device enhances the operator’s attitude awareness, and perception of the terrain texture, when teleoperating an uncrewed ground vehicle through rough or unstructured environments. It uses upper limb proprioception for the attitude feedback.

Description
– Provides absolute attitude orientation in Roll and Pitch
– Provides rate of change of attitude (angular acceleration and velocity)
– Is sensitive enough to provide terrain texture

 Published Papers
User Study Results on Attitude Perception of a Mobile Robot“. J. Corujeira, J. L. Silva and R. Ventura. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’18). Association for Computing Machinery, New York, NY, USA, 93–94. 
Attitude Perception of an Unmanned Ground Vehicle Using an Attitude Haptic Feedback Device,”  J. Corujeira, J. L. Silva and R. Ventura, 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 2018, pp. 356-363

Traction Glove

The operator feels vibration when the rover is stuck or sliding.

Description
– Custom made glove with vibration motors
– Operators can feel different vibration patterns when the rover loses traction, depending on the state of the rover.
– When the rover is stuck or sliding the glove vibrates and warns the astronaut.
 
Published paper
Traction Awareness Through Haptic Feedback for the Teleoperation of UGVs*,” 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, 2018, pp. 313-319.

Traction Cylinder Device

Description
– Custom made rotation device (cylinder)
– Operators can feel on the palm of the hand different rotation patterns when the rover loses traction, depending on the state of the rover
– When the rover is stuck or sliding the cylinder rotates to warn the operator

 Published paper
Traction Awareness Through Haptic Feedback for the Teleoperation of UGVs*,” 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, 2018, pp. 313-319.

Teleoperation Graphical User Interfaces (GUI)

Raposa-NG GUI

The GUI used with the Raposa-NG robot can be visualised on a computer monitor or within a Head-Mounted Display (HMD). 

This GUI has the following elements:
– Camera feeds, that can be cycled through, and camera main camera pan-tilt functionality, either through joystick or head rotation when wearing the HMD.
– Robot location in the map
– Attitude Indicator, that is turned off when using the haptic attitude feedback device

 

MEROP GUI for AMADEE-20

The GUI used within the AMADEE-20 mission can be visualised on a computer monitor.

This GUI has the following elements:
– Camera feeds, that can be cycled through
– Robot location in the map
– Attitude Indicator, that is turned off when using the haptic attitude feedback device
– Information about communication latency and bandwidth when these go below a threshold. As well as a warning when there is a connection failure

MEROP GUI for AMADEE-24

The GUI used within the AMADEE-24 mission can be visualised on a computer monitor.

This GUI has the following elements:
– Camera feeds that can be cycled through
– Robot location in the map. The map also shows georeferenced user notes, points-of-interest, regions of interest, and Wi-Fi signal strength
– Icons represent information about Wi-Fi signal strength and battery state
– There are two modes of operation: teleoperation mode and Avatar mode (semi-autonomy)