The goal of this project is to develop a robotic guide that can interact with visitors to HKUST to guide them from an interactive kiosk to their desired destination. The robot will be installed first in the ECE department, but we anticipate that if successful, the robot can also be rolled out to other departments and parts of the university. In the ECE version, we anticipate that visitors at the interactive kiosk will indicate which faculty they wish to visit. A prompt will then pop up asking the visitor whether they would like to be guided to the office. If the user answers in the affirmative, then a robot will meet the visitor at the kiosk and guide them to the desired office, and then return to its charging station. In order to complete this project successfully, students will need to master concepts in multi-sensor self localization and mapping, robot navigation, people detection, obstacle avoidance, human robot interfaces, network communication, and graphical user interface design.
Note that the work packages of this project have been partially finished from the previous semester. The new project will emphasise more on the integration and implementation, with adaption to new platforms.
UROP1100 UROP2100 UROP3100 UROP4100
This project is best suited for a team of students, who split the project into smaller subprojects concentrating on a particular aspect of the system, and a particular concept out of the list above. Students will be responsible for algorithm design and robot specification. However, the actual robot platform will be made from pre-existing commercial systems as much as possible, unless students have a strong interest in building their own robot system. Students interested in this project should have a combination of strong practical skills, programming skills and mathematical/signal processing background.
Applicant's Learning Objectives:
1. Students will enhance their knowledge of robotics. 2. Students will become familiar with topics such as multi-sensor self localization and mapping, robot navigation, people detection, obstacle avoidance, human robot interfaces, network communication, and graphical user interface design. 3. Students will gain practical experience in implementing human computer interfaces and algorithms/systems for robotic navigation.