About


  [Google Scholar] [LinkedIn]


  I am an Assistant Professor in the Department of Mechanical, Industrial, and Mechatronics Engineering at Toronto Metropolitan University (formerly Ryerson) in Toronto, Canada. My research focuses on robotics, SLAM, focal-plane sensor-processor arrays (FPSP), and deep learning.


 

From July 2018 to July 2019, I was a Dyson Research Fellow at Imperial College London, working with Prof. Andrew Davison and Prof. Stefan Leutenegger on Semantic SLAM. Previously I was a Research Associate at Robot Vision Group. I worked on the PAMELA project, with Prof. Andrew Davison and Prof. Paul Kelly. At Imperial College London, I work on many interesting and cutting-edge robotics and vision algorithms and devices including focal-plane sensor-processor arrays (FPSP), deep learning, and active vision.


 

In 2015, I was working at 2G Robotics (now Voyis Imaging Inc.) on underwater robotics and perception. Developing machine vision algorithms in underwater environments is a very challenging problem due to nonlinear light refraction in water-glass-air interfaces of the camera housing. I was actively involved in the following projects:

Underwater Stereo Vision: By using stereoscopic vision in underwater, we can make 3D models of underwater structures and wreckages. Building 3D models in real-time will allow underwater robots to navigate autonomously. In this project, I performed various tasks such as designing a stereoscopic system, stereo camera calibration, extracting depth from stereo images, performing visual odometry, and 3D point cloud mapping. The mapping application was developed with C++ and OpenCV under Linux.

Omnidirectional Perception: In this project, a unique omnidirectional catadioptric camera system combined with underwater laser was developed. The goal of the project is to build 3D models of unknown environments (implementation in C++/Linux).


 

​In 2014, I was a postdoctoral fellow at the University of New Brunswick, COBRA lab, under the supervision of Dr. Howard Li. I was working on 3D SLAM of Autonomous Quadrotor Aircraft. In this project, I was working on performing autonomous navigation, target localization, and 2D/3D SLAM with a scanning laser ranger and an RGBD camera in GPS-denied environments; rendering real-time 3D maps on multiple tablets.

3D Mapping for Autonomous Quadrotor Aircraft: With a qaudrotor in a GPS-denied environment, I accomplished autonomous navigation, target localization, and 2D/3D mapping with the onboard sensors such as IMU, laser ranger, and camera. The 3D maps and targets were rendered in real-time on multiple tablets remotely (implementation in C++/Linux).

Localization and Mapping Dataset: I collected imagery, laser, inertial, and GPS data for fixed-wing aircraft, quadrotor, and multiple ground robots for system benchmarking (implementation in ROS)
  In February 2014, I received my PhD from the University of New Brunswick (2009-2014). The title of my thesis was “Multiple-robot simultaneous localization and mapping”. I worked on several robotics projects as follows:

1- I designed and developed several novel algorithms for multiple-robot SLAM,

2- I also designed and implemented an autonomy stack, including perception, path planning, and exploration, for an autonomous quadrotor,

3- Additionally, I studied underwater navigation systems.

Here are more details about my projects:


 

Perception and Navigation for Autonomous Rotorcraft, funded by DRDC - Suffield: In this project, I designed and developed a fully autonomous UAV to support dismounts. The autonomous UAV, as a third eye above the ground, increases soldiers and other operators’ awareness of potential targets. It also has many other applications such as border patrol, first response, natural resource management, and environment monitoring. Research and development of this type can significantly improve the quality of life (implementation in C++/Linux, ROS).

Multiple-robot Simultaneous Localization and Mapping, funded by NSERC: When robots team up, tasks can be done faster and more accurately; however, this advantage comes at the price of complicated coordination between the robots. In this project, I developed multiple novel algorithms to address the coordination problems. Using these algorithms, we are able to deploy a swarm of robots to perform challenging tasks, such as search and rescue and marine spill removal more efficiently (implementation in C++/Linux).

Underwater Mapping and Navigation, funded by Defence R&D Canada - Atlantic: We have more information about the Moon and Mars than our oceans. In this project, I studied and reviewed state-of-the-art underwater navigation techniques and proposed new methods. These methods combined with the algorithms proposed for multi-robot systems allow us to explore underwater worlds and increase our knowledge to better monitor and preserve our resources (implementation in C++/Linux, MATLAB).