Projects
1. ROS-WebXR Integration for Intuitive AR Robot Control
2. BRUH (Battlefront Robot Using Hand-tracking)
3. ME-134y the Force Be With You
4. Object Detection and Prediction using Kalman Filter
5. MONKEY BAR's
ROS-WebXR Integration for
Intuitive AR Robot Control
Background:
-
Interacting with robots using Augmented Reality offers a better understanding of the robot's perception of the environment.
-
Enhancing human-robot interaction through AR faces challenges such as frequent recompilation for minor updates, need for users to download and understand AR applications, and requiring proficiency in ROS for robot control.
Project Objectives:
-
Utilize Robot Operating System (ROS) in conjunction with WebXR, a web-based Augmented Reality (AR) platform, to enable intuitive interactions with robots through a web browser on any mobile devices.
-
Optimize user interactions with robots for non-experts by eliminating any process reliant on background knowledge in robot control or utilization of AR applications.
Project Details:
-
By simply accessing a designated link, any users with a mobile device can start interacting with a robot .
-
Robot can be teleoperated with buttons on the screen.
-
LIDAR data, displayed as AR, can be observed by facing the camera to a maker image attached on the robot.
-
For simulation, TurtleBot3 Burger is used in a Gazebo simulated environment, and RViz is used to visualize the LIDAR data.
-
ROS, HTTPS server (Apache 2), and WebSocket connection (rosbridge) are hosted on a main device.
-
Tested with:
-
Ubuntu 20.04
-
ROS Noetic
-
iPhone 11 (iOS 16.3.1)
-
Galaxy S8+ (Android 9)
-
Chrome mobile (version 119.0.6045.169)
-
Safari (version 16.3)
-


BRUH:
Battlefront Robot Using Hand-tracking
Design & Creativity:
-
Four-wheel robot resembling a tactical vehicle.
-
It is fully independent from all outer sources: no Wi-Fi, no cables, no laptops, nothing.
-
It can be powered on anywhere - from your room to in the middle of the desert - and work as expected.
-
All parts except for the wheels are designed using CAD and 3D-printed.
-
Omnidirectional wheels are used to move freely in all directions.
Technical Details:
-
Hand gestures detected by a mounted camera are used to teleoperate the robot.
-
The mounted camera rotates to keep the human hand in vision.
-
Camera creates an Access Point, which Raspberry Pi is also connected to in order to receive video data.
-
Raspberry Pi processes the video and sends detected hand commands to ESP32 via UART serial communication.
-
ESP32 controls each individual motors according to the received commands with the help of LIDAR and IMU sensor data.
-
The robot is programmed with C++, Python, and MATLAB.
Limitations & Possible Improvements:
-
Wireless communication between the camera and Raspberry Pi causes considerable amount of delay.
-
The robot requires the human user to be on a similar height as the camera must be able to detect the hand to move.
-
Changing the camera that enables data transfer over cables would improve the delay significantly.
-
Utilizing AR, VR, or MR to be free from the camera angle limitation could make the robot more viable in real world usage.


















ME-134y the Force Be With You


















Design & Creativity:
-
A Robotic Arm featuring a "Star Wars" theme with the capability to perform writing tasks.
-
The arm is mounted upside down and hidden inside a box.
-
Only the pen at the end of the arm is visible to create a cinematic view of Baby Yoda using "the Force" to move the "Lightsaber-pen".
-
All arm components are designed using CAD and 3D-printed.
Technical Details:
-
The most affordable robotic arm actuated by the cheapest servo motors in the market.
-
It is a 4-Degrees-of-Freedom system that uses two plastic micro servos (elbow and wrist) and two metal gear servos (base and shoulder).
-
The lengths of all the arm links are calculated to be able to write on an A4 paper.
-
The robotic arm writes alphabets using a personally written inverse kinematics library based on trigonometry functions.
-
A plastic tube is used to cover the pen so that the LED light behind the pen travels within the tube, giving it a glowing look like a Lightsaber.
Limitations & Possible Improvements:
-
Because of the low-price, inaccurate servo motors (due to the project being underfunded), the writing is not precise.
-
The inverse kinematic library programmed during this project is only applicable to robotic arm with this exact system.
-
Changing the servo motors to more accurate ones or even motors with encoders would improve the accuracy.


Object Detection and Prediction
using Kalman Filter
Project Details:
-
Kalman filter was used to detect and predict the location of a moving object.
-
Based on the previous state of the object, the current state is updated with the predicted and measured data of the detected object.
-
Python was used for the base language, and libraries such as OpenCV and imutils were used for video streaming and image processing.
-
Kalman filter libraries were not used.
Experiment:
-
For the simplicity of the experiment, both the target object and the environment were set as simple as possible.
-
A uniform ball with solid green color was chosen for the target object.
-
The parameters for Kalman filter were calculated using the physics of the ball (calculation details are in the link).
-
To find the contour of the ball, a color mask for green, constructed with predefined HSV values, was used.
-
Red circle represents the current measured location.
-
Blue circle represents the current predicted location.
-
The experiment was conducted under four different conditions:
-
Steady - detect and predict a steady ball
-
Bouncing - detect and predict a bouncing ball
-
Rolling - detect and predict a rolling ball
-
Occluded - detect and predict a rolling ball when it is not visible
-
1. Steady

2. Bouncing

3. Rolling

4. Occluded


MONKEY BAR's
















_gif.gif)

(audio description)
Design & Creativity:
-
A moving bar for monkeys that traverses across monkey bars.
-
The course consists of different sections, such as ones with different grip size or different distance between bars.
-
The robot is able to complete the course with three "tracks" with special hooks designed for different grip sizes.
-
All components are designed using CAD and 3D-printed.
Technical Details:
-
One of the biggest challenges faced in this project was gravity.
-
The robot showed stability against gravity with two aspects: three tracks and the hook design.
-
With three tracks, the robot always held onto two bars, only moving one track at a time.
-
Using three sets of these unconventional hooks that firmly grabs the bar like a real hand, the robot was able to overcome tilting due to gravity.
-
The robot has two separate operational modes: teleoperate and autonomous mode.
-
Human user can use hand gestures to control the robot, or the robot can autonomously traverse across the bars using sensors.
-
In autonomous mode, with the sensor data from LIDARs on each tracks and an IMU, the robot uses its internal states to cycle through the tracks.
Limitations & Possible Improvements:
-
Because the robot must hold onto two different bars at all times, the tracks had to be three times the distance between bars, making the entire body extremely long and heavy.
-
Due to the significant weight, the robot still showed instability while traversing through the bars.
-
If the design could be modified so that the weight of each component is lighter while maintaining the length, the system would be more stable.