Projects

From Introduction to Robotics (CS460)
Jump to: navigation, search

This page contains some ideas of course projects. If you are interested in any of the projects, send an email to the instructor with your choice.

Fall2016

Face-based drone motion control

The objective of this project is to develop a drone motion PID controller using face.
You will need to use the Computer Vision ROS Package for Face Detection.
Put this package into the folder ~/catkin_ws/src/gaitech_edu/src
To run the Python code of face detection, follow these steps: run roscore

roscore

set the parameter of pixel format of USB camera

rosparam set usb_cam/pixel_format yuyv

start the USB camera driver

rosrun usb_cam usb_cam_node

See the topics related to USB camera using

rostopic list

We will use the topic /usb_cam/image_raw
You can now see the video from Webcam using this command

rosrun image_view image_view image:=/usb_cam/image_raw

To launch the face tracker app, use the following command

rosrun gaitech_edu face_tracker.py

You can now see your face being tracked. Observe in the terminal that the top left point coordinates with width and height are being displayed.

Your Task

You need to modify the attached face_tracker.py code, and create a new node called face_drone_controller.py, that will control the motion of a drone based on the face location. Your face will act as a joystick to make the drone move.

  • When your face is first detected, this will represent the reference location (0,0).
  • When you move your face into a certain direction (left, right), the drone should rotate into the same direction.
  • When you move close to screen the drone should go back
  • When you move far from the screen the drone should go forward

You can play with the diagonal of the bounding box for determining how far or close the face is to the screen.

What to submit

You need to submit:

  • the code of your project by email by JAN 10, 2016.
  • a video showing the control of the simulated drone using face
  • demonstrate your code to me on the real drone (you can test your code in advance using AR Drone).
  • Evaluate the performance of your PID controller.

Grading

20 points as project.


Fall2015

Drone Control

The objective of this project is to develop a ROS node that controls the motion of the simulated AR Parrot 2 drone using the tum_simulator and the ardrone_autonomy packages.
The ROS node should perform the following (sample) actions:

  • take-off and land the drone (through publishing to their topics)
  • move the robot in 3D
  • stop the robot
  • rotate the robot
  • return to initial location
  • create a new ROS service to send an emergency landing command to the drone, that make the drone immediately land.

You add other actions, which will counted as bonus.
Think of making a text menu with several options for the user.

Robot Arm Control

The objective of this project is to develop a ROS node that controls the motion of the Turtlebot robotic arm PhantomX Pincher Robot Arm. You will use the turtlebot_arm ROS package, and also, one ROS package that we have developed ourselves.
The ROS node should perform the following (sample) actions:

  • Open/close the gripper
  • Set the arm into pre-defined positions
  • Print the joints of the arm
  • Pick an object from a pre-define position and place it in another pre-defined position

You add other actions, which will counted as bonus.
Think of making a text menu with several options for the user.

Arduino Sensors with ROS

The objective of this project is to develop a ROS node that controls Arduino sensors. There are two important ROS packages to use, namely rosserial_arduino which depends on the native rosserial package. The latter allows to maintain a serial connection between a device and ROS.
There are several easy tutorials to start with. The ROS node should perform the following (sample) actions:

  • create publishers and subscribers to the topics of Arduino sensors (button, ultrasonic, and other sensor available in the kit)
  • Print the distance generated by the ultrasonic sensor
  • Print the status of the push button (Pressed/Released)
  • Integrate the button sensor and the ultrasonic sensor into the Turtlebot_Stage simulator such that the robot starts to move when the Button sensor is pressed, and stops if the ultrasonic sensor reports a distance less than 3 cm.

We have a collection of Arduino sensors that the student can borrow and use for the project. You add other actions, which will counted as bonus.
Think of making a text menu with several options for the user.