Skip to content
Nick Walker edited this page Oct 11, 2017 · 1 revision

Goal: Use the kinect, hokuyo, and segway base to create a people hunting machine.

People to Yell at

  • Oliver Croomes
  • Ji Won Min

Plan

General overview

The segbot is already able to detect gestures and avoid obstacles. We will use the velocity assigning code Assignment 1 from CS378 this past spring to integrate our projects and have the robot chase after a person while avoiding objects. The robot will only begin the chase when the person gestures the “Kung Fu” pose and will stop once the robot gets close enough.

Kung Fu pose: 1 see 1:50.

Timeline

- Jun 11: Create outline of project details and a timeline for subtasks.

-Update: Bender is the only segbot with openni_tracker which is needed for gesture recognition. However, Bender is the only robot without a laser scanner, which is needed for object avoidance. *Jun 14: Create messages for skeleton_listener to sent to asg1. Revise asg1 to listen to new messages rather than blob messages. *Jun 24 - 28: Continue with code integration. -Update: Code compiles but the messages are not going through from skeleton_listener to asg1. *July 1 - 5: Messages are still not going through from skeleton_listener to asg1. Need assistance with ros message setup. All code is written. However, a new way of when to halt the robot should be implemented because of all the noise for openni_tracker. *July 8 - 12:

  • July 15 - 19:
  • July 22 - 26:
  • July 29 - Aug 2nd:
  • Aug 5 - 9:
  • Aug 12 - 16:
  • Aug 19 - 23:
  • Aug 26 - 30:
Clone this wiki locally