Relevant publication: Yezhou Yang, Yiannis. Aloimonos, Cornelia. Fermuller. Detection of Manipulation Action Consequences (MAC), IEEE International Conference on Computer Vision and Pattern Recognition, 2013, Portland Seattle.
ZIP file: ROS PACKAGE V1.0
Special thanks to give for sharing the active segmentation and particle filter based tracking code from,
Mishra, Ajay, Yiannis Aloimonos, and Cornelia Fermuller. "Active segmentation for robotics." Intelligent Robots and Systems, 2009. IROS 2009. IEEE/RSJ International Conference on. IEEE, 2009.
Nummiaro, Katja, Esther Koller-Meier, and Luc Van Gool. "A color-based particle filter." First International Workshop on Generative-Model-Based Vision. Vol. 2002. Denmark, Kopenhagen: Datalogistik Institut, Kobenhavns Universitet, 2002.
1) First install ROS (only tested on diamondback) For more information about ROS, please refer HERE
2) install ROS package rosTrackSeg from .zip file, check dependency and then run "rosmake"
3) Publish a Live camera or kinect input through massage /camera/rgb/image_color , or publish a recorded .oni file with the live stream to the same massage channel.
4) run "rosrun rosTrackSeg trackSeg". There will be a window with name "select a fixation point", jump out. Select a fixation point within the object under manipulation and then press ENTER.
5) now active monitoring process starts and you can see result from the label window, you are welcome to revise our program for saving the segmentation result out.
Questions? Please contact yzyang "at" cs dot umd dot edu