ETH Zurich - D-INFK - IVC - CVG - Lectures - Computer Vision Lab

Computer Vision Lab


Lecturers: Friedrich Fraundorfer, Kevin Koeser

Assistant: Lorenz Meier

Schedule

The course will start with an info event where possible projects are suggested. All the students interested in this course should therefore attend the info event.


MID-TERM PRESENTATION: Tue., November 15th, 14-16h (Lab), FINAL PRESENTATION: Tue, December 20th, 14-16h (Lab).



Course Video

This video shows the some demos that are in the spirit of the course goals. Each student is working with his thesis towards a final demo showing his work on the real system.





Course Description

The goal of this course is to learn to develop a computer vision system and to gain hands-on-experience. Important components of a computer vision system, like camera systems, camera interfaces, image processing libraries, camera calibration etc., will be explained tutorial-like first and then used in projects to get in-depth understanding.



Grading

Grading will be based on the implementation, the final demo and written report.

Prerequisites

There are no formal prerequisites, Computer Vision I or II or 3D Photography are however recommended.

Course Outline

During this course students need to participate in tutorial sessions and carry out a project. At the beginning of the course tutorials will be held. Each students gets assigned one topic and works constantly over the whole semester towards the goals of his project. The work is carried partly at home, partly in the experimental lab in CNB D 102.1. Ubuntu Linux computers with direct connection to the helicopter are available in the lab.

Tutorial Lectures

The following tutorials will be held. Slides will be available ahead of lectures.

  • Information Event (September 21, 2011) (slides)
  • OpenCV Basics and Camera Calibration
  • Pixhawk System (Middleware, QGroundControl GUI)
  • Autonomous Flight with the Vicon Capture System (half day)

Possible Student Projects

This term (Fall 2011) all the projects will involve our Pixhawk Micro Aerial Vehicle. Each projects implements a Linux process performing one of the tasks from the list below, running onboard of the helicopter. The software will be developed on standard Ubuntu Desktop computers / laptops and then tested on the quadrotor. Students will be allowed to use the Vicon motion capture system for ground truth and localization. Please feel free to propose your own topics as well, if they fit we will try to accomodate them. Taken topics are marked as deleted.

  • Person/People tracking (in 3D) with avoidance (detect a person in the camera image, let the helicopter know the obstacle position and avoid it)
  • Car detection and tracking (detect a car in images or aerial images and calculate its position)
  • Crowd tracking from aerial images (detect multiple people and track the joint movement)
  • People/face identification (detect a face in the image and identify individual persons)
  • Gesture interpretation for control commands (detect hand/body position and derive control commands)
  • Photorealistic environment modeling (map out the environment from images and create photorealistic models)
  • Ball catching (calculate the trajectory of a thrown ball)
  • Fly through hoop (Detect the hoop as circle and fly through it)
  • Visual odometry (incrementally estimate the current position based on camera images)
  • Change detection (detect changes in in images / in the 3D scenes, e.g. a moved chair)
  • Automatic 3D object scanner (take images from multiple sides from the same object and fuse them into a 3D model)
  • Semantic segmentation (segment the image into walls, floors, ceiling, objects)

Links


© CVG, ETH Zürich lm@inf.ethz.ch