This video summarizes the research carried out by the Robotics and Perception Group of the University of Zurich on Event-based Vision between 2013 and 2017. Event-based sensors enable the design of very agile low powered robots that respond within microseconds to changes in the environment, faster than robots based on standard cameras, which have a latency of 200 ms or more. We investigate new methods to make our robots perceive and understand the environment using neuromorphic sensors. In our work, we strive to demonstrate the capabilities of these sensors to tackle real-world robotics problems that are out of reach with traditional cameras, namely in high speed and/or high dynamic range (HDR) conditions.
Related videos:
* Event-based, 6-DOF Pose Tracking for High-Speed Maneuvers using a Dynamic Vision Sensor – https://youtu.be/LauQ6LWTkxM
* EVO: A Geometric Approach to Event-based 6-DOF Parallel Tracking and Mapping in Real-time – https://youtu.be/bYqD2qZJlxE
* Accurate Angular Velocity Estimation with an Event Camera – https://youtu.be/v1sXWoOAs_0
* The Event-Camera Dataset and Simulator – https://youtu.be/bVVBTQ7l36I
* EMVS: Event-based Multi-View Stereo – https://youtu.be/EUX3Tfx0KKE
* Low-Latency Visual Odometry using Event-based Feature Tracks – https://youtu.be/RDu5eldW8i8
* Feature Detection and Tracking with the Dynamic and Active pixel Vision Sensor DAVIS – https://youtu.be/nglfEkiK308
* Event-based, 6-DOF Camera Tracking for High-Speed Applications – https://youtu.be/iZZ77F-hwzs
Publications:
* E. Mueggler, G. Gallego, H. Rebecq, D. Scaramuzza, “Continuous-Time Visual-Inertial Trajectory Estimation with Event Cameras”, https://arxiv.org/pdf/1702.07389.pdf
* H. Rebecq, T. Horstschaefer, G. Gallego, D. Scaramuzza, “EVO: A Geometric Approach to Event-based 6-DOF Parallel Tracking and Mapping in Real-time,” IEEE Robotics and Automation Letters, 2016. http://rpg.ifi.uzh.ch/docs/RAL16_EVO.pdf
* G. Gallego and D. Scaramuzza, “Accurate Angular Velocity Estimation with an Event Camera,” IEEE Robotics and Automation Letters, 2016. http://rpg.ifi.uzh.ch/docs/RAL16_Gallego.pdf
* E. Mueggler, H. Rebecq, G. Gallego, T. Delbruck, D. Scaramuzza, “The Event-Camera Dataset and Simulator: Event-based Data for Pose Estimation, Visual Odometry, and SLAM”. Int. Journal of Robotics Research, 2016. https://arxiv.org/pdf/1610.08336.pdf
* H. Rebecq, G. Gallego, D. Scaramuzza, “EMVS: Event-based Multi-View Stereo”, British Machine Vision Conf., 2016. Best Industry Paper Award! PDF: http://rpg.ifi.uzh.ch/docs/BMVC16_Rebecq.pdf
* G. Gallego, J.E.A. Lund, E. Mueggler, H. Rebecq, T. Delbruck, D. Scaramuzza, “Event-based, 6-DOF Camera Tracking for High-Speed Applications”, http://rpg.ifi.uzh.ch/docs/Arxiv16_Gallego.pdf
* B. Kueng, E. Mueggler, G. Gallego, D. Scaramuzza, “Low-Latency Visual Odometry using Event-based Feature Tracks”, IEEE/RSJ Int. Conf. Intelligent Robots and Systems, 2016. Best Application Paper Award Finalist! Highlight Talk. http://rpg.ifi.uzh.ch/docs/IROS16_Kueng.pdf
* D. Tedaldi, G. Gallego, E. Mueggler, D. Scaramuzza. “Feature Detection and Tracking with the Dynamic and Active-pixel Vision Sensor (DAVIS)”. Int. Conf. Event-Based Control, Communication and Signal Processing, 2016. http://rpg.ifi.uzh.ch/docs/EBCCSP16_Tedaldi.pdf
* E. Mueggler, G. Gallego, D. Scaramuzza, “Continuous-Time Trajectory Estimation for Event-based Vision Sensors”, Robotics: Science and Systems, 2015. http://rpg.ifi.uzh.ch/docs/RSS15_Mueggler.pdf
* G. Gallego, C. Forster, E. Mueggler, D. Scaramuzza, “Event-based Camera Pose Tracking using a Generative Event Model”, https://arxiv.org/pdf/1510.01972.pdf
* E. Mueggler, N. Baumli, F. Fontana, D. Scaramuzza. “Towards Evasive Maneuvers with Quadrotors using Dynamic Vision Sensors”, Eur. Conf. Mobile Robots, 2015. http://rpg.ifi.uzh.ch/docs/ECMR15_Mueggler.pdf
* E. Mueggler, C. Forster, Nathan Baumli, G. Gallego, D. Scaramuzza, “Lifetime Estimation of Events from Dynamic Vision Sensors”, IEEE Int. Conf. Robotics and Automation, 2015. http://rpg.ifi.uzh.ch/docs/ICRA15_Mueggler.pdf
* E. Mueggler, B. Huber, D. Scaramuzza, “Event-based, 6-DOF Pose Tracking for High-Speed Maneuvers”, IEEE/RSJ Int. Conf. Intelligent Robots and Systems, 2014. http://rpg.ifi.uzh.ch/docs/IROS14_Mueggler.pdf
* A. Censi, D. Scaramuzza, “Low-Latency Event-Based Visual Odometry”, IEEE Int. Conf. Robotics and Automation, 2014. http://rpg.ifi.uzh.ch/docs/ICRA14_Censi.pdf
* A. Censi, J. Strubel, C. Brandli, T. Delbruck, D. Scaramuzza, “Low-latency localization by Active LED Markers tracking using a Dynamic Vision Sensor”, IEEE/RSJ Int. Conf. Intelligent Robots and Systems, 2013. http://rpg.ifi.uzh.ch/docs/IROS13_Censi.pdf
Tutorial on event-based cameras: http://www.rit.edu/kgcoe/iros15workshop/papers/IROS2015-WASRoP-Invited-04-slides.pdf
Our research page on event based vision:
http://rpg.ifi.uzh.ch/research_dvs.html
Robotics and Perception Group, University of Zurich, 2017
http://rpg.ifi.uzh.ch/
不明觉厉!