Cassie vs Spin: A True Story
Cassie Blue gets her independence on July 4th, 2019! In this video, Cassie Blue is navigating autonomously. Right now, her autonomy is restricted to only turn left at intersections, and we demonstrate her abilities at the Wave Field on the University of Michigan campus. If you’re thinking, “That’s not a lot of independence”. Then you’re right, but its a great first step away from using a human and an RC controller! Using a RealSense RGBD Camera, an IMU, and our version of an InEKF with contact factors, Cassie Blue is able to build a 3D semantic map in real time that identifies sidewalks, grass, poles, bicycles, and buildings. From the semantic map, occupancy and cost maps are computed to identify the sidewalk as walkable area and all other features as obstacles. A planner then sets a goal to stay approximately 50 cm to the right of the sidewalk's leftmost edge and plans a path around obstacles and corners using D*. The path is translated into way-points that are achieved via Cassie Blue's gait controller.
While there is a lot left to do, we knew that Cassie Blue would enjoy celebrating Independence Day with you!
Packages used in this experiments:
Extrinsic calibration between the LiDAR and the camera [Paper] [GitHub]
Invariant EKF [GitHub]
BKI Mapping [GitHub]
Cassie Controller [GitHub]