Luxonis
LuxonisApplications
Software
Support
About UsBlog
Store
Applications
DocumentationGuides, specifications and datasheets
GitHubSource code of our libraries
RobotHubDiscover our cloud platform
About UsBlogStore
Jun 23, 2022

Robotic Dogs and Landmark Detection

Computer Vision
DepthAI

Time for a fun post for anyone who is interested in robotics. We'd like to share with you how our OAK-D-Lite was used in combination with the Open Source Mini Pupper (more on it here, in case you're not familiar).

GIF of mini puppy

Making Mini Pupper follow hand-commands like a Border Collie

Seeing something like this is a dream come true for us. A robotic dog that will literally obey your commands! It feels like science fiction. And to be able to build this from scratch in a weekend is just so neat!

This example, implemented in ROS, running on a Raspberry Pi, and using OAK-D-Lite for the AI, CV, and depth perception.

We so exciting not only because this is the kind of thing we’ve seen in movies since we were kids, but more importantly because it’s the convergence of so many open source and community-driven efforts:

  • Robot Operating System (ROS) from Open Robotics.
  • The RPi 4 from the Pi Foundation
  • Pupper from Stanford (here) improved to Mini Pup[er re-open-sourced by MangDang (here)
  • OAK-D-Lite (Open Source hardware, software, and more)
  • Hand Tracking (and a bunch more) open source pipelines for OAK by geax (here)

So, a robotic dog that responds to hand signals is one for the win column. What would you like to see next? What have you been dreaming about transforming from science fiction into reality? Let us know! Everything we do is because of our customers and our community. You speak, we listen and build.


Erik Kokalj
Erik KokaljDirector of Applications Engineering