Hungry Hungry Hippos Autoplay
Posted: June 10, 2017
A group of my coworkers play board games during lunch time on Fridays and at one point invited me to play. Unfortunately all they play is a bunch of modern hipster games when I wanted to play Hungry Hungry Hippos. No one wanted to play with me so I decided to build circuitry that would take their place. Pushing the plastic lever eventually got really tiring so I decided to just make the game play by itself.
The microprocessor of choice for this project is the W65C265SXB with the code written in 65C816 assembly. The image processing is done with a Sony Playstation 3 camera connected to a small Intel Atom based netbook with code written in C++. After getting the autoplay working I figured I'd do a quick extra credit project and made the hippos controllable through Amazon's Alexa (the Echo Dot). To connect the SXB to Alexa an ElectricImp was used.
The source code below for the W65C265SXB board can be assembled with naken_asm.
Related Projects @mikekohn.net
A video of the Hungry Hungry Hippos game auto-playing. The first part of the video shows the image processing on the netbook, the second part is just a single ball, and the last part is the hippos fighting over a few balls. There's a little delay from when a hippo swipes that it waits before trying again. https://youtu.be/YcaVS7d7cN4
A video of Hungry Hungry Hippos controlled by Alexa through an ElectricImp. https://youtu.be/JB0nCFxV3PQ
This is a closeup of the green Hungry Hungry Hippo with a servo motor.
Here's the Hungry Hungry Hippos board along with all the electronics. Above the board is a Sony Playstation 3 USB camera attached to a microphone stand. To the far right is the W65C265SXB board. The breadboard next to it is simply there to provide 5v to all the motors.
The hardest part of this project was getting the motors to pull the levers down. I tried a few different configurations of servo motors and DC motors and finally was able to get it (fairly) stable by putting an eyehook below the lever and attaching it to a servo with kite string. The servo has a popsicle stick on it to extend the length of the arm to make the circumference of the swing to be long enough to pull far enough on the string. Everything is hotglued together so it can be taken apart easily so I can reuse these motors. Even with this setup I tried using two different servos that seemed equivalent, but the only one that was strong enough to pull lever down was the PowerHD 6001HB.
I wrote all the image processing code in C++. The code basically looks for the blue color of the board and draws a bounding box around it. It then finds the center of that bounding box and scans up, down, left, right until it hits a hippo head. That bounding area is the hot spot where the code will then look for balls. If a ball is infront of a hippo head, a yellow bounding box is drawn infront of it. I know I'm going to get asked this question so I'll answer it here: no, I did not use OpenCV for this. The algorithms were pretty simple really and weren't hard to do without a library.
The microcontroller I used is the W65C265 which allowed me to salvage a bunch of code from the PANCAKE-ROM project. I origianally wanted to use the Mensch board but it was a little easier to just develop on the W65C265SXB and by the time it was done I didn't feel like trying it on another board. Actually, that might have just been another 20 minutes of work.. owell :(.
After getting the autoplay to work I thought it might be fun to try controlling the board with Alexa. The complex part with Alexa is she needs an "endpoint" on the internet to talk to. This ended up being a perfect use for the ElectricImp. So my speech goes to Amazon's cloud, Amazon's cloud uses my "skill" to talk to the Imp's cloud, the Imp's cloud talks to my ElectricImp, and my Imp sends a command to the SXB board. To set up the Alexa "skill", I pretty much copied what I had already done with a previous project: Alexa TV Remote.
Copyright 1997-2020 - Michael Kohn