Update: I added the video I had looping at ITP’s Winter Show at the bottom of the post. Also, the project got picked up on Engadget – check it out here! You can see my setup at the show and hear me talk about it for a bit.
My final project for Phys Comp is Cat Car, a “Feline Fitness Frenzy!” It was intended as a cat exercise toy, however ultimately the cats I tested it on didn’t really care much for it. But that’s not the point! I learned quite a bit about accelerometers, gyroscopes and XBees along the way, which I’ll share here.
Cat Car lets humans “drive” their cats by turning a laser pointer from side-to-side. The laser pointer is attached to a harness worn by the cat and its angle is determined by a steering wheel. The cat, wanting to follow the laser, will go where it is. We hope.
I’m behind on sharing! Here’s “The Big Bang,” a sound-based galaxy maker, built with DD and Edward for our Phys Comp midterm.
Cat Car is a cat exercise device. Using a wireless steering wheel, you control a laser mounted on a servo (mounted on a cat). Steering wheel left, laser left, cat left.
I’m using a MPU-6050 “6 degrees of freedom” accelerometer/gyroscope. I will write a long post later detailing all the issues I’ve encountered (some solved, some I am still dealing with). But for now, here’s a little video.
See part 1. Quick update. Today I built an “enclosure” for my sound sensor. Behold:
The microphone is set back because of how closely I initially soldered it to the PCB. This was impacting the microphone’s ability to capture sound, so after taking the pictures I went back and resoldered it so that it’s flush with the wood. Next steps will be to modifying my Processing sketch to output a CSV file of all the sensor readings so I can run it for hours and analyze it later.
The other day, I was talking with Jay about tracking street noise. I thought it would be neat to record a video of the street for however many hours and giving it away to anyone who wanted to extract data from it. Taxi frequency, direction of pedestrians, or noise level, for example. I mentioned how the noise from the street was impacting my sleeping, Jay said something about tracking sound, and at that moment, voilà! An idea was born.
For the audio lab in PComp, I built the Drinking Buddy. He just wants to sing German drinking songs with you! And even though he probably thinks the more he drinks, the better he gets… that just isn’t true. As your breath alcohol increases, more error is introduced into the playback of the song. To wit:
Reading Tom Igoe’s “Greatest Hits” article reinforces the point that it isn’t so much about what form of interaction you choose, but the idea behind the interaction. Ideas give meaning to technology: they turn a bunch of pressure sensors on a glove to into a portable drum kit or a networked LED into a remote hug. Ideas don’t care about how you make things: when you can describe the idea without even mentioning what sensor you’re using or what neat trick you used to hook everything together, users can fully connect with your project and experience it on a more meaningful level.
So what makes it so compelling to categorize all these different forms of interaction? When you put the technology first, users are going to see the technology first. As the world of physical computing continues to grow, more people will look at the touch glove and recognize it as “another drum glove project” or the networked LED as “another remote hug project.” Of course this is not to say nobody should do those projects: they are well documented, meaning they have a great potential to be learned from and developed with a strong idea. But for big projects, it does mean they need to be pushed and they need to have a “reason to be” beyond a desire to play around with the technology.
This weekend I took a trip with some classmates to the excellent American Museum of Natural History here in New York. There were plenty of interactive components to exhibits, mostly in the (rather disappointing) Creatures of Light exhibit. One particular setup had users press a button to flash a light, mimicking a firefly’s bioluminescence, which when done in the correct pattern would cause a group of LEDs to flash in response. The goal was to simulate how males use certain patterns of flashing to attract females.
Though the interaction itself was simple, the instructions and lack of feedback caused confusion for most users. A placard next to the light switch prominently displayed four light flashing patters in morse code style notation, two for male and two for female. Further down on the placard, beyond where I would imagine most people stopped reading, it was stated confusingly that in this scenario the user plays the role of the male. This left the users I observed (and myself) unsure over what to do next.
The next issue was feedback. The only feedback received was of success: when a user flashed the correct pattern, the “females” flashed back. If the user didn’t flash the right pattern, they received no notification as to what they did wrong. Flashing at the wrong speed, flashing to wrong pattern and not holding down the button for long enough were three problems I could imagine users having, but because there was no feedback I am unsure if any of those things even mattered.
This is something one sees often in websites: if the nonfunctioning of the product is the user’s fault but they receive no indication of that fact, they will mistake nonfunctioning for malfunctioning and blame the product. That is what happened in this case.
For this week’s Physical Computing labs, we covered two topics: digital inputs and analog inputs. Click through to see the details.