Lucas Ainsworth is a research scientist at Intel labs investigating methods for playful visualization of data. Working with Carlos Montesinos, also from Intel Labs, he created Data Monster using the Intel Galileo platform. The Data Monster was showcased at the Maker Faire in Rome.
Fun Fact about Lucas: He has been making stuff his whole life! He used to make kites for skiing behind and also made walking cardboard animal kits. Here’s a video of a giant one he and his Fiancé built for last year’s Maker Faire Bay Area http://vimeo.com/65432365 Image from Maker Faire here: http://www.evilmadscientist.com/2013/bamf/
Carlos and Lucas (Wearing cap) are the ones wearing Blue in this photo.
So far industry has been successful in creating big data visualization algorithms that result in charts, maps and graphs. Lucas took the data visualization to the next level by creating a playful way to mix data streams.
Question: Why would one need a playful way to visualize data?
Answer: What if one could visualize data in a more natural and playful way? Charts and graphs are a great way to explain data, but we were interested in exploring data through gesture and behavior. Today, when multiple data streams are combined the resulting visualization can be complex and may be deemed “not successful” if it doesn’t reveal concrete and actionable data. With their research, Lucas and Carlos are able to translate data into physical expression that may be more or less tangible than points on a chart, but are certainly more engaging from 10 feet away. This playful way to visualize data through behavior may one day enable new games, services and business models.
Question: What is a Data Monster?
Answer: Data Monsters are creatures that respond to you. They can see you and respond to your presence and movement. In addition to responding to immediate interactions, they can also be influenced by events happening in the world outside. Just as a person’s moods can be influenced by traffic jams, political climate, local events, the season and weather, Data Monsters can have their own opinions about anything you want them to listen to.
Question: Why did Intel Labs Makers, Lucas and Carlos create this Data Monster?
Answer: The Data Monster started as an Intel labs research project on kids, data and play. The first data monster was built as an experimentation platform for expressive movement. The goal was to create a playful way to mix data streams for emotive visualization. This research project is also being used to identify opportunities in the data economy revolution for both personal and societal benefits.
Question: How long did it take to make the data monster?
Answer: This effort had been simmering for some time, but once we got the call, we had just 4 weeks!
Question: How has the Data Monster evolved?
Answer: Initially it was a web-based app that allowed users to create a virtual monster, and set it’s attributes to be affected by twitter trends.
Although a Data Monster can take any form, this kit’s structure is similar to a desk-lamp, and can express dominance by standing upright and alert, and passivity by cowering or shying-away from things. There is a wide range of emotions and behaviors that can be explored with this physical structure. By listening to data online, Data Monsters can express more nuanced behavior than a robot that responds to physical presence alone. At this point, the project is very basic, but we hope the concept can extend into sophisticated behaviors and data visualization.
Question: Where can we download Data Monster code and design files?
Question: What technologies did Lucas use to build the data monster?
Answer: Data Monsters requires 3 main things:
1) physical structure-
The physical structure uses commonly available materials and a relatively easy-to build wooden kit pattern, so that the physical form “gets out of the way” as much as possible. If you cut this kit and put it together, you will have a robot with 5 joints: waist rotation, waist elevation, mid-body elevation, neck rotation, and head movement.
For this version, we’re using 3 long range active IR sensors for simplicity and low cost. This sensor pack estimates object location in 3D space. Next gen could possibly use a webcam and OpenCV to include face-detection and motion in addition to presence.
3) software -
This is where the fun is and where the most work remains to be done. We have code for the Arduino IDE (written for the Intel Galileo board) that you can use to calibrate and control your monster. If you use our code unchanged, you’ll have some basic reactions to objects, and a connection over WiFi to Thingspeak. Thingspeak (www.thingspeak.com) is an easy-to-use repository for data collected from the internet or any data sources you create.
The wooden parts of this kit are designed to be cut on 1/8” 12x24” plywood using a laser-cutter. Acrylic would probably also work fine. If you don’t have access to a lasercutter at your local school or hackerspace, www.ponoko.com or www.pololu.com are good online lasercutting resources.
You’ll need 5 servos. We use
2x Parallax standard servo (180 degrees)
1x Power HD Mini Servo HD-1711MG
2x Power HD Micro Servo HD-1600A
For the sensor module, we use active IR sensors. Sonar sensors would probably also work fine (and in the future, a webcam.)
3x Sharp GP2Y0A02YK0F Analog Distance Sensor 20-150cm
3x 3-Pin Female JST PH-Style Cable for Sharp Distance Sensors
-a few pull-down resistors
-one capacitor ~10uf
-three LEDs as status indicators
-Intel Galileo board
-Intel® Centrino® Wireless-N 135 WiFi card (to use wireless internet)
-Power supply for servos- anything that puts out 4.2v or 5v DC.
-wood or Elmers’ glue
-some machine screws and nuts- all 1/8”
1x 1.5” long
1x 1” long
2x 3/4” long
6x 1/2” long
12x matching 1/8” hex nuts
-some tiny zip-ties
-some thin stranded servo wire and jumper wires.