Lunch Time for the Puppy is an interactive children’s fabric book. The book is made from felt and other fabric material with different textures, and is embedded with soft sensors and electronic output elements for creating a rich storytelling experience. We produced a proof of concept prototype of the design by making one page from the book for demonstration. Our vision for the project is to have an entire book fabricated with similar methods with the main character, the detachable interactive puppy. The complete story would contain various scenarios of the puppy at places around the house.
The puppy is designed to be detachable and responsive, able to be reattached to various positions throughout the fabric book. The fabric book contains different scenes of a puppy’s day and each page corresponds to different behaviors of a puppy. For example, petting the puppy in the shower is different from petting the puppy before going for a walk.
With our successful implementation and modification of the methods provided by the Multi-Touch Kit (reference 1) research paper, we believe that by using this technique, capacitive sensing with off-the-shelf microcontroller board can be used for prototyping and designing touch-screen-like interactions for fabric, static flat surfaces and surfaces of other 3D objects. The technique can be used for augmenting children’s fabric books with digital interactions additional to the original material based interactions. Soft sensor grid can be integrated with the design of other soft materials for a more unified look of the book.
We see the possibility of creating open-ended storylines with adding different states of the behaviors. The rich interactive quality also helps reduce screen time for children.
In this section, we will discuss sensor iterations and software modifications during the prototyping process.
We fabricated our sensors using instructions from the original research project. The materials and tools we used were copper tapes, think double sided tapes, paper, scissors, and exacto knives.
The very first grid we made was of size 3×3. Copper tapes are taped to opposite sides of a thin plastic sheet. We made this to test if the provided multi-touch kit works at all.
We looked for ways to ease the fabrication process. We found the sensor design guideline and printed out the patterns for easy tracing. After a few attempts we fabricated the sensors using the following process:
The second grid we made was of size 6×6. The top layer of the copper tapes are taped on plastic sheet. The bottom layer of the copper tapes are taped on the paper. And the plastic sheet is taped on top of the paper. This sensor grid had issues with not recognizing light touches, so we suspected that it was caused by the gaps between the plastic sheet and the paper.
The third and the final grid we made was still of size 6×6. The copper tapes are taped the same as the first sensor grid: copper tapes are taped to opposite sides of a thin plastic sheet. This sensor worked relatively reliable and light touches could be detected.
We built our software implementation using the Processing sketch from the original research project. Here is what the sketch does:
We extracted the result of blob detection to do gesture recognitions.
Initially, we wanted to be able to implement gestures recognitions of the set below.
We then ran into some troubles when trying to detect sliding behavior, partly because of the grid is actually low resolution, and it cannot slide from touching multiple positions at the same time. Thus we decided to put sliding apart and use the other interactions.
Above is the API we ended up implementing. Note that we use the number of continuous taps to detect slides. Using the available gestures, we were able to map them to output behaviors shown below.
In this section, we will discuss the successes and failures of our choices, things we’ve learned, and how future work would further improve the results.
loop()
is called, we are not taking advantages of the rich possibilities offered by location information. Further works could explore more gestures such as direction slides, pinches, and other gestures that a 2D touch screen can recognize. One thing to keep in mind here is that although we are mimicking a touch screen using fabric, we should still remember the unique tangible interaction opportunities that soft fabric brings. For example, pinching fabric is drastically different from a hard touch screen as the fabric will be pinched/folded as well.Due to limited resources, we weren’t able to integrate as many different textures as we hoped for. Moving forward, we believe that adding more textures to the body of the puppy would allow a more diverse tactile experience.
Equal contributions:
Catherine’s additional contribution:
Yanwen’s additional contribution:
Narjes Pourjafarian, Anusha Withana, Joseph A. Paradiso, and Jürgen Steimle. 2019. Multi-Touch Kit: A Do-It-Yourself Technique for Capacitive Multi-Touch Sensing Using a Commodity Microcontroller. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST ’19). Association for Computing Machinery, New York, NY, USA, 1071–1083. DOI:https://doi-org.proxy.library.cmu.edu/10.1145/3332165.3347895
Jie Qi and Leah Buechley. 2010. Electronic popables: exploring paper-based computing through an interactive pop-up book. In Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction (TEI ’10). Association for Computing Machinery, New York, NY, USA, 121–128. DOI:https://doi-org.proxy.library.cmu.edu/10.1145/1709886.1709909
Irene Posch. 2021. Crafting Stories: Smart and Electronic Textile Craftsmanship for Interactive Books. In Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’21). Association for Computing Machinery, New York, NY, USA, Article 100, 1–12. DOI:https://doi-org.proxy.library.cmu.edu/10.1145/3430524.3446076
]]>We used the Multi-Touch Kit Software Toolkit and attached it to the back of the left ear of the puppy. The only other digital intput is a fabric button hidden underneath the puppy’s belly.
For outputs, we combined visual, audio, and haptic feedbacks using LEDs, the laptop speaker, and vibration motors.
To minimize the expected feeling of having some changes after doing something, we implmented a realatively naive state machine so that the puppy could be in different moods.
Due to time constraint, we used conductive copper tape instead of conductive yarn to fabricate the sensor grid. Between the aesthetics and the normality of interactions, we chose aesthetics to hide the sensor grid. If we fabricated the sensor grid using conductive yarn, we could interact with the top of the furry ears with gestures that are more similar to how one would pet a puppy.
Due to material constraint, we weren’t able to integrate as many different textures as we would like. Adding more textures to the body of the puppy allows a mroe diverse tactile experience.
This is only a prototype of a single page, we invisioned a puppy-themed interactive book of which the interactive puppy is like a bookmark. Every page is a different setting, and placing the puppy on the page triggers the start of the interaction to tell stories about puppy’s different behaviors and reactions in different settings.
]]>We programmed the microcontrollers, and our setup involves two Arduinos:
The startHappyLed()
starts a sequence of colored LEDs synchronous blinkings and is followed with a sequence of synchronous blinking of only a subset of these colored LEDs at a time.
The startUnhappyLed()
starts a sequence of red LEDs synchronous blinking and followed with a sequence of synchronous blinking of only a subset of these red LEDs at a time.
We finished making the page and started integrating the elements:
Besides the multi-touch grid, we decides to use 4 neopixels and 1 fabric button to control the vibration disc. The video shows the effect of the neopixels and multi-touch interactions. We will integrate in vibration feedback later.
]]>We tried adding a fabric layer(felt, muslin, and t-shirt fabric) on top of the sensing grid to see if detections can still be made. Unfortunately it did not work, and even layering a sheet of a piece of paper significantly weakens the raw data.
After looking into the library together with Garth, we took another look into the blob detection library used. The next step on the software side would be writing code to detect swiping and the speed of swiping. We plan to use the normalized coordinates of the blob’s center returned by the blob detection function.
We discussed on possible page designs and materials needed. Below is a rough sketch of the possibilities.
For the upcoming week, we will work on different tasks:
We were able to figure out a somewhat efficient way to create the sensor grid and fabricated a 6×6 sensor grid(which still took 2h+) and got it connected/working with both the arduino code and processing code.
Though some issues remain:
Action item for next week:
Four week plan
Week | Goals |
4.12-4.18 | Decide on what gestures can be detected |
4.19-4.25 | Test sound output Decide uses of different outputs(auditory&visual) Decide sensor grid materials: conductive tape/conductive yarn |
4.26-5.2 | Start on fabricating the page from a textile learning book |
5.3-5.9 | Finish fabricating the page from a textile learning book |
We also brainstormed on how to integrate the technology into our fabric book in a specific scenario. We decided to extend on the Jellycat If I were a Puppy Board Book and integrate digital tactile inputs, digital outputs(sound, lights, and haptic) with materials of different tactile experiences(rubbery, velvety, furry, and etc.). Below is a sketch of what we envision.
One thing we discussed was that this multi-touch kit enables gesture recognition, which is something we want to take advantage of. We also didn’t want to make a fabric touch screen, we decided to focus on the properties of touches. For example, puppies don’t generally like a rapid swipe. Gentile touches are preferred over hard presses. Feedbacks of the puppy’s preference will be delivered through digital feedback.
There are also possibilities in mapping different sound properties to the location of touches, but we want to first focus on properties of touches/swipes.
For this week, our parts should be arriving soon, and we will be playing with the multi-touch kit and each creates our first version of a multi-touch sensor. We plan to use conductive tape and/or conductive fabric for the first prototype.
Narjes Pourjafarian, Anusha Withana, Joseph A. Paradiso, and Jürgen Steimle. 2019. Multi-Touch Kit: A Do-It-Yourself Technique for Capacitive Multi-Touch Sensing Using a Commodity Microcontroller. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST ’19). Association for Computing Machinery, New York, NY, USA, 1071–1083. DOI:https://doi.org/10.1145/3332165.3347895
]]>Our project is a new kind of interactive book where the outcome of the story is determined by the child’s choice with the purpose of educating the kids how to eat healthier. Just as the child starts to read the book, he/she is given a set of raw food(ex. vegetables, chicken, eggs, fruit, milk). Each page of the book will accept a subset of raw food, and depending on the choice(sensed by sensors) the kid makes, the book will return a finished product in return(ex. grilled chicken v.s. fried chicken, fruit platter v.s. ice-cream). At the end of the book, the child places the returned food into a ‘mouth’. The food then goes through a digestive process, where different tactile/visual/auditory outputs will be given based on the “good”/”bad” of the returned food.
For our proof-of-concept experiment, we will experiment with how the sensors can contribute to the storyline of converting raw ingredients into cooked food. We will try to pair the sensors with specific food making processes and integrate the sensors into their individual pages. We will ignore the appearances of the food pieces for now, and focus on the modes of interaction and types of feedback first.
Our bill of material remains very much the same as what we had:
Items | Quantities | Unit Price | Total Cost by Items | Notes / Store Locations |
Adafruit Flora | 1 | 50.99 | 50.99 | link |
Conductive thread | (included in the Flora budget pack) | |||
Resistive thread | Had trouble finding where to buy | |||
Resistive yarn | Had trouble finding where to buy | |||
Conductive fabric | 1 | 8.99 | 8.99 | link (20 cm) |
x6-Inch by 1/16 (Pack of 3) Neoprene | 1 | 10.42 | 10.42 | link |
Colored yarns | 1 | 2.97 | 2.97 | link |
Colored threads/floss | 1 | 5.29 | 5.29 | link |
Assorted Color Felt Pack | 1 | 10.88 | 10.88 | link |
We found the technical papers for the tutorials Garth shared in class:
Perner-Wilson, Hannah, and Leah Buechley. “Handcrafting textile mice.” In Proceedings of the 8th ACM Conference on Designing Interactive Systems, pp. 434-435. 2010.
Perner-Wilson, Hannah, Leah Buechley, and Mika Satomi. “Handcrafting textile interfaces from a kit-of-no-parts.” In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction, pp. 61-68. 2010.
]]>Along with the different materials, we want to attempt make the following sensors and reliable data processing/filtering for possible applications:
Items | Quantities | Unit Price | Total Cost by Items | Notes / Store Locations |
Adafruit Flora | 1 | 50.99 | 50.99 | link |
Conductive thread | (included in the Flora budget pack) | |||
Resistive thread | Had trouble finding where to buy | |||
Resistive yarn | Had trouble finding where to buy | |||
Conductive fabric | 1 | 8.99 | 8.99 | link (20 cm) |
x6-Inch by 1/16 (Pack of 3) Neoprene | 1 | 10.42 | 10.42 | link |
Colored yarns | 1 | 2.97 | 2.97 | link |
Colored threads/floss | 1 | 5.29 | 5.29 | link |
Assorted Color Felt Pack | 1 | 10.88 | 10.88 | link |
We are making a themed children’s fabric book with soft sensors to create an interactive playful experience.
By adding digital interactive elements to the fabric book, we hope to explore innovative ways of letting young children to learn and practice skills through tactile experiences with the help of soft electronics.
The fabric book will be made from materials like felt(or/and other types of traditional fabric), yarn, conductive fabric and conductive yarn, and sensors like capacitive touch, pressure and sensor to trigger events. The events triggered by interacting with these sensors could be lighting up LEDs,sounds, small vibrations, shapes changes in fabric, and etc…
Proof-of-concept experiment
[Illustration: the interaction will be like: putting an element to connect the circuit for LED to light up/pressing part of the book for sound/rubbing part of the book to trigger vibration]
For the proof of concept experiment we will be building prototypes with different soft sensors and interactions and doing testings to find the most suitable ones. We will start with making one page from the fabric book and integrate the interaction with the fabric material for the book’s base. The page we are making for the proof of concept is like a sample ‘book’ of possible interactions.
The questions we are trying to answer here are:
Related art or design projects
Traditional children’s fabric books
Papers and resources
The similar approach of creating PCB islands can be applied to woven fabric too. The major question here would be how would these PCB islands be integrated into the woven fabric? And what activity recognition/sensing can it accomplish?
Revised project statement
A piece of woven fabric integrated with small electronic components. The woven fabric has such qualities:
Hsin-Liu Cindy Kao, Abdelkareem Bedri, and Kent Lyons. 2018. SkinWire: Fabricating a Self-Contained On-Skin PCB for the Hand. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2, 3, Article 116 (September 2018), 23 pages. DOI:https://doi.org/10.1145/3264926
]]>