tianhony@andrew.cmu.edu – Creative Soft Robotics https://courses.ideate.cmu.edu/16-480/s2021 An exploration of soft robotics research and art. Tue, 11 May 2021 03:39:53 +0000 en-US hourly 1 https://wordpress.org/?v=5.6.13 Lunch time for the puppy — an interactive fabric book. https://courses.ideate.cmu.edu/16-480/s2021/3033/lunch-time-for-the-puppy-an-interactive-fabric-book/ https://courses.ideate.cmu.edu/16-480/s2021/3033/lunch-time-for-the-puppy-an-interactive-fabric-book/#respond Tue, 11 May 2021 03:23:10 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=3033 Continue reading Lunch time for the puppy — an interactive fabric book. ]]>

Project Objectives

Lunch Time for the Puppy is an interactive children’s fabric book. The book is made from felt and other fabric material with different textures, and is embedded with soft sensors and electronic output elements for creating a rich storytelling experience. We produced a proof of concept prototype of the design by making one page from the book for demonstration. Our vision for the project is to have an entire book fabricated with similar methods with the main character, the detachable interactive puppy. The complete story would contain various scenarios of the puppy at places around the house.

Interaction and Story Outline

The puppy is designed to be detachable and responsive, able to be reattached to various positions throughout the fabric book. The fabric book contains different scenes of a puppy’s day and each page corresponds to different behaviors of a puppy. For example, petting the puppy in the shower is different from petting the puppy before going for a walk.

Take puppy off from the book

Creative Design Opportunities

With our successful implementation and modification of the methods provided by the Multi-Touch Kit (reference 1) research paper, we believe that by using this technique, capacitive sensing with off-the-shelf microcontroller board can be used for prototyping and designing touch-screen-like interactions for fabric, static flat surfaces and surfaces of other 3D objects. The technique can be used for augmenting children’s fabric books with digital interactions additional to the original material based interactions. Soft sensor grid can be integrated with the design of other soft materials for a more unified look of the book.
We see the possibility of creating open-ended storylines with adding different states of the behaviors. The rich interactive quality also helps reduce screen time for children.

Prototyping Process

In this section, we will discuss sensor iterations and software modifications during the prototyping process.

Sensor Iterations

We fabricated our sensors using instructions from the original research project. The materials and tools we used were copper tapes, think double sided tapes, paper, scissors, and exacto knives.

The very first grid we made was of size 3×3. Copper tapes are taped to opposite sides of a thin plastic sheet. We made this to test if the provided multi-touch kit works at all.

We looked for ways to ease the fabrication process. We found the sensor design guideline and printed out the patterns for easy tracing. After a few attempts we fabricated the sensors using the following process:

  • Cut off a piece of paper that contains a strip of the grid from the printed pattern
  • Tape the strip to do the back of the conductive tape
  • Use the taped conductive tape along the traces
  • Use the exacto knife to cut the vertical traces for the think lines connecting diamonds
  • Use the scissors to cut along the diamonds
  • Peal off the backing of the conductive tape to tape it with the guidance of the printed pattern

The second grid we made was of size 6×6. The top layer of the copper tapes are taped on plastic sheet. The bottom layer of the copper tapes are taped on the paper. And the plastic sheet is taped on top of the paper. This sensor grid had issues with not recognizing light touches, so we suspected that it was caused by the gaps between the plastic sheet and the paper.

The third and the final grid we made was still of size 6×6. The copper tapes are taped the same as the first sensor grid: copper tapes are taped to opposite sides of a thin plastic sheet. This sensor worked relatively reliable and light touches could be detected.

Software Modifications

We built our software implementation using the Processing sketch from the original research project. Here is what the sketch does:

  • Read analog values from the pins
  • Set Baseline
  • Use BlobDetection to identify&locate touches
  • Use OpenCV to visualize the touches

We extracted the result of blob detection to do gesture recognitions.

Initially, we wanted to be able to implement gestures recognitions of the set below.

Initial set of gestures to implement

We then ran into some troubles when trying to detect sliding behavior, partly because of the grid is actually low resolution, and it cannot slide from touching multiple positions at the same time. Thus we decided to put sliding apart and use the other interactions.

Above is the API we ended up implementing. Note that we use the number of continuous taps to detect slides. Using the available gestures, we were able to map them to output behaviors shown below.

Outcomes

In this section, we will discuss the successes and failures of our choices, things we’ve learned, and how future work would further improve the results.

Successes
  • Successful implementation of the multi-touch kit with software modification tailored to our purpose: With the limited resources we had, a lot of our time were spent on how to fabricate the sensor grid. Though our hand-cutting method is nowhere near scalable, it is sufficient for a small sensing area like the one we have.
  • Exploration of interactive fabric book with more complex interactions: Earlier works on tangible interactive books have focused on using simple e-textile inputs(e.g. buttons, sliders, and etc.) and thermochromic paint. We believe that our project is a proof of concept for interactive fabric book with un-constrained interaction sites. In other words, the entire page or a detachable piece of the page could act like a touch screen. Further research would be needed to determine effects of such interactions.
  • Variations of behaviors: To bring the puppy more to life, we implemented a very naive state machine so that the puppy has different states that represent different moods. The states are very limited right now and not all state changes have corresponding outputs, future works could further the complexity of the state machine.
Failures
  • Sensor grid robustness: The sensor grid has a low resolution because our sensor grid is very small. With a 6×6 grid, there are 36 ‘pixels’ in total, but a usual touch contact can easily cover up to 10 pixels. The use of blob detection also introduces errors like a single touch with a too large of a contact area could be detected as two separate touches. We suspect that some of these detection errors might be eliminated with a larger sensor grid, i.e. a larger sensing surface. One may asks about the diamond size and spacings, we chose the size and spacings as recommended by the original research project. Further works could look into replacing blob detections with some other algorithms to detect touches.
  • Missing opportunities provided by the multi-touch kit: Many current interactions we hae can be done with simpler inputs. For example, taps/presses can be detected using a fabric button or capacitive touch sensing. Though we do use touch coordinates to decide whether the touches are the same as before for each time loop() is called, we are not taking advantages of the rich possibilities offered by location information. Further works could explore more gestures such as direction slides, pinches, and other gestures that a 2D touch screen can recognize. One thing to keep in mind here is that although we are mimicking a touch screen using fabric, we should still remember the unique tangible interaction opportunities that soft fabric brings. For example, pinching fabric is drastically different from a hard touch screen as the fabric will be pinched/folded as well.
  • Unnatural interactions compared with interacting with a real puppy: Right now the sensor grid is placed behind the ear of the puppy and thus makes the interaction of reaching to the back very natural. It was a decision between the aesthetics and the normality of interactions, and we chose to hide the sensor grid. A limitation of the multi-touch kit is that fingers must directly contact with the top conductive layer. The question about using resistive sensing technologies instead of the capacitive one that we chose was raised, and we believe that resistive sensing complicates the fabrication process and causes a more unnatural interaction as it requires firmly pressings on the sensor. For future work, one could use conductive thread to directly sew the sensors onto the fabric and to have the sensors at more natural locations.

Due to limited resources, we weren’t able to integrate as many different textures as we hoped for. Moving forward, we believe that adding more textures to the body of the puppy would allow a more diverse tactile experience.

Sources

  • Hardware requirements and schematics can be found from the tutorial by the original research project.
  • Software source code(README has instructions on using the arduino and processing sketches.):

Group member contributions

Equal contributions:

  • Preliminary research
  • Project scope definement
  • Storybook storyline
  • Interaction design
  • Testing/prototyping the sensor grid
  • Weekly reports
  • Final system troubleshooting

Catherine’s additional contribution:

  • Software implementation
  • Troubleshooting on a breadboard

Yanwen’s additional contribution:

  • Fabrication of the fabric page
  • Integration of soft technology components with fabric materials

References

Narjes Pourjafarian, Anusha Withana, Joseph A. Paradiso, and Jürgen Steimle. 2019. Multi-Touch Kit: A Do-It-Yourself Technique for Capacitive Multi-Touch Sensing Using a Commodity Microcontroller. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST ’19). Association for Computing Machinery, New York, NY, USA, 1071–1083. DOI:https://doi-org.proxy.library.cmu.edu/10.1145/3332165.3347895

Jie Qi and Leah Buechley. 2010. Electronic popables: exploring paper-based computing through an interactive pop-up book. In Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction (TEI ’10). Association for Computing Machinery, New York, NY, USA, 121–128. DOI:https://doi-org.proxy.library.cmu.edu/10.1145/1709886.1709909

Irene Posch. 2021. Crafting Stories: Smart and Electronic Textile Craftsmanship for Interactive Books. In Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’21). Association for Computing Machinery, New York, NY, USA, Article 100, 1–12. DOI:https://doi-org.proxy.library.cmu.edu/10.1145/3430524.3446076

]]>
https://courses.ideate.cmu.edu/16-480/s2021/3033/lunch-time-for-the-puppy-an-interactive-fabric-book/feed/ 0
5/5 Catherine&Yanwen Final Critique https://courses.ideate.cmu.edu/16-480/s2021/3017/5-3-catherineyanwen-final-critique/ https://courses.ideate.cmu.edu/16-480/s2021/3017/5-3-catherineyanwen-final-critique/#respond Wed, 05 May 2021 12:56:58 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=3017 Continue reading 5/5 Catherine&Yanwen Final Critique ]]> Demo of a page in a puppy-themed interactive fabric book

We used the Multi-Touch Kit Software Toolkit and attached it to the back of the left ear of the puppy. The only other digital intput is a fabric button hidden underneath the puppy’s belly.

For outputs, we combined visual, audio, and haptic feedbacks using LEDs, the laptop speaker, and vibration motors.

To minimize the expected feeling of having some changes after doing something, we implmented a realatively naive state machine so that the puppy could be in different moods.

Due to time constraint, we used conductive copper tape instead of conductive yarn to fabricate the sensor grid. Between the aesthetics and the normality of interactions, we chose aesthetics to hide the sensor grid. If we fabricated the sensor grid using conductive yarn, we could interact with the top of the furry ears with gestures that are more similar to how one would pet a puppy.

Due to material constraint, we weren’t able to integrate as many different textures as we would like. Adding more textures to the body of the puppy allows a mroe diverse tactile experience.

This is only a prototype of a single page, we invisioned a puppy-themed interactive book of which the interactive puppy is like a bookmark. Every page is a different setting, and placing the puppy on the page triggers the start of the interaction to tell stories about puppy’s different behaviors and reactions in different settings.

]]>
https://courses.ideate.cmu.edu/16-480/s2021/3017/5-3-catherineyanwen-final-critique/feed/ 0
5/3 Catherine&Yanwen Updates https://courses.ideate.cmu.edu/16-480/s2021/2884/5-3-catherineyanwen-updates/ https://courses.ideate.cmu.edu/16-480/s2021/2884/5-3-catherineyanwen-updates/#respond Mon, 03 May 2021 07:30:37 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2884 Continue reading 5/3 Catherine&Yanwen Updates ]]> On the technical side:

We programmed the microcontrollers, and our setup involves two Arduinos:

  1. One is connected to the multi-touch grid. This arduino is controlled by a processing sketch and sends signals to the other arduino for outputs other than sounds.
  2. One is connected to all other input&output components: neopixel LEDs, vibration motors, and fabric buttons. This arduino is controlled by a arduino sketch and receives signals from the other arduino.
Important logic in the processing sketch
Important logic in the arduino sketch

The startHappyLed() starts a sequence of colored LEDs synchronous blinkings and is followed with a sequence of synchronous blinking of only a subset of these colored LEDs at a time.

The startUnhappyLed() starts a sequence of red LEDs synchronous blinking and followed with a sequence of synchronous blinking of only a subset of these red LEDs at a time.

On the page fabrication side:

We finished making the page and started integrating the elements:

Initial testing with the code

Besides the multi-touch grid, we decides to use 4 neopixels and 1 fabric button to control the vibration disc. The video shows the effect of the neopixels and multi-touch interactions. We will integrate in vibration feedback later.

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2884/5-3-catherineyanwen-updates/feed/ 0
4/19 Catherine & Yanwen Updates https://courses.ideate.cmu.edu/16-480/s2021/2852/4-19-catherine-yanwen-updates/ https://courses.ideate.cmu.edu/16-480/s2021/2852/4-19-catherine-yanwen-updates/#respond Mon, 19 Apr 2021 12:44:14 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2852 Continue reading 4/19 Catherine & Yanwen Updates ]]> This week, Yanwen rewired her sensor grid and put it on a different surface(cutting board to desk surface) and the reading becomes much clearer. Both the single and multi-touch detections became much more reliable, so we think that we have a sensor grid fabrication method that we can reliably test on now.

We tried adding a fabric layer(felt, muslin, and t-shirt fabric) on top of the sensing grid to see if detections can still be made. Unfortunately it did not work, and even layering a sheet of a piece of paper significantly weakens the raw data.

After looking into the library together with Garth, we took another look into the blob detection library used. The next step on the software side would be writing code to detect swiping and the speed of swiping. We plan to use the normalized coordinates of the blob’s center returned by the blob detection function.

We discussed on possible page designs and materials needed. Below is a rough sketch of the possibilities.

For the upcoming week, we will work on different tasks:

  • Catherine: Work on the software.
  • Yanwen: Investigate outputs mappings.

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2852/4-19-catherine-yanwen-updates/feed/ 0
4/12 Catherine & Yanwen Update https://courses.ideate.cmu.edu/16-480/s2021/2789/4-12-catherine-yanwen-update/ https://courses.ideate.cmu.edu/16-480/s2021/2789/4-12-catherine-yanwen-update/#respond Mon, 12 Apr 2021 12:38:02 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2789 Continue reading 4/12 Catherine & Yanwen Update ]]> We rewired the 3×3 sensor grid we had last week and the signals become much cleaner, so I think the noisy signal from last week is caused by error in wirings.

We were able to figure out a somewhat efficient way to create the sensor grid and fabricated a 6×6 sensor grid(which still took 2h+) and got it connected/working with both the arduino code and processing code.

Though some issues remain:

  • Touch is rarely detected with touching, and the accuracy increases with pressing. We played a little with tunable parameters and thresholds but the issue remains. We think a potential reason for it is the fabrication of the sensor grid. The 3×3 sensor we had has copper tape taped on both sides of a plastic sheet, but this 6×6 sensor has one side on paper, a plastic sheet layered on top of the bottom sheet, and the second layer of copper tape taped on top of the plastic sheet.
  • Multitouch is far from robust. Again, we think the fabrication could be the issue. Because we have to press hard on the grid to get a detection, it was difficult to have a single contact point from a single touch.
  • Sliding/stroking behavior does not appear obvious. We think one reason could be with sliding/stroking, the finger stays on one contact position for too short a period of time, so we need to go back and look at the paper again for more information. And similarly, the fabrication could cause the issue.

Action item for next week:

  • Remake the 6×6 grid using the 3×3 method for a more robust sensor grid
  • Look at the paper again
  • Decide on what gestures can be detected

Four week plan

WeekGoals
4.12-4.18Decide on what gestures can be detected
4.19-4.25Test sound output
Decide uses of different outputs(auditory&visual)
Decide sensor grid materials: conductive tape/conductive yarn
4.26-5.2Start on fabricating the page from a textile learning book
5.3-5.9Finish fabricating the page from a textile learning book
]]>
https://courses.ideate.cmu.edu/16-480/s2021/2789/4-12-catherine-yanwen-update/feed/ 0
Mar 29th, Catherine&Yanwen Update https://courses.ideate.cmu.edu/16-480/s2021/2671/mar-29th-catherineyanwen-update/ https://courses.ideate.cmu.edu/16-480/s2021/2671/mar-29th-catherineyanwen-update/#respond Mon, 29 Mar 2021 09:40:14 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2671 Continue reading Mar 29th, Catherine&Yanwen Update ]]> We decided to use the open source multi-touch kit and purchased the parts needed for the kit.

We also brainstormed on how to integrate the technology into our fabric book in a specific scenario. We decided to extend on the Jellycat If I were a Puppy Board Book and integrate digital tactile inputs, digital outputs(sound, lights, and haptic) with materials of different tactile experiences(rubbery, velvety, furry, and etc.). Below is a sketch of what we envision.

(Link to full-sized sketch.)

One thing we discussed was that this multi-touch kit enables gesture recognition, which is something we want to take advantage of. We also didn’t want to make a fabric touch screen, we decided to focus on the properties of touches. For example, puppies don’t generally like a rapid swipe. Gentile touches are preferred over hard presses. Feedbacks of the puppy’s preference will be delivered through digital feedback.

There are also possibilities in mapping different sound properties to the location of touches, but we want to first focus on properties of touches/swipes.

For this week, our parts should be arriving soon, and we will be playing with the multi-touch kit and each creates our first version of a multi-touch sensor. We plan to use conductive tape and/or conductive fabric for the first prototype. 

Narjes Pourjafarian, Anusha Withana, Joseph A. Paradiso, and Jürgen Steimle. 2019. Multi-Touch Kit: A Do-It-Yourself Technique for Capacitive Multi-Touch Sensing Using a Commodity Microcontroller. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST ’19). Association for Computing Machinery, New York, NY, USA, 1071–1083. DOI:https://doi.org/10.1145/3332165.3347895

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2671/mar-29th-catherineyanwen-update/feed/ 0
Research Plan https://courses.ideate.cmu.edu/16-480/s2021/2581/research-plan/ https://courses.ideate.cmu.edu/16-480/s2021/2581/research-plan/#respond Sun, 14 Mar 2021 22:34:26 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2581 Continue reading Research Plan ]]> We are making a children’s fabric book with soft sensors to create an interactive playful experience. It is an interactive storytelling experience augmented by using sensor inputs and outputs to create different story outcomes. By adding digital interactive elements to the fabric book, we hope to explore innovative ways of letting young children to learn and practice skills through tactile experiences with the help of soft electronics. The interaction process starts as the child touching and interacting with the soft fabric sensors, sensors picking up the inputs and processing events for outputs, and the outputs will guide the child to proceed with the storyline.

Our project is a new kind of interactive book where the outcome of the story is determined by the child’s choice with the purpose of educating the kids how to eat healthier. Just as the child starts to read the book, he/she is given a set of raw food(ex. vegetables, chicken, eggs, fruit, milk). Each page of the book will accept a subset of raw food, and depending on the choice(sensed by sensors) the kid makes, the book will return a finished product in return(ex. grilled chicken v.s. fried chicken, fruit platter v.s. ice-cream). At the end of the book, the child places the returned food into a ‘mouth’. The food then goes through a digestive process, where different tactile/visual/auditory outputs will be given based on the “good”/”bad” of the returned food.

For our proof-of-concept experiment, we will experiment with how the sensors can contribute to the storyline of converting raw ingredients into cooked food. We will try to pair the sensors with specific food making processes and integrate the sensors into their individual pages. We will ignore the appearances of the food pieces for now, and focus on the modes of interaction and types of feedback first. 

Our bill of material remains very much the same as what we had:

ItemsQuantitiesUnit PriceTotal Cost by ItemsNotes / Store Locations
Adafruit Flora150.9950.99link
Conductive thread(included in the Flora budget pack)
Resistive threadHad trouble finding where to buy
Resistive yarnHad trouble finding where to buy
Conductive fabric18.998.99link (20 cm)
x6-Inch by 1/16 (Pack of 3) Neoprene110.4210.42link
Colored yarns12.972.97link
Colored threads/floss15.295.29link
Assorted Color Felt Pack110.8810.88link

We found the technical papers for the tutorials Garth shared in class:

Perner-Wilson, Hannah, and Leah Buechley. “Handcrafting textile mice.” In Proceedings of the 8th ACM Conference on Designing Interactive Systems, pp. 434-435. 2010.

Perner-Wilson, Hannah, Leah Buechley, and Mika Satomi. “Handcrafting textile interfaces from a kit-of-no-parts.” In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction, pp. 61-68. 2010.

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2581/research-plan/feed/ 0
Bill of Material https://courses.ideate.cmu.edu/16-480/s2021/2551/bill-of-material/ https://courses.ideate.cmu.edu/16-480/s2021/2551/bill-of-material/#respond Wed, 10 Mar 2021 12:11:27 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2551 Continue reading Bill of Material ]]> For the initial proof of concept for making the children’s fabric book, we will be trying to create a list of selected sensors and conduct tests with the results. It will be a more technical prototyping stage and these prototypes will help us decide which of the input means fit the most with the responses, materials and story narrative. 

Along with the different materials, we want to attempt make the following sensors and reliable data processing/filtering for possible applications:

  • bend sensors
  • squeeze sensors(using bend sensors)
  • stroke sensors
  • pressure sensors
ItemsQuantitiesUnit PriceTotal Cost by ItemsNotes / Store Locations
Adafruit Flora150.9950.99link
Conductive thread(included in the Flora budget pack)
Resistive threadHad trouble finding where to buy
Resistive yarnHad trouble finding where to buy
Conductive fabric18.998.99link (20 cm)
x6-Inch by 1/16 (Pack of 3) Neoprene110.4210.42link
Colored yarns12.972.97link
Colored threads/floss15.295.29link
Assorted Color Felt Pack110.8810.88link
]]>
https://courses.ideate.cmu.edu/16-480/s2021/2551/bill-of-material/feed/ 0
Clarifying Research Scope https://courses.ideate.cmu.edu/16-480/s2021/2539/clarifying-research-scope/ https://courses.ideate.cmu.edu/16-480/s2021/2539/clarifying-research-scope/#respond Mon, 08 Mar 2021 11:55:01 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2539 Continue reading Clarifying Research Scope ]]> Project synopsis

We are making a themed children’s fabric book with soft sensors to create an interactive playful experience. 

By adding digital interactive elements to the fabric book, we hope to explore innovative ways of letting young children to learn and practice skills through tactile experiences with the help of soft electronics. 

The fabric book will be made from materials like felt(or/and other types of traditional fabric), yarn, conductive fabric and conductive yarn, and sensors like capacitive touch, pressure and sensor to trigger events. The events triggered by interacting with these sensors could be lighting up LEDs,sounds, small vibrations, shapes changes in fabric, and etc…

Proof-of-concept experiment

[Illustration: the interaction will be like: putting an element to connect the circuit for LED to light up/pressing part of the book for sound/rubbing part of the book to trigger vibration]

For the proof of concept experiment we will be building prototypes with different soft sensors and interactions and doing testings to find the most suitable ones. We will start with making one page from the fabric book and integrate the interaction with the fabric material for the book’s base. The page we are making for the proof of concept is like a sample ‘book’ of possible interactions.

The questions we are trying to answer here are:

  • How to translate tactile interactions into digital inputs?
  • How to integrate digital outputs into soft fabric materials while maintaining the softness?

Related art or design projects

Traditional children’s fabric books

Papers and resources

  1. I. Posch, “Crafting Stories: Smart and Electronic Textile Craftsmanship for Interactive Books,” in Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction, New York, NY, USA, Feb. 2021, pp. 1–12, doi: 10.1145/3430524.3446076.
  2. KOBAKANT DIY WEARABLE TECHNOLOGY DOCUMENTATION. Url: https://www.kobakant.at/DIY/ 
  3. J. Qi and L. Buechley, “Electronic popables: exploring paper-based computing through an interactive pop-up book,” in Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction, New York, NY, USA, Jan. 2010, pp. 121–128, doi: 10.1145/1709886.1709909.
]]>
https://courses.ideate.cmu.edu/16-480/s2021/2539/clarifying-research-scope/feed/ 0
Research Study: B https://courses.ideate.cmu.edu/16-480/s2021/2494/research-study-b/ https://courses.ideate.cmu.edu/16-480/s2021/2494/research-study-b/#respond Sun, 28 Feb 2021 21:16:41 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2494 Continue reading Research Study: B ]]> I was inspired by the self-contained on-skin pcb for the hand, where placement of electronic components are separated into individual PCB islands, then distributed over the body surface, and connected through a novel skin-wiring approach that deposits conformal multi-stranded metallic wires on thin silicon substrates through a sewing-based technique. 

The similar approach of creating PCB islands can be applied to woven fabric too. The major question here would be how would these PCB islands be integrated into the woven fabric? And what activity recognition/sensing can it accomplish?

Revised project statement

A piece of woven fabric integrated with small electronic components. The woven fabric has such qualities:

  1. Electronic components aren’t visually obvious
  2. Electronic components do not affect the softness and the flexibility of the fabric
  3. The Fabric can sense a ‘thing’/things that rely on hard components(i.e. beyond capacitive touch/slider sensors).

Hsin-Liu Cindy Kao, Abdelkareem Bedri, and Kent Lyons. 2018. SkinWire: Fabricating a Self-Contained On-Skin PCB for the Hand. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2, 3, Article 116 (September 2018), 23 pages. DOI:https://doi.org/10.1145/3264926

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2494/research-study-b/feed/ 0