Assignments – Creative Soft Robotics https://courses.ideate.cmu.edu/16-480/s2021 An exploration of soft robotics research and art. Mon, 03 May 2021 07:34:18 +0000 en-US hourly 1 https://wordpress.org/?v=5.6.13 5/3 Catherine&Yanwen Updates https://courses.ideate.cmu.edu/16-480/s2021/2884/5-3-catherineyanwen-updates/ https://courses.ideate.cmu.edu/16-480/s2021/2884/5-3-catherineyanwen-updates/#respond Mon, 03 May 2021 07:30:37 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2884 Continue reading 5/3 Catherine&Yanwen Updates ]]> On the technical side:

We programmed the microcontrollers, and our setup involves two Arduinos:

  1. One is connected to the multi-touch grid. This arduino is controlled by a processing sketch and sends signals to the other arduino for outputs other than sounds.
  2. One is connected to all other input&output components: neopixel LEDs, vibration motors, and fabric buttons. This arduino is controlled by a arduino sketch and receives signals from the other arduino.
Important logic in the processing sketch
Important logic in the arduino sketch

The startHappyLed() starts a sequence of colored LEDs synchronous blinkings and is followed with a sequence of synchronous blinking of only a subset of these colored LEDs at a time.

The startUnhappyLed() starts a sequence of red LEDs synchronous blinking and followed with a sequence of synchronous blinking of only a subset of these red LEDs at a time.

On the page fabrication side:

We finished making the page and started integrating the elements:

Initial testing with the code

Besides the multi-touch grid, we decides to use 4 neopixels and 1 fabric button to control the vibration disc. The video shows the effect of the neopixels and multi-touch interactions. We will integrate in vibration feedback later.

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2884/5-3-catherineyanwen-updates/feed/ 0
4/26 Catherine & Yanwen Updates https://courses.ideate.cmu.edu/16-480/s2021/2872/4-26-catherine-yanwen-updates/ https://courses.ideate.cmu.edu/16-480/s2021/2872/4-26-catherine-yanwen-updates/#respond Mon, 26 Apr 2021 08:08:24 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2872 Continue reading 4/26 Catherine & Yanwen Updates ]]> We have developed our own working API with the following events:

We ran into some troubles when trying to detect sliding behavior, partly because of the grid is actually low resolution, and it cannot slide from touching multiple positions at the same time. Thus we decided to put sliding apart and use the other interactions.

Here are the output behaviors we planned out for now:

We also tested out the vibration motor and LED for preparing to integrate with the fabric prototype.

For making the prototype, we made a sample page based on the scenario when the puppy is sitting beside the table and wants to eat human food. We made a sample sketch and the paper template for cutting the fabric pieces.

sample page
(breakdown of the page -> some parts (ears, tail) are movable)
templates

For the prototype and final demo, we will be working with a single fabric page and one multi touch pad (will be positioned on one ear), but for the actual expectation this would be a book with multiple pages and with different scenes of the puppy being around the house.

For this week, we will be working on:
– Developing possible different states of the puppy
– Testing out the API with actual events
– Fabricating and assembling the fabric pieces

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2872/4-26-catherine-yanwen-updates/feed/ 0
4/21 Catherine & Yanwen Updates https://courses.ideate.cmu.edu/16-480/s2021/2864/4-21-catherine-yanwen-updates/ https://courses.ideate.cmu.edu/16-480/s2021/2864/4-21-catherine-yanwen-updates/#respond Wed, 21 Apr 2021 06:36:43 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2864 Continue reading 4/21 Catherine & Yanwen Updates ]]> Following the suggestions on mapping out possible user behaviors towards the touch pad and designing states and unpredictability for the setup, we started to write down the categories in a Google sheet:

While waiting for the components to be ready for pickup, we will continue on finishing and refining these listed interactions and outputs before applying them to the actual making of the page.

For the upcoming week, we will be working on producing our own API for the data as well as starting to fabricate initial pieces for the fabric page.

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2864/4-21-catherine-yanwen-updates/feed/ 0
Catherine & Yanwen Project Update https://courses.ideate.cmu.edu/16-480/s2021/2621/catherine-yanwen-project-update/ https://courses.ideate.cmu.edu/16-480/s2021/2621/catherine-yanwen-project-update/#respond Mon, 22 Mar 2021 13:55:53 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2621 Continue reading Catherine & Yanwen Project Update ]]> We did more literature research with the goal of focusing on a single sensing technique, and we realized that work in this space is much more advanced and complex compared with the approaches that we were looking at before. The two possible technical directions that we eventually narrowed down to are touch tracking with electric field tomography and capacitive sensing.

Electrick uses electric field tomography in concert with an electrically conductive material to detect touch inputs on large surfaces that does not depend on distribution of the electrodes. Its following work Pulp Nonfiction further explores using Electrick with paper interfaces to track finger and writing instruments inputs.

We took inspiration from this and wondered if we could use the same system for fabric and explore where to place conductive yarn, and the density of conductive yarn. One immediate difficulty we see with this approach is that Electrick uses a custom sensor board and its own sensing system, so we are concerned with spending a long time reinventing wheels and further if we are even able to reinvent the wheels.

Another inspiration we got was the I/O braid. It uses conductive and passive yarns to create a helical sensing matrix. The I/O braid is able to detect many gestures such as twisting, flicking, sliding, and etc. based on capacitive signal strengths. We wanted to explore how we can apply the same technique in the 2D space. The paper mentions using a PSoC® 4 S-Series Pioneer Kit, but we weren’t able to find much information about its difference compared with other capacitive sensor chips.

For both approaches, there is much to explore to achieve the same goal of detecting gestures(even as simple as being able to detect where the textile is being touched, i.e. a textile touchscreen) on a textile surface. This adds to our kids story book narrative as it allows a wider interaction space and thus contributes to a likely open-ended book, since we could combine this position and touch sensing technique with the interaction of placing assets on to or just directly touching different locations of the page.

For the first approach, we can’t really think of a clear proof-of-concept experiment as of now because of our lack of technical understanding in electric field tomography.

For the second approach, the proof of concept experiment could be hooding up a capacitive sensing board with several different sensing ‘matrices’ using conductive copper tape(thicker and easier to work with than conductive yarn for now).

Our aim for the outcome is to focus on one method of sensing touch input and extend the interaction model from the basic “touch trigger outputs” to capturing different types of touch inputs(like the microinteractions from the second paper) or locationized inputs.

We’d love to have some feedback and advice, especially on the feasibility of this direction and anticipated technical challenge, and more related technologies/references.

References:

Yang Zhang, Gierad Laput, and Chris Harrison. 2017. Electrick: Low-Cost Touch Sensing Using Electric Field Tomography. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17). Association for Computing Machinery, New York, NY, USA, 1–14. DOI:https://doi-org.proxy.library.cmu.edu/10.1145/3025453.3025842

Yang Zhang and Chris Harrison. 2018. Pulp Nonfiction: Low-Cost Touch Tracking for Paper. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, Paper 117, 1–11. DOI:https://doi.org/10.1145/3173574.3173691

Alex Olwal, Jon Moeller, Greg Priest-Dorman, Thad Starner, and Ben Carroll. 2018. I/O Braid: Scalable Touch-Sensitive Lighted Cords Using Spiraling, Repeating Sensing Textiles and Fiber Optics. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology (UIST ’18). Association for Computing Machinery, New York, NY, USA, 485–497. DOI:https://doi-org.proxy.library.cmu.edu/10.1145/3242587.3242638

Alex Olwal, Thad Starner, and Gowa Mainini. 2020. E-Textile Microinteractions: Augmenting Twist with Flick, Slide and Grasp Gestures for Soft Electronics. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–13. DOI:https://doi-org.proxy.library.cmu.edu/10.1145/3313831.3376236

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2621/catherine-yanwen-project-update/feed/ 0
Research Part B https://courses.ideate.cmu.edu/16-480/s2021/2507/research-part-b/ https://courses.ideate.cmu.edu/16-480/s2021/2507/research-part-b/#respond Mon, 01 Mar 2021 06:17:27 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2507 Continue reading Research Part B ]]> My project brief remains largely the same as before–to create a robot capable of licking lollipops in an eerily biomimetic manner, count those licks, and, before finishing the lollipop, biting the candy with a hidden jaw mechanism.

The concept and background I discussed in the last post left a big technical hole: how to keep the tongue wet. The previous iteration of this project used a sponge for the tongue, but this presents both visual issues (it’s hard to make a sponge convincingly mimic a tongue), and mechanical issues (actuating complex movement within a sponge may not be feasible). Furthermore, the previous work I had looked at for this new version primarily used silicone, and I have some potential concerns regarding the ability of a material like silicone to hold onto water in a way that is useful.

Doing some research, I stumbled into hydrogel materials–complex structures of polymers capable of absorbing water, deformable, and biocompatible/biodegradable. They have wide applications in the medical and biotech fields, and are starting to get introduced into the field of soft robotics. Considering the tongue design discussed in the last blog post, which consists of 6 inflating/deflating chambers, a similar motion could be achieved using an electro-responsive hydrogel (1). By sending current through this material, it could swell, replacing the pneumatic air chambers in the silicone tongue. Switching to a hydrogel would allow the tongue to hold water in a manner hopefully similar to the sponge used in the previous iteration.

Furthermore, adding a microfiber layer on top of the hydrogel tongue could reduce the amount of water that evaporates from the tongue (2). Reducing evaporation may allow the tongue to perform more licks between rewettings and therefore allow for a more fluid, organic motion. This approach does increase friction, though, and may have an adverse effect on the tongue’s ability to wet the lollipop on consecutive licks.

There are some concerns about the structural integrity of hydrogels with regards to structural applications (1), and the extent to which this will be an issue for this particular project might require some testing. It is somewhat unclear to me where the line is drawn between structural and nonstructural applications.

With regards to manufacturing these materials, two possible approaches seem feasible. First, hydrogels can be directly 3D-printed using a Direct Ink Writing printer (similar in concept to a standard fused deposition modelling printer, but using liquid that dries rather than a melted solid that returns to room temperature). Mixing a carbomer solvent with the base materials for a hydrogel can create an ink capable of being 3D-printed without the use of complex, and sometimes problematic, support structures (3). Second, and likely even more feasible, is the ability to create viable hydrogels via casting in a rigid mold (2). Molds for this purpose can be produced in easily accessible materials such as PLA plastic (meaning molds can be 3D-printed). Constituent ingredients for a hydrogel can then be poured into the rigid mold removed after being properly cured.

Cited Sources:

  1. Hritwick Banerjee, Suhail Mohamed, Hongliang Ren. “Hydrogel Actuators and Sensors for Biomedical Soft Robots: Brief Overview with Impending Challenges.” Biomimetics, Volume 3, Issue 3. September 2018. doi: 10.3390/biomimetics3030015.
  2. Shuma Kanai, Yosuke Watanabe, MD Nahin Islam Shiblee, Ajit Khosla, Jun Ogawa, Masaru Kawakami, Hidemitsu Furukawa. “Skin-Mimic Hydrogel Materials with Water-Perspiration Control for Soft Robots Developed by 3D Printing.” ECS Transactions, Volume 98, Number 13, Pages 23-27. September 2020. doi: 10.1149/09813.0023ecst.
  3. Zhe Chen, Donghao Zhao, Binhong Liu, Guodong Nian, Xiaokeng Li, Jun Yin, Shaoxing Qu, Wei Yang. “3D Printing of Multifunctional Hydrogels.” Advanced Functional Materials, Volume 29, Issue 20, Pages 1900971. 2019. doi: 10.1002/adfm.201900971.

Additionally Consulted Sources:

  1. Hritweick Banerjee, Hongliang Ren. “Optimized Double-Network Hydrogel for Biomedical Soft Robots.” Soft Robotics, Volume 4, Number 3. 1 September 2017. doi: 10.1089/soro.2016.0059.
  2. Yin Cheng, Kwok Hoe Chan, Xiao-Qiao Wang, Tianpeng Ding, Tongtao Li, Xin Lu, Ghim Wei Ho. “Direct-Ink-Write 3D Printing of Hydrogels into Biomimetic Soft Robots.” ACS Nano, Volume 13, Issue 11, Pages 13176-13184. 26 November 2019. doi: 10.1021/acsnano.9b06144.
  3. Xuanming Lu, Weiliang Xu, Xiaoning Li. “PneuNet Based Control System for Soft Robotic Tongue.” IEEE 14th International Workshop on Advanced Motion Control, Pages 353-357. 2016. doi: 10.1109/AMC.2016.7496375.

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2507/research-part-b/feed/ 0
Exercise 8: Research Study B https://courses.ideate.cmu.edu/16-480/s2021/2508/exercise-8-research-study-b/ https://courses.ideate.cmu.edu/16-480/s2021/2508/exercise-8-research-study-b/#respond Mon, 01 Mar 2021 05:12:41 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2508 Continue reading Exercise 8: Research Study B ]]> Themes: Design and fabrication of soft textile based sensors

Inspired by existing artworks of responsive and interactive spaces, I want to explore making soft interfaces with textile based materials and sensors. Instead of building the entire environment, I plan on creating smaller scale prototypes (could be in the form of swatches or interactive soft objects) that use conductive fabric and thread to produce soft sensors. Primary choice of sensors includes capacitive touch sensor and pressure sensor. For the input and output of the interface, users will be able to interact with the sensors and trigger sounds as output. 

From previous assignments:

  1. R. Sun, R. Onose, M. Dunne, A. Ling, A. Denham, and H.-L. (Cindy) Kao, “Weaving a Second Skin: Exploring Opportunities for Crafting On-Skin Interfaces Through Weaving,” in Proceedings of the 2020 ACM Designing Interactive Systems Conference, New York, NY, USA: Association for Computing Machinery, 2020, pp. 365–377.
  2. S. H. Yoon et al., “iSoft: A Customizable Soft Sensor with Real-time Continuous Contact and Stretching Sensing,” in Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, New York, NY, USA, Oct. 2017, pp. 665–678, doi: 10.1145/3126594.3126654. (touch and stretch)

Additional references:

  1. A. Vogl, P. Parzer, T. Babic, J. Leong, A. Olwal, and M. Haller, “StretchEBand: Enabling Fabric-based Interactions through Rapid Fabrication of Textile Stretch Sensors,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, May 2017, pp. 2617–2627, doi: 10.1145/3025453.3025938. (stretch, machine stitching)
  2. F. Heller, S. Ivanov, C. Wacharamanotham, and J. Borchers, “FabriTouch: exploring flexible touch input on textiles,” in Proceedings of the 2014 ACM International Symposium on Wearable Computers, New York, NY, USA, Sep. 2014, pp. 59–62, doi: 10.1145/2634317.2634345. (touch based)
  3. R. Aigner, A. Pointner, T. Preindl, P. Parzer, and M. Haller, “Embroidered Resistive Pressure Sensors: A Novel Approach for Textile Interfaces,” in Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, Apr. 2020, pp. 1–13, doi: 10.1145/3313831.3376305. (touch based, machine embroidery)
  4. S. Mlakar and M. Haller, “Design Investigation of Embroidered Interactive Elements on Non-Wearable Textile Interfaces,” in Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, Apr. 2020, pp. 1–10, doi: 10.1145/3313831.3376692. (design, touch based, stitching)
]]>
https://courses.ideate.cmu.edu/16-480/s2021/2508/exercise-8-research-study-b/feed/ 0
Exercise 7: Research Study Part A https://courses.ideate.cmu.edu/16-480/s2021/2477/exercise-7-research-study-part-a/ https://courses.ideate.cmu.edu/16-480/s2021/2477/exercise-7-research-study-part-a/#respond Wed, 24 Feb 2021 10:43:41 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2477 Continue reading Exercise 7: Research Study Part A ]]>
teamLab

The question that emerges from my last assignment is why should we design responsive and interactive spaces, and how might we create these types of designs? Man-made environments have long stayed static and dominated by the visual elements. Integrating interactive components can change the passive relationship between people and space to an active one. Large scale interactive installations stand at the intersection of art, architecture and technology, and the purpose behind can be divided into for creating artistic expressions and functional outcomes. Art projects like the digitized natural environments created by teamLab or Voice Tunnel by Rafael Lozano Hemmer, both mentioned about removing the boundaries between artworks and the audience, co-creation, and reshaping the narrative of the space.

Projects that emphasize the functionality of interactive space discuss the possibilities of expanding tangible user interface to larger scales. A branch of this direction is interactive environments for educational purposes. Multisensorial environments like the Magika room and Social Sensory Architectures both look into how to design interactive spaces that integrate play and inclusiveness for autistic children.

I would like to research about the functionality of interactive space, but also borrow the concept of co-creation and playfulness from artistic projects. Soft material and technology could be a way of making technology approachable. By using soft sensors and actuators, it is possible to make interfaces that embody a more flexible presentation. 

Papers on soft sensors and actuators:

  1. S. Nakamaru, R. Nakayama, R. Niiyama, and Y. Kakehi, “FoamSense: Design of Three Dimensional Soft Sensors with Porous Materials,” in Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, New York, NY, USA, Oct. 2017, pp. 437–447, doi: 10.1145/3126594.3126666. 
  2. S. H. Yoon et al., “iSoft: A Customizable Soft Sensor with Real-time Continuous Contact and Stretching Sensing,” in Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, New York, NY, USA, Oct. 2017, pp. 665–678, doi: 10.1145/3126594.3126654.
  3. L. Albaugh, S. Hudson, and L. Yao, “Digital Fabrication of Soft Actuated Objects by Machine Knitting,” in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, May 2019, pp. 1–13, doi: 10.1145/3290605.3300414.
  4. J. Forman, T. Tabb, Y. Do, M.-H. Yeh, A. Galvin, and L. Yao, “ModiFiber: Two-Way Morphing Soft Thread Actuators for Tangible Interaction,” in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, May 2019, pp. 1–11, doi: 10.1145/3290605.3300890.
]]>
https://courses.ideate.cmu.edu/16-480/s2021/2477/exercise-7-research-study-part-a/feed/ 0
Research Study: Revisiting a Previous Project https://courses.ideate.cmu.edu/16-480/s2021/2462/research-study-revisiting-a-previous-project/ https://courses.ideate.cmu.edu/16-480/s2021/2462/research-study-revisiting-a-previous-project/#respond Wed, 24 Feb 2021 00:53:12 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2462 Continue reading Research Study: Revisiting a Previous Project ]]> Two years ago, with another student, I built a robot to lick lollipops. The robot functions, but I think more can be done with the concept; the piece consistently elicits references to the 1970 Tootsie Pop commercial (the one that posits the question about how many licks it takes…) from those I’ve shown it too, although it was never intended as a reference to the question. Creating a new piece with a tongue more mimetic of an actual tongue (compared to the sponge used in the original), a lick counter, and potentially an [initially hidden] ability to bite the lollipop before licking to the center, could turn this largely meaningless robot into a piece of art that toys with human curiosity, satisfyingly references an influential pop culture artifact, and includes elements that illicit surprise or frustration in viewers.

This piece primarily bases its inspiration in two aforementioned sources: my former lollipop licking robot (1) and the 1970 Tootsie Pop commercial (2). A compelling presentation of the robot in video format could also take influence from the format of a show like Mythbusters (3), with its habit of over-engineering solutions to answer largely scientifically unimportant questions in the popular consciousness. Lastly, the piece would likely take an appearance of disembodied, artificial body parts housed in a sterile, inorganic chassis, similar in aesthetic to work such as Simone Giertz’s Musical Instrument Made from Teeth (4).

This new iteration of the licking robot might integrate a simplified version of Lu’s, Xi’s, and Li’s soft robotic tongue (5) from the first homework assignment. The ability to incorporate biomimetic movement into the piece would be necessary in visually telling the full story. Furthermore, an artificial jaw would be implemented in the piece to give the biting action somewhat reasonable articulation. This jaw might be inspired by the mastication robot by Lee, Kim, Chun, and Park (6).

  1. Paul Park, Sebastian Carpenter. “Lollipop Licker.” Computing for Creative Practice, F18, Prof. Golan Levin, Carnegie Mellon University. November 2018. url: https://youtu.be/DW7IsEc858w.
  2. “How Many Licks.” Tootsie Roll Industries. 1970. url: https://youtu.be/O6rHeD5x2tI.
  3. Peter Rees. Mythbusters. Discovery Channel. 23 January 2003 – 28 February 2018.
  4. Simone Giertz. “Building a Musical Instrument Out of Teeth.” 16 September 2020. url: https://youtu.be/83yXCMHGr9A.
  5. X. Lu, W. Xu, X. Li. “A Soft Robotic Tongue—Mechatronic Design and Surface Reconstruction.” IEEE/ASME Transactions on Mechatronics, Volume 22, Number 5, Pages 2102-2110. 2017. doi: 10.1109/TMECH.2017.2748606.
  6. Seung-Ju Lee, Bum-Keun Kim, Yong-Gi Chun, Dong-June Park. “Design of Mastication Robot with Life-Sized Linear Actuator of Human Muscle and Load Cells for Measuring Force Distribution on Teeth.” Mechatronics, Volume 51, Pages 127-136. 2018. doi: 10.1016/j.mechatronics.2017.11.013.

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2462/research-study-revisiting-a-previous-project/feed/ 0
Exercise 6 https://courses.ideate.cmu.edu/16-480/s2021/2445/exercise-6/ https://courses.ideate.cmu.edu/16-480/s2021/2445/exercise-6/#respond Mon, 22 Feb 2021 05:12:44 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2445 Continue reading Exercise 6 ]]>

Lumen is an immersive interactive installation created by Jenny Sabin Studio. It was exhibited at MoMA PS1 as the winner of the Young Architects Program in 2017. The digitally knitted structure uses responsive textile that can emit glowing color at night by absorbing sunlight throughout the day. The installation was created with the concept of being “socially and environmentally responsive”, and aimed for a collective immersive experience for the visitors. 

I think this piece has great potential of implementing soft technology. Starting with the why questions, since the installation has already experimented with solar active fiber, to explore interactivity with soft sensors would be suitable as well. Also, the YAP installations are used as settings for the PS1 Warm Up music performances. Connecting knitted or woven sensors with collaborative music creation could be a possible application for soft technology.

Possible technology:

Rebecca Stewart and Sophie Skach. 2017. Initial Investigations into Characterizing DIY E-Textile Stretch Sensors. In Proceedings of the 4th International Conference on Movement Computing (MOCO ’17). Association for Computing Machinery, New York, NY, USA, Article 1, 1–4. DOI:https://doi.org/10.1145/3077981.3078043 

Paper: Ruojia Sun, Ryosuke Onose, Margaret Dunne, Andrea Ling, Amanda Denham, and Hsin-Liu (Cindy) Kao. 2020. Weaving a Second Skin: Exploring Opportunities for Crafting On-Skin Interfaces Through Weaving. Proceedings of the 2020 ACM Designing Interactive Systems Conference. Association for Computing Machinery, New York, NY, USA, 365–377. DOI:https://doi.org/10.1145/3357236.3395548 

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2445/exercise-6/feed/ 0
Exercise 5 https://courses.ideate.cmu.edu/16-480/s2021/2420/exercise-5/ https://courses.ideate.cmu.edu/16-480/s2021/2420/exercise-5/#respond Wed, 17 Feb 2021 07:57:33 +0000 https://courses.ideate.cmu.edu/16-480/s2021/?p=2420 Continue reading Exercise 5 ]]>

Phase In, Phase Out is an artwork created by EJTECH studio. Working with electronic textile and experimental interfaces, EJTECH connected materiality and sounds through making textile into a multichannel electroacoustic transducer. 

This artwork is related to Judit Eszter Kárpáti’s PhD research Soft Interfaces – Crossmodal Textile Interactions.

Artwork: Phase In, Phase Out. Sound installation / Performance. Horizont Gallery, Budapest, 2019 – ongoing. URL: https://ejtech.studio/Phase-In-Phase-Out 

Research: Judit Eszter Kárpáti. “Soft Interfaces – Crossmodal Textile Interactions,” PhD diss., (Moholy-Nagy University of Art and Design, 2018.) https://corvina.mome.hu/dsr/access/735c7f88-8d75-4506-b5c3-5ff149524f54

]]>
https://courses.ideate.cmu.edu/16-480/s2021/2420/exercise-5/feed/ 0