## SooA Kim – Final Project

``````/* SooA Kim
sookim@andrew.cmu.edu
Final Project
Section C
*/

//webcam and image texture variables
var myCaptureDevice;
var img = [];
var frames = [];
var framelist = 0;

//particle variables
var gravity = 0.1;   // downward acceleration
var springy = 0.9; // how much velocity is retained after bounce
var drag = 0.0001;    // drag causes particles to slow down
var np = 50;
var particles = [];

var Image1 = [];
var Image2 = [];
var Image3 = [];

"https://i.imgur.com/qAayMED.png",
"https://i.imgur.com/86biLWe.png",
"https://i.imgur.com/r5UE5kg.png",
"https://i.imgur.com/I1BBRfH.png",
"https://i.imgur.com/D1l3aXi.png",
"https://i.imgur.com/NAHr7yB.png",
"https://i.imgur.com/F0SB9GM.png",
"https://i.imgur.com/B1kvNaM.png",
"https://i.imgur.com/Th9sahL.png",
"https://i.imgur.com/c9cuj2k.png"
]

"https://i.imgur.com/z7BAFyp.png",
"https://i.imgur.com/8Ww7Vuo.png",
"https://i.imgur.com/QFjlLr2.png",
"https://i.imgur.com/0sdtErd.png",
]

"https://i.imgur.com/ZD7IoUP.png"
]

var textureIndex = 0;

for (var z = 0; z < link1.length; z++){
}

for (var z = 0; z < link2.length; z++){
}

for (var z = 0; z < link3.length; z++){
}
}

function setup() {
createCanvas(600, 480);
myCaptureDevice = createCapture(VIDEO);
myCaptureDevice.size(600, 480); // attempt to size the camera.
myCaptureDevice.hide(); // this hides an extra view.
}

function mousePressed(){
textureNew = floor(random(textures.length)); //to prevent from same number(texture) that's selected again,
while(textureNew == textureIndex){
textureNew = floor(random(textures.length));
}
textureIndex = textureNew;
}

function particleStep() {
this.age++;
this.x += this.dx;
this.y += this.dy;
this.dy = this.dy + gravity; // force of gravity
var vs = Math.pow(this.dx, 2) + Math.pow(this.dy, 2);
// d is the ratio of old velocty to new velocity
var d = vs * drag;
d = min(d, 1);
// scale dx and dy to include drag effect
this.dx *= (1 - d);
this.dy *= (1 - d);
}

function makeParticle(px, py, pdx, pdy, imglink) {
p = {x: px, y: py,
dx: pdx, dy: pdy,
age: 0,
step: particleStep,
draw: particleDraw,
image: imglink  //making image as a parameter
}
return p;
}

function particleDraw() {
//tint(255, map(this.age, 0, 30, 200, 0)); //fade out image particles at the end
image(this.image, this.x, this.y);
}

function isColor(c) {
return (c instanceof Array);
}

function draw() {
tint(255, 255);
background(220);
imageMode(CORNER);
myCaptureDevice.loadPixels(); // this must be done on each frame.
image(myCaptureDevice, 0, 0);
framelist += 1;
imageMode(CENTER);
// get the color value of every 20 pixels of webcam output
for (var i = 0; i < width; i += 20) {
for (var j = 0; j < height; j+= 20) {
var currentColor = myCaptureDevice.get(i, j);

var rd = red(currentColor);
var gr = green(currentColor);
var bl = blue(currentColor);
// targetColor: green
if (rd < 90 & gr > 100 && bl < 80){
for (var s = 0; s < textures[textureIndex].length; s++) {
var pImage = makeParticle(i, j, random(-10, 10), random (-10, 10),textures[textureIndex][s]); //replaced position with i and j that tracks green pixels
if (particles.length < 10000) {
particles.push(pImage);
}
}
}
}
}
newParticles = [];
for (var i = 0; i < particles.length; i++) { // for each particle
var p = particles[i];
p.step();
p.draw();
if (p.age < 20) { //to last good amount of particle (due to laggy actions)
newParticles.push(p);
}
}
particles = newParticles;
}

// moving frames
function framePush(){
framelist += 1;
image(frames[framelist % frames.length], 0, 0);
}
``````

Mouse click the screen!

For this final project, I created a variety of sprite textures to apply into a particle effect animation that replaces the green pixels in the webcam screen. The sprite texture changes when the mouse is clicked.  I approached this project with the basis of environmental concern. I wanted to convey a message on what happens with climate change with the imagery of air pollution covered by these sprite texture that imitates dusts and smokes.
I tried to be be playful in terms of this project as well by providing different textures. I started this code by tracking the green pixels first, then I moved onto figuring out how to replace the image texture to the particle effect.

The coding process was very challenging, but I am really happy with the outcome.

sprite textures

first try on getting the sprite texture to the image

## SooA Kim: Project – 12 – Proposal

For my final project, I would like to create a particle effect animations using sprite textures that react to green color pixels in a live capture device. (i.e. webcam) Anything that is green from the webcam screen will be covered with smoke particles. This work is based on daily basis of environmental concern of air pollution that is slowly becoming more vicious, affecting across the world. I want to convey a message, or give full conscious, to the user what happens when there is no green in what you see (from the screen), but full of smoke that can be signified as smog or just the fall of climate change. If smoke effect animations doesn’t work using sprite textures and particles with live webcam, I will try to replace it with some other animation content.

i.e. sprite textures that I’m planning to use

## SooA Kim: Looking Outwards – 11

I’mHyojung Seo is a media artist and an Associate Professor at Samsung Art and Design Institute. She creates interactive performance works through installations and various platforms of communication that allows human to seek media as the new sense, or sixth sense that goes beyond our five senses. The process and methods in the works are a crucial part of her creative practice. She has many interesting works, but I really admire this work called “rock scissors paper game with face” (2014). Using FaceOSC and ofxFaceTracker(byKyle Mcdonald) from Github, she figured out the data of getting user’s expression. Then she made a face version of Rock-Paper-Scissors game, where to represent Paper – you have to open your mouth big, for Scissors – wide lips, and for Rock – bigger face (meaning your face should be closer to the camera). Rock-Paper-Scissors game is known has a game of luck; but she advised that this game is more about the effort because you have to change your face intensely to make the right sign.

## SooA Kim: Project-10-Sonic-Sketch

I made my own audio segments, where if you press “play all” you can hear the music in one piece. Each character is assigned with different musical instruments.

``````/* SooA Kim
Section C
sookim@andrew.cmu.edu
Week 10 Project - Sonic Sketch
*/

var inst1;
var inst2;
var inst3;
var bell;
var img;

var myImageURL = "https://i.imgur.com/kxau9oQ.jpg"

inst1.setVolume(1);
inst2.setVolume(1);
inst3.setVolume(1);

}

function setup() {
// you can change the next 2 lines:
createCanvas(400, 300);

//======== call the following to use sound =========
useSound();

}

function soundSetup() { // setup for audio generation
// you can replace any of this with your own audio code:

}

function draw() {
background(156, 202, 255);
image(img, 0, 0);

//play all instruments
fill(255);
textSize(32);
textAlign(CENTER, CENTER);
textFont("Apple Chancery")
text("Play All", width/2, 250);
}
//plays each instrument when mouse is pressed
function mousePressed() {
if (mouseX > 0 & mouseX < 100 && mouseY > 0 && mouseY < 200) {
inst1.play();
} else {
inst1.pause();
}

if (mouseX > 100 & mouseX < 200 && mouseY > 0 && mouseY < 200) {
inst2.play();
} else {
inst2.pause();
}

if (mouseX > 200 & mouseX < 300 && mouseY > 0 && mouseY < 200) {
inst3.play();
} else {
inst3.pause();
}

if (mouseX > 300 & mouseX < 400 && mouseY > 0 && mouseY < 200) {
bell.play();
} else {
bell.pause();
}

if (mouseX > 0 & mouseX < 400 && mouseY > 200 && mouseY < 300) {
playAll();
}

}

function playAll(){
inst1.play();
inst2.play();
inst3.play();
bell.play();
}
/*
function pauseAll(){
inst1.pause();
inst2.pause();
inst3.pause();
bell.pause();
}
*/

``````

## SooA Kim: Looking Outwards-09

I’m citing on the week 5 of Looking Outwards and the topic was on 3D Computer Graphics. As a person who follows this artist’s work on instagram, I thought Sydney Salamy’s topic on Tyson Ibele’s work would be enthusiastic.

I agree on the peer’s assessment with the psychological play of the imagery in the work. The artist approached the viewer, using realistic texture and simulations to create the animation. Nowadays, most of the 3D animation software programs provide these simulations with scripts encoded. So, it gives more flexibility and options for 3D artist to just apply their objects/polygons to soft body or cloth simulation tag. Some of these software programs, such as Maya, provide a script content window for VFX artist to write their own Python code and generate it. Every time I watch this video post, it gives me chills as if my limbs were getting cut off; pretty similar to the reaction when you see someone getting a paper cut. However, there is this weird, visual pleasure of watching his 3D generative animation post in loop.

## SooA Kim – 09 – Portrait

For this project, I did a portrait of my dad and our two dogs by using text emojis to generate the pixels. It took a lot of time to cover using the texts, so I applied mouseMoved() to generate the photo faster.

``````/* SooA Kim
sookim@andrew.cmu.edu
Section C
Project-09-Portrait
*/

var baseImage;
var dogs = ["♡°▽°♡", "(◕‿◕)♡"];
var brush = ["°˖✧◝(⁰▿⁰)◜✧˖°"];

var myImageURL = "https://i.imgur.com/Jvb8wfq.jpg";
}

function setup() {
createCanvas(baseImage.width, baseImage.height);
imageMode(CENTER);
noStroke();
background(255);
frameRate(5000000);
}

function draw() {
var bx = random(width); // randomize pixels to initialize
var by = random(height);

var cx = constrain(floor(bx), 0, width - 1);
var cy = constrain(floor(by), 0, height-1);

var paintPix = baseImage.get(cx, cy);

noStroke();

fill(paintPix);
textSize(random(12));
text(dogs[0], random(0, width/2), by); //filling the half of the canvas

textSize(random(12));
fill(paintPix);
text(dogs[1], random(width/2, width), by); // filling other half of the canvas

}

//using mouse to generate faster
function mouseMoved(){
var paintPixMouse = baseImage.get(mouseX, mouseY);
fill(paintPixMouse);
textSize(15);
text(brush, mouseX, mouseY);
}

``````

## SooA Kim: Looking Outwards – 08

Béatrice Lartigue is a French new media artist and designer. She is an art director and teacher in Paris. She creates interaction installation works with a group of designers, artists, and anyone who is interested and has technological background, including Lab212 – a Paris based pluridisciplinary art collective. She has skills in architecture, where physical space setting matters and impacts her installation works by bringing poetic and humanistic approach. I admire one of her projects, “Portée, Monumental Unfold of a Music Score, 2014”. This interactive installation piece gives a collective multi-sensorial experience to the audience. Luminous threads or electroluminescent threads are installed with a grand piano inside a church. The visitors are invited to wonder around and experience the qualities of music when they touch and trigger the luminous threads. The music notes/melodies are associated to each thread and every time it vibrates by the interaction of the visitors, it will trigger the electro-mecanic grand piano to play the melody from it. There is a lot to learn from her process of the work; she makes her ideas clear to other collaborators by providing initial concept sketches to blueprint storyboard on how this project will be executed.

## SooA Kim – Looking Outwards – 07

They Rule is a data visualization site where you are able to browse and identify the relationships of the US ruling class. It shows you the boards of the most powerful U.S. corporate companies, such as Google, Yahoo, and Amazon. If you click one of the board members in one of the companies, it will show the connections the individual has to different companies that he/she is involved in. Through the process, the data visualization expands from one to multiple connections of people and companies. It shows you how everything is connected to one another. This information design shows the structure of large corporations and distribution of power in U.S economy.

Website: Theyrule.net

## SooA Kim: Project-07-Curves

For this project, I tried to understand how curves work using cos() and sin(). So I spent time figuring out how to draw hypotrochoid after numerous attempt failed to get other curve forms.

Other forms that appear when you move your mouse:

``````/*SooA Kim
Section C
sookim@andrew.cmu.edu
Project 7: Composition with Curves
*/

var nPoints = 360;
var nPoints1 = nPoints * 10;
var angle = 1;

function setup() {
createCanvas(480, 480);
}

function draw() {
frameRate(24);
var g = map(mouseX, 0, 480, 255, 0); //changing green color
background(255, g, 100);
angle += 10

//replicate hypotrochoid using for loop

for (var px = 0; px < 1000 ; px += 240) {
for (var py = 0; py < 1000; py += 240) {
push();
translate(px, py);
rotate(angle);
Hypotrochoid();
pop();
}
}
drawCurveOne();
}

function Hypotrochoid() {
var r = map(mouseX, 0, 480, 0, 255); //for red color changes
var b = map(mouseY, 0, 480, 0, 255); //for blue

strokeWeight(1.5);
stroke(r, 255, b);
noFill();
beginShape();

var a = map(mouseX, 0, 480, 10, 120);
var b = map(mouseY, 0, 480, 2, 10);
var h = map(mouseY, 0, 480, 40, 120);

for (var i = 0; i < nPoints1; i++) {
t = map(i, 0, nPoints1, 0, TWO_PI);
//Reference from <mathworld.wolfram.com/Hypocycloid>
var x = (a - b) * cos(t) + h * cos((a - b) / b * t);
var y = (a - b) * sin(t) + h * sin((a - b) / b * t);

vertex(x,y);
}

endShape();
}

function drawCurveOne() {
stroke(255);
strokeWeight(1);
noFill();
beginShape();

//increase number of vertices of the curve
var nuV = map(mouseX, 0, width, HALF_PI, TWO_PI);

for (var i = 0; i < nPoints; i+= nuV) {
//applied i to cos() and sin() to see what happens
var x = 180 * cos(i) + width/2;
var y = 180 * sin(i) + height/2;

vertex(x,y);
}
endShape();
}

``````

## SooA Kim: Looking Outwards – 06

Takeshi Murata is an American media artist and one of his works, Monster Movie (2005), uses a digital file that has been datamoshed. In his work, he tends to use video and animation techniques to create this datamosh effect. I enjoy watching the randomness of images or pixels popping out from the result of datamoshing. I believe he was able to achieve this effect by manipulating different kinds of frames, which are I-frames, P-frames, and B-frames. He removes one of those types of frames in this process, where the video player generates the video without realizing that the actual image content has been changed. This results as datamoshing with abstract motion of pixels with other types of frames in the video to continue with the next frame pixels.