Project 05 – Wallpaper

I wanted to make a wallpaper that looked like a stitched flower quilt pattern.

sketch

//SRISHTY BHAVSAR
//15-104 PROJECT 05 
//SECTION C



// COLORS
var w = 255 // white
var lbrown = (196, 164, 132); // light brown

// lengths

var s = 50 //sqare



function setup() {
    createCanvas(200, 20);
    background(194,197,201);
    text("p5.js vers 0.9.0 test.", 10, 15);
}

function draw() {
    createCanvas(600,600);
    background(194,197,201); //light blue


    // RED DIAMOND LOOP
    push();
    translate(300,-300);
    rotate(radians(45)); // rotates squares to be diamonds
    for( var x = 0; x < 1200; x+= s/2) {
        for( var y = 0; y < 1200; y+= s/2){
            reddiamonds(x,y);
        } 
    }

    // FLOWER DIAMOND LOOP
    pop();
    push();
    translate(265,-300);
    rotate(radians(45));
    noFill();
    for( var i = 0; i < 2000; i+= s) {
        for( var j = 0; j < 2000; j+=s) {
            flowerdiamonds(i,j);
            //square(i,j,s);
        } 
    }

    pop();

}

function reddiamonds(x,y) {
    translate(x,y); // origin moves along row
    push();
    stroke(183, 113, 121, 70); // light red
    strokeWeight(2);
    noFill();
    square(x,y,50);
    pop();
    translate(-x,-y); // origin moves along row
}

function flowerdiamonds(i,j) {
    // lacy white dot rim of elipses that trace the diamond
    noFill();
    stroke(w);
    strokeWeight(1);
    translate(i,j);
    // create 4 lacy rims that create a square
    push();
    for (var x = 0; x < 60; x +=10) {
        for(var y = 0; y <10; y += 10) {
            ellipse(x,y, 6, 4);
        }
    }
    rotate(radians(90));
    for (var x = 0; x < 60; x +=10) {
        for(var y = 0; y <10; y += 10) {
            ellipse(x,y, 6, 4);
        }
    }

    translate(0, -50);
    for (var x = 0; x < 60; x +=10) {
        for(var y = 0; y <10; y += 10) {
            ellipse(x,y, 6, 4);
        }
    }
    translate(50,50);
    rotate(radians(-90));
    for (var x = 0; x < 60; x +=10) {
        for(var y = 0; y <10; y += 10) {
            ellipse(x,y, 6, 4);
        }
    }

    pop();

    //FLOWER STEM
    push()
    translate(-4,-30);
    rotate(radians(-40))
    noFill();
    stroke(w);
    strokeWeight(1)
    curve(6, 30, 59, 50, 60, 80, 40, 40);
    pop()


    //FLOWER PETALS
    push()
    strokeWeight(1);
    fill(196, 164, 132); // dark blue
    translate(6,6);
    ellipse(10,18,13,9);
    rotate(radians(72));
    translate(6,-30);
    ellipse(10,18,13,9);
    translate(-1,-67);
    rotate(radians(72));
    ellipse(10,18,13,9);
    rotate(radians(72));
    translate(-23,-71);
    ellipse(10,18,13,9);
    rotate(radians(72));
    translate(-8,-77);
    ellipse(10,18,13,9);
    pop()
    translate(-i,-j);

}

LookingOutwards – 05 Thanos’s Creation. 3D Computer Graphics in Marvel.

Thanos, Created by Studio: Digital Domain

Over the last 12 years of MCU movies being created, Marvel worked with many VFX studios such as Weta Digital, Framestore, and industrial light and magic. Almost every 3D Computer Graphics was used in the films. They used Maya, 3ds Max Modo, in addition to Zbrush and Mudbox for sculpting. To create textured painting works, Mari and Substance Painter. Nuke is used with after effects for compositing 3D projections. 

To create Thanos, Digital Domain worked with Marvel Studios to create effects shots using Masquerade. 513 shots were created by over 340 Digital Domain artists. Masquerade is a facial capture application that is based on computer machine learning algorithms. The system was worked on for 3 to 4 months before filming to develop and test. Masquerade has the ability to capture a high resolution image of an actor’s face at a rate of 40-50 frames per second. 

The actor Josh Brolin who played Thanos. For Digital Domain, it was important for Thanos’s movements to be very organic and realistic. Thus, Mocap cameras were used. The actor Josh wore a Mocap suit and helmet with cameras that had motion capture dots to capture his movements. Digital Domain’s factual capture identified the smallest details such as wrinkles and curvatures of Josh’s face. From here, the animation team could enhance features of the face like eyes, until Josh’s face was transformed into Thanos’s purple face.

This project and artwork interests me because I had no idea that so many programs and machine learning algorithms are used in movies that contain real humans to create fake characters. Rather than going through the struggle of using prosthetics or other costumes to create a villain like Thanos, they were able to create an animated character that can be then utilized throughout the film.

Sources:

https://digitaldomain.com/case-studies/avengers-infinity-war/
https://digitaldomain.com/case-studies/avengers-infinity-war/

https://inspirationtuts.com/what-3d-software-does-marvel-use/

Project 04 – String Art

Sunset at the Beach

//Srishty Bhavsar
//15-104
//Section C
var dx1;
var dy1;
var dx2;
var dy2;
var numLines = 50;

function setup() {
    createCanvas(400, 400);
    background(200);
    text("p5.js vers 0.9.0 test.", 10, 15);
    line(50, 50, 150, 300);
    line(300, 300, 400, 100);
}




function draw() {
    createCanvas(400,300);
    //sky blue background
    background(193,242,254);
        // for loop that initializes =, lines up to 700, and spaced +1
    dx1 = (150-50)/numLines;
    dy1 = (300-50)/numLines;
    dx2 = (350-300)/numLines;
    dy2 = (100-300)/numLines;
    
// faint mountains in background
    var x1 = -100;
    var y1 = -30;
    var x2 = 500;
    var y2 = -200;

    for (var i = 0; i <= numLines; i += 1) {
        x1 += dx1;
        y1 += dy1;
        x2 += dx2;
        y2 += dy2;
        push()
        stroke(167,84,41,40);
        translate(200,150);
        rotate(PI/3); // flipped so they look like mountains
        line(x1/2, y1, x2, y2);
        pop();
        push()
        stroke(167,84,41,30);
        translate(100,150);
        rotate(PI/3);
        line(x1/2, y1, x2, y2);
        pop();
    }


    for (var i = 0; i <= numLines; i += 1) {

        var x1 = 0;
        var x2 = (i*width)/50;
        var y1 = (i*width)/50;
        var y2 = height;

        //Ocean blue waves (light to dark)
        stroke(219,247,253);
        line(x1 + 50, y1 + 100, x2, y2);
        stroke(0,157,196); // brightest blue
        line(x1, y1 + 80, x2, y2);
        line(width + (x2), height - y1, x2, y2);
        stroke(1,90,112); // oceanside blue
        line(width+500, height-y1, x2, y2);
        line(width+500, height-y1, x2 + 100 , y2);
        line(x1, y1 + 150, x2, y2);
        line(x1, y1 + 200, x2, y2);


        //sand
        stroke(246, 240, 210); // dark sand color
        line(x1, height - 20, x2 - 100, height);
        stroke(205, 170, 109); //lighter sand color
        line(x1, height-30, x2 + 10, height);
        stroke(255,255,240); //ivoru sand color
        line(x1, height -10, x2 -100, height);

    }

    //setting sun
     for (var i = 0; i <= 50 + numLines; i += 1) {
        
        // sun sets position above water
        var x1 = 200;
        var y1 = 250;
        stroke(250, 180, 20, 100); // sunny yellow 
        strokeWeight(2)
        push() // stored
        translate(x1, y1);
        rotate((i*PI)/numLines); //center filled circle
        line(0, 0, 0, 100); // rays
        pop() // resets

    }


    noLoop();
}


Srishty’s Looking Outwards: Sound Art

Purform’s White Box, Audio Visual Performance

The project I found is a performance called, “Puform; White Box.” This performance was programmed by Jean Sébastian Rousseau, and Peter Dines. The music is done by Alain Thibault, and visuals are done by Yan Breauleaux. The performance consists of three white rectangle screens, angled together to create a wall, displaying black parametric and geometric visuals transcend, twist, break, and many other transformations. When I first watched this video, the first thing that struck me was the way the audio sounded in different ways wearing earbuds and without. When I wore my airpods, I was able to hear the movement of the sound spatially because of its spatial audio feature. This made me think not only about the computational aspects of the audio within the project, but also the technology we use to interpret it. 

The music of the performanced matched the visuals tightly, creating a surreal and daunting experience. The visual artist correlated sharp breaks with the musician’s staccato notes, and created vibrations and faster tempo, based on the speed of the visuals. The darkness of the exhibit room allowed the transforming visuals to stand out as they were contrasted amongst a white background. 

The main software technique used by programmers is called white box testing. This is a testing technique in which the software’s internal structure, design, and coding are tested to verify input outflow, improve design, usability and testing. White Box is a new software based on an old way of generating A/V compositions in real time and is a new piece in a cycle that began in with Black Box, which exhibits inputs, outputs and transfer functions. Puform uses two layers mixed with their video tapes. Using Quartz composer compositions, the programmers can easily change the relationship between the music and video, as the piece is constructed with a database of clips using Lemur.

Sources: 

Srishty’s Looking Outwards 03 – Computational Fabrication

Eyebeam’s computational dress

In 2015, a startup called “Eyebeam” showed many of its computational fashion pieces at New York fashion week. Computational fashion aims to touch upon many themes such as aesthetic, ergonomics, and intellectual property. What I admire about computational fabrication within fashion, is that it is extremely innovative and predictive of the future. Because traditional garments are made of fabric, they are fluid in nature. Today fluidity has become a popular style in design and architecture. Architects such as Zaha Hadid have been inspired by the fluidity of fashion pieces and reflected fluidity in their architecture. 

However, the three main issues computation fashion desires to fix are flexibility, recharge-ability, and affordability. 3D printing has become increasingly popular for designers when modeling. But one of the biggest downsides of a 3D printed model is that it lacks malleability and flexibility. Designers at the company have found that by printing on different materials, they can manipulate it with interlocking springs to make naturally stiff material, loose like fabric or textile. Designer Bradley Rothenberg prints on nylon, polymers, and sometimes metals. He has used Python for the program Rhino in the past, but now uses C++ to allow himself to create more advanced structures. By increasing and decreasing his code and varying the geometric properties, he can control the material properties better.

Fashion technologies need to work throughout the day, and thus an important factor for computational fashion designers is recharge-ability. Eyebeam’s project director advised against having to plug a garment piece into your smartphone because it is inconvenient. Instead, professor Dan Steingard of Princeton University has been exploring energy options such as body heat, wind up solar, and bendable batteries. The third important factor is affordability. The minimum printing resolution for 3D printing is 500 microns. Because the resolution is not nice enough yet, there will have to be significant investments made in fashion technology.  

Source:

https://www.vice.com/en/article/53wdx3/haute-and-heavy-exploring-the-possibilities-of-computational-fashion

Srishty’s Project 2: Variable Face

sketch
//Srishty Bhavsar
//Section C
//15-104


// GLOBAL VARIABLES 

var headsize = 100;
var eye = 1;
var mouth = 1;
var earring = 1;
var shirtcol = 1;
var hair = 1;

function setup() {
    createCanvas(500, 300);
    background(220);
    text("p5.js vers 0.9.0 test.", 10, 15);
}

function draw() {
    createCanvas(500,300);
    if (mouseY < (width/2)) { // If mouse is in top half of canvas, color is yellow. If in lower half, color is blue
        background(252,239,145);
    } else { 
        background(204, 204, 255);
    }  
    strokeWeight(0);
    triangle(240,180,260,180,250,195); //mouth
    fill(164,116,73); // TAN
    rect(240,208,20,40); //neck

//HEAD CHANGE, HAIR CHANGE, EAR CHANGE 
    if (headsize <= 88 ) { 
        fill(0);
        ellipse(250,280,180,400); // hair     
        fill(164,116,73); // TAN
        rect(240,208,20,40); //neck
        ellipse(200,160,15,25); //ear
        ellipse(300,160,15,25); //ear 2 

        //EARRING CHANGE WITHIN IF      
        if ( earring == 1) {
            fill(255); // white
            ellipse(300,170,3,10); //earrings
            ellipse(200,170,3,10); //earrings
        } else if (earring == 2) {
            fill(255); //pink
            ellipse(300,170,5,5) //earrings
            ellipse(200,170,5,5); //earrings 
        } else {
            fill(207,255,229); //mint
            rect(297, 170, 3, 8); // rect earrings
            rect(200, 170, 3, 8); // rect earrings
        }

        fill(189,154,122); //  skin color
        ellipse(250,150,100,120); // head + ears
        fill(0);
        arc(280, 105, 55, 70, 1, PI + QUARTER_PI, CHORD); //front bang
        rect(270,93,10,23); //bangs
        rect(260,93,10,23); //bangs
        rect(280,101,10,23); //bangs
        rect(289,110,9,23); //bangs
        triangle(200,140,210,103,260,82); // strand
        ellipse(225,133,17,10); // one eye
        fill(189,154,122); // SKIN
        ellipse(225,135,17,10); // skin carve
 
    } else if (headsize >= 88 & headsize <= 100) {
        fill(132, 34, 34); // AUBURN HAIR
        ellipse(250,150,130,150); // TIED UP
        fill(164,116,73); // TAN
        rect(240,208,20,40); //neck
        ellipse(290,160,15,25); //ear adjusted slim
        ellipse(210,160,15,25); //ear 2 adjusted slim

        //EARRING CHANGE WITHIN IF
        if ( earring >= 10 & earring <= 20) {
            fill(255); // white
            ellipse(290,170,3,10); //earrings
            ellipse(210,170,3,10); //earrings
        } else if (earring >= 20 & earring <= 30) {
            fill('pink'); // pink
            ellipse(290,170,5,5); // circle earrings
            ellipse(210,170,5,5); // circle earrings 
        } else {
            fill(207,255,229); //mint 
            rect(290, 170, 3, 8); // rect earrings
            rect(210, 170, 3, 8); // rect earrings
        }

        fill(189,154,122); // skin color
        ellipse(250,150,75,120); // slim head + ears 
        fill(132, 34, 34);
        arc(280, 105, 55, 70, 1, PI + QUARTER_PI, CHORD); //front bang
        rect(270,93,10,23); //bangs
        rect(260,93,10,23); //bangs
        rect(280,101,10,23); //bangs
        rect(289,110,9,23); //bangs
        fill(132, 34, 34); // AUBURN EYEBROWS
        ellipse(225,133,17,10); // one eyebrow
        fill(189,154,122); // SKIN
        ellipse(225,135,17,10); // skin carve

    } else {
        fill(108,25,96); // MAGENTA HAIR
        ellipse(250,280,200,400); // hair
        fill(164,116,73); // TAN
        rect(240,208,20,40); //neck
        ellipse(310,160,15,25); //ear adjusted wide
        ellipse(190,160,15,25); //ear 2 adjusted wide
        fill(63, 20, 20);
        ellipse(310,170,3,10); //earrings
        ellipse(190,170,3,10); //earrings 
        fill(189,154,122);
        ellipse(250,150,120,120); // widest head + wider hair + ears
        fill(108,25,96); // MAGENTA eyebrow
        ellipse(225,133,19,10); // one eyebrow
        ellipse(275,133,19,10); //second eyebrow
        fill(189,154,122); // SKIN
        ellipse(225,135,19,10); // skin carve
        ellipse(275,135,19,10); //skin carve 
        fill(108,25,96);
        triangle(180,140,210,103,255,80); // hair strand
    }   

    if (eye >= 10 & eye <= 20){
        fill(72,60,50); // BROWN 
        ellipse(225,145,10,10); // one eye
        ellipse(275,145,10,10); //second eye 
    } else if (eye >= 20 & eye <= 30) { 
        fill(72,60,50); // BROWN 
        ellipse(225,145,10,10); // one eye
        ellipse(275,145,10,10); //second eye
        fill(189,154,122); // SKIN
        ellipse(225,147,10,10); // one eye SKIN COLOR
        ellipse(275,147,10,10); //second eye SKIN COLOR
    } else if (eye >= 30 & eye <= 40) {
        fill(72,60,50); // BROWN
        ellipse(225,145,10,10); // one eye
        ellipse(275,146,10,10); //second eye
        fill(189,154,122); // SKIN
        ellipse(275,148,10,10); //second eye  
    } else {
        fill(72,60,50); // BROWN 
        ellipse(225,145,10,10); // one eye
        ellipse(275,145,10,10); //second eye

    }

    strokeWeight(0);
    fill(0); 
    stroke(72,60,50); // BLUSH PINK
    fill(242,212,215);
    ellipse(220,170,10,5); //blush1
    ellipse(280,170,10,5); //blush2
    strokeWeight(.5);
    stroke(101,67,33); // DARK BROWN
    line(255,140,255,160); // nose long
    line(245,160,255,160); // nose short
    strokeWeight(0);

// SHIRT COLOR CHANGE 

    if (shirtcol >= 10 & shirtcol <= 20) {
        fill(79, 121, 66); //DARK GREEN
        ellipse(250,330,140,200); //arms
        fill(134,169,111);
        ellipse(250,360,115,280); // body
    } else if (shirtcol >= 20 & shirtcol <= 30) {
        fill(172, 112, 136); //DARK GREEN
        ellipse(250,330,140,200); //arms
        fill(222, 182, 171);
        ellipse(250,360,115,280); // body
    } else if (shirtcol >= 30 & shirtcol <= 40) {
        fill(241, 166, 97); //DARK GREEN
        ellipse(250,330,140,200); //arms
        fill(255, 216, 169);
        ellipse(250,360,115,280); // body    
    } else if (shirtcol >= 40 & shirtcol <= 50) {
        fill(134, 88, 88); //DARK GREEN
        ellipse(250,330,140,200); //arms
        fill(142, 127, 127);
        ellipse(250,360,115,280); // body 
    } else if (shirtcol >= 50 & shirtcol <= 60) {
        fill(110, 133, 183); //DARK GREEN
        ellipse(250,330,140,200); //arms
        fill(196, 215, 224);
        ellipse(250,360,115,280); // body     
    } else { 
        fill(79, 121, 66); //DARK GREEN
        ellipse(250,330,140,200); //arms
        fill(134,169,111);
        ellipse(250,360,115,280); // body
    }

// MOUTH CHANGE

    if (mouth >= 10 & mouth <= 20) {
        fill(255,182,193); // pink
        triangle(240,180,260,180,250,195); // MOUTH HAPPY
    } else if (mouth >= 20 & mouth <= 30) {
        fill(203, 76, 78); // light red
        ellipse(250,180,10,20); // MOUTH SHOCKED
    } else if (mouth >= 30 & mouth <= 40) {
        strokeWeight(1);
        stroke(0); //BLAXK
        line(240,180,260,180) //PLAIN MOUTH
    } else {
        strokeWeight(1);
        stroke(0);
        line(240,180,260,180);
    }

}

function mousePressed() {
    //when the user clicks, the variables of the following features are reassigned to random values within their respective ranges
    
    headsize = random(75,120); // head witdth
    eye = random(10,40); // 4 eye types
    mouth = random(10,40); // 4 mouth types
    earring = random (10,30); // 3 earring types
    shirtcol = random(10,60); // 6 color shirts
} 

L.A. Philharmonic Light Show at the “Walt Disney Concert Hall.”

Srishty Bhavsar

One of the first buildings that caught my attention when I was younger was the Walt Disney Concert Hall by Frank Gehry in downtown Los Angeles. I remembered being taken back by its cluster of large metal winged walls that stood out amongst its surrounding buildings. As I walked by the building, I noticed how whimsical, symphonic, and extravagant it was. Today, I admire how fitting these characteristics are to its function of being a hall for orchestras and bands. The building itself was designed using a C++ software package designed and used by aerospace engineers called the CATIA. Through this software, Gehry was able to achieve impeccable acoustics within the concert hall.

In 2018, the L.A Philharmonic Light show had an installation performance which transformed the facade of the Walt Disney Concert Hall at night. The installation was designed by Refik Anadol and Google Arts and Culture. Made up of deep neural connections, Anadol and Google created a data universe that translated data points from the LA Philharmonic’s digital archives into projections of light and color. The installation was designed with a parametric data sculpture approach where music was sorted into thematic compositions by machine learning algorithms. Inside the concert hall, visitors were able to interact with mirrored walls that showcased the philharmonic’s archives. Anadol’s light show is a great example of how visual generative art combined with audio and a computational structure can encapsulate a visceral and immersive experience.

Sources:

https://en.wikiarquitectura.com/building/walt-disney-concert-hall/

https://www.archdaily.com/902277/s-walt-disney-concert-hall-will-be-lit-by-algorithms-in-dream-like-light-show