Looking Outwards 03

I’m not sure if my chosen project will fit exactly into the brief, but I’ve found its recency and its potential to be quite relevant. Cornell recently (4 days ago, to be exact) published an article on the world’s first 3D printed home. An industrial-sized 3D printer is used to pour layers of concrete in toothpaste-like rows. Each row builds upon one another to result in a fully inhabitable two-story home. Apart from the sparing of human labour, this hybrid manufacturing process produces minimal waste, and creates more resilient buildings. The project combines 3D printing technology with more conventional framing methods for the most sound structures. The introduction of this hybrid approach creates great potential for “mass-customized” architectural projects, where advanced fabrication methods strategically combine different materials. These emergent methods will hopefully be able to be scaled up to mixed-use developments and serve as a viable solution to housing shortages.

Looking Outwards: 03

Mingjing Lin’s parametric modeling in terms of fashion is fueled by her curiosity of the human body. Lin has created her own definitions of parametric thinking and design, which both pull inspirations from the human body: she refers to parametric thinking 2.0 to an “emphasis on the awareness of the human body,” where she creates parametric design in terms of it. Lin’s body-oriented parametric design is inspirational to me because it promotes inclusivity through an abnormal medium. The dynamic body, compared with the fluidity that is parametric modeling creates a new approach for fashion. As the industry is evolving to include body types that are beyond the standard, there is a greater need for more inclusive fashion. Through parametric modeling, fashion can be much more accessible to the masses.

Project 03

This is my project of the moon phases!

sketch
// Natalie Koch
// nataliek
// Section A

// The Phases of the Moon

var diam = 300
var value = 255
var canvasX = 600
var canvasY = 450
function setup() {
    createCanvas(canvasX, canvasY);

}

function draw() {
    background(0);
    mouseX = constrain(mouseX,25,575) //motion constraints
    mouseY = constrain(mouseY,225,225) //motion constraints
    if (width/2-mouseX >= 0) {
        diam = mouseX
        value = (mouseX/canvasX)*255
    } else if (width/2-mouseX < 0) { //size variations as moon moves
        diam = width-mouseX}
        value = (mouseX/canvasX)*255
    fill(255,255,value)
    ellipse(mouseX,mouseY,diam,diam) //moon
    ellipse(50,30,10,10)
    ellipse(100,50,10,10)
    ellipse(120,70,10,10)
    ellipse(160,30,10,10)
    ellipse(200,60,10,10)
    ellipse(250,30,10,10) // } stars
    ellipse(300,40,10,10)
    ellipse(350,20,10,10)
    ellipse(400,50,10,10)
    ellipse(450,30,10,10)
    ellipse(500,60,10,10)
    ellipse(550,30,10,10)
    fill(0)
    ellipse(-75,height/2,400,400) //black circles on sides so moon can crescent
    ellipse(675,height/2,400,400)
}

LO 03: Reverberating Across the Divide by Madeline Gannon

This project revolves around usage of real time context scanning as data input and converting it to physical objects through chronomorphologic modeling. The most interesting part is the integration of real time scanning, computational generative modelling, and rapid prototyping tools to realize the design.  The complexity of the object can be controlled using parameters in the computational modelling software which in turn provides control over the local as well as global model.

The final form reverberates the sensitivity between the context, computer generated model as well as the fabricated physical object, true to the title.  

Reverberating Across the Divide: Digital Design Meets Physical Context from Madeline Gannon on Vimeo.

Looking Outwards 03

For this assignment, I was reading about “Robots in Architecture.” I thought the crossover between these two disciplines was really interesting, specifically because even though I don’t know much about them, I find robots fascinating. In this project, robots are used to help construct different architectural works. I don’t know much about the algorithms, but coding and engineering had to play a key role in the functioning success of these robots. The website says that KUKA|prc is a program that simulates the different positions of the robot, also citing a wide range of algorithms (such as Grasshopper) that these coders can build upon to make sure the robot functions smoothly. The creators had to have a very specific vision in mind for the buildings, and then an even greater handle on the coding needed for the robots to successfully carry it out. This is a great example of how STEM and the arts can intersect to form something powerful and innovative, and I would love to watch a project be carried out like this first hand.

Article link:
https://www.robotsinarchitecture.org/kukaprc

Watch the robot at work here:

Project 03 – Dynamic Drawing

  • move the mouse up and down to see changes in color
  • move the mouse left and right to change spacing between the squares
sketch
let angle = 0;
var size = 100

function setup() {
    createCanvas(700, 500);
    background(220);
    text("p5.js vers 0.9.0 test.", 10, 15);
}

function draw() {
    background(50);

    //circle in the center
    push();
        translate(width/2,height/2);
            rotate(angle);
            fill(mouseX,43,172);
            stroke('black');
            strokeWeight(5);
        rectMode(CENTER);
            square(0,0,300);
        pop();

    //inner ring of circles
        for(let a=0; a<radians(360); a+=radians(30)){
            
            push();
            translate(width/2,height/2);
            rotate(a);
            translate(0,size);
                rotate(angle);
            rectMode(CENTER);
            blendMode(MULTIPLY);
            fill(mouseY,43,100);
            square(0,0,200);
            pop();
        }

    //outer row of circles
        for(let a=0; a<radians(360); a+=radians(30)){
            
            push();
            translate(width/2,height/2);
            rotate(a);
            translate(0,size+100);
                rotate(angle);
            rectMode(CENTER);
            blendMode(MULTIPLY);
            fill(mouseY,43,172);
            square(0,0,200);
            pop();
        }

        //makes it so that the squares don't disappear when mouseX goes to 700
        size = max(mouseX/2, 100);



    angle += radians(1);

}

Blog-03

Designer Hasan Ragab creates immersive digital art using an AI text-to-image generator for Parametric Architecture. The model is called Midjourney and is hosted on a Discord server. After inputting responses to prompts, the AI bot will produce four variations of the result, then you can make more variations out of the existing variations. Hasan Ragab implements his Egyptian heritage into his work, creating futuristic Egyptian temples and reimagining Gaudi-esque buildings. Ideas are processed through a “parametric copy-paste” and new possibilities come with each variation. A perk of Midjourney is that it prioritizes more artistic than realistic results. Taking less than a minute, Midjourney is also very accessible and efficient, as easy as playing a game on your phone. A community of architects and designers has also formed surrounding the model to push the boundaries. Hasan Ragab views AI tools as a new path in architecture. AI tools will have to shift our views on creativity, whether it is for better or worse if up for debate.

Project 3 – Dynamic Drawing

Controls:
– upper right corner = scale up
– lower left corner = scale down
– up & down = triangles move to the right
– click & hold on triangle = rotate
– hover over triangle = fade to white

Originally I wanted to make a field of triangles, where the triangle & neighboring triangles would flip out into a ripple. Initial generation of the triangular grid was easy, but when I wanted to proceed with more complicated maneuvers, I found my self going back to and rewriting my code so it was more structured and centralizing all the important parameters I wanted my operations to depend on.


In particular, I really struggled with the rotation implementation. P5’s scale operation acts as a blanket technique, operating on all the geometries called after it, whereas I wanted triangles to rotate individually, meaning I had to parse through every single triangle individually to find the one I wanted to rotate. When rotating, wiggling the mouse causes the rotation to flicker, as the file is looping through draw to update the new mouse position, so keeping the mouse still when rotating pieces is advised. The rest were relatively straightforward to implement.

sketch
// Tsz Wing Clover Chau
// Section E


function setup() {
    createCanvas(600, 600);
    background(220);
    text("p5.js vers 0.9.0 test.", 10, 15);
    frameRate(200);
}

var init = true;

var row = 0;
var col = 0;
var selCol = 0;
var selRow = 0;

var offset = 30;

var n = 3;
var l = 0;


var leftX = 0;
var rightX = 0;
var ptX = 0;
var leftY = 0;
var rightY = 0;
var ptY = 0;

var mX = 0;
var mY = 0;

var wait = 70;

var sizeCounter = 1;


let neighC = [255, 255, 255];

let triList = [];


function area(x1, y1, x2, y2, x3, y3) {
    return abs(((x1*(y2-y3)) + (x2*(y3-y1))+ (x3*(y1-y2)))/2);
}

function inbounds(x1, y1, x2, y2, x3, y3, mouseX, mouseY){
    let A = area(x1, y1, x2, y2, x3, y3);
    let A1 = area(mouseX, mouseY, x2, y2, x3, y3);
    let A2 = area(x1, y1, mouseX, mouseY, x3, y3);
    let A3 = area(x1, y1, x2, y2, mouseX, mouseY);
    return (A == A1+A2+A3);
}



function draw() {
    noStroke();
    background(220);
    
    l = (width-(offset*(n-1)))/n;
    var h = sqrt(3)/2 * l; 

    // init triangle grid
    if (init){
        if (row %4 == 0){
            leftX = (col*(l+offset)) + offset;
            rightX = leftX + l;
            ptX = leftX + l/2;

            leftY = (row/2*(h+offset/2)) + offset;
            rightY = leftY;
            ptY =  leftY + h;

            midY = leftY + h/2;


        } else if (row %4 == 2) {
            leftX = ((col-0.5)*(l+offset)) + offset;
            rightX = leftX + l;
            ptX = leftX + l/2;


            leftY = ((row/2)*(h+offset/2)) + offset;
            rightY = leftY;
            ptY =  leftY + h;

            midY = leftY + h/2;

        } else if (row %4 == 3) {
            leftX = ((col - 0.5)*(l+offset)) + l/2 + offset*(3/2);
            rightX = leftX + l;
            ptX = leftX + l/2;

            leftY = (int(row/2)*(h+offset/2)) + h + offset;
            rightY = leftY;
            ptY = leftY - h; 

            midY = leftY - h/2;


        } else {
            leftX = ((col-1)*(l+offset)) + l/2 + offset*(3/2);
            rightX = leftX + l;
            ptX = leftX + l/2;

            leftY = (int(row/2)*(h+offset/2)) + h + offset;
            rightY = leftY;
            ptY = leftY - h;  

            midY = leftY - h/2;
        }
        midX = ptX;

        var cShift = false;
        var a = 90;
        let selC = [0, 0, 0];

        append(triList, [leftX, leftY, rightX, rightY, ptX, ptY, selC, cShift,
                         a, midX, midY, row, col]);

    } else {

        //controlling SCALE
        if (wait == 0 & mouseX < width/8 && mouseY < height/8){
            sizeCounter += 0.01;
        } else if (sizeCounter > 1 & mouseX > width/4 && mouseY > height*3/4){
            sizeCounter -= 0.01;
            }
        scale(sizeCounter);
        
        

        for (let i = 0; i< triList.length; i++) {
            mX = mouseX;
            mY = mouseY;
            elem = triList[i];


            fill(elem[6]);

            //controlling POSITION
            if ((mouseY > elem[2] & mouseY > elem[5]) ||
                (mouseY > elem[2] && mouseY < elem[5])){
                    elem[0] -= 0.5;
                    elem[2] -= 0.5;
                    elem[4] -= 0.5;
                    elem[9] -= 0.5;
                }

            if (inbounds(elem[0], elem[1], elem[2], elem[3], 
                         elem[4], elem[5], mX, mY)) {
                elem[7] = true;

                //controlling SHADE
                if (elem[7]){
                    for (let i = 0; i< 3; i++){
                        elem[6][i] += 10;
                    }
                }
                // controlling ROTATION  (don't move mouse during rotation - it flickers)
                if (mouseIsPressed){
                    push();
                    fill(255, 255, 255);
                    translate( elem[9], elem[10]);
                    rotate(radians(elem[8]), [elem[9], elem[10], 0]);
                    translate(-elem[9], -elem[10]);
                    triangle(elem[0], elem[1], elem[2], elem[3], elem[4], elem[5]);
                    pop();

                    elem[8] += 3;
                }
            } else {
                elem[7] = false;
            }
            if (elem[8] == 90){
                triangle(elem[0], elem[1], elem[2], elem[3], elem[4], elem[5]);
            }
        }
    }


    if (col > n){
        row ++;
        col = 0;
        if (rightY +h > height) {
        init = false;
        }
    } else {
        col ++;
    }

    if (wait > 0){
        wait --;
    }
}

Looking Outwards 03: Computational Fabrication

Aguahoja (1 – 3)
Contributors: Neri Oxman & MIT Mediated Matter Group
2018 – 2021

Physical installation at MIT Media Lab in February 2018

The project I have chosen is the Aguahoja collection (Aguahoja 1 & Aguahoja 2) by the Mediated Matter Group at MIT which are a series of pavilions and associated artifacts made from digitally fabricated biodegradable composites (ie. a cellulose-chitosan-pectin-calcium carbonate compound.
After use, these ‘structural skins’ can be programmed to degrade in water in a process called “environmental programming”.

Prototypes at MIT Media Lab in February 2018

What I admire is how ‘self-sufficient’ the project is and how integrally the ‘cradle-to-cradle’ aspect of its design concept were incorporated. The lack of need for a secondary load bearing structure conventionally needed for stability and form is forgone in favour of a material with these features embedded into the molecular structure, giving an unique elegance. I also appreciate the rigor that the researches put into development of their ‘library’ of functional biopolymers, allowing them to develop an extensive range of biocomposites that can respond to different stimuli (temp / heat, humidity, light, etc.).

According to the paper “Flow-based Fabrication: An integrated computational workflow for design and digital additive manufacturing of multifunctional heterogeneously structured objects”, published by the Mediated Matter Group in 2020, their work flow “encodes for, and integrates domain-specific meta-data relating to local, regional and global feature resolution of heterogeneous material organisations.” I interpreted it to mean that they had a mesh-free geometric primitive onto which they associated the material properties and variable flow rates of various water-based materials, and then tested/ demonstrated the physical properties of these simulations with a robot arm and multi-syringe multi nozzle deposition system. This in context of another published paper “Designing a Tree: Fabrication Informed Digital Design and Fabrication of Hierarchical Structures” implies that the biomolecules of these biomaterials are deliberately chosen to “maximize desired basic-to-acidic and hydrophobic-to-hydrophilic transitions”, while “decay maps” show the degradation of the material over time in relation to various environmental factors. Clearly, a lot of categorization and material mapping occurs on the nano scale, followed most likely by an optimization algorithm (potentially a Machine Learning program) that determines an optimal molecular organisation to be fed into robot arm & multi nozzle deposition system depending on the environmental parameters encoded for.

Robot arm depositing composite material fibers

Arguably, this is a manifestation of the “Form Follows Function” creed where an artifact/ structure’s physical form emerges as result of the functions it has to serve. This is supported by the quote “The Aguahoja 1 platform is… where shape and material composition are directly informed by physical properties (eg. stiffness and opacity), environmental conditions (eg. load, temperature, and relative humidity) and fabrication constraints (eg. degrees-of-freedom, arm, speed, and nozzle pressure), among others.” Thus, it is inferred that any artistic sensibilities exhibited by the work were encoded via the prioritization and curation of specific environmental & physical factors.

Links:
https://www.media.mit.edu/projects/aguahoja/overview/
https://www.media.mit.edu/projects/aguahoja-iii/overview/
https://web.archive.org/web/20211015194534/https://mediatedmattergroup.com/publications-2-1/2018/10/16/designing-a-tree-fabrication-informed-digital-design-and-fabrication-of-hierarchical-structures
https://web.archive.org/web/20211015184725/https://mediatedmattergroup.com/publications-1/2018/10/7/flow-based-fabrication-an-integrated-computational-workflow-for-design-and-digital-additive-manufacturing-of-multifunctional-heterogeneously-structured-objects

Project – 02-Variable Face

sketch
// Emily Franco
// efranco
// Section C

//color only randomized when page is loaded
var x = 0;

//-----SLIDER VARS-----
//stores latest mouseX position for slider
var xPos;
//stores past x positions
var pastXPos=0;
//bar height
var barH = 20;
//bar height
var barWidth = 10;
//tracks if mouse has been pressed
var pressed = 0;

//-----DEFAULT FACE VARS----
var eyeWidth = 16;
var eyeHeight = 24;

function setup() {
    createCanvas(640, 480);
    background(220);
    text("p5.js vers 0.9.0 test.", 10, 15);
}

function draw() {
	//reference for position of face elements
	var y_ref_pos = width/2.5;
	strokeWeight(0);
	background (138,176,162);	

	//header 
	fill(0);
	textSize (20);
	text ('Slide the arrow to pick a face for me.',10,barH+barWidth+20);

	//----EMOTION METER----
	//meter sliderer mark
	fill("black");
	triangle (((width/5)*2)+(width/10),barH-2,((width/5)*2)+(width/10)-3,barH-7,((width/5)*2)+(width/10)+3,barH-7);
	if (mouseIsPressed){
		//draw over 1st triangle background
		background (138,176,162);	
		triangle (xPos=mouseX,barH-2,mouseX-3,barH-7,mouseX+3,barH-7);
		pressed = 1;
	}else if (pressed==1){
		//draw over 1st triangle background
		background (138,176,162);	
		triangle (xPos,barH-2,xPos-3,barH-7,xPos+3,barH-7);
	}
	
	//meter
	fill (85,180,220); //blue
	//very happy
	rect(0,barH,(width/5),barWidth);
	//happy
	fill(193,230,90); //green
	rect(width/5,barH,(width/5),barWidth);
	//meh...
	fill(225,181,37); //yellow
	rect((width/5)*2,barH,(width/5),barWidth);
	//shock
	fill(252,65,18); //red
	rect((width/5)*3,barH,(width/5),barWidth);
	//angry
	fill(137,5,5); //dark red
	rect((width/5)*4,barH,(width/5),barWidth);

	//--------HAIR-------
	//back hair
	fill (104, 66, 17); //dark brown
	ellipse (width/2, y_ref_pos+28,260,400);

	//--------CLOTHES-------
	fill (220, 96, 46); //orange
	arc((width/2)-32+44,y_ref_pos+158,280,70,Math.PI,0);
	//shirt 
	rect((width/2)-87,y_ref_pos+140,181,180);

	//------DEFALUT FACE-----
	strokeWeight (.25);
	//base ears 
	fill (238, 217, 197); //beige
	ellipse ((width/2)-106,y_ref_pos,32,60);
	ellipse ((width/2)+106,y_ref_pos,32,60);
	//neck 
	fill (238, 217, 197);//beige
	ellipse((width/2)+1, y_ref_pos+130,90,60);
	strokeWeight (0);
	rect((width/2)-44, y_ref_pos+90,90,40);
	//base face
	stroke("black");
	strokeWeight (.5);
	ellipse (width/2,y_ref_pos,200,232);

	if (pressed == 1){
	//nose 
	strokeWeight (0);
	fill (229, 155, 99); //orange
	triangle (width/2,y_ref_pos-20,(width/2)-20,y_ref_pos+40, width/2,y_ref_pos+38);
	}

	//-----EXPRESSIONS----
	//mouse position over emotion meter changes face expression
	//VERY HAPPY
	if (xPos<width/5){
			//outter eye
			strokeWeight (0.25);
			fill (242,239,234); //white
			stroke (58,37,22); //dark brown
			circle ((width/2)-46,y_ref_pos-20,eyeWidth+40);
			circle ((width/2)+46,y_ref_pos-20,eyeWidth+40);
			//eye pupil
			fill (58,37,22);  //dark brown
			circle ((width/2)-46,y_ref_pos-20,eyeWidth+30);
			circle ((width/2)+46,y_ref_pos-20,eyeWidth+30); 
			//eye highlight
			fill (242,239,234); //white
			circle ((width/2)-46,y_ref_pos-20,eyeWidth);
			circle ((width/2)+46,y_ref_pos-20,eyeWidth);
			//eye small highlights
			fill (242,239,234); //white
			ellipse ((width/2)-56,y_ref_pos-30,eyeWidth-10);
			ellipse ((width/2)+56,y_ref_pos-30,eyeWidth-10); 
			//mouth 
			strokeWeight (1);
			stroke("black");
			fill (233, 161, 135); //pink
			arc((width/2)-2,y_ref_pos+55,80,50,0,3.15);
			line ((width/2)+38,y_ref_pos+55,(width/2)-42,y_ref_pos+55);
			//cheeks
			fill (233, 161, 135); //pink
			strokeWeight (0);
			circle((width/2)+54,y_ref_pos+30,40);
			circle((width/2)-60,y_ref_pos+30,40);
		} 
		//HAPPY
		else if (xPos<(width/5)*2 & xPos>=width/5){
			//eyes 
			fill (58,37,22); //dark brown
			ellipse ((width/2)-44,y_ref_pos-20,eyeWidth,eyeHeight);
			ellipse ((width/2)+44,y_ref_pos-20,eyeWidth,eyeHeight);
			//mouth
			strokeWeight (1);
			stroke("black");
			noFill();
			arc((width/2)-2,y_ref_pos+70,20,14,0,3);	
			//cheeks
			fill (233, 161, 135); //pink
			strokeWeight (0);
			circle((width/2)+44,y_ref_pos+30,40);
			circle((width/2)-50,y_ref_pos+30,40);
		} 
		//MEH
		else if (xPos<(width/5)*3 & xPos>=(width/5)*2){
			//mouth 
			strokeWeight (1);
			stroke("black");
			line ((width/2)+40,y_ref_pos+65,(width/2)-40,y_ref_pos+65);
			//cheeks
			fill (233, 161, 135); //pink
			strokeWeight (0);
			circle((width/2)+44,y_ref_pos+30,40);
			circle((width/2)-50,y_ref_pos+30,40);
			//outter eye
			fill (58,37,22); //dark brown
			circle ((width/2)-46,y_ref_pos-20,eyeWidth);
			circle ((width/2)+46,y_ref_pos-20,eyeWidth);
		}
		//SHOCK 
		else if (xPos<(width/5)*4 & xPos>=(width/5)*3){
			//eyes 
			fill (58,37,22); //dark brown
			ellipse ((width/2)-44,y_ref_pos-30,eyeWidth+6,eyeHeight*2);
			ellipse ((width/2)+44,y_ref_pos-30,eyeWidth+6,eyeHeight*2);
			//mouth 
			strokeWeight (1);
			stroke("black");
			fill (233, 161, 135); //pink
			arc((width/2)-2,y_ref_pos+95,40,90,3.15,0);
			line((width/2)+18,y_ref_pos+95,(width/2)-22,y_ref_pos+95);
			//cheeks
			fill (233, 161, 135); //pink
			strokeWeight (0);
			circle((width/2)+60,y_ref_pos+30,40);
			circle((width/2)-60,y_ref_pos+30,40);
		} 
		//ANGRY
		else if (xPos>(width/5)*4){
			//eyes 
			fill (58,37,22); //dark brown
			arc((width/2)-50,y_ref_pos-20,50,25,0,3.15);
			arc((width/2)+50,y_ref_pos-20,50,25,0,3.15);
			//eyebrows
			strokeWeight (3);
			stroke(58,37,22); //dark brown
			line ((width/2)-75,y_ref_pos-35,(width/2)-25,y_ref_pos-25);	
			line ((width/2)+75,y_ref_pos-35,(width/2)+25,y_ref_pos-25);	
			//mouth
			strokeWeight (2);
			stroke("black");
			noFill();
			arc((width/2)-2,y_ref_pos+80,30,40,3.1,0);
		
		}

	//------BODY-----
	//shoulders
	strokeWeight (0);
	fill (238, 217, 197); //beige
	circle((width/2)-120, y_ref_pos+182,80);
	circle((width/2)+126, y_ref_pos+180,80);
	//arms
	rect((width/2)-160,y_ref_pos+180,80,140);
	rect((width/2)+86,y_ref_pos+180,80,140);

	//-----DETAILS----
	//earings
	fill (111, 115, 210); //purple
	square ((width/2)-114,y_ref_pos+30,16); 
	square ((width/2)+100,y_ref_pos+30,16);
	//bangs
	push();
	strokeWeight(0);
	fill (104, 66, 17); //brown 
	rotate (-0.9);
	ellipse (width/2-230, y_ref_pos+140,40,150);
	rotate (1.7);
	ellipse (width/2-5, y_ref_pos-330,40,150);
	pop();
	//hairclip 
	//random color generated in first loop and only changes when page is reloaded
	x=x+30;
	if (x==30) {
		stroke(r=random(200),g=random (200),b=random(200));
	}else{
		stroke(r,g,b);
	}
	strokeWeight(4);
	line(width/2+50,y_ref_pos-60,(width/2)+80,y_ref_pos-80);
	//shirt details
	strokeWeight(8);
	stroke(r,g,b);
	point(width/2, y_ref_pos+200);
	

}