jwchou-LookingOutwards-2

This post features the work of an artist/designer right here at CMU! His name is Kyuha Shim, or “Q” for short. He teaches communication design in the School of Design.

Q has done multiple projects based on generative typography. He used custom software to patterns and forms to create and present traditional letterforms/type in interesting and unique ways.

His work on generative type: http://generativetypography.com

Typography is very delicate. It depends on a set of somewhat-flexible rules that govern letterforms, thickness of strokes, and how close characters should be to each other. I assume that in order for Q’s type to look like type, the algorithms he used included many rules and boundaries to ensure that the generated type still contained many of the important characteristics of standard type.

Q’s artistic sensibilities are reflected in the different typefaces, colors, textures, and patterns he used. As an extension, the sensibilities of type designer who might’ve designed a typeface he used also influenced the piece.

This project inspires me because it is still incredibly dynamic and beautiful, even though typography has a reputation for having a lot of intricate rules! If I were to change something, I would’ve loved to see some generative type in a bigger scale. A lot of his projects focus on a word or a short phrase. What would a paragraph look like if it was pushed through his algorithm?

 

jwchou-Project-02-VariableFace

sketch 2

//Jackie Chou
//Section E
//jwchou@andrew.cmu.edu
//Project 2

var eyeSize = 20;
var pupilX = 268;
var pupilY = 124;
var pupilSize = 6.5;
var armRotation = 0;
var MouthWidth = 50;
var MouthHeight = 30;
var RightEyeY = 120;
var mouth = 1;
var mouthExpressions = [1,2];


function setup() {
    createCanvas(640, 480);
}

function draw() {
	//background will change color based on mouseX coordinates
	background(172, 230, 226); // blue
    if(mouseX < (width/3)){
      background(226, 225, 163); // yellow
    }
    if(mouseX > (2/3)*width){
      background(244,189,100); //orange
    }

	noStroke();

	//right eye
	fill(0);
	ellipse(349, RightEyeY, eyeSize, 1.5*eyeSize);

	//head
	fill(43, 132, 212);
	beginShape();
	curveVertex(228, 169);
	curveVertex(228, 169);
	curveVertex(267, 85);
	curveVertex(330, 41);
	curveVertex(355, 78);
	curveVertex(350, 146);

	//torso
	curveVertex(347, 178);

	//legs
	curveVertex(358, 261);
	curveVertex(359, 311);
	curveVertex(367, 369);
	curveVertex(387, 409);
	curveVertex(404, 417);
	curveVertex(417, 437);
	curveVertex(386, 440);
	curveVertex(352, 435);
	curveVertex(343, 422);
	curveVertex(315, 365);

	//pelvis
	curveVertex(296, 337);

	//left leg
	curveVertex(286, 358);
	curveVertex(276, 396);
	curveVertex(273, 402);
	curveVertex(282, 416);
	curveVertex(289, 432);
	curveVertex(272, 438);
	curveVertex(246, 436);
	curveVertex(234, 429);
	curveVertex(236, 378);
	curveVertex(238, 340);
	curveVertex(238, 298);
	curveVertex(238, 298);
	curveVertex(229, 230);
	curveVertex(229, 230);
	endShape();

   
    //left eye
    fill(255)
    ellipse(267, 130, eyeSize, 1.5*eyeSize)
    fill(0);
    ellipse(264, 130, eyeSize, 1.5*eyeSize);
    fill(255);
    ellipse(pupilX, pupilY, pupilSize, pupilSize);


     //white body
    fill(255);
    rotate(0*PI);
    beginShape();
    curveVertex(300, 307);
    curveVertex(265, 232);
    curveVertex(265, 192);
    curveVertex(277, 143);
    curveVertex(292, 104);
    curveVertex(309, 72);
    curveVertex(334, 55);
    curveVertex(338, 57);
    curveVertex(344, 87);
    curveVertex(344, 113);
    curveVertex(342, 149);
    curveVertex(340, 177);
    curveVertex(337, 201);
    curveVertex(340, 227);
    curveVertex(327, 264);
    curveVertex(312, 293);
    endShape();

    //mouth
    fill(0);

    //frown
    if(mouth == 1){
      arc(309, 155, MouthWidth, MouthHeight, PI, 2*PI);
    }
    //smile
    if(mouth == 2){
    	arc(309, 135, MouthWidth, MouthHeight, 0, PI);
    }

    //left arm
    fill(43, 132, 212);
    rotate(armRotation*PI);
    beginShape();
    curveVertex(258, 251);
    curveVertex(251, 280);
    curveVertex(232, 270);
    curveVertex(218, 240);
    curveVertex(201, 218);
    curveVertex(182, 180);
    curveVertex(162, 138);
    curveVertex(152, 120);
    curveVertex(151, 112);
    curveVertex(156, 107);
    curveVertex(162, 106);
    curveVertex(152, 120);
    curveVertex(151, 112);
    curveVertex(156, 107);
    curveVertex(162, 106);
    curveVertex(172, 114);
    curveVertex(183, 126);
    curveVertex(204, 144);
    curveVertex(226, 166);
    curveVertex(256, 196);
    curveVertex(260, 224);
    endShape();

    //right arm
    beginShape();
    curveVertex(340, 118);
    curveVertex(342, 205);
    curveVertex(345, 224);
    curveVertex(345, 256);
    curveVertex(347, 267);
    curveVertex(360, 243);
    curveVertex(366, 234);
    curveVertex(376, 220);
    curveVertex(388, 206);
    curveVertex(398, 191);
    curveVertex(410, 176);
    curveVertex(417, 167);
    curveVertex(430, 145);
    curveVertex(446, 118);
    curveVertex(453, 102);
    curveVertex(447, 96);
    curveVertex(439, 100);
    curveVertex(424, 108);
    curveVertex(409, 122);
    curveVertex(390, 134);
    curveVertex(370, 153);
    curveVertex(340, 188);
    endShape();
}

function mousePressed() {
    // when the user clicks, these variables are reassigned
    // to random values within specified ranges. For example,
    // 'faceWidth' gets a random value between 75 and 150.
    pupilX = random(258, 271);
    pupilY = random(122, 138);
    pupilSize = random(6.5, 15);
    armRotation = random(-0.01, 0);
    MouthWidth = random(15, 50);
    MouthHeight = random(15,40);
    RightEyeY = random(105,135);
    mouth = random(mouthExpressions); //mouth will change from frown to smile

}

For this project, a friend gave me the suggestion that I should recreate left shark, who was took over the internet after Katy Perry’s 2015 Super Bowl Halftime Show.

To plot all the distinct points, I opened an image file in Adobe Illustrator and used the “info” window to find the pixel coordinate for the various different shapes.

Most everything was pretty straightforward, but I did have a hard time getting the arms (fins) to move and the color of the background to change.  In fact, I couldn’t figure out how to make the colors change on click, so I made it change based on mouse movement. I got the fins to rotate, but I need to learn how to change the point of rotation from the origin to another point.

JackieChou-project 01-face

sketch 2

//Jackie Chou
//Section E
//jwchou@andrew.cmu.edu
//Project-01
function setup() {
    createCanvas(600, 585);
    background(251,220,13);
    text("p5.js vers 0.5.12 test.", 10, 15);
}

function draw() {
	

	//neck
	noStroke();
	fill(211,166,161);
	rect(255,400,90,100);

	//head
	fill(224,195,200);
	ellipse(300,300,200,240);

	//hat
	fill(60);
	arc(300,240,170,150,PI,0);
	fill(0);
	ellipse(300,160,20,20);

	//face shape
	fill(251,220,13);
	quad(200,300,250,430,200,400,200,300);
	quad(400,300,350,430,400,400,400,300);

	//ears
	fill(224,195,200);
	ellipse(210,290,50,60);
	ellipse(390,290,50,60);

	//mask
	fill(20);
	rect(210,255,179,65);

	//masksides
	arc(210,287,33,64,HALF_PI,PI+HALF_PI);
	arc(389,287,33,64,PI+HALF_PI,HALF_PI);

	//maskholes
	fill(211,166,161);
	ellipse(263,283,45,35);
	ellipse(343,283,45,35);

	//maskholes
	fill(224,195,200);
	ellipse(258,283,45,35);
	ellipse(338,283,45,35);

	//eyes
	fill(245);
	ellipse(259,281,30,15);
	ellipse(341,281,30,15);
	fill(3);
	ellipse(252,281,10,10);
	ellipse(334,281,10,10);

	//mouth2
	stroke(173,140,137);
	strokeWeight(5);
	line(270,370,330,370);

	//nose
	noStroke();
	fill(211,166,161);
	arc(300,337,30,10,0,PI);


	//nose
    noFill();
	strokeWeight(3);
	stroke(136,104,101);
	noStroke();
	beginShape();
	curveVertex(300,290);
	curveVertex(297,290);
	curveVertex(315,320);
	curveVertex(292,322);
	curveVertex(293,310);
	endShape();
	noStroke();

	//shirt
	fill(50);
	ellipse(300,584,370,270);
	noFill();
	beginShape();
	vertex(70,600);
	vertex(150,515);
	vertex(255,475);
	vertex(345,475);
	vertex(450,515);
	vertex(530,600);
	endShape(CLOSE);

	//shirt-stripes
	fill(210);
	rect(170,500,260,10);
	rect(145,530,300,10);
	rect(135,560,330,10);



}

Robber, 2017
Digital self portrait.
Javascript.

For this project, I went through multiple iterations and phases. The toughest part was learning first how to utilize the different commands to create the shapes I wanted to use. Also, I didn’t learn how to organize my code until late in the process, which made managing my code harder. However, I picked everything up fairly quickly.

I also experimented and iterated how I wanted to represent my nose and my mouth.

I first went from a generic self portrait:

To a robber (I don’t know why, but I felt that it represented me).

JackieChou-LookingOutwards-1

 

eCloud, installed in Terminal B at San Jose International Airport in San Jose, CA.

eCloud

eCloud is an art installation in Terminal B at San Jose International Airport in San Jose, CA. I know this exhibit well because SJC is my home airport, and I flew through it fairly often.

According to the piece’s official website, eCloud was created by Dan Goods, Nik Hafermaas, and Aaron Koblin. It is composed of hundreds of polycarbonate planes that are suspended from the terminal’s ceiling in the rough shape of a cloud. The panels can change transparency, based on real-time weather data from cities around the globe. The effect is of a digital cloud that changes subtly based on each panel’s transparency level.

The installation also includes a large LCD panel installed in the adjacent wall, that shows the weather of the city being represented at the moment, as well as a simulated preview of the cloud itself. This project really inspired me because its innovative use of technology truly represented Silicon Valley and it was a beautiful representation of how technology could contribute to something simple and beautiful. These days, a lot of thinkers are concerned about technology’s harmful impact on our generation, but we should be reminded that technology isn’t bad in itself, but we have to be be careful in how we choose to use it.

The artwork was essentially commissioned by the city of San Jose. While the terminal was being built, the city called for applicants to propose their own ideas, and eCloud was eventually selected. It seems that the applicants started prototyping their piece in 2007, but the final terminal with the installation did not open until 2010.

The team was led by three main artists/designers, who also worked with some professionals who had some more specific expertise.

The team who created the piece had plenty of previous experience in environmental design. They had designed various spaces and environments and many settings such as airports and museums. And why a cloud? According to a VICE article, the art world at the time was heavily relying on clouds for inspiration.

John Baldessari’s Brain/Cloud

Here is a video of the installation:

According to the official website, the project utilized a lot of custom software so that the panels could communicate with each other and utilize the proper transparency.

Could the project have been effective? I don’t know! After all, it’s a piece of modern art, not quite a product so it doesn’t feel right to critique it. However, I think it works incredibly well in its setting. Perhaps I wish the panels were darker in default, so the more transparent “white” panels would be more visible with greater contrast.