I chose to look at the snow in the movie Frozen. I have always admired how the creators were able to brave the task of creating a realistic looking snow. In high school, my teacher was involved with this project and since watching the movie it became apparent how real the snow looked and acted compared to the snow animations in other movies. The animators explain that they used a method where they created very small particles of snow and assign them a random volume and size. After factoring velocities as well as collision variables for each of those particles, the grain of snow is then able to move. Another amazing spect that was taken into consideration was the different consistences of snow in different situations and temperatures. With the snow they were able to use it as a narrative cue within the movie—much of Elsa’s emotions are manifested through the snows’s velocity and color.
Researchers at the University of California San Diego have found a way to improve “all that glitters” in computer graphics.
The collective outcome is a more eye-catching suit for Iron Man, or a shinier shield for Captain America; but I admire the complex algorithm based on countless individual rays of light. The algorithm more accurately calculates and reproduces the way light interacts with surface details.
Previous computer graphics software have assumed that all surfaces are smooth at the pixel level, which is untrue in real life. This algorithm breaks down each pixel even further into what are called microfacets. The vector that is perpendicular to each microfacet is then computed in relation to the surface material. By combining this information at an approximate normal distribution, the surface is rendered in much higher definition.
I am excited to see this computer graphics software be applied to metal finishes for cars and electronics on HD screens.
Elastic is a company that produces advertisements, main titles for TV shows and movies, animations, broadcasts, and other video production. I really like this video, which is the main title they produced for the first season of HBO’s Westworld. (They also produced the main title for the second season as well. It’s on their website!) A team of CG artists, motion graphic artists, and designers, under creative director Patrick Clair, painstakingly built sets and models digitally. The programs they used include ZBrush, Cinema 4D, Maya, AfterEffects, and Octane. Through this, Elastic rendered and modelled 3-D assets that reflected the wild west, distorted human robotic symbiosis. Watching this in isolation always gives me the chills, especially with the music composed by Ramin Djawadi. The music and the scenes tie together to create an eery feeling, and I find the detail in the rendering to hit close to the uncanny valley (a point where realisim in androids makes a very unsettling affect).
This is the scene from Dr.Strange that I thought was the most interesting out of CG effects. It is the open your eyes scene from Dr. Strange.
The 3D computer graphics project that stood out to me the most is the scene from the movie Dr. Strange. I thought that this scene was immensely visibly engaging and incorporates so many different elements, such as entities and even a butterfly. What I admire the most of it is simply how realistic it is and how it sucks the audience into a whole new reality. By referencing what we are familiar with, such as the planet Earth and a butterfly, we are able to see just how surreal the experience is. The point of CG typically is to make untrue things seem as realistic as possible, but with this project, its goal was the opposite- to make things seem bizarre as possible but with such technological skills to make the surreal look realistic and blend well with the actor himself. I admire these aspects because it is something so different and unique and at the same time memorable.
I think that the algorithms that were used to generate these scenes definitely used 3D algorithms like camera to make things more 3D. I also think that they used scale and translate in the parts where locations are being zoomed out, while the characters stay in the same place. The must have also used recursion because there were a lot of recurring elements when Dr. Strange is in the different dimensions.
The director’s artistic sensibilities manifested in the final form when he was able to tell such a compelling story with the crazy graphics. He used a voiceover to explain what is going on in the scene and how it relates to the story. Doing so anchors the intense and out-worldly CG into a rock-hard reality in the story and doesn’t confuse the audience but rather teaches them. I think that this particular scene sets up the story of the whole movie very well.
Source:
Movie: Dr. Strange
Director: Mads Mikkelsen
Actor in scene: Benedict Cumberbatch
/*
Min Jun Kim
minjunki@andrew.cmu.edu
15104-B
Project 5
*/
function setup() {
createCanvas(480, 400);
}
function draw() {
noStroke();
background(161,211,234);
fill(250,250,200,220);
//moves the whole position up past left top corner
translate(-10,-20);
//draws the grass lines, triangles and icecream tops
for (var j = 0; j < height+30; j += 60) {
for (var h = 0; h < width+30; h += 80) {
push();
//makes the triangles and the dashes
fill(200,180,100);
translate(h+12,j+10);
triangle(10,28,30,28,20,55);
fill(100,180,100);
text("///", -22, 0);
pop();
push();
//makes more dashes colored differently and calls icecream function
translate(h+12,j+10);
rotate(-1);
text("///", -22, 0);
pop();
drawn(h,j+0.5);
}
}
}
function drawn(e,f) {
push();
//calls the drawIt function to add more complexity to it
for (var z = 0; z < 10; z+=1) {
translate(e,f);
drawIt(25, -30);
}
pop();
}
function drawIt(a,b) {
push();
translate(a,b);
//colors it pink
fill(251,156,180);
for (var x = 0; x < 8; x+= 1) {
//draws ellipses while constantly scaling and translating
translate(x+1,5);
scale(0.45,-0.7);
ellipse(x+15,10,60,43);
}
pop();
}
I wanted to see if I can create unique patterns just by drawing one ellipse, so I messed around with using for loops on scale and translate and came across very unique patterns. First it came out as a thin feather and when I input a negative value in the scale in came out with a very unique and deformed sort of pattern. I thought that it looked a lot like an icecream so I incorporated other elements to make it look more alike and added more patterns to make the wallpaper interesting. Below are other iterations of this project that I’ve made but didn’t feel like elaborating on.
I thought that it was very interesting how just making minor tweaks in the iteration can change the picture as a whole. I thought that this project helped me learn more about calling functions in functions.
Jonathan Zawada is a motion graphics artist who has done work at very corporate and independent levels. Much of his work, similar to other artists featured as “Looking Outwards” suggestions, is based around organic form in the natural world. What sets Zawada’s work apart from others for me though is that the blurring of organic and artificial is very visually apparent. In other works when I’ve referred to this relationship, it has been as organic objects being portrayed through digital coding.
Although I am unfamiliar with the field of 3-D motion graphics, it is evident that there are multiple layers to what is being displayed. In the embedded music video, there were quick cuts between simulations of metallic stem growth and chain link interactions, oftentimes with the two of them physically overlapping in a layer-like nature. From this, I would assume that there were different sequences of code for each of the elements in the video, which are triggered in response to drastic changes in pitch, tempo, or bass.
Nike’s Air Max 2017 launch utilizes 3D computer and motion graphics in an incredibly captivating way by playing with a diverse range of textures, interesting use of negative spaces, and metaphorical explorations of air and “lightness”. As with other Nike marketing and products, there is an immense team dedicated to all aspects of the creation of these ads and fine-tuning all the details, so the end result is nothing short of being incredibly hyper realistic. Nike creates a variety of different styles of the Nike Air Max by composing it through several different 3D computer graphic “mediums”, such as colored sand, toothpaste like texture, gum like strings, soft bunched up fabric, physical air balloon pockets, rubber sole like material, and so on. In addition, because many of these 3D computer graphics also evolved into motion graphics, they enabled another dimension in where the viewer could understood how the “material” behaved and reacted to gravity, pressure, and tension.
The Berlin-based studio Onformative created an interactive art installation called “ANIMAS” (2015). Physically the installation is a giant glowing sphere measuring two meters in diameter, on the interior is a powerful wide-angle projector and fisheye lens, producing images in a full 360-degree directional beam onto the sphere.
The orb, which can be seen from all angles, is constantly moving, producing a dynamic “texture” that is computer generated. Modulating frequencies audible in the installation respond to those in the installation space as sound is picked up and resonated back. While the project is not necessary “groundbreaking” in technologies, it was certainly executed well and a beautiful art piece.
Print Fiction was a completely digital art galley produced and curated by Micheal Seibert in 2012. He put together digital art by artists (including himself) and displayed them all in a digital gallery (about 8 rooms total) created using Unity. The person can explore the gallery using the usual controls a first-person shooter might have. All of the artists gave different “types” of digital art from 3D sculptures to flat paintings, which creates a large mass of different types of 3D graphics, some being shown as 2D images and some being explored in a 3D space as 3D physical art. I like this project, because it really does seem like “the way of the future” to start opening larger virtual art galleries, which I could see becoming a big part of the AR/VR unfolding market.
This is the full space of the exhibit, separated into rooms.
I am using 1 of my grace days for this late submission.
For this week’s Looking Outward, I chose to read up on the Apparatum, created by panGenerator. In a nutshell, the apparatum is a custom made apparatus with digital interface that emits purely analogue sound.
It’s a tribute to the Polish Radio Experimental studio. This studio recorded electronic and utility pieces. The studio was established in 1957 and operated until 2004; it was one of the first studios in the world producing electroacoustic music.
I think it’s cool how multifunctional this installation is. Apparatum is equipped with optical analog sound generators and tape samplers. It has seamless UI design and a touch screen to allow users to compose their own music.Then it print’s the user’s score and uploads the audio file.
I also admire the black and white minimalist aesthetic of the physical design. The physical form of the apparatus is inspired by the general aesthetics of the Studio’s famous “Black Room” designed by Oskar Hansen. I think it suits the tempo and range of electronic sounds the apparatum produces.
I enjoy the mixture of old technology and new technology. The new tech: Software used to create the interface/GUI is electron (node.js) and microcontroller elements are c running on teensy 3.2. The older tech: the panGenerator team decided to use two types of “tape samplers” – two 2-track loops and three one-shot linear tape samplers. To obtain noise and basic tones they are utilising purely analog optical generators based on spinning discs with graphical patterns.