gabagoo-FinalProject

CHA(i)Rmageddon

 
 


For my final project, I have made several large plots composed of some distribution of chairs. Since I did not make the chairs, nor did I improve or add much to the .svg generation, the bulk of my compositional input was in the distribution of chairs. The two main kinds of distributions I used were a planar distribution and a large pile formed from a physics simulation

Inspiration

- Doris Salcedo's Installations (above)
- SIGGRAPH 2007
- Chris Jordan's Photography

Process Overview

1. Data scraping
2. 2D Experiments
3. 3D File Cleanup
4. Composition Experiments
5. Material Experiments
data scraping
I spent quite a bit of time trying create my own chair generator, only to come to the conclusion that chairs are incredibly diverse. Naturally, the next bright idea that one might have is Googling 'chairs dataset.' The first result was this machine learning paper from 2014. The dataset included 2D rendering of chairs for image detection as well as the 3D files that the images were rendered from. The 3D files that were linked (left) were broken, but I was able to create a web scraping script that would download whatever files were still available.

 

2d experiments
Before I figured out how to fix the deprecated links to the 3D models, I did some experimentation with OpenCV to detect the contours for plotting the chairs. At this time, my intention was to annotate the legs of the chairs such that I could super-impose human arms and legs onto the chair legs.

 

3d file cleanup
After figuring out the web scraping there still remained quite a bit of standardization across the 3D models that needed to be done. The major issues involved creating a standard scale across all the chais and fixing the inverted normals (left, middle). The 3D to 2D .svg pipeline that I used is Blender's builtin Freestyle svg exporter.

The files I downloaded from 3dwarehouse were in .DAE format. I converted them all to .STLs using Blender, but the conversion did not always work. From the ~1600 links, I was able to scrape ~250 .dae files (I reached their download limits and I was too lazy to download more). Of those ~250 .dae files, I was able to successfully convert ~170 files into a workable format within Blender

 

composition experiments
physics simulation
While sorting through the mess of STLs that my previous work had generated, I ended up scaling all the chairs to a similar size and organized them into a simple lattice pattern (left). 

Around this time, the large AxiDraw had arrived and not many people were using it, so I set my goals to create a super intricate large plot. Hence, I arrived at using some built in physics simulation native to Blender (top, right).

After the major milestone critique, I decided to revisit the idea of planar distributions, as they felt less overwhelming than the dumpster of chairs I had previously plotted. What I ended up with was a mixture of Perlin noise fields and a bit of manual object manipulation within Blender.

 

material experiments
Material Experimentation
Throughout the majority of this project, I had been plotting with a 0.38 MUJI pen and a Pilot G-2 05. These had allowed the resultant plots to have individual chairs plot with a lot of clarity. I did a test experimenting with a more t-shirt like pattern using a thicker sharpie, which I plotted over with pen as well.

Takeaways

Technically, I learned a lot about creating large scale workflows for generating the kinds of pieces that I wanted. In the process I learned web scraping, blender python scripting, and improved my skills in the vpype CLI, openCV scripting, and the blender plug-in Sverchok. 

I suppose the larger insight that I have gain from this is that my approach to a project now has a much broader scope because I know that I am capable of combining various kinds of skills into a larger pipeline for achieving an overarching goal.