Author Archives: aspauldi@andrew.cmu.edu

Project 2: Max Hexagon

For my second project, I created a game heavily inspired by super hexagon. This game, which I call “Max Hexagon”, bases all of its randomization on aspects of a given sound file.

In the game, the player is a cursor on one side of a spinning hexagon. As the board spins, so does the player. The player can move left and right about the hexagon, and must dodge the incoming formations for as long as possible. By default, the entire board spins at 33.3RPM, the angular speed of a 12′ record. As the player adjusts his movement, the song begins to play faster/slower, based entirely on the players angular speed in proportion to the speed of a record.

The stage itself is generated in a number of ways. Aspects of the songs FFT are used to create pseudo-random shapes and rotations, while the note-rate of the song is used to determine game speed. In addition, visual effects are created from the music. The maximum value of the FFT is used to create the color, the integral of a frame of the song is used to determine how large the center hexagon is, and the beat of the song is used to change the background pattern. Beat detection is both inaccurate and computationally intense, which is why it does not play a larger role in the game.

The game itself was created using Python and Tkinter. The script that runs the game is multi-threaded, to allow both Tkinter and an OSC server to run in parallel. The OSC server changes specific variables to allow python and max to communicate. The general form is either Python sends a message to Max, which is enacted on immediately, or Python requests new data from max, which is promptly sent over OSC.

The game itself is extremely computationally intense, and must be run in a 1920×1080 resolution. It is, unfortunately, difficult for the game to keep up with Max while running other tasks on the hardware. If the game crashes due to insufficient hardware, the framerate can be changed in constants.py and the tickrate (which is the framerate in milliseconds) can be changed in max.

The game itself requires a few external modules:

Python OSC: https://pypi.org/project/python-osc/

Beat~: http://web.media.mit.edu/~tristan/maxmsp.html

Beat~ is a soft requirement, it can be removed if necessary and most of the game will still function, baring a visual effect. Beat~ requires 32-bit Max, and will thus not run in Max 8.

My project can be downloaded here: https://drive.google.com/open?id=1gsfdUkBEIh-JGZjIu0__g8CKnZvktxMx

Python spews a number of errors on closing the program. This is normal behavior, and due to the lack of an ability to properly close the OSC server with the rest of the game.

Project 1: These sounds look nice.

For my first project, I created a Max patch which can take an input video and synthesize a corresponding audio file which has a wave form representation that looks like the video.

To convert a video to a waveform representation, a few things must happen: First, the video needs to be simplified. I use edge detection for that. The edge detected video needs to be converted to matrices of sine values. A scope will take these values and plot them to x and y coordinates in a signal. Since our video is edge detected, we need only look for “visible points” and then determine what their sine values are. These values correspond to their position in the matrix. More clearly, the top-right of the screen is x=1, y=1 and the bottom-left is x=-1,y=-1. Once we have our list of values, we need to align them so that the output looks “correct”. This is done in python, as it allows for more complex manipulations. The aligned matrix is then written to a jxf, for later use in max. That matrix represents one frame of our video, as audio.

Interfacing Max with Python required a bit of creativity. I wound up using OSC to send messages between Python and Max, with most real data being sent in the form of matrices saved to jxf files. The exception to this is the patches playback function, which is python sending many read values to Max under very strict timing.

Rendering the video to audio takes a long time. Around one second of 24fps video takes 1 minute to render. I’ve included a short video and it’s corresponding audio and representation. My project Zip also includes a scope patch to allow for dependency-free viewing on any computer with Max.

The patch itself has several requirements, with the primary ones being Python-OSC, Python 3, and xray.jit.sift. These are either included or else explained in the README of my project.

PythonOSC: https://pypi.org/project/python-osc/

xray.jit: https://github.com/weshoke/xray.jit/releases

My project can be downloaded here: https://drive.google.com/file/d/1Vk19SfcogpdhtJHHTioFyniZq4UuaqDk/view?usp=sharing

 

Assignment 4: Some colorful music.

For this assignment, I created a patch which takes in an arbitrary sound file and uses it to convolve a video source. In my case, I used my web cam as the video source. The video is convolved at the color level, with the first bin of the sound fft being the first item in the red matrix, then the second is green, then blue, and then it repeats until there is a 5×5 matrix for each color of the video to be convolved with. The video is rendered at about 14 fps to keep the render from lagging, although this can probably be increased. The scale for the fft can also be increased. Each bin is normalized to cap at 1, but the scale multiplies this by some arbitrary factor. I’ve found that three works best in my dorm.

Video of patch in use: https://youtu.be/ySZOQbweElg

Top level patch: https://gist.github.com/TheDreadedAndy/81a7b759f354c176c8ff729a377cd441

fft patch: https://gist.github.com/TheDreadedAndy/7152747cdf6ef599fcdbdcd3f06cc878

Assignment 3 – Convolution

For my convolution, I recorded a brief statement using my microphone in my dorm.

I then created 4 impulse response recordings.

This impulse response is the sound effect for collecting a ring in the game “Sonic the Hedgehog.” It was obtained by extracting the sound effect from a youtube video using a conversion website. It was then normalized in audacity.

This impulse response is the first three notes to the Song of Storms from the Legend of Zelda series. It was recorded using the audio input of an EasyCap USB capture card, which was connected to a Nintendo Wii. The song was then played in game, in Legend of Zelda: Majoras Mask, and normalized in audacity.

This impulse was response was recorded under the UC overhang across from the entrance to entropy. To record it, I determined which spot seemed to have the most interesting acoustics, and placed a zoom in said spot. I then popped a balloon a few yards away, still under UC, and normalized the result in audacity.

This impulse response was recorded in Doherty Hall, outside the first floor elevator. To record it, I set a zoom h4n pro to record and placed it on a desk in the corner of the room. I then popped a balloon in the center of the room and normalized the recording in audacity.

Using these impulse response recordings, I convolved my original signal.

Assignment 2 – A delayed change in pitch.

For my assignment, I began experimenting with tapin/tapout and line~. Through experimentation I learned that, when you change the parameters of line~, there is a pitch shift to transition into the new delay. I used a random number generator and zl reg to setup a patch where, whenever line~ reached its desired delay, its goal  would be changed again and its start would be set to the previous end, resulting in a delayed pitch shift at the output.

Next, I incorporated feedback using the same method with a shorter delay and an input that would get progressively quieter (so as to not trample my original sound in an explosive manner). The output of my patch is a version of the audio with a distinct echo and pitch shift, below are the results for the song I used.

<pre><code>
----------begin_max5_patcher----------
2488.3oc6bkziqaiD9rCP9OPHDfAXF2cqMaYmSSv6z.7xjKYN0IvfVh1lokH
0HR0u1IH8u8gKZ2Vtks7VdiNzV1bSU8whUUh7S8e7seyHikz2PLCv2CdFLZz
eHJYjpLYIixKXjQD7M+PHS0PCeZTDhvMFmUIG8FWUwm1.IqQ.NE7i+z+9mdr
nAjzHLIDwUc2JuTbfpSzk+1Cyp1TZJOusl4EGC49avj0KRP9bszZO09Qyw.a
G0EKKS0uDeB907dw3aCQpahgpj+7a+F4Uwkw8WY+L8KnDFXEBErD5+B3UZXZ
DBvn.9FwE7ZBLDDPQLxeiC9ECzawgz.zuXTBKgXBxmlRTi2jNCVSOAvZhsqB
kl373jwf4Vxe3M4biUDzWDB3tPULPppODfBgaefsglv6towjVzVqCnstyLqZ
ZX5n9kcKpadQ5AmuMFoGEC8bnQYmhSPLgs.jiojJ2NuYZKwIp6Z9GU5lTBQI
YHWNzMxXEND8pvFRLZU0nQFv33JkOpZmjH9uQUik23xxvDcYNkkkfdEmODSJ
KFlHPKtP1SSz5+aScMpLRBazDRJVIPYkJsAxEsRs1QYQ44ofX2oyT+x1ohhK
rWVGR8eAET07T.zwHBlTELqWe.ZELMjuXEkvY3eWIlVRHdeMXEzG0d2IvHsV
9CIXwTYQaVmfCnDofTG4kkmeKeVrbQunolNoZBAFuutKLXE3Sa0xDJaJaILQ
NwrTa9YWTKmRCqWWYOCQq3Y0GiIjl.JmFefZSvq2bndujJpM5fCupJ1hTht5
EhU77EL3qMPdNLLLyGPi6vaPBNBxQbrd9v1rrVDAJT3ML+DZXXc8VW0q6qp.
g4sO5K3.9F0Mql4gnC33bCKixo8.7ZDi2nPNbMqQQ65cPTV5xrExK3nn3Pg1
znE0BkVaQaUOm0q3vQap6F8m2fYfLGC.5JUbFkHA1.Y.HfEIveTBHARBnQxK
qQ0FoZAbbpVydcC2zU7biFcYOAeZykrikiNLsmL.jk1Uo6rpqr1WTnQkQhx8
DcYv1+EHkg.+dHHAsVl.CCwAzv.Ufq2AHRfrPgscVAhkxI7GOmfqWO.WsSXa
6pg6tivVkcq1PUkB.SY4BSCvTApKTE.GJbr7j3SgRCfRvVz.MRuJk3KWI+H3
mUl6IBO5bUhWzjrl.SPY17BO2AiySAS92ZpHILeUZohUHLQUhkNhqXtXhFJV
LK+pbnERHZ0JAhheEEtcW4TJTrM3UbcgwXg1bf4+4G87+zdO+6nVSYMUkbmk
i00b9WHuKEY3bzZs6g7oX+Q9TL018yUF7ZWJ0xxqMe46MaOiw.ikBCEi5cuv
hagNXjZ937ie0xbtQ1yvW.l.y1AW6V.WmCfsVe.1JBPqfScps8FaA+5UEx3.
LX4waMZ2CiQKMRYMaV4ZwSGvvReoE1iWUrSGC73s1r5O3Y6nR318LrR9hfZm
nSNauyfSNMz7UqSt+AXh0wayYOoOt3bxhTpWw51+UrW4UpY446XdBwFrc6Ox
MyKeajtCAtHDiAa77OEH22YMF7c1hTkOIn6LDW0wVcwy8uVwUE9HRnJXC7Og
pbkAmxpVq9a6oiQXYOqe.3sH.q5gVNATyrGQQxM67z95l1OTKaiQuQInneLw
2OTFwsEItOgKx1xBak6N8CY0aD75ibXx6BM4DfN2dCcNtpj7l50OnSnD9TBQ
M3me3SADsiNls8nV182xxr7AGbL6sgU0Vku6L6zcLI.8V0s5+7AjZI63sybb
5wFgnQxIldm.Td5fTkCGP5auss.VIuxFzB7wnoI94yi4Y2BZH7AHFGSJNzjm
KdRefU8FRSBzG4zQL0dAkP4y.0rkEhn48gHd.Iz9RJgtcVByzkKovX0Ygw8h
KLVGi8u4Mv9uyBnbeNuMl+65YX+Rn0k2zpqfkkVlujhhYWkkqvJNmNuhy7hK
LcWVbt7xRmMXt7dhbuerW5dRAGdJJuhR1cXHOQ+fEZRLr.x4I3kobc9L0Htx
Qd.6qCoKggYGddAiLN7osWdn7ER4ISKo3P31PLi+d2YcjqWKzNZxAncjyb0C
3nYWU1Qb6XdhrNZ79+l7u.r5DQgIaqvMo.HGtyDkeHNtszQkrOpXx3Se+S+G
FJg8zOPB19zOlxv9O8Y416HO2yOifIDvCfOkjx1HeNkn3pIoqFoWDYIqFI0w
kJKoZKXnPjeo4oduapmWtQHkF2HGeQN4DtHm7ERlyz1BDIkRX7DjX5Hazqu3
Pxnohc6QnWM25a4I4pNY2rihn4iKrhlDAIsU6+MEFh4aOvMfEizDpo8w1mlj
TCe1iH9AsglfWKbADJolBM6tYuyi9HWIr2tuhFFR+hdgZkgn4SNElljyfmmA
6L54xfvtZMeSkoZwbj+KLiC05HVKCpBR4oDrbilTj6ysS9zN6jX7u+tP3lre
WH16yEROXtn0rrMapmDW71CBtm.H3LKaWc8tsfPbHsRLi3DJmJG+BmlQaef4
SiQePXkHXhvPmUk0j1yKXJpiWUEKfFAECAbIJTuicxUa6CW+HXUTbLESzkNa
l8g3F7TswliFtMmos.m93jR4RR5QeZnljoxUoSFW6iJpfN.f77DSxbZ6ZUPN
u8dDF40oHrVotCihEtVSCJA.IE7DyIUinUxK1JhmkFcsl4L0yo52p6.1fuQ3
WhfzS2VOVtI5FJraQoYFgRpDPSsMOUpUuqOkUqOn316tH7HW+neMqQO+uPQu
rFzZrnRjNRq0dpptHAS1cjEFiuHhyWw4r.crUyg5Oc7JmUmpKRWxL4kF.X1f
plyxFypUqlPKUlGrdbRyJKTlI61sV0EcCpoJOXUrhpgLpabUQT2VC8i7p9pI
vXr1pwvBTMNUVTpEusyjQVEaypnjpz2Adccle5dcc7leeF5o5KQP2Sj24TB.
4p7XIxcWSwc6y0qOvWkuG.O3jcXXylL7p.L7p.L7p.bMdU.Fnq9.c0Gnq9.c
0Gnq9.c0Gnq9.c0Gnq90lt5Vl2V9p6Z8WU9p6NvW8A9pOvW8A9pOvW8A9pOv
W8A9pOvW8uN3q94hYZekQXu6IdLdOwuygWuggWugq9q2v8DYquqdWKtqnD+c
zaJv8y6Ow8+KaxE8804+e3qdaLfI.5+d2IAj0I7uIzLN.ky+EuCR7kF5mtM0
ynLSeaZHkq06l0V8nYcvAzdw2Ne+5zsyw8rc+bcOlaXAyhpu1sehfSmDglSD
Q3fR5j9LvchhgfN5c4eplxn0+kkq9vQlMW8O7O2Z+ylswh8qBnNuEP8rHBdc
BTOiFRd0hKc4ugMguCFD7hfwcWBrtsKct4Kda0T2d+traPyP0cuI8BaRsv8P
qvCPovcnSnRFDe7+.CRGsm.
-----------end_max5_patcher-----------
</code></pre>

Assignment 1 – Repeated capturing of gameplay footage.

For my assignment, I first used my capture card to record part of the opening to the game The Legend of Zelda: Majora’s Mask. I then transferred that footage to my laptop, where it was output and recorded back to my main computer. I then repeated this rerecording from my laptop many times. So as to prevent the video from being destroyed by the darkening of the footage alone, I added a bit of brightness after each iteration. Additionally, each recording was converted from AVI to MP4 before being iterated on. I successfully recorded a total of 16 iterations, at which point the video component was destroyed.

Since the file is too large to upload here. I have uploaded it to youtube: https://youtu.be/yrOUTDxm8y4