Author Archives: Will Walters

Small Production Environment – Will Walters Final Project

For my final project, I created what I’m calling a Small Production Environment, or SPE. Yes, it’s a bad name.

The SPE consists of three parts: the first being the subtractive synth from my last project, with some quality of life and functionality improvements. This serves as the lead of the SPE.

The second is a series of four probabilistic sequencers. This gives the SPE the ability to play four separate samples with probabilities specified for each sixteenth note in a four note measure. This serves as the rhythm of the SPE.

Finally, the third part is an automated bass line. This will play a sample at a regular (user-defined) interval. It also detects the key being played in by the lead and shifts the sample accordingly to match.

It also contains equalization equipment for the bass & drum (jointly), as well as for the lead. In addition, many controls are alterable via MIDI keyboard knobs. A demonstration of the SPE is below.

The code for the main section of the patch can be found here. Its pfft~ subpatch is here.

The embedded sequencer can be found here.

The embedded synth can be found here. Its poly~ subpatch is here.

Thanks to V.J. Manzo for the Modal Analysis library, An0va for the bass guitar samples, Roland for making the 808 (and whoever first extracted the samples I downloaded), and Jesse for his help with the probabilistic sequencers.

 

Project 1 – 3x Oscillator – Will Walters

For this project, I created a synthesizer instrument called a 3x Oscillator. It does more or less what it says on the tin, the user can control three oscillators which can be played via midi input. When a note is played, the oscillators produce sound in tandem, creating a far fuller sound than a single tone. The oscillators can be tuned and equalized relative to each other, and the waveform of each can be selected – sinusoid, sawtooth, or square. Other options for customization include total gain of the instrument; independent control of the attack, decay, sustain, and release; and a filter with both type and parameters customizable.

Here’s a video of me noodling around with it:

(sorry about the audio in some places, it’s my capture, not the patch itself)

The main patch can be found here.

The patch used inside poly~ can be found here.

Assignment 4 – Multislider EQ – Will Walters

A video of the patch in action (sorry about the clipping):

In this project, I connect a multislider object to a pfft instance to allow for real-time volume adjustment across the frequency spectrum. The multislider updates a matrix which is read via the jit.peek object inside of a pfft subpatch. The subpatch reads the appropriate value of this matrix for the current bucket, and adjusts the amplitude accordingly. This amplitude is written into a separate matrix, which itself is read by the main patch to create a visualization of the amplitude across the frequency spectrum.

At first, the multislider had as many sliders as buckets. However, this was too cumbersome to easily manipulate, and so I reduced the number of sliders, having one slider control the equalization for multiple buckets. At first I divided these equally, but this lead to the problem of the first few buckets controlling the majority of the sound, and the last few controlling none. This stems from the fact that humans are better at distinguishing low frequencies from each other. Approximating the psychoacoustic curve from class as a logarithmic curve, I assigned volume sliders to buckets based on the logarithm of the bucket. After doing this, I was happy with what portion of the frequency spectrum was controlled by each slider.

(Also interesting: to visualize the frequency, I took the logarithm of the absolute value of the amplitude. In the graph, the points you see are actually mostly negative – this means the original amplitudes were less than one. I took the logarithm to take away peaks – the lowest frequencies always registered as way louder and so kinda wrecked the visualization.)

This is the code for the subpatch running inside pfft.

<pre><code>
----------begin_max5_patcher----------
824.3ocyX98aaBCDG+4To8+fEuToMJx1X90dZ+eLMMQHtotkfQFSappV9aeG
1IocYjE2HSydATNr4quO9tyG4kOc0rf4x07t.zWQeGMa1KfkYFaCVlsyvrfU
kqqpK6LCLng+jb98AgaelluVarWKWtYu01Rc0chlk+Twqz12OKEGgCQIzjga
wrgqTZDF8icSR1qq4Z8ysb6LB5DKaJqCdcDc5mqMOLXuRM8qDMvzLKN5arZe
aFyjclEKLSFV+2PRCLF+0mtZ3NbKzYDTIWsh2n+aF7jRn4nxUs0Bc+BdGRKQ
kntV.BJ4RUY6c6mSsngWI6az+w5dDtkhyixwXLqHDXX9.1HI3cLzc1PFmM3w
XS1DvFX.v1I5548UO.Ze8V1TKVvUHH3oDX2cqDUk00OeFThhixJJJxyA7jkG
w.hgKFPEaZQE67Q0wxjtWniZkOv2XhbPDD9emWkt2yS.OevwSIfmGScHEahR
tn9mJky613JGXowQTfCDJr0mMEUZHuCXPlnPDN+gMHP+tSEhjvX6QCM0jPPv
4tikv2UbxQPCcLzf8OZTP4hEaPjHz2Z3kJdm9c.GpsVKNFHDM6BeDUg+gyM.
XbEFjh3nTfETnXhS0RlRVj6eVLzvBDlj6LORx1yijKc6KY9mGe48QCx1CYgR
rrjKLMR8OMt8VMH5IRWhAfj.9Oydra7QhLNSOcr9MR7um1Jq0xpR0FG7z37W
8Ta+7NeBhu1+G6bjInIrO6DNnolj.JC.QwfkKXRPr+g.DUnkP7gKrfPIQIEC
UG9uJzfNIkFDMmpxPBY.AVPvxNGbD5u1RiGq2qCAiUEymbc3eIfwAGdvA3pS
1qp14.1VVBQuwIgOAFPUoVHadyfft9Li53aLNqWtK5U3M4xbQtbuIWpKxk4M
4XtHWhWki3hbDuHWrKd2AHPpVvUl7pOpE.7whiuBvdXEPcg3LuQbpKNbr2hm
HNktPn9SPhSBl5OAwtRT+rAN3ezSIW5Dlx3Df83FpK4GzOV7RNHo8nED1d7c
Ya6ibU21WhUbnsl6klIkEZ+snw9aaiAAJ9ihcSgYMUpfVZzP+L8JaCGqSYAa
msDVBM8hc6xl0.b42.BIVgH
-----------end_max5_patcher-----------
</code></pre>
view raw gistfile1.txt hosted with ❤ by GitHub

And this is the code for the main patch.

<pre><code>
----------begin_max5_patcher----------
926.3ocyW0saaCBE95Lo8Nf35zHC9mztq16wzTEwljPK13B31zUs9ruiASpS
liqUaxztHACb37y2gyGvKe8KyvqT63FL5aneflM6EXjYtwZGYVXfY3R1tbIy
3DDWKYOKEF6q34cSWyr4aEUatUyysdkQHwKhlihuw0PRib8f+Q+LrJUiUxs1
mq49kfMhMULId9ve09qPjaEpJl9Y7a5wXeV5TAduCU0TJp.k6bXRuQ81zMbZ
XXQgawpU2cEIduJJXVVGJDfgY3bondOZ0eFXt0BIuhU58jZsnbgTopWvDqC5
LH08hJuIYMEBU6H8kvvkbWT5siC3H8vsVYZ0b+Py4apJKuxdqwxr7fmefKBP
tVrQ.3okWVq5RTznCU9LWBty3GNgUTxMVMGx1CNeP8.PrwtszrOBNTrGZXRg
syD3ULiHGenDlZNuny+Nd0qURo5oMR0pChiiiAA3jlsh01SnlRUAeLOP1naC
2SDBN0mqz5CRUG6n5RVk8cj5PLqWNGP676M3A04IhIWfXapfRQuDIIfD6E32
e8KgugOCSDFsqEZlOYpfJ9SPYy9ZFKemy0vOTxsZEhh9NCh7G4HxXbE2jk41
ju7ZWCIssgRGkqXEqMFmLG.cXN.x.b.zX7GFOtSXWT+DTdqdZr.95DW.mPSW
jBrhot.NNab1QGG3ayOEJN5PgGEe1S2sg8FMqdK56ZHuv2JPjnEccjJzUPuQ
OrHJJwAAzktTez0SXG.XzaKYVsX2QHyG77fAAKxkAr79MBn4xQDzZohYioHZ
Z16TnbsGeR7Gpls7+DXhjc9goBV9nWvfRc0Pow9aVL.T7A4DhFJ.u.6CpWu1
9Jh+vUllUnTBczBjkTGggu4loj26t3zmNeODEII5hVV.NfoWYAgNdUQRlaKP
R2UMyR9OopH9x.Rv8EkNHZJXB0sgYRDEWPVzONPT1HsBiTTv06cCCXpfyQFC
CBml3ZhSW5eIR1mDEZumc3wAjwglZlFdPfkquEdYvJuh1SuXD+xGBzdAFnEH
9OwM+mBS7wPsewXon5udlmCxZm3nDfQ0nyC3Q3UQ8t1Jt.dD.bk02tRafTvI
0oy0S2fYSwfvgymKCFT06XP54yfSCRyNeFbJ1K97kAilT7cDrqzs04s0D+ac
Ax.tPz4vElztpAA8t5VVc8ibsoaEdKALj2ob93x499hJe+XeeM+QQXII9gXZ
fczBTiMZOc1trDb2pgGBqqZDAb24Cve+A.Sh8m.
-----------end_max5_patcher-----------
</code></pre>
view raw gistfile1.txt hosted with ❤ by GitHub

Project Proposal 1 – Will Walters

I’d like to make a playable 3x Oscillator in Max. The basic functionality will be three separate oscillators with switchable waveforms which can be (de)tuned and volume adjusted separately. On top of hooking this up to a keyboard (and maybe functionality to have it read from USB Midi input?) I could also implement a bunch of user-customizable options like hi/lo pass filters, panning, reverb, and EQ options. I could also add some visualizations of the resulting waveform.

Assignment 3 – Will Walters

My original signal is a demo version of the song ‘You or your Memory’ by The Mountain Goats.

The original:

Audio Player

 

My first two signals were created by popping a balloon from the other side of a door as the recorder, and by recording the sound through Snapchat, then playing it back.

Here’s the IR and the song from the other side of a door:

Audio Player Audio Player

 

And here’s the IR and the song through Snapchat:

Audio Player Audio Player

 

Next, I used as my IR the sound of me knocking on my desk with my recorder pressed to the desk. There was a plate on my desk, and the sound of a fork rattling on the plate creates a pitch.

Here are the IR and the song convolved with this IR:

Audio Player Audio Player

I was a bit disappointed to see that it sounded similar to the first two, but it is cool to note that the frequency from the fork and plate cause a resonance in the song.

 

Finally, I recorded a short clip of myself eating yogurt and used that as the IR. I’d like to thank my roommate for donating his yogurt for the sake of art. Here’s that IR and the resulting song:

Audio Player Audio Player

 

Sorry that the IR for this is so gross. But, the different spikes in the yogurt IR do create a cool preverb effect in the song.

Assignment 2 – Will Walters

For this assignment, my first was to filter video feedback through a convolutional matrix which could be altered by the user, allowing for variable effects, such as edge detection, blurring, and embossing, to be fed back on themselves. However, using common kernels for this system with the jit.convolve object yielded transformations too subtle to be fed back without being lost in noise. (The system I built for doing this is still in this patch, off to the right.)

My second attempt was to abandon user-defined transforms and instead utilize Max’s built-in implementation of the Sobel edge detection kernel to create the transform. However, applying the convolution to the feedback itself led to the edge detection being run on itself, causing values in the video to explode. This was solved by applying the edge detection on the input itself, and then adding the camera footage before the final output. (It looks maybe cooler without the original image added, depending on the light, so I included both outputs in the final patch.)

<pre><code>
----------begin_max5_patcher----------
2071.3oc2bs2abhCD+uSktuCVnJcOZZD9I3S5N06yQUUEKqSBsrvd.aRZqtu
6G1CaVxlEhMwfRpp5RLgEl4mm2yP9wu7lyBVUdmpN.8mnOhN6rezdlyLmSel
y1ehyB1jbWZdRs4BC9RVyEauMqXc4sAm2cAk6ZxUMMeaqBtWAAm29ezm1+6K
1sIqn8JL2A79ytMoI85rhq9bkJsA9hDF4hvyQBlTefRh0GHj1Sd3lks1PGkq
9x6wXbPumAPFlGBwb1+6Wdi9X6gysl+JT21dqu+11ntyPaAUnzjMivwCvsgi
vsBASyebtgMYg.y1mWqa9Vt4IDDbR1Ob.1G6c1WusqtaaE5ClOyJ9X3mdW6m
3OMLlz9c97ljlpr6FSdfLBBwhADRZNfm.DEJWLIDMDUWtRk+bQjwzPXRP0PF
Y.Fr6.Baw.jZUwZzpxlqCblM4TXiGaTMDTWYS4Pa6gSlK0auop77U4koe896
9kkEMWljpdft9i12yypaL63m+Hai5aPc12MWIVa8aTcD8UWjrAX7+oJK4fv1
IPQLUZzXnBtQ+ADZh3CXNMN99aVZYt4ASOMJx1e5pxagqy6BO+6FUSUIhf9P
RZS1MJDdX0pUIEWMEyK3vHiXFQX.FLl6pbFW5cKvoka1nJZdLh7MTZYY0Z2U
lZECD.iArIGLiFZOaxndWcZP17tmKahk3K3ShMoxkiMuIIem5YvjB7D2Kobu
yjs2mUppwCGbLczwYXFtmvK24P.nCI61yxPUqI0FU0mUEIqfa0KavnSD2cvf
D+SHX.pBS.LDKLXLfqtsIoeEEp+m64WPGCf5RmhPM90DBmAHh28tsQUWmbk5
TAK1nitB8VB5sTTq0QzawtCGi5smDB4SBN8wXmipjfWz7sRKKtoL+F0blgEN
jXPCJ2HiD4dFV9OehYsDCXhzvorHHXOoQEAGGMPPwQKZ9ivlIBitLuLogRPT
DcNSmDiiDPoV.Xf3bDvjWWa+BlwyofQgDin5UDsNvoSIR7Zt.SbLvdPdetW8
jXxKf5K8GgWHl0ZLEAkeKpKXBhynj+KB4fYMrVkm7MTS1FE52tTGaR8u6dND
xPiNOSLwLHhjupxfHl0iccOJwnnWFQI9NDsWUsdDBk0JtLAo+3tnffRevbtd
hQzEyDQ1kn2lgQ+8e0Foby0pByJUdsB09XI5U92cQDT.MbDEhRvY7QrbNPdO
hF5e4iPS3gPTRT2Ye9hIdz5+rbWOaldwCZTKypSzF7P3L2ubJGMnLT1369mO
pPvnZAsa7D9dSnDIDyjC3.Y47PZRhjfpTISnDhwhtvfMrHT0bW7LJv92y3PM
W4Hwcq4QZDzAIHYXnSqNUx6nEiGmgPh63dLYhgDyE9WkdFi8oicIb4zB9gyd
YD7iQVWU054+ICBpWa1FwbG2BPCar8Csf1IPaHycrIiKq10zTVLsNfYg7ANB
x5gs+SqYU7B5ga0Xd3LHvzcwIvfkenhoTmUTXxE1E2spesRgtsJqokQZyWto
zcWA5APQ2or17A0GHlRi3TeAi8tuf4RRuaTRvPsvcWRmIVLI8um2F9xUSKYF
xnkCBDqwl8Y288w7eXbC1Nf1691cMckC02cBPHgjdgAuR5be.XgupBBnKGtt
nZmPGT+Ipog6syGxlHVrv8LbPEjrh0pYPy.x2kPfVp5bGCojELe+GLZY9of4
PENvfPh6UCitbCjY8CxMx9AqqKZVSXeL26.pbwR96P6vpUMsdqBQLT50IUHA
q8GiCmy1hw6lKrtYsIJ1YbhuXk8aIFTNpIa433INkbxkaLBRyKqU90l39VDG
ByRoyELId439xspBOy7gxCa8T2q.p2484c3.5FXFRWQhLN.zE8+jMGltniFv
UUIql0QA.3cL7RI3dwtssKnv8KHOq3Qu5IFpR+KNBipK2UktmS22xUTOJasp
tMM3jlr1LGObUlWOj9WVY0513JGeGxZZv7pE7zDgdBBzWkOdh522CKXaCgg8
xizRf9Xn3dfF6EflYEPi8FPaGWKGhoC8.InCfvRd1S6zD61oYy3Nskz.aFwc
pUpX5Jgnw88W0lr0aKyJZp2GjcnofZvHkzekW1prBkzuLGymzo06S3YRTgXk
JJwalD3xk84QDV8772CTOl7V9.I9gCis9A5GKbDqjZI9yMhUhLGIGejkDln2
qxBle3EawO17e9DHMDOmTnd3qdZRTOkIiXNNlyOzHuXx8K7CEZkdCKbTTLtq
raPAYhBOrxKzndHj5iOmlFOlSNhFkht1CA0Bf2MbDmZUG2DF4S5+4hwQR4rh
wVEvFMdb4.XfcwXPTs2J+PiVIqJHiiiTxg5fD0FaSDGV4GRjXsB+HjXGUE2G
F8lFufXi1DULNIxgQpBluPAz5UouLaJd9F1I334zvN2JRj+DNeh6nQSb18W4
GZzp7N4Q1PijXZOZzrxOznU5KGyICPiTSzF8W4GZzpX04DqnQRXebj3K4Qlz
579FlF6Tiwwgve0JflIcxUX48Wo+n+m1nzSP+TBjoJC9qt.TPvGtJVb7J+P+
BOf+LIsed2cXr2x6lYWBDi6apipHwF.jGSuekenwPOPi6CNZlnQpU4cHDVEp
DGBmCBtyrxOzncxiRqrGvX8bxaV4GZzGxicZxy0dM1JenQ9qF4VEE9nHx9Xc
hk8i7IVdwbUMM6JkkzFhlf6LOe+BKp+VWGYR1t8FUUc2iEH2fMIeoz7khNGV
mU.qgW77fJ0MY6+JveUXBRpRuNqQk1rqB5VzchtoHNXSYKITrKaOUXng1O9e
.8f5j6B
-----------end_max5_patcher-----------
</code></pre>
view raw gistfile1.txt hosted with ❤ by GitHub

Assignment 1 – Will Walters

 

The above video is the result of a beautifying filter from a photo retouching app (Meitu) being applied to the same picture (of myself) about 120 times. The filter itself was the one the app called ‘natural’, set to be as subtle as possible. (A slider in the app alters how drastic the changes are.)

To follow the themes presented in the in-class examples, this project was meant to demonstrate certain fundamental artifacts and assumptions in the process itself – in this case, what does the app consider beautiful? We see in the final result of the process those artifacts extracted and brought to their extremes.

The music on the video doesn’t have anything to do with feedback loops, I just thought having it in the background was funny.