This post is a walks through the process to completely bypass those steps. After the walkthrough, you should be able to:
I Have not tested this walkthrough…
Let me know if anything breaks or doesn’t work, and I can help you fix it.
This assumes you have git as well as command line tools / xcode.
If you don’t know them, you can just ask me or something.
You can follow the confusing version here, or…
OS X users can install the toolchain with Homebrew:
brew install cmake
brew tap PX4/homebrew-px4
brew update
brew install gcc-arm-none-eabi-49
arm-none-eabi-gcc --version
(should now say v4.9.x)brew install dfu-util
If you’re on Windows… may the odds be ever in your favor…
No Mercy :*(
The firmware contains the code that makes the photon operate, the libraries, routines, etc. You need to download the firmware so your computer has access to all the files necessary to compile the binaries that will be written to the photons memory.
# working_dir is whatever directory you want to develop in cd "<working_dir>" git clone https://github.com/spark/firmware.git cd firmware/modules git checkout latest
The directories should look something like this
This is one I found recently, if you add the following env variable, make will automatically put the photon in dfu mode.
# /dev/tty.usbmodem12345 is the usb connected photon port export PARTICLE_SERIAL_DEV=/dev/tty.usbmodem12345 # Set the platform default export PLATFORM=photon
If you want them to stay after you close the terminal, ask someone how to add them to your bash profile.
This will clean and build the file at firmware/user/src/application.cpp which should contain the default tinker app. It will build the system firmware as well as the app.
PUT THE BOARD INTO DFU MODE
If you didn’t put in the env variables above, or it didn’t automatically put the board into dfu mode ( orange ish yellow ), or you just wanna do it manually because you’re a badass and nobody tells you what to do…
make clean all PLATFORM=photon -s program-dfu # This will transfer the code to the photon make program-dfu
# EXAMPLE OUTPUT [snow@snow modules]$ make all PLATFORM=photon APP=test program-dfu /Users/snow/Documents/dev/hardware/particle/firmware/modules/photon/system-part1/makefile /Users/snow/Documents/dev/hardware/particle/firmware/modules/photon/system-part2/makefile /Users/snow/Documents/dev/hardware/particle/firmware/modules/photon/user-part/makefile /Applications/Xcode.app/Contents/Developer/usr/bin/make -C /Users/snow/Documents/dev/hardware/particle/firmware/modules/photon/system-part1/ all program-dfu APP=test PLATFORM=photon ..... ..... Cut out the middle man ..... Copyright 2011-2012 Stefan Schmidt, 2013-2014 Tormod Volden This program is Free Software and has ABSOLUTELY NO WARRANTY Please report bugs to dfu-util@lists.gnumonks.org Suffix successfully added to file Serial device PARTICLE_SERIAL_DEV : not available Flashing using dfu: dfu-util -d 0x2B04:0xD006 -a 0 -s 0x80A0000:leave -D ../../../build/target/user-part/platform-6-m/test.dfu dfu-util 0.8 Copyright 2005-2009 Weston Schmidt, Harald Welte and OpenMoko Inc. Copyright 2010-2014 Tormod Volden and Stefan Schmidt This program is Free Software and has ABSOLUTELY NO WARRANTY Please report bugs to dfu-util@lists.gnumonks.org Opening DFU capable USB device... ID 2b04:d006 Run-time device DFU version 011a Claiming USB DFU Interface... Setting Alternate Setting #0 ... Determining device status: state = dfuIDLE, status = 0 dfuIDLE, continuing DFU mode device DFU version 011a Device returned transfer size 4096 DfuSe interface name: "Internal Flash " Downloading to address = 0x080a0000, size = 6812 Download [ ] 0% Download [ ] 0% Download [=============== ] 60% 409 Download [=========================] 100% 6812 bytes Download done. File downloaded successfully
You can use any/all of these clean all of these
make
make clean all program-dfu PLATFORM=photon -s APPDIR=~/app_name
SUPER MEGA ADVANCED
To Escape the Hardware Abstraction Layer (enabling direct hardware calls)
make clean make APPDIR=~/app_name SPARK_NO_PLATFORM=y
There are more detailed docs at:
github wiki It’s hard to navigate, and I imagine it’s a bit confusing.
gettingstarted Other potentially helpful github wiki
]]>CHOSEN: I chose this robot because it has very good documentation and because it is one of the many sumo bots that use extreme speed as their main weapon. Additionally, everything on the robot is mounted to a single base plate which is our intended sumo bot construction technique.
CRITIQUED: The designer made a very effective sumo robot which did very well in several competitions. The design was steadily iterated over time in order to achieve this bot. Also, they included multiple sensors to find both opponents and the edge of the arena. However, the robot seems to only really smash straight forward at the beginning of the match which may allow random factors such as traction in different parts of the arena to effect the outcome of the match. It would have been nice to see more use of the sensors intended to track the other robot.
RELATED: Like many robots of this size, it uses extreme speed and a low angled blade to knock its opponents out of the ring. Other examples are:
LINK: http://web.mit.edu/jlramos/www/sumo_robots.html
]]>CHOSEN: I chose this robot because it is one of the few small sumo robots to use an attack besides ramming to get their opponent out of the ring. Additionally, the designer was able to fit a lot of stuff into the small package (including an air tank, large lipo, motors, etc.) all while keeping it under 1kg. This could help us figure out how to keep our sumos small as well.
CRITIQUED: The designer did a very good job at fitting a lot into a small space and staying beneath the 1kg weight limit for the competition this robot was entered in. Also, the choice of flipper actuation (pneumatics) proved to be very strong and effective in the ring. However, this robot’s main weakness in the ring was its sides. Ultrasonic sensors on the back and front allowed it to avoid enemies from those directions. Unfortunately, the side sensors seem to have holes in the enclosure, but they were not added.
RELATED: This robot uses a similar lifting mechanism to larger battle bot robots. Some examples are:
LINK: http://www.instructables.com/id/Autonomous-Sumo-Battle-Bot-with-Pneumatic-Flipper-/?ALLSTEPS
]]>
Chosen:
Unlike the other two drawing machines, I chose this piece due to it’s deeper contextual implications. While other drawing bots (actually, artists in general), create impermanent work, this project challenges the notion of longevity and preservation that artists often face when creating a work. The drawings created by this initiative have the potential to survive for millennia—which also begs the question: What image is worth preserving that will outlast, possibly, humanity?
Critiqued:
Although I love how the Moon Arts team is challenging public perception of art by, literally, going where no artist has gone before, I dislike how so many images were chosen as part of this series. In Spring 2015, an open call was put out for 10,000 images, and over 9000 artists responded. With an open call like this, it’s natural to assume that many were inspired by fleeting ideas and current trends. In fact, in ten years, I wonder how many of the drawings will hold the significance they do today.
Related:
This project was spawned from the collaborate efforts of the Moon Arts Project and NASA to orchestrate a cultural mission to space. Another project to be fulfilled in the year 2016 is the launching of a physical sculpture “The Moon Ark,” where it will exist in space for, potentially, billions of years.
]]>
Using an actuator based reel system, artist Harvey Moon is able to draw complex images with this machine.
Chosen:
Like the first piece I chose, I loved how this drawing machine uses simple motion to create dynamic results. The pen is controlled by two motors, but through careful calculation, the paper becomes a grid that is then used to precisely place each pen stroke.
Critiqued:
Like all drawing machines, the performance is initially captivating to the viewer, challenging them to speculate what the image is before it is completed. Something I dislike, however, is the simplicity of the performance. The motion very directly correlates to what is being drawn. Unlike the sand drawing piece where the viewer is left to wonder how the pendulum motion is able to create a pattern, the viewer very plainly sees what is happening as it occurs.
Also, since the final drawing can be completely removed from the method used to create, I can’t help but wonder if the work is a performance or simply a stylized printer.
Related:
The cable mechanism in this piece is similar to the “Four Cable Drawing Machine.” In which artist, David Bynoe is able to move an object across a bed of sand to create simple patterns.
http://dbynoe.blogspot.ca/2013/12/four-cable-drawing-machine.html
Project Video:
The Drawing Machine: Harvey Moon from Make: on Vimeo.
]]>
I recently went to a show opening at the Miller Gallery by the name of “Maximum Minimum In Unum.” One of the pieces displayed was “Entropic Order” by Laleh Mehran.
http://lalehmehran.com/Entropic-Order
Explained:
In this work, a pendulum-like device is used to draw patterns into a bed of sand.
Chosen:
I thought this was a unique take on a drawing bot. Instead of having a miniature car-like mechanism that is programmed to create patterns, this piece facilitates an elegant performance in which the patterns serve as both a product and a record of motion.
Critiqued:
While I love the simplicity of the motion, I am still left to wonder what there is to gain from watching a pendulum fill a large grid with the same pattern. While the physicality of the machine is captivating, as the performance continues, nothing unexpected is revealed.
Related:
The patterns found in this piece are derived from the designs used in “Sacred Geometry.” These repeating patterns have been used in the construction of churches, mosques, temples and other places of worship. Such patterns are also not rare in the world of art. Artist and programmer Sakari Lehtonen developed a web app that allows users to generate their own patterns based on the principles of sacred geometry (http://geokone.net/).
Entropic Order from Laleh Mehran on Vimeo.
]]>
Explained: A complete kit sold by Parallax at $135 for one, and $225 for a complete kit. It uses BASIC Stamp control boards and comes equipped with servos, QTI line sensors, and other components all mounted on an aluminum chassis.
Chosen: I chose this project because it seemed like the most similar product that is out on the market.
Critiqued: I think that the company definitely mad a robust product by using the aluminum and it is surprisingly light at 400 grams. I think that while there is room to change the circuitry and hardware for the robot, it seems like the shape and look of the robot is limited. Its a 10x10x10 box that you have to work with.
Related: The company doesn’t really say what inspired the SumoBot but it seems like they were targeting hobbyists and hardware enthusiasts. I think this probably informed the zumo Robot shield that adafruit sells for the arduino now(https://www.adafruit.com/products/1639) but I don’t know of hobby and commercially available battling robots before.
]]>Created by Alessandro G.
EXPLAINED: A way of drawing inputted data with high accuracy on paper with a swivel polar arm creation
CHOSEN: This project was chosen because it could be modified to clean a whiteboard table in designated areas if an eraser was added. It could even deal with specific commands if we decided to move into having it coordinate with a camera.
CRITIQUED: This project does an efficient job of using having a simple angle and radius controlling system but having very accurate results. I think the documentation could have been better such as no outside noises, like a baby, in the background and better camera handling. Overall it displays the concept of the project very well.
RELATED: This work is related to small sized gantry systems. Some examples are:
– https://en.wikipedia.org/wiki/Gantry_crane
]]>EXPLAINED: A drawing robot exhibition which gathering visitors’ portrait based on image processing.
CHOSEN: This project is a good example as a combination of art and technology. Which gives the robots a different charm. And the usage of computer vision can be a good reference for our Project 1.
CRITIQUED: It’s not the first time that people use camera to capture portrait and train robot to draw the portrait out. However, this exhibition is a good try to attach art elements to robot. Making the topic more appealing rather then just being an impassive technical demo.
RELATED: There are several related projects that also used openCV edge detection algorithm to get portraits.
And for our Project 1. We adopted similar gear & pinion controlling system as the project showing below.
The link and video for this project:
http://patricktresset.com
EXPLAINED: A cleaning robot with a vision control system built in.
CHOSEN: This cleaning robot has integrated a camera that can not only be used for real-time controlling but also for analyzing and recording. And the robot’s mechanical design is a good example for designing for the specific usage.
CRITIQUED: This robot has a good feature that it makes use of vision controlling system which gives it the ability to work remotely with more flexibility. Also it can meet it’s working environment’s request by assembling chains in 6 different direction.
RELATED: The 6 chains ducts climbing system reminds me of the wall climbing robot that professor showed in class. Which does an awesome job in omni-direction climbing.
VertiGo – A Wall-Climbing Robot including Ground-Wall Transition
]]>
Link: Follow this link to see the Instructables page for it.
VIDEO:
]]>