Final stage of building the Mutable Instruments Edges module was to program the chip. Its a different kind than I have used before so I had to figure out what software and hardware would do it. The following is what I did, that got it to work. I don’t really understand some stuff, but as I get more info, Ill fill this in.
Old friend of the Cranes, Jennie Vee, has released a video with some of my animation in it.
Great song, a collaboration with John Fryer, who coincidentally once engineered a remix by Ivo Watts Russell of a Cranes song.
The animation footage was nicked from an old video for the song inescapable, there’s even a few frames of MY HAND playing the violin in it ha ha.
We first became acquainted with a very young Jennie, when she was to young to get into our gig at Lee’s Palace in Toronto and stood outside the fire exit with her frozen ear to the door.
I’ve been experimenting alot with my new Oculus Rift, trying to get my head around what VR actually is… or what I want to use it for. Getting somewhere with that, but this post isnt anything to do with that. My “meet-up” group had a special arrangement yesterday to check out the Microsoft Hololens.
Admirably demonstrated by group member Edd Smith, he explained that although Microsoft are billing it as a “Mixed Reality” device, that its really just AR (Augmented reality) with glasses. Speaking as someone who has always been a bit *meh* about AR, I was very interested to see if it being full face and immersive with the real world would help some… and would that feeling bring to me any ideas for the USE of AR apart from dancing dinosaurs on the breakfast table, or pop up directions on the road.
First let me outline my misgivings with AR. Firstly… there is already too much visual NOISE in the world, I really do NOT want to add more, permanently to my field of view. I dont want ads popping up as I walk past shops, I dont want TV touch controls on the sofa cushion and I dont want “helpful” dancing paper clips, telling me how to use a kitchen appliance.
AR has always been about quite cheezy and ad-centric things, we´ve even considered its use at work, for creating a “virtual” Teacher that can pop up around the workplace, and give you tasks, quizzes, directions and rewards… Never happened though. Just too expensive and yes.. cheezy.
But then along comes Pokemon GO!, and the world goes AR crazy…
Firstly, Pokemon GO! is a great game, well thought out, interesting features and a new application of… AR??? NO.. a new application of GPS. Its NOT a AR game AT ALL… ok, its cute that you can turn you camera on and SEE the pokemons in the living room/street/Playground, however, AR is not CORE GAMEPLAY. Its NOT needed… infact, I noticed that Nikolas, my son has turned off that feature, and just uses the pregenerated backgrounds. (maybe a battery saving action) NO.. the REAL game-play is in the unique use of GPS, the fact that you MUST walk to hatch an egg, is genius, that you must wander about to grab them poke-balls, is fab… that you can fight against others (anonymously) at Poke-gyms, and leave your “Winner” there is super-cool., and there is no doubt that P-GO! is a great game… but its got nothing to do with AR… didnt need it!!!
So yeah… AR.. Convince me…
So Im going to put on these $3000 glasses and see 3D objects in the room with me? Ill be able to make gestures in space to interact, and Ill be able to talk commands to it?
Well not so much.
Firstly, the field of view is TERRIBLE… its a tiny 30degree square in the middle of your field of view, so things disappear over that edge ALL THE TIME… the only way to look at things is to, stand along way away, or go in close and only look at one small thing. This to me is the biggest drawback, if its not FULL SCREEN, its basically useless, there is no difference to me in looking through glasses at a tiny square of AR, than holding up an iPad in front of my face.
Secondly, and I admit I hadnt thought of this…no BLACK! So no shading and no shadows! OF COURSE! As the view is NOT a screen, but glass (that is a difference between iPad AR where the background real world is ALSO an image, on a camera, whereas on Hololens its glass, see through glass) so all images are colour and bright overlays on reality, not added to reality. Now, its pretty good, but you wont see a SOLID object, you will see a transparent object.
I must say, that this is also such a big immersion killer for me, that I might have tried to use for example, dark glasses which have a default level of “dark”.. then use brightness to bring up reality to normal, add objects, but still be able to NOT add brightness to areas that should be shadows, but Im sure they cant NOT have thought of that, so its probably unfeasible.
Thirdly and fouthly, the input methods. gestures and speech. Speech seemed to work pretty good despite a noisy room, but I could NOT get gestures to work AT ALL!!! I must say this is probably just practice, as Edd was having no trouble.
OH and Fifthly… THREE THOUSAND DOLLARS??? OK I get this is just for Rand D folk right now, trying to get in on the ground floor, incase its the next big thing, and it may be.. just not now, with THIS tech… its just not very good. The future, might be interesting, but if its just gonna be all about advertising, architectural modelling, and directions.. I dunno… someone has to come up with something REALLY cool for this to be useful or worth it, for example, Ive seen a pretty interesting video about controlling the Behringer Deep Mind synthesiser with the hololens.. could be good… again… expensive add-on for a average synth… as in a bout 5 times MORE expensive than the synth… anyhow…
Pokemon GO! is gonna be fucking great on it though!
I’m turning Subscribing off… too much spam.
If you are one of my 3,567 “subscribers” who is a real person and not a crawling spam bot and are genuinely interested in getting notifications of my junk.. ha ha.. let me know, and Ill add you manually…
Heres a film made during the summer of 1985. By me my and my friends Simon Ginns and Cathy Barrow.. Believe me, back then ALL films looked like this…
Ill just put this here… just inc case.
The first of my VR experiments is almost finished. The Window is a one room claustrophobic experience for Google Cardboard. I like the way that VR has kind of pushed back the idea of “the short game!” Wearing a cardboard device for more than about 5 minutes is painful and impractical so we have to make short interactive “ideas” fit that format. It brings to mind the typical quirkjy YouTube clip, or before that the single floppy “demos” on the Amiga, that I loved so much.. had a big collection… oh wait a minute.. and before that.. THE SINGLE!
What is it with that time slot? 4 or 5 minutes seems to be a nice little slice of time?
Anyway, The window explores the way to move around with no interface or controller, not totally solved, it has a few ways to interact with things, a phone, a door, a button and an electric guitar. Things change, but will you notice? until its too late? A little environmental in message, and quite dark.. in a Stanley Parable kind of way…
Coming soon on Apples App store… not free, but “virtually free” ha ha
For editing the OSC codes for the touchOSC interface with missing link box
Heres the official
But just in case Ill just put this here:
After successfully burning the software to the ARM chip on the clouds module I thought Id better collect all the instructions down in one place. I still havent gotten Braids to work, but Im not deterred, Id still like to make Frames, Grids, Edges and.. er that new morphing modulator one.. forgotten..
Anyway here it is…
Go to the MI git repository at https://github.com/pichenettes/eurorack
and click DOWNLOAD zip (you unless you know what you are doing with github, which I don’t)
This folder is your MASTER directory for MI building. it is called:
In it you will see, loose a file called
In order to run vagrant your computer must be able to see that file, that means navigating to this folder via Terminal.
Starting Vagrant (these instructions copied from https://github.com/pichenettes/mutable-dev-environment)
Hard to describe how to do this without knowing where you have saved your folder. But you need some knowledge of command line navigation. Either:
you know it
You go to https://wwwcodecademy.com/ and take the Learn the Command Line course (recommended)
wing it… heres some hints
Command line 101
ls to show a list of files and/or directory that are in the current directory
cd to change directory so
cd documents will put you in the documents folder, then you can run another
ls and see what the contents there are. hopefully, you saved the mutable stuff there, and all you need to do is
cd mutable-dev-environment-master and you are in the right place
cd.. to go up one level, just incase you go in the wrong folder
Type the terminal command to start up Vagrant. That is:
The first time the VM is started, all tools will be downloaded, and the latest version of the code will be grabbed from github. The process takes about 15 minutes, depending on the speed of your internet connection or computer.
Then login to Vagrant (again in the terminal, NEVER in the black terminal lookalike window that Virtual box throws up, was a mistake of mine)Type:
You then need to change a few lines in one file, this is to tell it to upload via StLink (I think)
So look in the main eurorack-modules folder,
open the stmlib folder
Then try to open makefile.inc
You probably need to use a very simple text editor, like Text Edit. But it wont recognise the filetype .inc but will open.
The relevant lines should look like this (ignore those starting with #, they are just comments):
# Supported: arm-usb-ocd, arm-usb-ocd-h, arm-usb-tiny-h, stlink-v2
# PGM_INTERFACE ?= arm-usb-ocd-h
PGM_INTERFACE = stlink-v2
# hla for stlink-v2 ; jtag otherwise
# PGM_INTERFACE_TYPE ?= jtag
PGM_INTERFACE_TYPE = hla
Leave no spaces at the end of lines
Now back in terminal
You can begin to build things
Do these one at a time.
to clean the area reserved for the bootloader
make -f clouds/bootloader/makefile clean
to clean the area reserved for the software
make -f clouds/makefile clean
to build the bootloader
make -f clouds/bootloader/makefile hex
to build the software
make -f clouds/makefile hex
That should be it, but there’s an option to do both at once.. not sure if it worked for me
to upload both in one package to the chip
make -f clouds/makefile upload_combo_jtag
Also found in the make file..
make -f clouds/makefile upload_combo_jtag_erase_first
something gbiz suggested when trying to get Braids to work…
openocd -f interface/stlink-v2.cfg -f target/stm32f1x_stlink.cfg -c init -c halt -c "flash write_bank 0 build/braids/braids_bootloader_combo.bin 0x0"
“It’s most of the command that make runs to flash the binary, less a couple of configure options. I doubt it’ll help, but it’s worth a try.”
and I dont think it did…