I have always been interested in Live Visuals. from the good old Uncle Ian days, where we projected films and used video monitors with the “singer” on. We had partys where we covered the walls with Super 8 projections. At one time in Portsmouth I was regularly projecting across the road onto the white house opposite. My end of term dissertation at Portsmouth Art College was on synesthesia, the art form that attempts to fuse audio and video into a whole. (not the ability to smell Blue and see G-sharp, that’s madness!).
More recently I have played “live” over the internet using the Furthernoise.org tool called Visitors Studio, and experimented with Jitter, Modul8, Resolume and Quartz Composer.
I have the original “Expanded Cinema” book by Gene Youngblood, and it is still my bible in a way, especially the naive optimism that “tuned in and turned on” freaks had about the future.
Anyway, this is just a short introduction to a collection of posts that are documenting my re-awakened interest in producing Live Visuals, specifically linked or generated by music.
Today I attended a course on Modul8 with Ilan Katin. He is a bit of a guru in the program and paid by them as a regular walking advertisement.
Modul8 did not immediately grab me when I did my program comparisons a while back, and instead I was drawn to the maybe more professional “looking” Resolume Avenue.
This is because I am very interested in Quartz composer too. And the great thing about Resolume (stupid name, Im gonna call it Rezz from now on) is that it imports QC files and by adding a input splitter module in QC for all parameter you want to tweak live, you can get control of your QC patch in Rezz. Which is very nice.
Now having said that, at the hands of an expert, Modul8 was amazing and the course was great too, he quickly ran through the basics, but got deep pretty quickly. Funny guy, he said right out what things he likes and what things he doesnt, and I suspected when he said he didnt like something, I slightly suspected that Modul8 didnt DO THAT too well!
I was bowled over by, and here come some screenshots, the ability to make a 3D mesh out of any image, based on a grey-scale height-map (whiter means higher).
The built in particle effects, and the audio analysis tools. especially on the 3D-ified images…
I discovered that Modul8 DOES accept QC files, and one of mine popped in OK,, bt didbt LOOK that good, kind of over contrasty, but need to check that, might just be settings, but none of the controls are anywhere to be found, Ilan said, Well you can, but its difficult… ” Hmm maybe doesnt work…
Also, he kept saying, “Oh.. thats a bug, dunno when Jason (made up name cant remember) will fix that” or ” Yeah…its always done that with me too” or “Hmm thats just a quirk of the software”..
Well thats exactly what I think/thought of Modul8 actually… cool but quirky…
There is a Mad Mapper course tommorrow, that i now wish I had gone to. Mad Mapper is pretty simple program, but its not what you learn about software at these things, its seeing what other people do with it.. Video Mapping is pretty new to me, but its certainly intriguing, and if I can get a projector for my birthday (hint hint Hilde!!!) I might have a go at it… I wonder if there’s any battery operated projectors… It would be great to go out and do stuff in the snow!
To research VJ visual mixing programs for use in new musical project. Background
I have always been interested in live visuals, both from a psychedelic hippy point of view, but also from a more art oriented and academic point of view. I’m very interested in translating sound into picture, and the visual effects of synchronization and tightly linked audio and video experiences. Indeed in my texts section, above, there is a long article I wrote at Art College in 1989, about Synesthesia.
With this new looping band (all of us are live-loopers, and will synchronize all our equipment so we all add small amounts to create a rhythmic whole) we would like to link what we do to a live generated looping projection, that is generated live, FROM the music, with little or no user interaction. Research
I met with my friend Grete, who is a professional VJ called EYEBORG and has worked at Ministry of Sound and Renaissance, about software’s, and techniques. She told me about three packages, and a little about each. They were:
Having checked all 3 packages, I want to make one thing perfectly clear. All seem great. All have lots of fun possibilities. I am certain that once you get deep into each of these you will find lots of interesting things, I was only really testing for 2 things.
1. I really cant be bothered to learn a really complex program, with all sorts of online tutorials necessary and ages before I get results, no I want “Out of the box” results, I want to see immediately how it works and how I might use it, but will NOT shy away from many days/hours working on the visuals, but I don’t want to WASTE time in the wrong software.
2. I am ONLY testing the products ability to GENERATE visuals, not play back short quirky video clips. This is a biggie, almost all VJ’s basically do this, I’m am not in the slightest bit interested in flash! made mandalas or bouncy texts, of trippy zooms, or god help me FRACTALS !!! YIKES!!
No, I need this to analyze the music(volume, frequencies whatever) take in midi clock and maybe midi notes, and use it to create visual on the fly.
Oh, and I was only testing the demos, available on there sites for free d/l. Maybe full versions bring in more cool shit!
I am having lots of fun designing the video projection for my 30 minute performance at Y2K-X in Santa Cruz and 2 weeks time (Sunday 17th 10.30 Pearl Alley Studios). As I have decided that something in the performance will be influenced by psychedelia, I am trying to ( dunno what I mean by “trying”… its easy…) make a trippy 30 minute film.
Im working with the strobe effect that I was experimenting with in video back in the 80´s, and re-discovered for the “driving film” Vang Drive (see prev post). The theory that I was working on was that if you flash the frames in negative/positive in front of the eyes, the ends of your eye rods dont have time to aclimatise themselves to being, black for example before they are given white to process… this makes vision difficult and unstable. THEN if you introduce a static NON strobed element to the composition infront of the flashing, it will kind of float out of the screen IN FRONT… this is because the eye will try to give this element prescience over the instability of the strobing.
Anyway I started out with such lofty ideas… 15 layers of flashing later (Im using stock footage as source material, mostely a huge collection of grunge film, leader film, scratches etc.) its looking LOVELY…
So now Im trying to render it… except that the fucking thing keeps crashing.. damn it! It says I have some frames out of bounds and 6k pixels or something, except I dont.. unless its doing something behind the scenes… Grrr.