Pages

Thursday, December 26, 2013

silent night

Our last episode was about bad experiences with iMovie and frustration with the state of licenses for movie soundtracks. I left on a high, though unlicensed, note and told you about my happy times with the python program 'webkit2png'.

It would have been a fine post with which to end the year if I had included an example. I couldn't figure out how to include the video I made in the course of writing that post and still preserve my anonymity.

I'm back with a new example that pinpoints my location as the northern hemisphere of Earth. I'm comfortable with that.



'Silent Night' was built from 251 ten second exposures taken at one minute intervals on Christmas Eve. The images were post-processed and stacked in webkit with an HTML canvas and getImageData/putImageData.

The sound track 'Silent Night' was licensed from friendlymusic.com for $1.99. That fee got me the rights to include the thirty second track in exactly one personal video uploaded to a 'User Generated Content' web site. My theory is that Google Drive is a 'User Generated Content' web site and that my hosting the file there is within the scope of my license.

Google Drive allowed direct access to the video for a few hours last night. That's what I wanted. This morning, Google gives me a fat HTTP 403. Perhaps this has something to do with copyright police. You ought to be able to get the full video from Google with this direct link. If your browser is a modern one, you ought to be able to skip the wonky flash player and just view the file directly below:


Merry Christmas.

Wednesday, December 11, 2013

cinema

Yesterday was a rare whole-family snow day. It was actually that rarer thing -- the nearly snowless snow day. The morning's lack of snow and relative leisure gave me a chance to set up a time lapse rig to catch the expected snowfall.

I shoot most of my time lapse movies with a Canon 40D SLR camera. I love the camera, but my time lapse machinery is a little unsatisfying. The camera is about six years old. My Mac appears ready to do tethered shooting with many camera models, Canon included, but not this one. I think this model was too new at first. Now it too old. Canon probably has an free and unscriptable app for this purpose. I'm sure that grownup photographers have expensive software that makes this all go. I don't use the words 'workflow' or 'post-processing' unironically to describe my modest endeavors. I'm not really interested in a complicated solution.

It is a strange sign of the times that my idea of an uncomplicated solution is an Arduino and a transistor. I bought a cheap wired remote for the camera and hacked it and the transistor onto the Arduino. A tiny sketch runs the project. My original project had no user interface. The delay was embedded in the sketch source. Every new project meant recompiling the shutter sketch. Though this sounds clunky, it is no more complicated than using Apple's own 'Automator' facility for this purpose. Who knows? Perhaps the Arduino IDE itself could be scripted with Automator.

I now use a TFT touch screen shield from Seeed (by way of Radio Shack) as a little user interface. That contraption uses nearly all of the available I/O pins on the poor Arduino. Good news, then, that the actual function of the whole bodge -- connecting two pins from the camera -- requires only one.

This assembly of camera, Arduino, touch screen, and transistor worked well enough to get me 497 pictures of the day's precipitation with a 30 second interval between shots. The camera's battery was about three quarters full at the start and the event wrapped up when the battery gave out. I shot from a Duplo rig.

The shooting took about five minutes to set up and ran unattended for the next four hours. These were, by far, the most pleasant hours of the project. The next four hours boiled away in front of iMovie as I tried to put the frames together. I have used iMovie dozens of times to stitch together little time lapse movies for my kids. It is always frustrating but I usually get a movie out of it. Not so this time.

I hadn't used iMovie as much since it was sent off to a reeducation camp for the redesigned iLife '11 suite. I adjusted my workflow and ambitions downward towards and kept going. Yesterday was the first time I tried the still newer 2013 version. It cheerfully upgraded my library and spent much of the rest of the evening crashing.

The basic idea in cinema is that a series of still images can be displayed in sequence in a way that an observer will perceive as fluid and continuous. That's the fundamental concept. The difference between a time lapse movie and a real time movie was once maintained by clockwork camera drive or the evenness of a cameraman's cranking cadence. In the digital world, it exists only in post production.

iMovie treats still pictures, the most basic and primitive element of movie making, as some sort of second class citizen. For years, you couldn't have a still picture on screen for less than .2 seconds. You can't have a title or effect span more than a single frame. Pictures are converted into 'Ken Burns' movie clips by default. The older iMovie 11 let users turn this default off. The newer version has apparently hidden this knob. Both versions let you set the duration of several stills together at one time. The new version seems to have lost the feature that lets you crop a bunch of stills to the same box.

iMovie accomplished a lot in the Power Mac G3 days simply by showing that it was possible to do something with digital video on a consumer machine. Job well done. I believed. Even today, a bunch of clips put together in iMovie usually look better than the raw concatenation of the footage.

It is no longer surprising that consumer machines can handle enough data to string together a movie. Apple themselves distribute a version of iMovie for their telephones and tablets. What is the point of iMovie now? What does it demonstrate? I don't know. Many of the signature iMovie effects seem not much better than HTML5 CSS demos available from Apple and others.

After four hours in front of iMovie, I decided to find a better way to make time lapses. My only rule was that it had to be free, simple, and not involve anything called an 'App Store'. I searched for tools designed to let you make movies with HTML5 and CSS3. There may be a good search result out there somewhere. I couldn't find it among the many articles about viewing videos in the browser with HTML5.

A more complicated query found me webkit2png. Webkit2png was described modestly as a command line tool for making screenshots of web pages. It is much cooler than that. It is a simple python script that uses the python to objective-C bridge to create a webkit view offscreen, load an URL into it, and dump the virtual frame buffer to a file. The screenshots have absolutely no browser chrome, just edge to edge web goodness.

I wrote a simple HTML file that accepts a frame number as part of the query string and generates a web view filled with that image. Titles are superimposed and styled with CSS. Webkit2png turns each of these rendered frames into a PNG. ffmpeg turns these PNGs into a movie. My titles can fade out over as many frames as I like.

There are some rough spots. My movie script is basically a javascript program. I use CSS for titles and transitions, but I'm not actually using the browser animation facility. Each movie frame is rendered by a completely independent webkit instance. Time exists only as a Javascript variable. Rendering is slow. ffmpeg turns 720p PNG files into 720p video at about 30 frames a second. I can render only about 2 frames a second with webkit2png. iMovie may have lost its mojo, but it at least has an interface that looks good in the Apple Store. My approach does not though I think one could be built for Safari with little trouble. Safari has evolving support for 'Web Audio' but this support does not translate directly into a soundtrack story for my movies.

My shell-based workflow is not an alternative to Apple software. It runs in a terminal window on my Mac. webkit2png runs only on a mac and depends on the fact that WebKit (also Apple) can be manipulated through a bridge to Objective-C (also Apple). Once ffmpeg generates a video, I open it in Apple's great Safari broswer and watch it full-screen. I don't know how to do this as simply with any combination of Linux, Windows, Firefox or Chrome.

I think Apple could decide to make the MPEG encoder and a few other toys available from Safari and then do iMovie as a web app. iMovie would have a point again. It would demonstrate that it is possible to do something with digital video on a consumer machine. The difference between now and iMovie of the twentieth century is that the leading consumer machine is the browser.

In the mean time, several groups are beavering away to bring you video editing in the cloud. I fear these efforts will be linked irretrievably to a 'platform', like YouTube. Apple could remind us that a local machine actually does a fine job. They could add some value along the way. They have the cloud facilities to offer automated closed captioning. They have the clout to offer a portal for licensing music for use in videos. Getting synchronization rights to incidental music in a home video is now more difficult than digital video editing.

One firm trying to sort the soundtrack issue is FriendlyMusic. They offer a catalog of at least tens of thousands of tracks available for your video. They offer 'Mash' licenses -- good for non-downloadable videos hosted only at youtube.com for as little as $.99. A 'Personal' license, $1.99, lets you host your
video on 'any social video website, such as YouTube, Vimeo, Animoto or any other UGC site.' A 'Commercial' license, $25.00, lets you make a video for your small business. They mean small -- fewer than ten employees and less than $1m in annual revenue. I hoped to find 'A Hazy Shade of Winter'. I expected that finding a specific track would be dicey. I was right. I looked around for alternative ideas. I flipped through about a hundred track clips before settling on silence as my alternative soundtrack concept.

Nick Wingfield reported last month that an Apple ][ machine donated by Steve Jobs to SEVA, a non-profit humanitarian organization in Nepal, has returned to his estate after 33 years. One point of that piece is that Steve Jobs was actually a philanthropic creature by some measure -- a response to a criticism often leveled at the man. Let me borrow a Steve Jobs quote from that piece:

“I only know how to do one thing well, I think I can help the world by doing this one thing.”

My Mac does at least four things well. It is beautiful and reliable. It has a good operating system. It includes a good web browser. It may be greedy of me to expect more from it or from Apple. For the moment, I am limited by my vision, not by my tools. I'm exceedingly grateful to Paul Hammond for webkit2png. It's cool. It scratched an itch. It reminded me of my joy using an early NeXT workstation. These reminders do not come often.

Wednesday, December 4, 2013

surprised

I have a friend who mines Bitcoin. He wondered aloud today if an alpaca might make a good animal for home. Alpaca scarves for all! I was surprised to hear myself say that Bitcoin might be the better investment.