Palin

This is for the record, so that in case it comes true I can say “I nailed it”:

I wouldn’t be entirely surprised if, at some point this week, Sarah Palin’s ethical troubles suddenly blow so wide open that McCain is forced to withdraw her name from candidacy and select someone else. This would be an example of the Miers Maneuver.

Withdrawing Palin’s name might seem politically infeasible since it would show that McCain is grossly incompetent at the most basic of presidential tasks, but then again it already looks that way. He could easily claim, “see, I learn from my mistakes.” Or he could just throw her under the Straight Talk Express and claim she had assured him that there was nothing to the allegations, etc etc.

Update: Ok, so I didn’t nail it. Enjoy your nominee, dudes

I Am Free, and so are you

Jewel Logo

Category: Accessories
Released Aug 9, 2008
Seller: N/A
(c) 2008 Owen Williams
Version: 1.0
24K

The red icon on your Nokia N800 or N810 always reminds you (and others when you show it to them) that you appreciate freedom. It’s a work of art with no hidden function at all.

You are free to install it from the Application Manager from the Extras repository.

You are free to install it with one-click download:

Chinook / Diablo (OS2008)

You are free to download it directly from my website:
i-am-free_1.0-1_armel.deb

You are free to download the sourcecode:
i-am-free_1.0.tar.gz

You are free to open it up, see how it works, make it better, email it to your friends, and do whatever you want.

What else are you free to do?

Grab bag

It’s grab bag time! First up, I went to LA in early May to get training on a Lustre color grading system. I also hung out with Merry, who I knew in college and who had been drawn to the bright lights of Hollywood to be an assistant director. At the time I visited she was working on CSI:New York, and she let me hang out on the set for a day. Luckily the crew was really cool and had no problem with me being there — in fact I got mistaken for working there at one point :). Merry said there are other shows that are strictly locked down and would never have allowed outsiders to sit in video village with the director and screenwriter.

We also went to see Ironman at the Arclight, which features seat reservations. Yes! Why don’t more theaters do this? I would gladly pay a premium every time I went to the theater if I didn’t have to show up 45 minutes before showtime just to guarantee myself a decent seat.

[600×800][1200×1600]
“Actual suit worn by Robert Downey, Jr. in Ironman”

[800×600][1600×1200]
Merry and me at the Geisha House

[800×600][1600×1200]
On the set of CSI:NY

Here’s some other selections from the past few months. I keep forgetting my camera or not
bringing it with me to places. This must change!

[800×600][1600×1200]
Char and I went on a bike ride with Dad. You may be able to see him in the reflection of our glasses.

[800×600][1600×1200]
When we go for a bike ride on the Cape we often stop at this marsh

[800×600][1600×1200]
Berry picking in Ipswich

[800×600][1600×1200]
We made sure not to over-pick, unlike last year

Avid, LEAVE MY LEVELS ALOOOOOOOOOONNNNEE!

Avid has posted a little tutorial on how to export and import quicktimes “correctly.” After reading it several times, I noticed that the author doesn’t cover what I think is the most important use-case: I would like to export video and not touch the levels at all. And then I would like to import it, and not touch the levels at all. Leave my levels alone! Don’t clip them, don’t make them colorsafe. My job as the online editor is to make the show broadcast-safe, and I don’t need the “help”. When Avid screws with my levels, it makes it impossible to roundtrip between Avid and Shake so I can do vfx work.

Thanks to this article, however, I think I finally understand what the Color Levels options in Avid mean:

  • “RGB”: This material I am importing or exporting is EVIL RGB, and needs to be fixed to proper broadcast safety. Please Avid, I am incapable of using a color corrector, won’t you squish (RGB option) or clip (RGB source) my levels for me?
  • “601/709”: LEAVE MY LEVELS ALOOOOONNNEEE. I’ll do my own correction, thanks!

If you select 601/709 everywhere you can, Avid won’t touch your levels and will preserve the full range of the image. I have confirmed this by exporting a dozen files with all sorts of settings. I was able to make the process work two ways:

  • Avid codecs using Format Settings / Video Settings / Options / Color Input: 601/709, and Color Levels: 601/709, then importing 601/709
  • Animation codec with Color Levels: 601/709 and importing 601/709

With the Avid codecs, selecting Color Input: RGB clips data off the top and bottom, and selecting Color Levels: RGB squishes the levels to broadcast safe without clipping.

I have been exporting and importing files incorrectly for years. Along with alpha channels, importing and exporting in Avid is insanely complex. Having backwards alpha channels doesn’t help. This needs to be fixed. Here is how these options should read:

  • Color Levels: Maintain color levels
  • Color Levels: Import as broadcast-safe

3 for 3.0, and Project Sandbox

There has been a lot of discussion recently about the future of GNOME and the stasis that the project seems to have reached ((I dislike the term “decadence,” because it seems to imply abundant wealth going to waste. GNOME will never be the wealthy-person’s OS of choice. That would be OS X)). I think that stasis is not a horrible place to be. As others have said, having a desktop that is stable, useful, and predictable is a good thing. Developers have worked hard to get the desktop where it is, so let’s give them all a pat on the back before we discount all that’s been achieved.

But, as an application developer, I see some shortcomings to GNOME that is preventing further progress in user interface design. There are technical limitations, but there are also social limitations working against progress. On the technical side, GTK needs some changes to allow more developers to take user interaction to the next level. On the community side, there needs to be an official GNOME-sponsored forum in which to experiment free from criticism.

To enable developers to try new things, GTK may need to break ABI stability and move to version 3. I call it “3 for 3.0” — the three features GTK needs to move forward:

  1. Multi-input from the ground up: Right now GTK reacts to one event at a time. It’s possible to make an application look like the user is doing many things at once, but true multi-touch and multi-user interaction is not really possible — or at least it’s too difficult for moderately-skilled developers (like me) to achieve.
  2. First-class Animation: GTK needs to perform animation by default. There are ways to make widgets spin, slide, and fade, but they are all hacks. I should be able to fire off an animation and perform other functions while it is animating. I should get a signal when the animation is complete. Built-in state management would be a key feature. There should be a standard library of basic transitions and special effects that anyone can use with minimal code (like fade, push, wipe).
  3. 3D-awareness: GTK doesn’t need to be 3D itself, but it should understand and be ready for 3D. 2D apps will never disappear, but there will be a need for a bridge between 2D and 3D. This could mean that GTK would support a z-buffer, or perhaps it would have a blessed 3d-canvas like Clutter. Or perhaps it could have access to OpenGL to provide various compositing and shader effects.

Perhaps some of these effects seem only useful for pointless flourishes that will slow down interactivity and increase processor overhead. I would argue that what GNOME needs right now is some stupid slow-ass eye-candy. Look at Compiz Fusion. It has dozens of gaudy effects, half of which are useless and most of which have way too many settings. But I love playing with it. It’s been a fertile sandbox for developers to go in and see what works. Maybe the “fire” transition is a waste of time, but there are a few Compiz features that are genuinely useful and I use all the time. The “enhanced zoom” feature, for instance, is a perfect way to blow up a youtube video without making it full-screen.

Every now and then I see a screencast from a GNOME developer working on a little pet project, and some of those demos have been amazing. Whenever I go to the GNOME Summit in Boston, there’s always some guy with his laptop, and he says, “take a look at this –” and proceeds to blow everyone away with some awesome thing he’s been working on. It’s rare, though, for those hacks to escape from that single laptop onto anyone else’s.

GNOME needs a Project Sandbox — an official, gory-edge (it’s past “bleeding”), parallel installable set of libraries and programs (“Toys,” perhaps? ((The exact terminology isn’t important, I’m just keeping the Sandbox metaphor going))) with all of the crazy hacks developers have been trying. It should be housed in a distributed SCM, so developers can push and pull from each other, mashing features and screwing around with GTK “3”, Clutter, Pyro, and whatever other toys people come up with.

The Sandbox should have two rules:

  1. No Kicking Sand ((alternate title: “No Pooping in the Sandbox”)) — ie, no stop energy. Anything goes, no matter how hacky. Developers should be able to prototype ideas quickly, no matter how cracked up they may be. The good ideas will stick and can be re-written cleanly. Nobody, not Apple, not Microsoft, not Nokia, knows what will really be useful in the future with multi-tap and 3D. Apple has a head start, but that doesn’t mean they have all the answers.
  2. Anyone Can Play: If I pull from your tree, I should be able to build and install what you’ve made. It does no good to have a toy if it only works in your corner of the sandbox. There might be some cases where a feature requires a certain video card, but developers should make a good faith effort to make code build and install on systems other than their own. Setting up a development environment is Hard, but I wouldn’t care if I needed 4 hacked copies of GTK each with different .so-names. Disk space is cheap! Computers are fast!

GNOME has done a good job of reining in developer craziness and promoting consistency and uniformity across the desktop. That was good, and was necessary while the desktop was maturing. Now it’s mature, and those reins need to be lifted, or at least relaxed. The stable desktop can plod forward steadily, but developers need a place to relax, rip off every feature from the iphone and Vista, and more importantly make that code public without fear of attack. What’s worse than a flame on Planet Gnome in response to a crazy feature? The feature that doesn’t get written for fear of being flamed.

Converting Varicam 48fps to 24p

Warning, technical video post production post ahead.

A few weeks back we needed to shoot some greenscreen for a show that is being delivered at 23.98fps (aka “24p”). I’d had problems pulling a key in the past with motion blur at that slow framerate (I prefer 30 for TV work), so I suggested we increase the shutter speed in the camera. The DP seemed more comfortable adjusting the framerate, so he suggested we shoot it at 48 and only use every other frame of the footage. I figured I could write a script to do the conversion later.

We shot the footage, and the next week I sat down to write a python program to convert the material to 24 fps. This could have probably been done with final cut or something, but I don’t know that program so I did it the easy way: play around with the frames themselves using an image sequence (ie, a big folder of numbered tif files, each of which represents a frame of footage).

Normally this would be easy, just take every odd-numbered image. But in this case, although we shot at 48fps, the footage is at 60fps. Why? Varicam works like this: the front of the camera (the shutter and CCD) shoot whatever framerate you choose, but the tape recorder in the camera always records 60fps. Even if you shoot 6fps, the varicam writes 60 frames to the tape — 10 of frame 1, 10 of frame 2, etc. So when we shoot 48fps, there are 60 frames per second on the tape, 48 of which are unique and 12 of which are duplicates.

If I am going to convert the footage to 24p, I need to first remove the duplicate frames (60 -> 48 fps), then remove every other frame (48 -> 24). By analyzing the footage frame by frame, I determined that when the varicam shoots 48fps, it uses the following pattern:

0000100001000010001

Where 0 represents a unique frame, and 1 represents a duplicated frame (to fill out to 60fps). This pattern is repeated over and over again throughout the footage. (I only just noticed that the pattern is 19 frames long, not 20 like I’d expect, but looking at the footage that’s what it is.)

My python program goes through an image sequence, finds the pattern of duplicate frames, and then copies every other file that is not a duplicate to a new folder as a new image sequence. It makes the following assumptions: the files are .tif files, and “duplicate frame” means “exactly the same size” (not a bad assumption with digital media and tifs). It’s a little hacky, but looking at the resulting 24fps image sequences I don’t see any stutter or dropped frames.

There are some basic checks in the code so it hopefully won’t overwrite precious data, but I make no guarantees.

Code: 48-to-24.py

Grading a short film


I had the good fortune of being able to grade (color-correct) a graduate student’s thesis project last weekend. It’s called Mel’s Hole, dir. Kenji Miwa. It was my first narrative project, and my first using Apple’s Color. I usually do documentary work, where the highest priority is to make the footage look “good” and consistent. Also, I’m used to the Avid color corrector, which is not very good for matted secondary color corrections (“brighten his face here”) so it would have been hard to do the sort of aggressive grading that the director wanted.

He’s given me permission to show some before-and-after shots from the film, showing off some of the more fun corrections I got to do. Mouse over the images to see the uncorrected versions. (Shot on a Panasonic HVX200 with a lens adapter for low depth-of-field.)

(Note, these images sometimes appear much too bright on a mac. Set your monitor’s gamma to PC / video standard (2.2) to see the night-time shots correctly.)



The above shot represents the basic look for the film, which is a desaturated “bleach bypass” look. It’s high contrast, with substantial crushing of the blacks and whites. In this shot, we had to knock down the colors of the blanket, which was still too saturated even after we applied the look.



In this scene, the character walks into the woods, which were supposed to be dark and foreboding. By really crushing the blacks we were able to make the woods look deeper and more mysterious. This darkening caused the character to be somewhat lost in the busy-ness of the image, so I put a small tracked oval (the shot pans up) over the character to draw attention to him.



This scene takes place in the middle of the night, and I was instructed to make it very very dark, with a silvery-blue cast. Although the lefthand venetian blind did not have any light behind it, I was able to put one in, which serves to illuminate the character’s face (even though a light back there would really just silhouette him). There are still some bright highlights visible in the blinds, but I wasn’t able to get rid of them.



This shot was actually a last-minute idea. It is paired with another night-time shot, so we decided this shot should also be in night-time. I was able to do a good day-for-night, including drawing in the spill of the light at the bottom of the stairs.

I had a lot of fun doing this grade, and I really liked Apple Color — which makes sense, I doubt they would have bought a company that made a bad color correction program. I do have to say that the keyframing in that program absolutely blows, and the tracker isn’t great either. It also crashed immediately after finishing a render once. But on the whole, it was good at disappearing and letting me work.

The director and DP were great too. We hadn’t really done a lot with aggressive grading before, but once they saw what was possible they were able to direct me better and make requests that were creative but also doable.

World premiere is on May 2nd. The details are on Facebook.

(ps, just shoot me an email if you want me to grade your film — the first job is free!)

Starting a user-controlled streaming radio station

The dream:
I want a streaming radio station for my company that itunes can tune in to, where anyone at work can upload tracks and add them to the playlist.

The ingredients:

Explanation:
MusicPD is an awesome music server that I’ve talked about before, and it would be perfect for setting up my little radio station. Users would be able to add files to the playlist from any browser, and mpd now supports shoutcast, which should make turning it into a streaming station easy.

However, right now mpd only supports shoutcast streaming in OGG Vorbis format, and iTunes can’t play OGG. Since everyone in the office is on iTunes, I need some way of getting mpd to spit out mp3 to icecast.

Enter JACK! Jack is a “pro-audio” subsystem for connecting applications, kind of like shell pipes for audio. It can be kind of daunting to work with, but in this case it’s best to just think of it as glue connecting one program with another.

For instance, mpd has a JACK output plugin, and there’s another program, Darkice, that has a JACK input and sends audio to icecast. I can link mpd and darkice with jack, and that should solve my mp3 problem.

So, to review:
mpd (links via JACK to) darkice (which encodes mp3s for) icecast (which streams to) itunes

The Wrinkle:
Make it work on a mac server. At work we don’t have a recent machine running linux that’s up all the time and that isn’t mission-critical, so I have to make due with one of the mac workstations. Luckily macports will do a lot of the heavy lifting.

A Note:
This took a lot of building, installing, and configuration that I don’t have space to cover here. I’m trying to record everything I did that was abnormal or not obvious. If you’re comfortable installing software on unix and you know how to configure a program based on the examples and READMEs, there should be enough information here to reproduce my work.

Getting the software:

  1. Install macports on the mac
  2. Install the following ports: apache2 faac faad2 flac id3lib mad lame libid3tag libvorbis speex
  3. Install this port: php5 +apache2 +macosx +pear
  4. Install the following by source: curl, jack (do not use “jackosx”, build it from source), icecast, darkice, mpd
  5. Apply my patch to mpd/src/audioOutputs/audioOutput_jack.c to make it work on mac osx.

Details:

JACK:
Built with these options: ./configure --prefix=/opt/local --with-default-tmpdir=/tmp
I just start JACK with jackd -d dummy -r 44100

If I don’t specify a default tmpdir, jack crashes complaining it can’t stat /dev/shm. I don’t need to use a real audio device in jack because I’m not outputting the audio to the speakers.

Icecast:
built with ./configure --prefix=/opt/local
No special configuration needed other than what’s specific to my site.

Darkice:
Built with ./configure --prefix=/opt/local/ --with-lame-prefix=/opt/local/ --with-vorbis-prefix=/opt/local/ --with-faac-prefix=/opt/local/ --with-jack-prefix=/opt/local/
I probably didn’t need all those options, but heck, it works.
I set up the configuration file so that the audio device is jack (no quotes).

MusicPD:
built with ./configure --prefix=/opt/local --with-faad=/opt/local --with-libFLAC=/opt/local --with-mad=/opt/local --with-libvorbis=/opt/local/ --with-id3tag=/opt/local
I set up the configuration file to use jack as a device. I need to tell mpd to connect to darkice when it starts up, but darkice picks its port names based on its pid, so it’s always different. What I did is create mpd.conf.in, with this as the device config:

audio_output {
type "jack"
name "MPDJack"
ports "darkice-REPLACEME:left,darkice-REPLACEME:right"
}

Then I run a script to actually start mpd:
#!/bin/bash
pid=`jack_lsp | grep darkice | cut -d '-' -f 2 | cut -d ':' -f 1 | uniq`
cat mpd.conf.in | eval sed -e 's/REPLACEME/$pid/g;' > mpd.conf
mpd ./mpd.conf

Putting it all together:
Once all of those pieces are in place, I can start the radio station by starting the various programs in this order:

  • Jack first,
  • then icecast,
  • then darkice, (-v 10 for debugging)
  • and finally mpd. (--verbose --stdout --no-daemon for debugging)

That’s it. I can control the mpd server and it plays music. If I run jack_lsp -c -p I can see mpd is connected to darkice. I can tune in to the icecast stream from itunes and hear it. With simple mac file sharing, users can connect to the server and drop music in the database. Then mpd can either rescan every hour or so with cron, or when the user clicks “update db” in pitchfork.

Setting up pitchfork is left as an exercise for the reader.

Linux Tip: How to clear some room

I’ve been running out of disk space on my laptop. Multiple development environments, vmware images, my music collection, and especially my camera RAW originals take a heavy toll. Most data on a computer is already highly-compressed, but the whole idea behind RAW files is that they are not compressed — they represent the original image sensor data. So, here’s a one-liner I used for freeing up some disk space. (I have an Olympus camera, whose raw file format has the ORF extension.)

find ~/Documents/images/photos/raw -name "*.ORF" | xargs -n 1 bzip2

Just using standard bzip compression reduces each file by about 50%. And, since I had 10 gigs of raw files, that translates into 5 gigs of space I’ve freed up.

PenguinTV’s third platform: maemo

This weekend I hosted a little maemo hackfest with my friend Mike with my goal being to get PenguinTV to launch in the maemo development environment. This meant building python support for mozilla embedding, which was hard. I’m used to checking out code and then building it, but the process of building the microb-engine packages was incredibly obtuse, and had no documentation. But I tried one thing, then another, then another, and finally it worked.

Once the gtkmozembed python module was finally built, porting the application was relatively easy. GTK is really wonderful because you can extract widgets, move them around, and plug them back in where you need them. I can take a toolbar that normally goes here and plug it in there, I can take a preferences window and throw in a scroll bar, and I can turn off that menu item or that button if it doesn’t apply. Sprinkle “if RUNNING_HILDON:” checks where necessary, and:

PenguinTV on Maemo

I always have to do some extra gymnastics because my program requires a lot of special startup initialization and shutdown finalization, but other than that the port was nice and straight-forward.

I’m now working on using all of the maemo-specific features like hardware buttons, network connection, and system messages. It should just be a matter of hooking up to some basic signals to existing functions.

My only big concern about is the “state-saving” feature that allow hildon to completely shut down a program when the user switches away, then restored when the user switches back. Because of all of my threads, database connections, and caches, my startup and shutdown are very slow. I’m not sure if it’s going to be possible to save fast enough. I also don’t know if it’s possible to save the scroll position of a mozilla widget, because when the user switches back they’ll expect to have everything scrolled how they left it. Since gtkmozembed is a black box, it’s just going to start scrolled at the top, which is not acceptable.

Along with olpc and the desktop, PenguinTV now runs on three different platforms using a single code-base.