Disclaimer: I’m not a video expert, so this tutorial may be entirely wrong. It reflects only my understanding of what I’ve learned so far about editing video files.

Any explanation of digital video begins with an understanding coders and decoders. The proper parlance for this concept is codec. This word is important to learn. Take it to heart.

Digital video takes up a lot of space, much, much more than audio. An uncompressed audio file can be dozens of megabytes. An uncompressed video file can be dozens, even hundreds, of gigabytes. Back in the late ’90s, when I made my first tip-toe into the depths of digital audio, 6GB hard drives felt voluminous, but they were no match for the demands of space-hogging WAV files.

Today’s late-aught hard drives make those 6GB drives look puny, so a 100MB WAV file doesn’t seem so greedy. But even a 1.5-terabyte drive is no match for hours of uncompressed video files. Till such a day comes when drive space approaches infinity, codecs are a fact of life.

And damn are there a lot of them.

Of course, you could head on over to teh Wikipediaz to get an explanation of what a codec is. For our purposes, we can think of it as the way a video is compressed, then decompressed, and each codec represents some method of compression and decompression that affects the quality of the picture.

Lossy codecs discard information during encoding, similar to how a WAV file loses part of its frequency range when being converted to MP3. Lossless codecs do not discard information during encoding, similar to how a FLAC file compresses a WAV file without loss of sound quality.

Between the rock of hard drive space and the hard place of video file size, lossy codecs are pretty much de rigeur when working with digital video. To the home studio enthusiast, that’s like recording a song with nothing but MP3s and MP4s. Oh, it could be done, but the basic tenet of any media editing is that you want to cut more than boost.

Actually, keep that idea in mind throughout your exploration of any creative endeavor: cut before boost.

So in terms of digital video, you’ll be working with source that’s pretty much cut to begin with. I did not realize this fact during my first steps in dealing with video, so I ended up transcoding a lot of video. But similar to how coding and re-encoding a sound file will eventually screw up the sound quality, the effects on digital video are compounded moreso.

Transcoding is a necessity when working with video, but smart transcoding saves time and space.

Let’s make this personal.

My first music video was shot on a Canon Powershot S500 point-and-shoot camera from 2003. It could record 30 seconds of video at 640×480 or 3 minutes at 320×240. Translation: It could record 30 seconds of video viewable on an old television set or 3 minutes viewable on 2005 state-of-the-art YouTube.

I opted to go for the smaller file size.

I eventually wanted to view this video on a DVD, despite its resolution. The concept of resolution requires another tutorial entirely, but for now let’s map some of these numbers being thrown around with real-life environments:

  • A 320×240 video can be viewed, as mentioned before, on YouTube at its lowest settings. It won’t look any good bigger than that.
  • A 640×480 video can be viewed in iTunes or other desktop computer media players such as Windows Media Player and Winamp. You might even watch it on a television, but it won’t have the same picture quality as …
  • A 720×480 video, which is DVD quality and no better. It is certainly not hi-def, which is …
  • A 1920×1280 video, right now the best picture quality around, with 1280×1040 still being pretty damned decent.

My video, being capped at 320×240, has no chance of looking good at a DVD level, but in order to be burned on a DVD, it has to be transcoded to that resolution regardless.

Fine. I figured I may as well transcode the source material to the destination resolution, edit it and export at that resolution. So I transcoded those files without paying attention to which codec I was using and ended up with 10 files each nearly 1GB. The original files combined took up 230MB of space. But I have a big drive, so what’s the deal?

First, the codec mattered. I could have saved a lot of hard drive space by choosing the right codec for the job, and I didn’t. I don’t even know how I chose what I chose — probably some default setting — but if I used one designed specifically for DVDs, I could have shaved a few hundred megabytes from those encoded files.

But in reality, I didn’t need to transcode those file at all. In fact, I could have just worked with the files on hand and let the settings in Sony Vegas Movie Studio take care of the transcoding during export. The picture quality looked the same both ways, but one method saved hard drive space.

So what codec do you use? That’s a question that comes up numerous times, but for beginners, read the manual for your camera. My Canon Powershot S500 stored video in AVI files with the MJPEG codec. My newer camera, a Canon Powershot SD780IS bought to replace the now-dead S500, stores video in MOV files with the H.264 codec.

Your camera will be different.

If you use a paid software video editor such as Vegas Movie Studio (the one I use) or iMovie, it should come with its own set of codecs to create whatever video you want to produce — DVD, Blu Ray, YouTube. If you use freeware such as VirtualDub, you’ll have to hunt for your own codecs.