"Once the video is captured in the file (...), you are past any PAL/NTSC differences..."
While this is true in some aspects, it's very wrong with regards to one important aspect: frame rate. Why is frame rate important? Because there's no way to do interpolations on time.
Let's think about it this way - if your monitor is set to 1024 x 768 and the movie you're watching is using a different resolution, it's relatively easy to scale the movie to fit the entire screen - scaling using various methods of interpolation has been done a lot. In most cases, it's the video card that does the scaling, so it doesn't even affect your computer playback performance (scaling doesn't work well on interlaced video, which is one of the reasons I think interlaced video sucks, but that's the topic for another post).
If your monitor is set to 60Hz (it's practically showing 60 fps), and you're watching a PAL movie that runs at 25 frames per second, there's no way to interpolate 60 frames out of the existing 25 frames. Instead, during playback frames are repeated so that the overall speed of the movie is maintained.
Here's an easy example, and let's assume that NTSC runs at exactly 30 fps for simplicity: If your monitor is set to 60Hz, and you play a 30 fps NTSC movie, each frame is shown for two progressive scans of the screen - the frame rate is maintained exactly, and the quality and smoothness of the video is excellent.
But now try to watch a 25 fps PAL movie using the same monitor. Now you have to fit 25 original frames into 60 display frames. That's 5 original frames into 12 display frames (in 1/5 of a second). There's no way to spread the frames evenly. Let's assume an original frame switch is done as soon as it should be shown, but not within a display frame (otherwise you'll see a discontinuity at a certain vertical position, whenever there's horizontal motion between frames). That's the way DirectShow works on Windows. In this case, here's how 1/5 of a second would look like:
After this conversion, some of the frames would be shown for 1/30 of a seconds, and some would be shown for 1/20 of a second, there's actual a rhythm here: 3,2,3,2,2. This rhythm will repeat itself 5 times per second throughout the video you're seeing. The visual result of this is called judder, and is especially common when converting films to NTSC.
How annoying is this jitter? Many people wouldn't be able to spot the problem, but the video would looks a bit jumpy. It's especially visible in scenes of consistent (and slow) horizontal movement of the camera (panning).
Connecting your PC to a TV
What if you have a TV-Out option on your display card, and you hook your PC to a TV? Will this solve the problem? The TV shows 25 fps, and the movie also has 25 fps, but there's no way to skip the 60 fps of the display adapter (actually, some display adapters lock the frame rate to something sensible when the TV-Out is turned on, but let's assume you can feed a PAL TV from a display card showing 60 fps).
In this case the result is even worse. The display adapter has to convert the 60 fps back to 25 fps without knowing the original material was in 25 fps. This would probably look like this:
What happened here? The first original frame (red) is shown without a problem, but by the time we need to show the second frame (on the TV), the second video frame (yellow) is not yet shown, so the red frame will be shown once more. The outcome is that out of every 5 frames, one will be repeated, and one will be skipped. This is a pretty noticeable judder.
What's the Solution?
In some cases it's quite simple - if you're using your PC to watch video, make sure the display refresh rate matches your video's frame rate. If you watch PAL movies, set your display to 75Hz or 100Hz. If you watch NTSC movies, it sounds like you should set your display to 60Hz (or 90Hz or 120Hz), but most NTSC shows are actually filmed at 24 fps (film rate), so the best frame rate is a multiplication of that, probably 72Hz.
Is that all?
No, of course not. Many people in Europe want to use their computers to watch both shows they recorded using a capture card (which will be recorded at 25 fps) and movies or US TV shows (which are recorded at 24 fps). How do you set your monitor's refresh rate in this case? I'll leave that to the next post.