by Bob » Wed Mar 18, 2015 2:29 pm
OK, Matthew,
Thank you for the screen capture. Looking at it, I don't see anything terribly wrong, although you are doing some things that could reduce the quality. I have to ask, what exactly do you mean by low res. I know what I mean by that, but what do you mean?
The source is standard definition 720x480 (Fullscreen not Widescreen) NTSC interlaced video. Your output is standard definition 720x480 (Widescreen) deinterlaced video. Two things are happening here. First, you are changing the pixel aspect ratio from the fullscreen 0.9091 to the widescreen 1.2121. That will force resampling of the video. You probably won't notice any difference from that though. Second, you are deinterlacing the video. Interlaced video has two fields -- one with the odd numbered rows of pixels and one with the even numbered. Premiere deinterlaces by throwing away the second (upper) field and then recreates the missing lines using the first field by interpolating/blending the line above and below. Essentially you are losing half of your vertical resolution. If there's not much motion, this looks fine. The more (and faster) motion present, the more visible the deinterlacing becomes. Still, the degradation shouldn't be terrible.
Your bitrate settings leave me scratching my head. Your target bitrate is 1.0 Mbps and your maximum bitrate is 1.2 Mbps. That kind of negates the reason for using VBR in the first place. Essentially you are forcing the compressor to output something similar to constant bit rate (CBR). VBR can produce much better results than CBR, but only if it has room to manage the bandwith utilization. I'd kick that maximum bitrate up. Also, the target bitrate is pretty low. Why the 1.0 Mbps? 2.0 Mbps for web based viewing is pretty common. The lower the target bit rate, the lower the quality. 1.0 Mpbs is pushing the lower acceptable rate.
The Maximum Render Quality option is unlikely to affect your quality as you are not scaling the video. Enabling this option allows Premiere to use more sophisticated (but more compute intensive) algorithms to scale the video. If you were using HD source and outputting SD, for example, this setting can produce better and more accurate scaling.
The other optional setting "Render at Maximum Depth" has to due with the color bit depth, not resolution. If you are seeing banding in the output video, you can enable this option to allow Premiere to use a higher bit depth in its calculations to get smoother gradations. You probably won't need this if you aren't adding heavy vignetting or strong color grading. But, it's there if you need it.