Optimal Picture Quality for Console Gaming
Image Clarity, Graphics & Resolution.
Before we get into this, there are a few things that need to be clarified. When talking about picture quality, there are three components (not talking about the cables!) that factor into the end-result of what one actually sees on screen. The three factors are image-clarity, graphics and resolution. Let's define these terms:
- Image-Clarity - You can basically think of this as the differing levels of "dirt" standing between your eyes and the video signal rendered on the display. It's like if you kept all of your previous eye glasses, and went through them starting from oldest to newest. The closer you get to your current prescription, the clearer your vision will become. That's basically the concept behind image-clarity. Your console uses cables instead of eye glasses to display what it 'sees'.
- Graphics - This is the one that people often misunderstand. Graphics have nothing to do with image-clarity or resolution. Graphics are simply the polygons, anti-aliasing, textures and filters that the hardware is rendering and pushing out of the console into whatever cables you're using. The graphics of a game will never change no matter what resolutions or cables you have configured.
- Resolution - Resolution is basically how close or far an image is presented. Specifically, resolution relates to how many pixels can be viewed in a single frame. The higher the resolution, the more pixels are displayed on screen & the greater the detail of the objects within the image. You can still have a very clear picture (a.k.a "image-clarity"), but have a lower resolution (240p or lower).
Now that we have sorted that confusion out, the next step is to discuss the varying video formats and cables that correspond to them. Before HDTVs & flat-screen computer monitors were common, we basically had no way of properly displaying our older game consoles. The fat, chunky CRT TVs of the time did not support the higher-quality video formats that display what a console is actually supposed to look like. Usually the best you ever got was composite-video or maybe even S-video. So let's now define some of these formats:
- RF - This is the absolute worst video quality possible other than simply not displaying an image at all. It stands for Radio Frequency modulator. This is a device that converts a baseband signal into a radio frequency that can be understood by a radio or television. Everything (including, left/right audio & video) is crammed into one signal to be displayed on a designated station or channel. The result is very muddy, distorted picture and sound quality. You may even experience color-bleeding or audio glitches (skipping or static).
- Composite - This is definitely a step up from RF. In this instance, the left & right audio channels (in most cases) are separated for crisper, stereo-enabled sound. The video has it's own cable, but all of the information (Reds, Greens, Blues, Black Levels, Contrast, etc) are still being crammed into one signal that again suffers from a level of distortion.
- S-Video - This is a common abbreviation for 'Super Video' or 'Separate Video' with the latter term probably being the more fitting description. S-Video essentially separates parts of the video signal, resulting in improved image quality with substantially less distortion (especially around the edges of text or character models). This is perhaps the simplest/cheapest way to get semi-optimal picture quality with a large number of older consoles, specifically the systems existing before the Sega Dreamcast.
- RGB/SCART - This video format is actually what a large number of older consoles are capable of displaying natively, but the format is limited to the European region almost exclusively. I've never owned or seen a US TV with the inputs to support this format. However, there is a way around this. See 'Converters/Upscalers' below. RGB-SCART provides a truly vivid picture quality with greatly enhanced colors & contrast and image-clarity slightly above S-Video.
- VGA - Okay... I have a lot to say on this one. VGA (Video Graphics Array) is a popular video format that is frequently used on computer displays. I believe the industry has moved more towards HDMI or DVI as far as computer monitors go these days, but many schools and businesses are still using the VGA standard. It's relatively inexpensive, easy to use & provides amazing picture quality for a variety of purposes, including gaming.
Probably the most prominent example of a game console featuring this video format is the Sega Dreamcast, a machine capable of displaying (I would say more than 90%) of its library in enhanced 480p resolution. The resolution may not be impressive to most people (especially now), but there's no denying that the image-clarity that VGA allows on this console is jaw-dropping compared to older formats. I also believe that the early Xbox 360's were capable of using this video format. It's an extremely good-looking picture, and provides the highest possible image quality out of all of the analog formats. It's even capable of supporting "HD" resolutions of up to 1080p, but a drawback can be image-degradation that is dependent on the quality and/or length of the cables. VGA uses multiple prongs for interfacing, which can be bent or broken easily.
There are other variations of VGA, including XGA (Extended Graphics Array) and a few others dubbed, 'Super VGA' when referring to various clone designs produced by manufacturers other than IBM (the company responsible for introducing the format). Strangely enough, there really are no defining characteristics for this format other than being produced by a non-IBM company and typically being used for resolutions above 800x600. I haven't compared SVGA to VGA on my own, so I can't testify for one being superior to the other. I am aware that there are SVGA monitors and SVGA graphics cards, which I'm assuming could potentially create image quality surpassing VGA. I'm also wondering if the build-quality of SVGA products are superior to standard VGA products. If anybody has specific insights into these questions, feel free to comment below!
There is another format similar to VGA, called DVI (Digital Video Interface). The connections even look similar, but I'm not aware of any consoles with official support for this display standard. It's mostly used in computers for the purpose of delivering high-definition video playback.
- Component - Component really began to take off during the latter half of the sixth-generation of console gaming, allowing gamers to view standard-definition titles with "HD" image clarity & resolutions. It was extremely common in the seventh-generation due to the Xbox 360 originally shipping without HDMI support, and the Nintendo Wii lacking any digital interfacing in that entire console period. It's a very popular upgrade for consoles shipping with standard composite cables, offering a substantial jump in image quality. For the most part, they are relatively cheap, although some devices (like the GameCube, for a good example) lack support for component signals in early or later models released over time, resulting in limited supply and hefty price-tags as they become rarer.
In my opinion, while component is certainly a great/convenient option for many people wishing to improve image quality, VGA still offers image quality that is very near component under normal circumstance, but can also surpass it in a few instances (especially while using a computer display).
- HDMI - HDMI (High-Definition Multimedia Interface) has recently become an almost universal display standard in both television and computing. The only other format offering similar picture quality & popularity is DVI (Digital Video Interface) primarily seen on computer displays. Every HDTV should have at least one of these ports, along with component & sometimes VGA or S-Video options.
- Converters/Upscalers - Another problem you may run into, especially with older consoles, is a lack of support for older video formats, forcing you to use standard composite and ending up with a very poor picture quality. HDTVs are terrible with analog equipment. Somehow, it looks a lot worse than it would on an old CRT television (presumably due to the lack of scan-lines on most high-definition displays). But, there is a solution to this problem. There are multiple converters & upscalers available on the market that allow you to convert older formats (like S-Video, SCART & VGA) to digital signals that are understandable to your HDTV, resulting in far superior picture quality than native composite would allow.
Now, some of the early or cheaper HDTVs may still support formats like S-Video or VGA.. so you may want to consider whether or not you're willing to spend the time & money on a TV dedicated to your gaming or if you would rather use your existing HDTV. The difference between converting an image & up-scaling an image is pretty basic. If you're only using a converter, the only thing you're doing is allowing your non-compatible format (like S-Video) to interact with your TV in a language that it understands. They simply exist for convenience, and that's it. You aren't going to suddenly be able to play a Sega Genesis in 1080p, but it will typically give you the best version of the format you're using (so if using S-Video, you will get the highest possible S-Video signal that the console & cables are capable of producing).
Some converters will also perform other functions (like offering scan lines, up-scaling and multiple inputs) that allow you to take the image being pumped through your older cables and enhancing it to "trick" your TV into thinking it's being fed a higher resolution (like 720p) than it really is. This can make it possible for individuals interested in capturing video from an older console to do just that, simply because a lot of recent capture cards or other recording devices will not recognize resolutions below a certain point (typically those below 720p) It does not actually make standard definition signals from older consoles 'HD', but it does sharpen & improve clarity compared to the non-modified image and offers superior performance over standard converters without this feature.
This is popular method for allowing RGB-SCART-capable systems of displaying native picture quality on a display lacking RGB/SCART inputs.
Additionally, many HDTVs and computer displays have various settings that can help further improve the final result. Make sure your HDTV is set to 'game mode' if it has one. Computer monitors usually offer differing levels of contrast/brightness, but also sharpness. You may want to minimalize sharpness when using a computer monitor. In my experience, using little or no sharpness improves the overall image clarity in video games when using a computer monitor. But, you may want to play around with it depending on your specific monitor.
Adam Koralik Offers Advice on Optimal Video Performance
Poll: Your Best Video Format?
What video-format are you using the most?
Here are links to sources for additional information, some of my information came from these sources as well.
- Understanding VGA Formats
Have you noticed how many different video format acronyms there are out there? Have you ever attempted to dissect the three-letter and four-letter acronyms (hereafter TLA’s and FLA’s) used to describe each one? Do you wonder what they mean, and do yo
- HDMI Connections - HowStuffWorks
HDMI connections can include component video, s-video, and DVI. Learn about the different types of HDMI connections and HDMI connection methods.
- Video Cables - HowStuffWorks
Video cables can be confusing. Learn about video cables and get tips for choosing and using video cables.