So, I have this monitor, which I've mentioned before (LaCie Electron 22 Blue IV), and lately I've been using it for Dreamcast, Gamecube, Xbox, Wii, and Model 3 games. For the DC I use the VGA box, and for everything else I've been using a GBS 8220 scaler. For Gamecube and Xbox games, I have it set to output 640x480, in the hopes of having the image as close to its native resolution as possible. However, the picture I get, while it looks pretty good, isn't great - nowhere near as good as the Dreamcast, even with deflicker turned off. So, the way I see it, there are a few different reasons why this might be the case: 1) The scaler isn't doing a very good job of converting the image. 2) Gamecube and Xbox games use certain types of image processing that obscure the image (like deflicker, but I think there may be others). 3) Maybe the GC and Xbox aren't outputting exactly 640x480, but more like 720x480 or something in that range. If that were the case, that would mean the image is being downscaled, which would explain why it doesn't look so great. Some sources I've read have indicated that the Gamecube outputs 640x480 in progressive mode, but I'm slightly skeptical. Once, when I connected my Xbox to an LCD, the info panel said that it was outputting 720x480, but I'm still skeptical. So can anyone elucidate this matter for me? Exactly what resolution do these older consoles output? What would be the ideal way to play these consoles on a display in their native resolution? Note: I'm not concerned about the PS2. Only a few PS2 games supported 480p, so I think it's better to play most games in 480i mode, and my PVM does a great job with that. Also, I understand that 640x480 is considered 480p, but it would appear that 480p, as a TV format, is not rigidly defined in terms of its horizontal resolution. In other words, it's always 480 pixels high, but the width may vary.
Well of course the GBS-8220 isn't a top-of-the-line scaler. Interlaced source will look worst than progressive with it. Use 480p when possible. You can try to modify settings to optimize the image but the documentation is rather obscure... Check the manual here:http://www.jammaboards.com/arcade_manuals/GBS-8220_CGA_to_VGA_HD-Converter.pdf I would start by adjusting the 3 potentiometer on the R,G and B inputs and then move on to the settings in the OSD. Check that the consoles are set to output a 4:3 image, on the Xbox you can check in the Stock Dashboard. I don't think there is a master setting for this in the Gamecube, it's a setting found in games. Like you said, 480p by definition will only ensure you have 480 lines in height. No info on horizontal but it'll be mostly 640 or 720. If you are comfortable with a soldering iron I would suggest you mod your Xbox (1.6 revision cannot be modded...) to output native VGA, that'll solve at least one problem.
i've not tried the 8220 with a newer console yet, the resolution settings on the gbs 8220 are for the onboard scalling output resolution and not the input, it's designed to convert cga / ega to vga, (640x480,1024x768 and 13somethingx8something? from memory) it's not really designed to take a VGA signal and pass it through eg 640x480 > 640x480 or 720x480 > 720x480 if that makes any sense.
Well I think he's trying to use the GBS-8220 as a VGA box to simply plug in the Gamecube and Xbox because those 2 won't output it by default. You can mod both to get a standard VGA signal but that's another story. From start, this converter was designed to take 15KHz H-Sync signals and convert it to 31.5KHz H-Sync. Feeding it a 640x480 input signal isn't all that bad as there will be less scaling involved since the source is already at a convenient size. Of course there will be a quality degradation but it'll be better than a 320x240 signal input. Interlaced signals are also problematic and can lead to greater quality degradation than progressive since you need to fill in 2 frame buffers in order to get a complete picture. Depending on the timing and sync implementation of this device there could be some quality degradation if the device is unbalanced between buffer fill-in and latency allowed between input signal and output. This latency is surely more important when the scaling processes kick-in strong (because your source is really really small res). I strongly believe that feeding the GBS-8220 a higher res picture will result in less latency because the CPU will take less time to scale and that'll leave more processing power to manage the frame buffers. The DC isn't plugged into the GBS-8220, it's plugged directly into the VGA monitor. That's the right thing to do unless you have a really cheap LCD with a really bad scaler inside of it. But you'd have to be pretty cheap to consider a monitor so low-quality that the embedded scaler is inferior to the GBS-8220.
Thanks for the responses, guys. A few notes: I'm actually using a Wii for Gamecube games. It's also softmodded so I can force 480p and disable deflicker. I have a Gamecube, but no component cables unfortunately. Interlaced is not really an issue for me. The vast majority of the games I want to play support 480p, and in the case of the few games that don't, I can always connect the Wii or Xbox to my PVM instead (they're right next to each other). In the case of the Wii, I can force 480p, so that's really inconsequential. Part of the reason I posted this is that I'm considering getting a Sony PVM-20L5 (HD video monitor), and I want to know if it will really make a difference for this kind of stuff. Less conversion would be taking place (just component to RGB, no scaling or D to A conversion), and it would display at the correct resolution, so you would think the image quality would be better. I'm just wondering how much better. Is it worth it? Yes, that's correct.
Frankly I don't know. It's still a CRT so there's no issue of native screen resolution. Of course, less conversion stages means better picture. Ditching the GBS-8220 is the best move you can do for an Xbox and Gamecube (keep it for N64 and older gen). Other than that I must say I'm clueless as to what's the best set-up for you.
straight to PVM would probably be best option native resolution and wouldn't mess with the signals / less connections. i do use a GBS-8220 but as of yet only on MegaDrive and Sega Saturn, tried them through a 15inch CRT and a 21" widescreen LCD monitors. works and looks well especially the saturn. i'll need to try my wii through it, but i've seen a video from "retrogametech" on youtube and his wii via component > 8220 > slg > LCD TV looks nice considering switching to this method myself -(without the SLG3000) i currently use rgb scart into a CRT tv for both wii and original xbox. may do this for PS2 aswell.
I should mention that even though the Xbox and the Wii don't look as good as the DC, they still look significantly better going through the GBS 8220 to my CRT computer monitor than they do when connected to an LCD. The DC image quality is, well, pretty much flawless, so it's a pretty high standard to go by.
yeah the DC looks excellent on a CRT monitor i used to use a 21" fujitsu for the dreamcast. looked so nice. saturn i think looks really nice too through a CRT monitor(via gbs 8220)
Here's an idea I had a while back: are there devices that pass through 480p component video to VGA, without processing the video like the GBS 8220 does? In other words, a device that does exactly what an HD CRT with a component input does internally, in an external box.
well my CRT monitor is fairly small only 15" so straight up comparison isnt easy and its all subjective