HDMI via video card output.

Discussion in 'Computer Gaming Forum' started by Trenton_net, Jun 11, 2009.

  1. Trenton_net

    Trenton_net AKA SUPERCOM32

    Joined:
    Apr 13, 2007
    Messages:
    2,378
    Likes Received:
    58
    Hey Everyone,

    If your motherboard or video card can output HDMI, what happens if run a full screen application that does not support a standard resolution? Will it get upscaled/downscaled to a valid resolution for output via HDMI?

    For instance, if you run a 3D game in full screen and you need to drop the resolution so it runs smooth, how will that effect output via HDMI?
     
  2. Borman

    Borman Digital Games Curator

    Joined:
    Mar 24, 2005
    Messages:
    9,543
    Likes Received:
    1,880
    HDMI is simply DVI + Audio (basically), so as long as the monitor itself supports the resolution, then it will be displayed fine.
     
  3. 3do

    3do Segata Sanshiro!

    Joined:
    Sep 25, 2006
    Messages:
    1,901
    Likes Received:
    12
    Not owning anything HDMI i can't give you a certain answer but i'd asume it would like a normal monitor so say you had set your desktop to 1280x1024 or a lower than standard resolution for your screen it would strech the desktop to fit to the size of screen but stay at your set resolution.

    also any programs that run at different resolution or games set at different resolutions would do the same as above unless set to a high enough resolution that would fit your screen without having to be strech or whatever to do so.

    Again i don't own anything HDMI so can't say if that what would actually happen and it may be different but i'd asume it acts like a monitor would and ignoring what connection your using.
     
    Last edited: Jun 12, 2009
  4. phate

    phate Enthusiastic Member

    Joined:
    Feb 23, 2008
    Messages:
    540
    Likes Received:
    3
    HDMI is just a carrier for the image signal. HDMI really has no effect on the actual image produced from the image signal, it actually has much more to do with what your graphics card supports and what your monitor supports, HDMI really doesn't care what resolution or pixel depth the picture is, its just the package delivery service.

    Lets take your example and examine it:

    Lets say you just got a new game, will call it Double Douche 2: More Asshatery, it happens to be the latest and greatest game and sports more lens flares then Star Trek. Normally you play your games at your graphics cards' maximum resolution of 1600x1200. Well while your last gen RadiFail can push the pixels on the first Double Douche game at that resolution with a smooth 60 frames per second, it just can't keep up with the lens flares and digital bouncing boobs of the sequal. A quick solution is to drop the screen resolution of 1280x1024, half of your normal maximum resolution. This allows your RadiFail to not have push its maximum amount of pixels and so it has a easier time rendering the Double Douche 2 game.

    Lets say your moniter is an LCD. Most LCDs operate the best when run at their native resolution. Lets say its native resolution is 1600x1200 and its normally connected to your computer via DVI. You connect your LCD to your graphics card via DVI, because that is the connector that both the graphics card have in common as far a connector ports. Had your card and monitor also had HDMI or VGA your could have connected them together with either of those as it is just a connector type and doesn't really effect the output of the image from the graphics card to the LCD (there are some little differences, mostly how the signal is carried (one is analog, one is digital, and one is digital and packet based) and are mostly minor issues).

    Your monitor on the other hand does effect the output of the image. Being that your are now running the game below the LCD's native resolution your LCD has three options: if it does support the resolution it will, based on user settings, either stetch the image to fill the screen, or turn off the pixels not being used and create a picture frame around the smaller image. If it does not support the resolution it will simply throw an error up on screen (something akin to "Not Supported" or "Out of Sync"). CRTs do roughly the same thing but they tend to have finer control over the image presented and usually support a larger group of resolutions.

    Now lets just say that everything is fine and dandy and your happily playing Double Douche 2: More Asshatery at 1280x1024 @ 60 FPS. For giggles you decide since both your LCD and RadiFail support HDMI as well you connect them to see if will do the samething as the DVI connection, and to your surprise, it does the exact samething, with no difference in FPS or resolution.

    This is a tad simplistic, in truth, there are some image quality differences between VGA and DVI/HDMI, but that is mostly because of the switch from analog to digital. There are also some monitors and graphics cards that will only produce a certain resolution on a certain connector type and some monitors that will only accept certain resolutions on certain connectors as well, but that is specific to the display devices themselves and have really nothing to do with with the connector, a great example is an HDTV that will only allow standard pc 4:3 aspect ratios (640x480, 1280x1024) on the VGA connector and HDTV 16:9/16:10 aspect ratios (1280x720, 1920x1200) on the DVI/HDMI connector; it again has nothing to do with the connectors and cables, any of them VGA, DVI/HDMI, DisplayPort, could easily carry any resolutions, aspect ratio, signal, its the HDTV limiting what can be done with which connector.

    Long winded reply over. Hope the light reading helped :lol:

    FYI: This was a exerpt (a bit edited for length and brand neutral) from a tech doc I wrote for my Dell teams shortly after the release of Halo 2 Vista.
     
    Last edited: Jun 12, 2009
  5. Trenton_net

    Trenton_net AKA SUPERCOM32

    Joined:
    Apr 13, 2007
    Messages:
    2,378
    Likes Received:
    58
    So, if I set my desktop to 1280x1024, but run Doom 10 at 640x480, it simply renders in 640x480, then upscales it to 1280x1024 and shoots that out to my monitor?
     
  6. phate

    phate Enthusiastic Member

    Joined:
    Feb 23, 2008
    Messages:
    540
    Likes Received:
    3
    For the most part yes, however usually the upscaling is usually done on the monitor side of the image shooting.

    Also what sad, little machine can only play Doom 10 at 640x480? My Nokia WinMo iPhone with Wang enhancer can easily push Doom 10 at 4096x2048 and still gets 1200FPS :110:
     
  7. Trenton_net

    Trenton_net AKA SUPERCOM32

    Joined:
    Apr 13, 2007
    Messages:
    2,378
    Likes Received:
    58
    Hrm, and if your monitor can't support 640x480, your screwed because it's likely the monitor is the one responsible for accepting the signal and upscaling it? Hrm... I guess if your PC which is outputting to an HDTV isn't powerful enough to run all applications at HDTV output resolutions, your kind of screwed.
     
  8. Jamtex

    Jamtex Adult Orientated Mahjong Connoisseur

    Joined:
    Feb 21, 2007
    Messages:
    5,472
    Likes Received:
    16
    Most LCD monitors and TVs will have a few pages in the manual of all the supported resolutions and frequencies that they will support. Normally it will be the standard VGA resolutions of.

    Nearly all monitors wether they are VGA or DVI/HDMI will be able to communicate the resolutions and frequencies they support so your resolution slider bar will show this. My monitor won't support 640x480, the lowest is a little used 720x400 mode...

    If the graphics card tries to set a resolution that the monitor can't support then it's more then likely you will get a resolution not supported message. Else the monitor will scale it to fit, but most give you the option for side bars (for 4:3 resolutions so they aren't too stretched and aspect ratio is maintained), stretched (to fill the screen, aspect ratio is lost), zoom (use the whole screen but some areas may be chopped off) or dot by dot, will give you borders but looks nice. :)

    My notebook does have a HDMI socket but the chipset means that anything apart from movies or work related applications will crawl, even at a resolution less then the inbuilt screen.
     
  9. 3do

    3do Segata Sanshiro!

    Joined:
    Sep 25, 2006
    Messages:
    1,901
    Likes Received:
    12
    Most if not all monitors even those from the last 10 or more years will support 640x480 and if thats what you have doom set at then it will be run at that. Its only the higher resolutions that your monitor may not be able to do so its acase of finding out what your monitors maximum resolution is.

    Usually once you've installed graphics drivers then you'll only be shown the resolutions your monitor does and you'll not have any others to select so you'll know what maximum resolution is.

    If you can't do HDTV resolutions then it will be because your monitor doesn't support them and it has nothing to do with your PC so in the that case you'd run programs in the highest resolution possible on your monitor as you don't need to run applications at a HDTV resolution becuase they'll work at any even 640x480 except in the case of games which may lose performance the higher your put your resolution unless you have a high-end graphics card or something capable of running games decently at higher resolutions.
     
    Last edited: Jun 12, 2009
sonicdude10
Draft saved Draft deleted
Insert every image as a...
  1.  0%

Share This Page