Forums

Topic: Not getting 16:9, or am I? What's the deal..

Posts 1 to 20 of 31

Firkraag

I have my Wii set on 16:9, and my TV set on 16:9, yet still when i check what output the Wii gives me it's 4:3 480p 60hz.

Does the Wii just output a thin 4:3 image and let the TV stretch it or what? that would seem like a cheap fix..

There he goes, Firkraag. One of God's own prototypes. A high-powered mutant of some kind never even considered for mass production. Too weird to live, and too rare to die. - My VGscore

Firkraag

This is for games which should have 16:9, MPT and RE4..

There he goes, Firkraag. One of God's own prototypes. A high-powered mutant of some kind never even considered for mass production. Too weird to live, and too rare to die. - My VGscore

owen1

yeah it seems thats all it does, just let the tv be able to stretch it out better. I do it on f-zero gx all the time on my standard def tv to make car seem thinner

owen1

Zaphod_Beeblebrox

I'm not really qualified to respond because I don't own an HDTV but the read count keeps going up without anyone responding (edit: oops, Owen responded!) so I'll give it a shot....

Are you using the component (RGB) cable? It's required for 480p and 16:9.
Did you change both the "Widescreen Settings" and "TV Resolution" settings on the Wii?

I set up my sister's HDTV and it definitely had more of the game visible on the left and right than my SDTV did. So I've seen it work.

[Edited by Zaphod_Beeblebrox]

Put your analyst on danger money, baby.

Firkraag

Zaphod+Beeblebrox wrote:

I'm not really qualified to respond because I don't own an HDTV but the read count keeps going up without anyone responding (edit: oops, Owen responded!) so I'll give it a shot....

Are you using the component (RGB) cable? It's required for 480p and 16:9.
Did you change both the "Widescreen Settings" and "TV Resolution" settings on the Wii?

I set up my sister's HDTV and it definitely had more of the game visible on the left and right than my SDTV did. So I've seen it work.

Component cable, in the settings it's setup as 16:9 and 480p. And I am seeing a widescreen resolution(it might just be stretched though), but the 'output status' thingy shows it's a 4:3 signal not a 16:9 one that's my problem.

[Edited by Firkraag]

There he goes, Firkraag. One of God's own prototypes. A high-powered mutant of some kind never even considered for mass production. Too weird to live, and too rare to die. - My VGscore

Zaphod_Beeblebrox

Yeah, maybe the Wii just squashes the 16:9 image into 4:3 and then the TV stretches it out to 16:9 (like Owen said).

Put your analyst on danger money, baby.

Firkraag

Starting to believe you're right, just seem like an odd solution..

There he goes, Firkraag. One of God's own prototypes. A high-powered mutant of some kind never even considered for mass production. Too weird to live, and too rare to die. - My VGscore

owen1

the reason you still have to tell the wii that it should do a 16:9 image in because the game has to know that it should switch some of the pixel sprites so that they are smaller than normal. But its still a 4:3 image, just more stretchable. Its referred to as Anamorphic widescreen: http://en.wikipedia.org/wiki/Anamorphic_widescreen

owen1

thewiirocks

The Wii outputs an anamorphic widescreen signal. For those familiar with DVDs and HDTVs, you probably know that this is basically a 4:3 signal formated for widescreen. It's a cheap solution, but it does work and is supported by the widest range of televisions.

Hopefully Nintendo's next console will do true 720p at 16:9.

thewiirocks

Zaphod_Beeblebrox

The killer question is whether you are seeing any lag on the HDTV. My sister's HDTV has terrible lag. Excruciating. And no "game mode" to reduce it. I hope yours doesn't (have lag, that is).

[Edited by Zaphod_Beeblebrox]

Put your analyst on danger money, baby.

Firkraag

Thanks everyone for the information about Anamorphic, I was starting to believe it was a fault with my TV. Now I know I'm not missing out

Zaphod+Beeblebrox wrote:

The killer question is whether you are seeing any lag on the HDTV. My sister's HDTV has terrible lag. Excruciating. And no "game mode" to reduce it. I hope yours doesn't (have lag, that is).

I know you raised that question when I was just about to buy it a week ago. I'm having absolutely no lag at all, there is a game mode but I don't get lag in cinema or sports mode either. It's a modern panel though, that might be why it does not have these problems that your sister has.

There he goes, Firkraag. One of God's own prototypes. A high-powered mutant of some kind never even considered for mass production. Too weird to live, and too rare to die. - My VGscore

Zaphod_Beeblebrox

Yay! Party on! After seeing my sister's TV, lag just haunts me. Sorry to be obsessed about it.

Put your analyst on danger money, baby.

warioswoods

The Wii simply looks far better on a standard-def, 4:3 CRT television than it ever will on HD and widescreen setups. Between upscaling the resolution in an artificial way and the issues introduced by anamorphic widescreen, you'll be getting a jagged or lagged image on these new sets, while my trusty flatscreen (no flat panel) 4:3 CRT produces a perfect image with no edges or stretching, better color fidelity, and zero lag, for not only Wii games but VC retro perfection as well.

Some HD sets produce better results than others, but you can't beat a high-end CRT for the Wii.

Twitter is a good place to throw your nonsense.
Wii FC: 8378 9716 1696 8633 || "How can mushrooms give you extra life? Get the green ones." -

warioswoods

Firkraag wrote:

I know you raised that question when I was just about to buy it a week ago. I'm having absolutely no lag at all, there is a game mode but I don't get lag in cinema or sports mode either. It's a modern panel though, that might be why it does not have these problems that your sister has.

Just to be clear, you might not immediately notice lag on on cinema or sports mode anyway. The lag is usually caused by plugging a low-res device (like the Wii) into an HD set and forcing the set to upscale the image to a workable resolution, requiring a fair amount of processing and delay. When doing this, the TV should equally offset the sound, so the only way to perceive the lag is to notice input delay on your game controller. On many HD sets, you can observe this immediately by moving the pointer around on the Wii menu; if the hand on-screen lags even a fraction of a second behind your movements, it's enough delay to make certain games unplayable, including retro titles requiring quick reactions and music games requiring accurate rhythm.

The very existence of a "game" mode on your set indicates that it most likely does have at least a little delay in the other modes, although it's possible that, in this case, the game mode is there to address other non-timing related issues.

[Edited by warioswoods]

Twitter is a good place to throw your nonsense.
Wii FC: 8378 9716 1696 8633 || "How can mushrooms give you extra life? Get the green ones." -

Firkraag

That is if you have a CRT with component input and a flat screen. Which mine did not, my new LCD looks miles better. Also the lag is non-existing, that must have been a problem of older panels because mine suffers none of it, and I've done my share of guitar hero since I got the TV a week ago

The game mode what I can tell has to do with color/brightness and backlight settings.

[Edited by Firkraag]

There he goes, Firkraag. One of God's own prototypes. A high-powered mutant of some kind never even considered for mass production. Too weird to live, and too rare to die. - My VGscore

thewiirocks

warioswoods wrote:

Between upscaling the resolution in an artificial way

What gives you that idea? The Wii does no upscaling. 480 pixels from the buffer go to 480 pixels in the output.

and the issues introduced by anamorphic widescreen

The anamorphic really isn't that bad. The artifacts introduced are minimal at best.

you'll be getting a jagged or lagged image on these new sets, while my trusty flatscreen (no flat panel) 4:3 CRT produces a perfect image with no edges or stretching, better color fidelity, and zero lag, for not only Wii games but VC retro perfection as well.

Actually, a 16:9 CRT at 480p would kick the arse of your 4:3 panel. The jaggies you're worried about are not a problem with the Wii's signal output, but rather the configuration of the target flat-panel. Pixels on CRTs are variable and not fixed to any particular resolution. You lose some sharpness in trade, but it allows the sets to adapt to a number of resolutions. Flat panels OTOH have a fixed number of pixels. If your signal is anything except the native resolution, you're potentially going to see scaling issues. Since most sets today focus on 720 and 1080, they produce jaggies when displaying a 480 signal since their pixel configuration tends to be either 1.5 or 2.25 vertical pixels per signal pixel.

Never fear, there is a solution! The problem is partly caused by poor scaling algorithms which are cheaper to implement, but produce poor quality results. If you want to see the absolute BEST signal on any TV, get yourself a quality upscaler that sits between your console and TV. You'll need to read your TV's manual to tune the signal correctly, but if you get it worked out you can get 1:1 pixel signal out of the upscaler. A quality upscaler will use a sophisticated scaling algorithm that will interpolate the image data between the pixels, thus providing a degree of anti-aliasing. (It's actually just a bit blurry, but less blurry than a CRT.)

So if you care about the absolute best quality image, the solution is NOT a 4:3 480i CRT. That's the worst solution possible. The proper solution is to purchase a flat panel with a good upscaler or purchase a separate upscaler to change the 480p image into one that looks good on your television.

thewiirocks

warioswoods

I am certainly glad that the lag issue is disappearing in newer models; I still cringe at the thought of how many lower-end HD sets with heavy lag are being sold to unsuspecting customers for a bargain, only to get home and find issues they never even knew to ask about.

Twitter is a good place to throw your nonsense.
Wii FC: 8378 9716 1696 8633 || "How can mushrooms give you extra life? Get the green ones." -

warioswoods

@thewiirocks

To your first point, I wasn't talking about the Wii upscaling, I was talking about the HD sets in use.

The rest of your post is incorrect insofar as you skip the fact that nearly all upscalers introduce their own lag, and even a fraction of a second is too severe for certain game genres. Not only that, but you're wrong about CRT's flexible resolution producing worse results -- the fact is that a good set produces absolutely perfect results to the human eye, with no 'pixels' visible whatsoever. I've seen many upscaled images, and nothing in the algorithms comes even close to the natural blending of the image you get with CRT, which has no edges whatsoever, a different thing entirely from anti-aliasing. Good case in point -- I've played classic lower-res games on very high-end computers, using all the latest filtering and smoothing that emulators offer, and none of it is even in the same league as the smoothness of playing one of these games on a CRT. One is a natural look, the other is forcing one resolution into another, no matter how you look at it.

EDIT: I should also emphasize that there are other issues in play, including color fidelity (never as good on an LCD as a CRT, although newer tech like LED is finally catching up and making this a non-issue -- but for a hefty price for now!) and the size of your set / viewing distance. I'm not one who feels that having a giant TV in your living room is a good thing, so I'll always opt for more moderate sizes, and there's a complex equation that determines the quality of image you'll see based on size and viewing distance; suffice to say that with a moderately-sized set, viewed from several feet away, the color fidelity, smoothness, lack of lag or ghosting etc is far more important than the resolution, as your eye can't even distinguish all the detail of higher resolution unless your set is of a certain size and viewed within a set number of feet.

[Edited by warioswoods]

Twitter is a good place to throw your nonsense.
Wii FC: 8378 9716 1696 8633 || "How can mushrooms give you extra life? Get the green ones." -

thewiirocks

warioswoods wrote:

To your first point, I wasn't talking about the Wii upscaling, I was talking about the HD sets in use.

Fair enough. It wasn't clear from your post, so I thought you were talking about the Wii upscaling.

The rest of your post is incorrect insofar as you skip the fact that nearly all upscalers introduce their own lag, and even a fraction of a second is too severe for certain game genres.

Depends on what you mean by "fraction of a second". More than 100ms is going to hurt. 10ms is in before the frame is rendered. Even if you end up a frame or two behind, it's not going to be noticeable. I must stress how important it is to get a quality upscaler with specs that will work for gaming.

Also keep in mind that it is CRITICAL that you get the settings right. If you do it wrong, you'll have the delay of the upscaler plus the delay of the television. If you do it right, the television will no longer produce a delay (since the signal is 1:1) and the upscaler's delay will replace the TV's.

Not only that, but you're wrong about CRT's flexible resolution producing worse results -- the fact is that a good set produces absolutely perfect results to the human eye, with no 'pixels' visible whatsoever. I've seen many upscaled images, and nothing in the algorithms comes even close to the natural blending of the image you get with CRT, which has no edges whatsoever, a different thing entirely from anti-aliasing.

I think you have a core misunderstanding of the issue. Having pixel edges is not a bad thing. It increases the sharpness of the image and thus reduces eye strain. The natural blending of CRTs has a more formal name: Blur

Blur looks like crud to the trained eye. A properly scaled image on a sharp display will ALWAYS look better. Always, always, always. It's just the nature of the beast. A CRT will force you to strain your eyes to make out details. It will look ok if everything is centered and large on the screen (thus the over-reliance on close-up in television work) but it will hurt if you're trying to make out detail. For an example, try watching the King Kong remake on a CRT vs. an LCD panel. You won't see a damn thing on the CRT. (I know, I tried. The director was having WAY too much fun cramming detail into HD resolutions.)

Good case in point -- I've played classic lower-res games on very high-end computers, using all the latest filtering and smoothing that emulators offer, and none of it is even in the same league as the smoothness of playing one of these games on a CRT. One is a natural look, the other is forcing one resolution into another, no matter how you look at it.

Sorry, but this is confusion on your part. The CRT is not producing a better image. The older game console was hacking the TV.

"Wait. What?" I hear you say.

CRTs are still broken up into pixels of their own. There are a number of colored elements on the phosphorous screen that are lit up to make the final color you see. To reduce manufacturing costs, CRTs shared color elements. By manipulating the relationship between these elements, it was possible to double the number of apparent pixels or provide a wider range of colors.

This all goes back to the NTSC signal and how it encodes color information. It uses a UV (Chroma/Luminance) scheme called YCbCr. YCbCr encodes the luma (Y) and differential blue (Cb) and red (Cr) values for two pixels. Notice something missing? Right, the eye is less sensitive to green, so that space is significantly reduced in this encoding. Older games exploit this relationship along with the blurriness of the screen to give a more colorful and interesting picture than would normally have been possible using the technology of the time. (I should take a screenshot of Ultimate Wizard on an emulator and a CRT sometime to demonstrate the difference. It's quite striking.) When we emulate these games, we cannot replicate the signal precisely, so the end result actually ends up far too sharp and loses the hacks applied to the signal.

That being said, the effect can be reproduced on an LCD. Using the Ultimate Wizard example above, I have run my C64 on both an LCD monitor with a television card and a CRT screen. The output is the same in both cases. The TV card correctly computes the expected pixel values and produces an image that looks more or less exactly like it did back in the day. Blur and all.

So be careful what you point to as an example of superior picture quality. It may not be what you think it is as all.

EDIT: I should also emphasize that there are other issues in play, including color fidelity (never as good on an LCD as a CRT, although newer tech like LED is finally catching up and making this a non-issue

Keep in mind that your suggestion of 480i on a 4:3 screen sucks for color fidelity. Just because it exists in the technology doesn't mean that the underlying signal can provide it. What most people mean when they refer to superior color or audio of analog technology is a certain "warmth" caused by incorrect calibration of the device. e.g. Skin tones may come across as more realistic because the TV is shifted too much into the red spectrum. From a tech perspective, you're not seeing what the signal really contains. You're seeing an altered image that looks good for some things, but bad for others.

Digital displays tend to display what is really in the signal. However, most modern displays have enough settings to allow you to adjust the color to recapture that "warmth" that you're missing.

That being said, it sounds like what you're really looking for is a solid classic gaming experience in addition to the modern experience. What you should do is visit AtariAge.com sometime and tell them you want help setting yourself up for both classic and modern gaming. They'll first suggest that you have two televisions. A CRT for older systems (using the actual system, not emulation) and a flat panel of some sort for your modern consoles. (Since the modern consoles are designed with the sharpness of a flat panel in mind.) They should be able to put you on the right track to get the experience you want out of both your old and your new consoles.

thewiirocks

warioswoods

@thewiirocks

You have several valid points, but you're also still oversimplifying in some areas. "Blur looks like crud to the trained eye" / "The natural blending of CRTs has a more formal name: Blur" -- that's misleading. We're discussing standard definition sources plugged into either a CRT or LCD, and you're raving about the various configurations for LCD while ignoring that the sharpness of a CRT is also adjustable. If properly adjusted, you'll have all the detail of your standard-def source visible, but with a smoothing that is not merely blur -- killing details, etc -- so much as it is just a more natural look without the pixels being so painfully obvious. Your example is even worse than mine:

For an example, try watching the King Kong remake on a CRT vs. an LCD panel. You won't see a damn thing on the CRT. (I know, I tried. The director was having WAY too much fun cramming detail into HD resolutions.)

Well, duh, that's a film that was shot for HD, while I'm talking about output from a system whose games simply don't have HD details, they are made for standard definition. The example couldn't be less relevant -- I'm not opposed to all higher resolution, just to shoehorning a standard-def system into high def using all manner of algorithms and adjustments that simply never reproduce the look of a high-end CRT.

Your notes about color fidelity are also highly misleading. I know the whole bit about people listening to vinyl for the "warmth" and the controversies surrounding the question of higher fidelity to the source input vs. certain effects that merely seem natural, but here you're skipping over the simple fact that LCD do not have as full a range of displayable color as high-end CRTs. You can read about this just about anywhere; LCDs have certain limitations with regard to color, and you're absolutely wrong that they perfectly replicate the colors in the signal, as they have difficulty with certain ranges, and often with contrast as well.

Most of these issues are either just getting finally worked out, or will soon, but only for the most expensive sets, and HD has been sold for years to customers under false premises, with no mention of the loss of many fundamental aspects of quality, all with an emphasis on mere resolution. Many of the "fixes" in these sets -- such as the horrible, horrible "Motion Plus" in Samsung models -- are so hideous that they far outweigh the boosts in resolution and sharpness.

[Edited by warioswoods]

Twitter is a good place to throw your nonsense.
Wii FC: 8378 9716 1696 8633 || "How can mushrooms give you extra life? Get the green ones." -

This topic has been archived, no further posts can be added.