Wednesday, February 28, 2007

Vista is an AMD game

With the launch of AMD 690 integrated doom 3 capable graphics and chipset, all of Intel's IGPs become totally obsolete. For anyone who wishes to get the fancy Vista features, he must either go AMD or Nvidia.

18 Comments:

Blogger Unknown said...

The ATI Radeon X1200-family IGP features two pixel-pipelines and two vertex-shaders. Unlike some ATI Radeon X1000-series GPUs, the pixel-pipelines have not been decoupled – the ATI Radeon X1200-family only has two pixel-shaders. AMD specifies a 400 MHz GPU-core clock for reference designs.


Sorry, but this is only Doom 3 capable if you consider 15fps as "playing" doom.

11:26 AM, February 28, 2007  
Blogger Wirmish said...

I've played DOOM 3 with my MSI K7N2G-LISR integrated nForce 2 IGP (~ GeForce 2 MX 400) at 800x600 minimum setting, although I don't recommend it to anyone...

11:43 AM, February 28, 2007  
Blogger Ho Ho said...

How on earth did you get it running on GF2*? Doom3 needs programmable shaders to work. Registry combiners that GF2 has are not enough.

*) Actually that IGP is GF4mx but the GPU is the same as GF2.

12:22 PM, February 28, 2007  
Blogger enumae said...

Why do you continue saying Intels chipsets are not Aero capable?

It has already been proven that your comment is incorrect.

Here is a link to someone using MacBook running Windows Vista with Aero Glass on an Intel GMA950 IGP

It show it works fine, keep in mind this is not the new X3000 IGP on the 965 chipset, this is the older GMA950 from the 945 chipset.

12:30 PM, February 28, 2007  
Blogger abinstein said...

On desktop, AMD sort of "attach" itself on Vista, and it's a business strategy so maybe the executives feel this is good for "business," I don't know. But I really don't think Vista is a good OS, nor Aero a good part of it. Being able to run Aero with a few higher performance points doesn't make a better CPU/GPU/chipset, either.

1:03 PM, February 28, 2007  
Blogger PENIX said...

abinstein said...
On desktop, AMD sort of "attach" itself on Vista, and it's a business strategy so maybe the executives feel this is good for "business," I don't know. But I really don't think Vista is a good OS, nor Aero a good part of it.


Like it or not, Vista is coming and will replace WinXP in market share in the next few years. It think this is an excellent business move as it will allow AMD to move into the business market, which relies heavily on integrated graphics.

Personally, I like Aero. I consider it far superior to the Fisher Price WinXP look. It is nothing revolutionary, but it is an improvement. I also like the sidebar. I had always been fond of WindowMaker dock apps, and this seems to be an improvement upon that. All and all, not bad for a free piece of software.

3:08 PM, February 28, 2007  
Blogger Unknown said...

Here's an interesting piece of news for those like Sharikou who having been touting the gaming abilities of Ati's IGPs vs. Intel's:- Ati's IGPs all lack support for shader mode 3. This means that some newer games like Rainbow Six: Vegas won't run at all. And other games, like Farcry, only appear to run faster than Nvidia IGPs because they are doing less work. While the Nvidia IGP is displaying all the shader 3 effects the Ati IGP can't resulting in a lower image quality and higher FPS.

Nvidia 6150 IGP and Intel GMA X3000 are both Shader mode 3 compatible. I wouldn't say that makes them suitable for gaming at all though.

Look at the scores for the IGPs here: http://www.hexus.net/content/item.php?item=7972&page=9

Farcry at 15 FPS on 1024x768 with low details? No thanks.

AMD's IGPs are completley undesirable due to lack of shader mode 3 support. More upcoming games need this making this chipset useless.

7:20 PM, February 28, 2007  
Blogger PENIX said...

giant said...
AMD's IGPs are completley undesirable due to lack of shader mode 3 support. More upcoming games need this making this chipset useless.


Does Vista use shader mode 3? That's the market AMD is going after with this IGP. This is obviously not a top teir product, just the superior alternative to the Intel equivalent.

8:16 AM, March 01, 2007  
Blogger abinstein said...

Giant :"Nvidia 6150 IGP and Intel GMA X3000 are both Shader mode 3 compatible. I wouldn't say that makes them suitable for gaming at all though.
...
AMD's IGPs are completley undesirable due to lack of shader mode 3 support. More upcoming games need this making this chipset useless."


You have quite wrong idea about Shader Model 3 vs. Shader Model 2. The biggest "improvement" of SM3 programming model is not on quality, but on performance. The longer shader length, better branching, less pass/operation, and support on displacement mapping etc. The only conceivable quality difference is the color depth, from 8-bit integer (SM2) to 32-bit FP (SM3).

But first, are the games actually utilizing the higher color depth? Do they have different sets (one 4 times larger than the other) of texture libraries? Of course not! Second, can your eye detect the difference of less than 1/256 brightness difference in 1/15 sec?

Conclusion: SM3 is meant to improve graphics engine performance, much, much more than rendering quality. If ATi could achieve better performance without utilizing SM3, guess how much better it could do when it implements those new features later!

2:34 PM, March 01, 2007  
Blogger Ho Ho said...

abstein
"But first, are the games actually utilizing the higher color depth?"

Yes, there are. Every time you hear "HDR" you should know that game uses floating point textures, usually 16bit per channel.


abstein
"Do they have different sets (one 4 times larger than the other) of texture libraries?"

Mostly not, they just output to FP texture.


abstein
"Of course not! Second, can your eye detect the difference of less than 1/256 brightness difference in 1/15 sec?"

Mine sure can and do it really good, especially in games where you have lots of post-processing.

On the other hand, most non-HDR games don't use lots of postprocessing filters so you can't really compare them with HDR stuff.


abstein
"SM3 is meant to improve graphics engine performance, much, much more than rendering quality. If ATi could achieve better performance without utilizing SM3, guess how much better it could do when it implements those new features later!"

Either way it would need a whole new IGP.

Btw, can anyone understand why Sharikou talks so much about Doom3? That game sucks, it is only a techdemo for the engine. Half Life 2, Far Cry or Serious Sam 2 are the games that most play.

2:59 PM, March 01, 2007  
Blogger abinstein said...

"Yes, there are. Every time you hear "HDR" you should know that game uses floating point textures, usually 16bit per channel."

HDR is great, but not because it offers 16-bit brightness resolution (it doesn't unless your LCD/monitor does), but because it makes nonlinear brightness mapping easier. Under HDR, you increase dynamic range by spending less bits on the very bright and very dark parts, similar to how human eyes react to the real world. This requires the usage of nonlinear functions, difficult without FP calculation. Without SM3, CPU has to do the HDR calculation, that is if it wants to. Again, it's a performance improvement.

However, under HDR, you'll lose some visibility on the very bright orvery dark parts of the scene, although it also makes the scene more realistic because human eyes react the same way.

"
abstein
"Do they have different sets (one 4 times larger than the other) of texture libraries?"

Mostly not, they just output to FP texture."


My point was that, HDR or not, both the input and output brightness resolutions are the same, but the mapping is different.

"
abstein
"Of course not! Second, can your eye detect the difference of less than 1/256 brightness difference in 1/15 sec?"

Mine sure can and do it really good, especially in games where you have lots of post-processing."


No, your eyes can't detect the difference of brightness resolution, but its distribution. If most pixels fall in the linear brightness region, then you won't be able to discern 16-bit or 8-bit processing.

"
abstein
"SM3 is meant to improve graphics engine performance, much, much more than rendering quality. If ATi could achieve better performance without utilizing SM3, guess how much better it could do when it implements those new features later!"

Either way it would need a whole new IGP."


I'm not saying it won't, and I actually favor nvidia over ati myself. But the fact is that SM3 is more a performance than a quality enhancement. HDR makes scenes look more realistic to the eyes, but not necessarily better quality. For example, if you take a picture of a window frame while facing the sun, you'd likely wish the camera get rid of the real-world "HDR effect".

What I'm saying is that the "improvement" of HDR over non-HDR is very different from that of, say, bump mapping over non-bump mapping, or ray-tracing over non-ray tracing.

"Btw, can anyone understand why Sharikou talks so much about Doom3? That game sucks, it is only a techdemo for the engine. Half Life 2, Far Cry or Serious Sam 2 are the games that most play."

Doom3 is just a benchmark.

4:59 PM, March 01, 2007  
Blogger Unknown said...

That's classic! HDR isn't much of an improvement when Ati's crappy IGPs don't support it?! LMAO!

Anyone that has used HDR on games like FarCry knows that it makes the graphics in the game noticeably better.

R600 delayed until the end of June. AMD BK Q2'08 for sure. AMD will BK just as Nehalem comes out.

11:06 PM, March 01, 2007  
Blogger Ho Ho said...

abinstein
"HDR is great, but not because it offers 16-bit brightness resolution (it doesn't unless your LCD/monitor does), but because it makes nonlinear brightness mapping easier."

Basically what HDR means is that when you have lots of filters over 8bit colour channels then the colour quality will suffer greatily. With 16bit FP per channel things are much, much better. It has nothing to do with your monitor. LCD's can't even output 8bit per channel, most have 6bit and use dightering. Only newer ones have started to get a bit more bits per channel.

abinstein
"Under HDR, you increase dynamic range by spending less bits on the very bright and very dark parts, similar to how human eyes react to the real world."

First, do you know how FP works?
HDR means you can use values that are between 0 and 1/256. Without HDR you have only two options: 0 and 1/256. Same goes for the other end (1/255 and 1).

abinstein
"This requires the usage of nonlinear functions, difficult without FP calculation"

I wouldn't say difficult, just pretty much impossible. 256 different values per channel is not nearly enough to express most imprtant values that are on both ends of the extreme values.

abinstein
"Without SM3, CPU has to do the HDR calculation, that is if it wants to"

Say what? That would mean that all the per-pixel calculations have to run on CPU. Have you ever heard of anything like that? I sure haven't.

abinstein
"However, under HDR, you'll lose some visibility on the very bright orvery dark parts of the scene"

No, you are wrong. Please explain why you think that.

abinstein
"No, your eyes can't detect the difference of brightness resolution, but its distribution. If most pixels fall in the linear brightness region, then you won't be able to discern 16-bit or 8-bit processing"

What if most pixels do not fall into the linear region while running all kinds of shaders over those pixels? That is exactly what HDR is for, to preserve the differences between very close values. Reason why those things are not very well seen so far is because most developers avoid such situations.

I remember reading that John Carmac said once that it had lots of problems trying to get decent visuals with Doom3 only using 8bit per channel. As he didn't use HDR you couldn'd make much difference between different shades, they were almost all completely black.

abinstein
"What I'm saying is that the "improvement" of HDR over non-HDR is very different from that of, say, bump mapping over non-bump mapping, or ray-tracing over non-ray tracing."

HDR on its own has almost no visible image difference but it allows using effects that either were not possible with 8bit per channel or what caused artefacts.

giant
"Anyone that has used HDR on games like FarCry knows that it makes the graphics in the game noticeably better."

As I said, HDR on its own didn't offer almost anything, it just allowed using nice bloom filters and different exposures. Those weren't easily doable without HDR.


Btw, HDR is technically supported under shader model 2, that means starting from NV40 (FX+) series and ATI R300 series (r9500+). Reason why it wasn't really usable before NV40 was that earlier architectures didn't support FP texture blending. R500 series on PC were the first that supported FP texture anti aliasing and they were also the first ATI GPU's that supported FP blending. G80 was the first NV chip that supported FP AA.

HDR in Xbox360 uses several different colour channel widths. Most common is 12bit fixed point per channel. Even though it is much worse than 16bit FP it is still better than 8bit integer.

2:43 AM, March 02, 2007  
Blogger abinstein said...

Ho Ho: "Basically what HDR means is that when you have lots of filters over 8bit colour channels then the colour quality will suffer greatily."

Wrong. HDR lets you represent very bright and very dark parts instead of clamping them. The # bits per color is not an issue past 8. At the end of the day all brightness has to map to that of the monitor's.

"First, do you know how FP works?
HDR means you can use values that are between 0 and 1/256. Without HDR you have only two options: 0 and 1/256. Same goes for the other end (1/255 and 1)."


FP and HDR are two things. Graphics cards are FP optimized because almost all transformations require FP. The "HDR difference" is only on post-output processing that maps luminance to 8-bit.

"abinstein
"Without SM3, CPU has to do the HDR calculation, that is if it wants to"

Say what? That would mean that all the per-pixel calculations have to run on CPU. Have you ever heard of anything like that? I sure haven't."


You can use a D3D API call, probably batched, or a customized function call, or not to do HDR at all. The difference is performance, not possibility.

"abinstein
"However, under HDR, you'll lose some visibility on the very bright orvery dark parts of the scene"

No, you are wrong. Please explain why you think that."


If you want to represent the very high and dark parts on the screen (which has ~250 luminance levels), you've got to loose some in the middle.

"What if most pixels do not fall into the linear region while running all kinds of shaders over those pixels? That is exactly what HDR is for, to preserve the differences between very close values. Reason why those things are not very well seen so far is because most developers avoid such situations."

HDR does not preserve precision, float-point representation does. HDR requires post-output FP processing, but not vice versa. You seem to have mixed up shader model, HDR, and fp.

"abinstein
"What I'm saying is that the "improvement" of HDR over non-HDR is very different from that of, say, bump mapping over non-bump mapping, or ray-tracing over non-ray tracing."

HDR on its own has almost no visible image difference but it allows using effects that either were not possible with 8bit per channel or what caused artefacts."


Frankly I don't know what is the HDR you're talking about. But... never mind.

10:30 AM, March 02, 2007  
Blogger Aguia said...

Giant,

Any feature is important. But as much as the GPU can do enough frame rates to make the game playable.
I'm not sure how much frames the nvidia 6100 do with far cry + HDR, must be near to unplayable?
I think features like AA, HDR and SM3 are useless if the IGP can’t perform well with them enabled.

AMD will not bankrupt, like Intel and nvidia, ...
I don’t see anyone saying SIS or VIA is going bankrupt. Strange.

10:36 AM, March 02, 2007  
Blogger Ho Ho said...

abinstein
"HDR lets you represent very bright and very dark parts instead of clamping them"

Yes, it can. Thanks to the wide array of values FP values can hold.

abinstein
"The # bits per color is not an issue past 8."

Say we have 10 bit integer per channel. That means that three smallest values we can represent are roughly 0, 0.001 and 0.002. Now what if I have to take the average of two pixels with values of 0.001 and 0.002? What would be the result of such a blending operation? Having more bits surely helps to reduce the impact on quality but until you use integers you can't really use HDR.

abinstein
"At the end of the day all brightness has to map to that of the monitor's."

Yes, but when the colours are washed out thanks to not having enough percision before they get to framebuffer there isn't much difference anyways.

abinstein
"FP and HDR are two things"

Yes they are but it seems to me as you don't know how very small and very big nubmers are represented in FP and thus you don't seem to understand how HDR effects work.

abinstein
"The "HDR difference" is only on post-output processing that maps luminance to 8-bit."

Please see the 10bit per channel example again.

abinstein
"You can use a D3D API call, probably batched, or a customized function call, or not to do HDR at all"

Do I understand correctly you are suggesting to run the programs with D3D reference rasterizer?

abinstein
"The difference is performance, not possibility."

Yes, and the difference between lowest-end GPU and CPU is several orders of magnitude. Too great to get any kind of usable performance.

If you don't believe me then download DX SDK and try the reference rasterizer on some game.

abinstein
"If you want to represent the very high and dark parts on the screen (which has ~250 luminance levels), you've got to loose some in the middle."

Yes, and no GPU has support for it when using integer colour values. FP doesn't loose anything from the middle, it just gets more inprecise with bigger nubmers. As I said, look up how FP works.

abinstein
"HDR does not preserve precision, float-point representation does"

So it is but you can't really use HDR without floating point or at least fixet point colour depths.

abinstein
"HDR requires post-output FP processing, but not vice versa."

Correct but not having HDR sources to work with you can't use many post-processing effects without loosing much of the quality. Again, see the 10bit example.

abinstein
"You seem to have mixed up shader model, HDR, and fp."

Care to elaborate what made you think so? It probably isn't very nice thing to say but I think I'm correct when I say I know quite a bit more about GPUs (and CPUs) than average computer geek.

abinstein
"Frankly I don't know what is the HDR you're talking about. But... never mind."

I'm talking about regular HDR, more specifically rendering to floating point texture. Without using any post-processing effects you simply won't have any difference compared to non-HDR rendering. That's why I said without additional post-processing effects there is no difference

aguia
"I'm not sure how much frames the nvidia 6100 do with far cry + HDR, must be near to unplayable? "

Considering my FX5600Ultra was choking on it with low resolution and minimal details I don't think any IGP can run it at playable rates. Even my new 6600GT has problems and still without HDR.

1:43 PM, March 02, 2007  
Blogger abinstein said...

Ho Ho, whenever I see someone break up an argument like what you did above, I know he's lost the big picture of the discussion but just don't admit it.

I think the problem is that you've never written a rendering procedure yourself. There is no rendering that do not require FP calculation. I have no idea why you tried to hard-link HDR with FP processing. Yet, all rendering results on screen are in the end quantized to raster images, 8-bit per channel. HDR-enabled processing preserves fine quantization until the very end, so that an eye-realistic, post-output tone mapping is possible.

I think the original question is whether ATi IGP has poorer quality and performs less computation than an nVidia one, when running the same game. The answer is clearly no. Shader model 3 has almost no effect (except hdr) on image quality, and it actually should help performance instead of reduce it. If you try to do hdr on an nVidia IGP then I'm sure the performance will become much worse.

3:39 PM, March 02, 2007  
Anonymous Anonymous said...

AMD's IGPs are only shader model 2 compliant. They are completely useless and unfit for games.

AMD's R600 delayed again, probably because of its insane power requirements and absurd length. By the time R600 rolls out, Nvidia will frag AMD with G90.

9:04 PM, March 04, 2007  

Post a Comment

<< Home