Sunday, May 13, 2007

K10 2.4GHZ runs at 1.1 v

Core 2 needs 1.3v. AMD's 65nm dual stress linear strained silicon on insulator with memorization and embedded Silicon Germanium (e-SiGe) (more buzzwords) is better than Intel's bulk vanilla dumb 45nm.

AMD is simply smarter.

33 Comments:

Blogger Evil_Merlin said...

Once again, Ph(ake)d is comparing the next gen AMD chip to the CURRENT gen Intel chip...

6:18 PM, May 13, 2007  
Blogger lex said...

Buzz word..

The buzz word that I like best is profits.. 1.9 billion last I checked.

Oh.. how did AMD do with that buzz word.. its not in their vocabulary.

"Ph"ony "D"octorate you and AMD can have SOI. dual stress liner and anyother invention...

You got no profits and no benchmarks and falling market share. Pss those are the important buzz words.. LOL

6:23 PM, May 13, 2007  
Blogger Randy Allen said...

This is the exact same thing he lambasted Intel for doing last year:- Comparing current CPUs to one's that aren't out yet.

I'll follow Sharikou's lead, and remind him that Intel's next generation Penryn CPUs will also see voltage requirements drop.

The K10 in that image is only clocked at 2.4Ghz as well, the fastest parts (2.9GHz?) could well require more voltage.

6:23 PM, May 13, 2007  
Blogger Unknown said...

"Once again, Ph(ake)d is comparing the next gen AMD chip to the CURRENT gen Intel chip..."

As opposed to comparing Core 2 to K8? The point is K10 will be faster than Core 2 clock-for-clock and with a much lower vcore than Core 2 on the same manufacturing process.

6:24 PM, May 13, 2007  
Blogger Unknown said...

humm but according to the same source, Barcelona was running at 1.3V for 1.9Ghz....

6:54 PM, May 13, 2007  
Blogger Randy Allen said...

As opposed to comparing Core 2 to K8?

Both are the current CPUs from Intel and AMD.

The point is K10 will be faster than Core 2 clock-for-clock

Proof? AMD hasn't provided much of that at all. Only a few vague numbers. Current Core 2 enjoys a 20% IPC advantage over K8. Penryn will increase that by about 10%. (Except when SSE4 is used, that provides a large benefit for video encoding etc) Penryn will be 30% IPC advantage over K8.

In other words, a dual core K10 will have to have a 40% or more IPC advantage over current dual core K8 processors to have any meaningful IPC advantage.

It's also worth noting that only Penryn will have SSE4. AMD grouped a few SSE3 instructions together and called that "SSE4A", clearly not the same thing as SSE4.

7:01 PM, May 13, 2007  
Blogger DaSickNinja said...

The image is sketchy to say the least.

7:44 PM, May 13, 2007  
Blogger Randy Allen said...

Here's a good one for the AMD fanboys:

http://vr-zone.com/?i=4946&s=1

A good reputable review of the HD2900XT. It is soundly fragged by the GTX. No need for even the ultra. This means that Nvidia commands 100% of the $400+ video card segment. Lots of people will say that is not a lot of sales, but this doesn't matter. It's basically pure profit. Plus it's good bragging rights.

In summary:- HD2900XT can compete with the GTS OK, but not the GTX. For me, the 8800 GTS 320 still seems to be the best bang for the buck.

8:02 PM, May 13, 2007  
Blogger enumae said...

Sharikou

Considering that the cores voltages can be adjusted separate of the IMC, what voltage is CPU-Z reading, the IMC or the Cores?

And if it is the Cores, can you provide a link to a source?

Thanks

8:28 PM, May 13, 2007  
Blogger Unknown said...

randy allen/giant said..
It's also worth noting that only Penryn will have SSE4. AMD grouped a few SSE3 instructions together and called that "SSE4A", clearly not the same thing as SSE4.



~This might answer your argument..

The new core also features 3-way integer execution and address generation, 3-way 128 bit wide floating point executions, Enhanced 3Dnow! Marchitecture, MMX, SSE, SSE2, SSE3 & SSE4A Single instruction multiple data (SIMD) instruction extensions.

Further the CPU can cope with advanced bit manipulation instructions, super forwarding, prefetch into L1 data cache, deep out of order integer & floating point execution, 8 additional XMM registers (SSE, SSE2, SSE3 and SSE4A) & 8 additional GPRs in 64 Bit mode.

Last but not the least is Enhanced HyperTransport marchitecture. If this is too much for you don’t worry, it is too much for most of us, but we like that the K10 supports SSE4A so it might have a fighting chance in encoding.

8:45 PM, May 13, 2007  
Blogger Randy Allen said...

I was wrong earlier, this is certain. SSE4A is not just grouped SSE3 instructions. I found this on Anandtech:


AMD also introduced four new SSE extensions: EXTRQ/INSERTQ, MOVNTSD/MOVNTSS. The first two extensions are mask and shift operations combined into a single instruction, while the latter two are scalar streaming stores (streaming stores that can be done on scalar operands). We may see some of these same instructions included in Penryn and other future Intel processors.


This is SSE4A. It is two SSE3 instructions grouped together and another two separate new instructions.

8:52 PM, May 13, 2007  
Blogger Evil_Merlin said...

OH NOES!

More not so good reviews for AMD's supposed NVidia killer:

"Truth to be told, Nvidia's boards are far less hot and consume less power in 2D mode when compared to AMD's power hog."

9:02 PM, May 13, 2007  
Blogger R said...

The AMD Phenom for gamers will be out at about the same time as Penryn.

9:19 PM, May 13, 2007  
Blogger Randy Allen said...

This little gem is from another R600 review on hardocp.

Radeon HD 2900 XT In Trouble

ATI definitely did not pull off another Radeon 9700 Pro. The Radeon 9700 Pro was a phenomenal card for the time. It introduced a 256-bit bus for the first time and excelled in DirectX 9 shader performance. The Radeon 9700 Pro positively shocked gamers and had a very long life span. Everyone wanted a generational jump such as that with the R600. The ATI Radeon HD 2900 XT however is more akin to NVIDIA’s GeForce FX 5800. It does not seem like this will have a very long life span in comparison. NVIDIA quickly answered the GeForce FX 5800 by introducing the GeForce FX 5900 (NV35). ATI really needs to do something similar in this situation, or they may lose some loyal fans in the enthusiast community and you can bet they are going to continue to lose sales to NVIDIA’s 8000 series products.

Here is what it boils down to. If the Radeon HD 2900 XT performed exactly on par with the GeForce 8800 GTS in every game, it would still be a loser because it draws nearly 100 more watts of power, meaning it is very inefficient. The facts are though that it doesn’t even match the 8800 GTS currently. In every game it slides in underperforming compared to the GeForce 8800 GTS 640 MB, and it does it while drawing a lot more power, as much power or more as an 8800 GTX. Not only that, but a GeForce 8800 GTS 640 MB based video card can now be had for up to $70 cheaper than the Radeon HD 2900 XT. I don’t know about you, but a video card that is cheaper, runs a lot faster and draws less energy just seems like the better value to me.

This doesn’t even bring into the equation the GeForce 8800 GTX which outclasses the ATI Radeon HD 2900 XT on every front. Yes, it costs about $130 more, but that $130 buys you a lot more performance in games. It is sad that ATI does not have a GPU to compete with the GeForce 8800 GTX. At this point NVIDIA has, dare I say it, a monopoly over the high-end of computer gaming video card market. If you want the best gaming performance, it is still the GeForce 8800 GTX. The GTX has no competition.

No High–End Video Card?

Is ATI going to completely drop out of the high-end segment? If you ask anyone at ATI they will tell you that they actually haven’t because you can CrossFire two of these video cards to achieve higher performance. But do you really want two of these high wattage hot running video cards in your system? There are gamers that actually prefer just using one video card and have not bought into the dual GPU solutions. We asked ATI why there is no higher-end version and they pretty much told us that no one would buy it when you take the CrossFire into consideration. Well, we know that is simply false because there are people that buy $499 and $599 single video cards, case in point the GeForce 8800 GTX. ATI’s answers were a painful copout to not being able to get the job done and it was obvious.

I have a strong suspicion that the R700 will be able to compete in the high-end. The R600’s main problem is that the GPU is inefficient and draws so much power that they have simply hit a wall as to how fast they can make the Radeon HD 2900 XT. There is simply no way they can make it faster with the current ASIC design. There will be a 1 GB version in the future utilizing GDDR4, but from what we hear it will most likely not bring much of a core or memory clock speed increase, it will simply be a 1 GB version of the HD 2900 XT. GDDR4 should help some, with the power utilization, but even with that it appears the 2900 XT is simply running into a big brick wall.

It is sad to see less competition in the high-end. We may be seeing the first signs that this is a bad thing for consumers with the GeForce 8800 Ultra that was launched a few weeks ago. That video card only received a modest overclock from the GTX but incurred an incredibly high price penalty. A penalty that was so ridiculous it rendered the video card pointless. We hope this is not a trend starting; we’d hate to see overpriced video cards becoming the norm.

9:46 PM, May 13, 2007  
Blogger Randy Allen said...

The important part of the above quote:

It is sad that ATI does not have a GPU to compete with the GeForce 8800 GTX. At this point NVIDIA has, dare I say it, a monopoly over the high-end of computer gaming video card market. If you want the best gaming performance, it is still the GeForce 8800 GTX. The GTX has no competition.

10:37 PM, May 13, 2007  
Blogger abinstein said...

"This little gem is from another R600 review on hardocp."

I see an Intel fanboy run out of things to say amid the great performance of Barcelona, and start to scream about a 1% niche market.

2:14 AM, May 14, 2007  
Blogger Randy Allen said...

Abinstein previously claimed that the data was false, and that it wasn't true. Now that it's verified as being true and totally valid he claims that the high end graphics market is unimportant. This just gets funnier and funnier. What's next? If Barcelona is a flop, claim that quad core servers CPUs are an unimportant market?!

We had AMD talk about how great R600 would be, that was a total flop. Slower than G80. No one should believe anything AMD is going to say about Barcelona performance. For all we know it might well be slower than Clovertown.

A niche market? The market might be small, but it's almost pure profit. A lot more so than selling low end GPUs. It's one that's incredibly important to video card manufacturers.

AMD has lost significant market share in CPUs, falling below 20% to a meagre 19% market share. That puts Intel at 80% market share with VIA taking the remaining 1%. AMD is building up inventory because no one wants to buy it's under powered desktop CPUs. Expect Intel to gain further market share from AMD throughout the year.

Intel has 45nm and Penryn coming this year, Intel will plan the Penryn launch to be as close as possible to the Barcelona launch. This is a perfect way to crash AMD's party and prevent any possbility of AMD taking back any market share.

Then next year Intel has a whole new architecture coming, with the graphics on the CPU die a full year before AMD will have Fusion ready. It will also scale up to 8 cores as Pat Gelsinger has stated. 8 core CPUs will be huge for servers.

So, as a recap, DAAMIT is losing money and market share on all fronts; graphics and CPUs. AMD is finished.

AMD BK Q2'08.

2:53 AM, May 14, 2007  
Blogger Ycon said...

It does not run at 1.144V
Notice how everything in the pic is slightly italic, but the voltage number is not... ph@k3... as fake as everything AMD is doing these days.

K10 will likely run at about 1.4V (prolly more) to sustain the higher clock-speed bins.

5:40 AM, May 14, 2007  
Blogger Evil_Merlin said...

abinstein did EXACTLY as a fanboi would.

When backed against the wall with facts, squirm.


To the fanboi, it's perfectly fine for things to matter one day, then mean nothing the next, as long as the fanboi feels safe.

Apparently the AMD fanboi's ain't feeling so well with ATi/AMD's not so good, top of the line video cards/video processors.

6:03 AM, May 14, 2007  
Blogger Unknown said...

It’s still very earlier to say that AMD/ATI is loss in this battle. The returns of AMD this Q3 will be changed everything.. AMD will kick back the Evil Intel to darkness.. AMD will transform back the Giant Intel to the stupid Shrek.. Ekekekeke..

7:48 AM, May 14, 2007  
Blogger Unknown said...

AMD chopping more jobs, 800 more people are to be cut.

Soon Hector will be one of them.

http://www.dailytech.com/AMD+to+Cut+More+Than+400+Jobs/article7241.htm

7:55 AM, May 14, 2007  
Blogger Evil_Merlin said...

pezal do you have Tourette's or something like that?

Why is it too early to declare AMD/ATi as the fight lost, but pefectly acceptable for you and your not-so-illustrious leader (who lies about his so called Ph.D.) to declare Intel dead and beat?

Fanboism is bad Mkay?

7:56 AM, May 14, 2007  
Blogger Unknown said...

"We pushed out the launch of the R600 and people thought is must be a silicon or software problem…it's got to be a bug," said Dave Orton, president and chief executiveof ATI. "In fact, our mainstream chips are in 65nm and are coming out extremely fast. Because of that configuration, we have an interesting opportunity to come to market with a broader range of products," he explained.

"Instead of having them separate, we thought, lets line that up, so we delayed for several weeks," Orton continued, referring to the R600 family as a whole, which AMD now says will come out at the same time (a matter of weeks as opposed to months, according to Richard) instead of just the high-end version."


And yet today they paper launched the 2600.

AMD should be convicted of fraud.

8:25 AM, May 14, 2007  
Blogger Unknown said...

AMD top of the line GPU is seriously broken, can't render images properly.

http://www.anandtech.com/video/showdoc.aspx?i=2988&p=15

Check the glowing step, it breaks down into a series of unlit bands in the AMD hardware.

9:28 AM, May 14, 2007  
Blogger Unknown said...

Best one yet:

"AMD has built hardware with the performance of an 8800 GTS in a power envelope beyond the 8800 Ultra"

9:31 AM, May 14, 2007  
Blogger PENIX said...

The R600 series is an excellent first release for AMD, but it is not their true first chip. As was stated by many, the R600 was being developed by ATI before the acquisition. While AMD surely had influence, most of the work was already done. This means that AMD's true first GPU has yet to be seen.

AMD's solid history of innovation and technological dominance will surely conquer the GPU market as well. There is no doubt in my mind that AMD will soon dominate both the CPU market and the GPU market.

10:03 AM, May 14, 2007  
Blogger Unknown said...

This is so much fun to watch.

"R600 will show you guys what AMD can do!"

R600 fragged by nVidia. R600 requires small nuclear reactor to power.

"R600 wasn't made by AMD! Muahahaha see!"

10:19 AM, May 14, 2007  
Blogger Unknown said...

R600 is excelent?!?

It's a power sucking peice of junk according to every single review. I'd welcome you to post a link to a review that calls the R600 "excellent".

10:20 AM, May 14, 2007  
Blogger Evil_Merlin said...

Power hungry, hotter than hades, and has some serious rendering issues.

Other than that, it performs as a mid level NVidia card does.

Good going AMD!

10:45 AM, May 14, 2007  
Blogger Unknown said...

I see an Intel fanboy run out of things to say amid the great performance of Barcelona, and start to scream about a 1% niche market.


I see an AMD fanboy.

12:07 PM, May 14, 2007  
Blogger The Dude said...

Egads! 1.1 Volts is pretty amazing, if this info is any good, especially for 65 nm. They obviously massaged the hell out of this core.

At 1.1 vcore for a 2.4 GHz Quadcore, AMD ought to have some serious ammunition for their upcoming server products.

If their upcoming CPUs perform anywhere near the claimed performance, they will clearly have the performance-per-watt advantage. Of course, if Intel manages to actually pull Penryn into Q4 in any volume, this may lessen the AMD offensive. It will be interesting to see what Penryn's power requirements will be.

12:45 PM, May 14, 2007  
Blogger ck said...

did fudo mention this?

http://www.fudzilla.com/index.php?option=com_content&task=view&id=982&Itemid=1

Barcelona B0 2.4 GHz tested
- 50 to 100 per cent faster than QX6700

Yes, he did!!

12:14 AM, May 15, 2007  
Blogger Unknown said...

oops... guess FUDzilla got kicked in the shin...

http://dailytech.com/Barcelona+Ben
chmarks++Dont+Believe+Em+Just+Yet/a
rticle7291.htm

I'm sorry Doc, but there is no 1.1V K10 @ 2.4Ghz.

nice try though.

6:02 PM, May 15, 2007  

Post a Comment

<< Home