Wednesday, August 29, 2007

Intel needs a 6GHZ Quad to compete against Phenom 3GHZ

5GHZ Intel quad scores 26K.

3.2 GHZ Intel quad scores 16K.

3GHZ Phenom scores 30K.

60 Comments:

Blogger Chuckula said...

Sharikou... why the hell do you have to wait so long to regurgitate AMD fanboyism. This is YESTERDAY's lame excuse for news.

I have a question... if the Opteron is so amazingly fast, then why the hell wasn't that 1.6Ghz piece of crap at Computex destroying all the Intel machines? It was clock for clock slower than the existing 2.4 Ghz Quads!! After all, it would still require an Intel running at 3.2 or more Ghz to stop that thing (according to your delusional fanoboism).

There is another option: You are full of shit. That seems a little more likely, and I'm just wondering if you will lie and deny you even posted this crap in about 2 weeks when the truth comes out and it ain't pretty for AMD.

4:22 PM, August 29, 2007  
Blogger Unknown said...

As reported by the same person who made the reverse hyperthreading claims about socket AM2.

There's a reliable source for you.

Or, you could read a little further (other sites actually) and learn that 3Dmark is GPU dependent, and how overclocked GPU must have been in order to obtain this score.

And then you could do a little more research here:
http://www.xtremesystems.org/Forums/showthread.php?t=102058
And run the 3Dmark06 score calculator and find that even getting a score of 99999 for the cpu will not get you to 30k for a total score.

In other words, the Inq is full of it's usuall shit, and so is Sharikou.

Can't wait for the AMD fanboys to aregue with the facts again.

4:41 PM, August 29, 2007  
Blogger Unknown said...

BTW, here's alink to how 3Dmark06 actually calculates the scores.

Go ahead and do the math yourself and figure out how fast of a cpu you need to get 30k when running any currrent gpu.

http://www.tomshardware.com/2006/04/03/3dmark06_under_the_magnifying_glass/page12.html

4:50 PM, August 29, 2007  
Blogger Unknown said...

Hey Doctor, you don't think that you compared a single gpu system to a multi gpu system in a gpu synthetic gpu benchmark would have anything to do with the scores, do you?

And since you are a doctor, please relate these synthetic scores to real world application performace please.

4:58 PM, August 29, 2007  
Blogger GutterRat said...

Hey Sharikou,

Great News!

The wikipedia entry for MORON has been augmented with 'Sharikou Ph.D.' as a synonym.

Again, congratulations!

6:14 PM, August 29, 2007  
Blogger Scott said...

MEMORY BENCHMARKS!

ROFL!!!

Yes, because memory benchmarks are better, as opposed to other REAL WORLD ones that actually matter.

Nice try fanboy.

6:15 PM, August 29, 2007  
Blogger Unknown said...

Tigerton launched next week:

http://news.com.com/8301-13579_3-9767602-37.html

Barcelona is pre-fragged. AMD is finished.

9:30 PM, August 29, 2007  
Blogger anti-Intel guy said...

1. K10 @3 GHz = 30K in 3DM06
2. I spoke with Theo Valich and confirmed that the score is not a lie.

"I got a confirmation from head of AMD EMEA and he confirmed to Mike that I am right".

THIS IS NOT AMD FANBOYISM, this is the TRUE.

BANG!

10:12 PM, August 29, 2007  
Blogger Unknown said...

AMD 690G is plagued with issues relating to DVD playback. George Ou finds that Intel G965 is a far better choice.

http://blogs.zdnet.com/Ou/?p=632&page=3

10:19 PM, August 29, 2007  
Blogger Unknown said...

hoho explains in the set of comments below this thread that it's impossible to get 30,000 in 3Dmark 06.

Nice try, AMD fanboy.

I had in my possession, for a brief period of time, two G92 GPUs and a Yorkfield CPU. They were all ESs. I ran 3DMark and got a score of two-hundred thousand! Unfortunately, I took pictures but the flash drive I had saved them on was stolen. I had to return that gear to Nvidia and Intel, respectively.

There you go. A total BS post. Anyone can make one up. I could add credible lines about knowing the right people at Nvidia and Intel and all that other nonsense.

That's exactly what Theo Valich has done.

10:27 PM, August 29, 2007  
Blogger anti-Intel guy said...

@giant: THIS IS THE TRUE, nothing more, nothing less! You'll see that I'm right after 09/05.

Cheers

10:31 PM, August 29, 2007  
Blogger Ho Ho said...

"You'll see that I'm right after 09/05."

What will I win if you are wrong?

11:25 PM, August 29, 2007  
Blogger anti-Intel guy said...

10 beers... or maybe a bottle of scotch... ;)

11:52 PM, August 29, 2007  
Blogger Unknown said...

Barcelona PRE-FRAGGED by Intel

http://www.xtremesystems.org/forums/showthread.php?t=157136


Game Over Sharikaka. Go home.

12:00 AM, August 30, 2007  
Blogger anti-Intel guy said...

This comment has been removed by the author.

12:10 AM, August 30, 2007  
Blogger anti-Intel guy said...

8 x 2453 MHz vs 8 x 2006.5 MHz

Cinebench 10 - 64 bit

Xeon E5320
1 cpu = 2.844
x cpu = 17.298
Speed = 6.08x

Opteron 2332
1 cpu = 1.896
x cpu = 13.295
Speed = 7.01x

Yeah, nice.

12:36 AM, August 30, 2007  
Blogger Ho Ho said...

That is all that native quadcore and massively superior HT gives over shared FSB? I'm dissapointed to say the least.

Also per-clock speed is kind of interesting. Scaling K10 linearly up to Xeon level it should score around 2300 per-cpu and ~16250 for 8 cores total. That means at the same clock K10 is quite a bit slower than Xeon, that Xeon isn't even 45nm with SSE4.

1:09 AM, August 30, 2007  
Blogger Unknown said...



That is all that native quadcore and massively superior HT gives over shared FSB? I'm dissapointed to say the least.


It's pathetic, to say the least. All that blabbing on about how FSB can't scale, MCM is the devil etc. and that's the meager scaling boost they get? Don't forget, Clovertown goes to 3Ghz while Barcelona is stuck at 2Ghz.

AMD, Pre-Fragged products are their specialty!

2:55 AM, August 30, 2007  
Blogger Aguia said...

That means at the same clock K10 is quite a bit slower than Xeon, that Xeon isn't even 45nm with SSE4.

What server application that you know uses SSE4? Or even SSE3?
You talk of SSE4 as it’s some immediate performance boost technology that all existing software already benefit from it…

4:07 AM, August 30, 2007  
Blogger anti-Intel guy said...

Theo from the L'inq sent me another mail 10 min ago and told me he just finished some benches with K10 @ 2.5 GHz. And you know what? He was right about K10... It just scored about 24k in 3DM06. I'll keep you informed as soon as i got other news. THESE ARE NOT LIES. Believe what you want.

PS: XtremeSystem's lied about K10.

4:14 AM, August 30, 2007  
Blogger Ho Ho said...

"What server application that you know uses SSE4?"

how about encryption?


"Or even SSE3?"

You seem not to know that SSE3 mainly helps with thread synchronization and other multithreaded things.


"You talk of SSE4 as it’s some immediate performance boost technology that all existing software already benefit from it…"

No, not existing but future products quite likely will.


"I'll keep you informed as soon as i got other news"

Let him do a quick run with superpi and Cinebench also to see what a "real" K10 can do


"PS: XtremeSystem's lied about K10."

About what exactly? Those 2GHz ES samples are all fake?

4:27 AM, August 30, 2007  
Blogger Hornet331 said...

anti-Intel guy, links, pics, or even a validation link ?

4:32 AM, August 30, 2007  
Blogger anti-Intel guy said...

K10 @ 2.5 benches will pop-up soon enough on L'inq, but not until 09/05.

XS guy cannot be trusted. That's all i can say.

4:38 AM, August 30, 2007  
Blogger Unknown said...



K10 @ 2.5 benches will pop-up soon enough on L'inq, but not until 09/05.


Proof or it's BS. Screenshots are good. When you run 3Dmark06 on a computer you have to submit the results online. You then get a link for those results. Where is this link?

BTW, I just finished talking to my sources at Nvidia and Intel. G92 and Nehalem will be released next Tuesday. Both will run at 5GHz. You'll see proof of this when I deem it appropriate. Anyone that says anything to the contrary on this matter is clearly lying.

4:42 AM, August 30, 2007  
Blogger Aguia said...

ho ho,

No, not existing but future products quite likely will.

If that’s so comments like this:

That means at the same clock K10 is quite a bit slower than Xeon, that Xeon isn't even 45nm with SSE4.


Are unnecessary. Unless you want me to say:

Wooo 24000 in 3dmark and that Phenom isn’t even 45nm and using the new K10 instructions.
^Comments like this sound biased and absurd.

how about encryption?
Link for SSE4 encryption test/application and performance comparison.

4:47 AM, August 30, 2007  
Blogger Ho Ho said...

My point was that when K10 isn't all that fast compared to current Clovertowns then with ~5% higher IPC Penryns they will be even less competitive, more so if the apps use SSE4.

As for SSE4 tests, I can give you a link to SSE4 instructions, anyone who knows basic math and encryption can see how it can speed it up. There is no benchmarks to show it simply because nobody has released anything yet.

5:04 AM, August 30, 2007  
Blogger Unknown said...

ekkekeke.. Barcelona has a secret weapon.. If Intel could not clock their CPUs over 6Gzh to match Phenom 2.5Gzh than Intel will definitely BK by Q1'08.. ;-)

6:30 AM, August 30, 2007  
Blogger Unknown said...

Accordng to an article on TechARP, the SPEC numbers are already done but are under NDA. And AMD rep says that Barcelona is 20-30% faster on average and wins by as much as 170% on some benchmarks.

http://www.techarp.com/showarticle.aspx?artno=434

6:34 AM, August 30, 2007  
Blogger Ho Ho said...

That seems to be compared to their dualcore K8's.


"More importantly, he says, the Quad-Core Opteron (Barcelona) will offer 45-85% better performance than current dual-core Opteron processors at the same power consumption and thermal dissipation."


What happened to doubling the performance with same thermals?

6:43 AM, August 30, 2007  
Blogger Unknown said...

the Quad-Core Opteron (Barcelona) will offer 45-85% better performance than current dual-core Opteron processors at the same power consumption and thermal dissipation.

Quad-Core Opteron = 1.9Gzh

Current Dual Core Opteron = 3.0Gzh

45-85% better = Quad Opteron 1.9Gzh is 45-85% faster than the 3.0Gzh (current dual core opteron)at the same power and thermal dissipation..

7:00 AM, August 30, 2007  
Blogger Unknown said...

John says the Quad-Core Opteron systems had already been submitted to the benchmark organizations for testing. The results are all under NDA and will only be released on launch date, September 10, 2007

He said that Quad-Core Opterons which is the 1.9Gzh and the 2.Gzh are already been submitted to NDA and it will be only be released on launch date which is September 10.

7:09 AM, August 30, 2007  
Blogger Unknown said...

the Quad-Core Opteron (Barcelona) will offer 45-85% better performance than current dual-core Opteron processors at the same power consumption and thermal dissipation.

And.. he also says the Quad-Core Opteron (which is the 1.9Gzh and the 2.0Gzh that AMD want to launch this september 10) will offer 45-85% better performance than current dual-core opteron (which is the opteron 3.0Gzh) at the same power consumption and thermal dissipation.

7:14 AM, August 30, 2007  
Blogger Unknown said...

"The pics are gone with my stolen laptop, though."

LOL... right...

9:38 AM, August 30, 2007  
Blogger Unknown said...

The article also states:

LAST WEEK in Leipzig my kit was nicked, but before that happened we asked AMD if it would let us run memory benchmark scores on a system there. The reps gave us the company line and declined, so we decided to disclose the benchmark scores of our own K10 benchmarking here and now.

So AMD declined to let him run memory tests on the system. Yet he was still able to run 3Dmark, overclock the CPU and run 3Dmark again?

If the AMD staff wouldn't let him run a memory test, why would they let him run the 3Dmark tests?

10:04 AM, August 30, 2007  
Blogger Aguia said...

What happened to doubling the performance with same thermals?

Your messing up things. It's doubling the number of cores with the same thermals.

11:12 AM, August 30, 2007  
Blogger Unknown said...

According to this video (featuring none other than Randy Allen!) a quad core was twice as fast as a dual core at the same frequency:

http://youtube.com/watch?v=VGiv9Dtrc5Q

At the same frequency are the key words. Dual core Opteron is 3.2Ghz. Barcelona doesn't go that high.

11:31 AM, August 30, 2007  
Blogger NT78stonewobble said...

"XS guy cannot be trusted. That's all i can say. "

But the wholly IMPARTIAL "anti-intel guy" can be trusted?

Yes offcourse...

11:51 AM, August 30, 2007  
Blogger Unknown said...

http://www.worlds-fastest.com/wf0000.html

http://www.worlds-fastest.com/wfz991.html

1:46 PM, August 30, 2007  
Blogger Unknown said...

we asked AMD if it would let us run memory benchmark scores on a system there. The reps gave us the company line and declined

lined = Benchmark that allowed to be tested such as the 3DMark06

Declined = Benchmark that not allowed to be tested such as the memory test.

Stolen = If his laptop was not got stolen, than the K10 benchmark will be revealed and there will be no more surprised to Intel people such as you guys.. ;-)

3:58 PM, August 30, 2007  
Blogger Unknown said...

At the same frequency are the key words. Dual core Opteron is 3.2Ghz. Barcelona doesn't go that high.


No.. the keywords is "Barcelona to blow away Intel CPUs"

4:09 PM, August 30, 2007  
Blogger Unknown said...

I know what a 2GHz Barcelona does in 3DM06, lets just say that must have been one very very special 3GHz chip and video card setup to hit 30K.

-Gary Key, Anandtech Editor

4:19 PM, August 30, 2007  
Blogger somebody said...

Hi all,
To those who don't believe that phenoms are quite fast, I'd like to say the following.
I used to do a lot of benchmarking different instructions on a celeron pc. the results were disgusting. a simple jump misprediction could cost 700 clock cycles. hundreds of clocks if you did an xor before a jump, and so on. what I am trying to say is that the existing cpus are simply primitive and there is still a lot of room for optimizations in the architectural level. I dare say a perfect architecture would mean atleast 100 times the speed of the existing cpus.
so if phenom does what the inq says, well actually no big deal.

4:39 PM, August 30, 2007  
Blogger oneexpert said...

This comment has been removed by the author.

4:51 PM, August 30, 2007  
Blogger Unknown said...

OneExpert you tool, how is the ATI and AMD products energy saving?

The ATI 2900XT is the highest power consuming Card.

The AMD 6400+ is the highest Power consuming chip especially at dual core.


What are you smoking you tool.

5:02 PM, August 30, 2007  
Blogger oneexpert said...

This comment has been removed by the author.

7:23 PM, August 30, 2007  
Blogger oneexpert said...

This comment has been removed by the author.

8:28 PM, August 30, 2007  
Blogger Unknown said...

PAID AMD STUDY DOES NOT USE INTEL QUAD CORE PARTS.

What kind of crap is that? Just because AMD can't seem to launch ANYTHING on time they exclude Intel quad core results?

I heard from AMD that their quad core parts were never delayed. They've just simply been set to release right alongside Duke Nukem Forever.

8:45 PM, August 30, 2007  
Blogger NT78stonewobble said...

@onexpert

"BUY ATI/AMD superior, super saving, video solutions... "

A more correct conclusion to your post would be:

"You get what you pay for."

As at the moment any amd / ati solution is performance inferior to an intel / nvidia solution.

Oh well lets hope amd / ati can soon compete performance wise.

Lower prices and higher performance necessitated through competition would be in any consumers interest right?

10:14 PM, August 30, 2007  
Blogger Aguia said...

OneExpert you tool, how is the ATI and AMD products energy saving?

The ATI 2900XT is the highest power consuming Card.


Nope:

pcstats

At idle, one Diamond Viper Radeon HD 2900XT videocard draws 178W, about 18W less than the MSI NX8800GTX-T2D768E-HD. Even when a single Radeon HD 2900XT is overclocked, it still draws less than one Geforce 8800GTX.

Remind you the one has 1GB memory the other 768MB.

Energy consumption jumps a lot when the videocards are under gaming loads.
The MSI NX8800GTX-T2D768E-HD draws more power (345W) than a Diamond Viper Radeon HD 2900XT (337W). With one Diamond HD 2900XT overclocked, power consumption rises slightly to 346W.

3:27 AM, August 31, 2007  
Blogger Unknown said...

Nvidia cheats in HD HQV test??

Nvidia tweaked its driver to meet the quality standards of the most famous HD HQV video quality benchmark and with the last 163.11 and 163.44 beta drivers it can score some nice numbers with this test. The catch is simple, HQV has a lot of static scenes and Nvidia can tweak the driver to look good on this non motion video.

6:27 AM, August 31, 2007  
Blogger oneexpert said...

This comment has been removed by the author.

1:12 PM, August 31, 2007  
Blogger Unknown said...

Totally incorrect. The study has MANY flaws. I'm no expert but I'll just sit here and point a few out to you:

1) The 5160 is Intel's fastest dual core CPU. Why didn't they use AMD's fastest dual core CPU, the 8224SE? Because it's TDP is 30W higher perhaps?

2) Did they use the new G0 stepping 5160, that cuts the TDP to 65W, down from 80W?

3) They claim it's only fair to compare dual core CPUs to other dual core CPUs. This totally negates Intel's quad core advantage.

4) When using two quad core CPUs in a 2P server you'll obtain far higher performance numbers, producing a much higher performance per watt. Again, they only failed to use quad core CPUs because AMD has nothing at all to show in this category.

BUY INTEL high performance QUAD CORE CPUs, energy saving cpus, platforms and video solutions.

7:08 PM, August 31, 2007  
Blogger Unknown said...

This comment has been removed by the author.

10:21 PM, August 31, 2007  
Blogger Unknown said...

These scores come from XtremeSystems:

They are a comparison at 2Ghz between Barcelona and Harpertown (both unreleased products):

If you want all the pretty pictures etc. as before the link is: http://www.xtremesystems.org/forums/showthread.php?t=157136

Harpertown results on the left, Barcelona on the right.

23.3s vs 39.7s for Pi
20.48 vs 17.13 in "relative speed" for Fritz
2454 vs 1896 in single cpu cinebench
15334 vs 13295 in mutli cpu cinebench
6.25x vs 7.01 cinebench scaling (78 vs 88 percent)
10.4s vs 10.6s for wprime 32m
310.9s vs 327.4s for wprime 1024m

As it is plainly clear, Barceloa is behind Harpertown on IPC. The only test Barcelona is better in is the cinebench scaling.

But, what's also important to note, is that this scaling advantage advantage will be decreased with the Harpertown CPUs that feature a 1.6Ghz bus, up from 1.33Ghz today. It's not much, but should be good for a few %.

10:23 PM, August 31, 2007  
Blogger Unknown said...

where do you get those figures giant? Can you provide links pls.. ;-)

2:37 AM, September 01, 2007  
Blogger Unknown said...

Can't AMD users even read now?

If you want all the pretty pictures etc. as before the link is: http://www.xtremesystems.org/forums/showthread.php?t=157136

4:55 AM, September 01, 2007  
Blogger Unknown said...

But it seems none of the benchmarks can be trusted Giant.. Didn't you noticed that something wrong with the CPU-Z? The HT link seems like being disabled by the tester.. And, the CPU-Z detects only one CPUs running instead of supposed to be 4 cores.. Beside that, why the CPU-Z detects the processor code name Agena?? Wasn't it supposed to be Quad-Opteron? What I know, Agena/Phenom is for desktop not for server.. AMD server processor still called Opteron.. correct me if I'm wrong.. ;-)

5:53 AM, September 01, 2007  
Blogger terry said...

i stop buying intel proc since p4 2.6ghz prescott ( proc so hot ). i bought it because got eaten by marketing blurbs of intel ads and intel fanbois at forums. bt since bought, my pc crash randomly because of heat. i gave up.

i bought amd x2 3600+ and very pleased it can give what i desrved with the money i hv that time, with 150 bucks, i bought asus mobo with gf6150 chipset and 1gb of ddr667 ram. plug into the old pc. and its alive. the proc so good that i cant notice any lagging frames when playing online games counter-strike1.6 with virus scan kicks in as scheduled. only knows after playing. before this, i had lagging frames with my prscot.

i also observed the same thing happens as my presot with my friend's intel e6300 proc. poor him..bought twice the price of mine still suffer lags.

intel processors sux big time!!!

intel ads really sux everytime!!!

intel fanbois shitting anywhere,anytime!!!

8:49 AM, September 01, 2007  
Blogger Ho Ho said...

pezal
"The HT link seems like being disabled by the tester"

Without HT the CPU couldn't talk to the rest of the system. Basic 101 on computers.


"And, the CPU-Z detects only one CPUs running instead of supposed to be 4 cores"

So how does task manager show 8 cores?


"Beside that, why the CPU-Z detects the processor code name Agena?? Wasn't it supposed to be Quad-Opteron?"

It is not the first time that CPUz can't find the correct name of the CPU, especially if it is unreleased. You should see what does CPUz show under Linux when run through Wine, IIRC I saw that I had 3GHz Pentium 2 with all the latest SSE versions :)

2:22 AM, September 02, 2007  
Blogger NT78stonewobble said...

@terry

"i stop buying intel proc since p4 2.6ghz prescott ( proc so hot ). i bought it because got eaten by marketing blurbs of intel ads and intel fanbois at forums. bt since bought, my pc crash randomly because of heat. i gave up."

Lots of people get fooled when starting out with pc's. You really have to know what your doing.

Yes the prescott was hot and noisy and though I didn't own it I'd rather have had the previous northwood processor @ say 3.2 ghz.

"i bought amd x2 3600+ and very pleased it can give what i desrved with the money i hv that time, with 150 bucks, i bought asus mobo with gf6150 chipset and 1gb of ddr667 ram. plug into the old pc. and its alive. "

I do believe that the x2 3600+ is a good buy when being on a budget. I'm considering it for a very low budget media center pc and for upgrade for 2 family members pc's.

"the proc so good that i cant notice any lagging frames when playing online games counter-strike1.6 with virus scan kicks in as scheduled. only knows after playing. before this, i had lagging frames with my prscot."

Now now, the prescot wasn't a true dual core processor. And you'll probably allways risk lagging with antivirus unless you are running some insane raid array of 15000 rpm harddrives.

Basically both the antivirus and game WILL try to access the slowest component of your computer. The harddrive.

Now if the game doesn't have to access the harddrive (perhaps only during level loads) you won't notice this as lag, but perhaps longer loading times.

"i also observed the same thing happens as my presot with my friend's intel e6300 proc. poor him..bought twice the price of mine still suffer lags."

That can happen but you really can't blame it on the processor. Theres a lot of other variables. Ram, OS, antivirus software, game software, harddrive type, level of fragmentation and so on...

"intel processors sux big time!!!"

Some do and others don't... especially compaired to the their competition at the same time. I mean the history of processors through the last 20 years...

"intel ads really sux everytime!!!"

AGREED !!!


"intel fanbois shitting anywhere,anytime!!! "

So do AMD fanbois. Actually all fanboys are morons IMHO...

11:30 AM, September 02, 2007  

Post a Comment

<< Home