Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Processors ...?

Status
Not open for further replies.

roseusa

IS-IT--Management
Jan 10, 2007
54
US
Can any body explain the difference in

1. dual core
2. quad

and if speed is 2.0 G.Hz then what will actual speed in dual and quad core...?
 
1. dual core
2. quad

1. 2 processors on die. 2. 4 processors on die.

and if speed is 2.0 G.Hz then what will actual speed in dual and quad core...?

The number of cores in a CPU doesn't have any relationship to clock speed. Much has to do with whether the software in question is capable of supporting it.

In truth, there are very few pieces of software that provide a speed benefit on multi-core/multi-processor computers. So if given the choice between a slower quad-core or a faster dual-core for the same money, go with the faster dual-core.


----------
Measurement is not management.
 
Let's start with your last question first...
When you see the clock speed such as 2.0GHz being reported on a multi-core CPU, that means that each core will run at that speed. So a 2.0GHz Core 2 Duo from Intel will have each core running at 2GHz. That doesn't mean twice the computing power of a single core running at 2.0GHz, however, as the following will help explain.


Start with a quick summary:

This is a bit dated, but gives you an idea what the discussion was like just 8 months ago:

Here's an up-to-date article in regards to gaming:


As you will learn, it depends on what you plan to do the most on your PC. In most situations, it will benefit you to have a quad core these days. On the other hand, if you just plan to surf the web, e-mail, play an occasional game here and there along with other low-intensive tasks, then a quad-core will likely be overkill and not show much improvement over a dual-core in your situation. It is also important to note that there are still some tasks that will perform faster on a dual-core CPU where each core is faster than the quad-core you are comparing it to.

But now that many applications and games are beginning to have multi-core systems in mind when they are developed, it won't be long before we see the dual core begin to disappear from sight. It's never a bad idea to future-proof your system. Just a year ago, I was recommending the dual over the quad. How quickly times have changed!

~cdogg
"Insanity: doing the same thing over and over again and expecting different results." - Einstein
[tab][navy]For posting policies, click [/navy]here.
 
Thanks for the replies, it helps to understand difference.
thanks so much
 
There is also a "quasi-quad core" CPU (2 Dual Core CPU dies joined at the front side bus) and "true quad core" (all four cores on the dame die). Most being made today are the latter, but check anyway. Four cores on the same die will be faster.

Earlier dual-core (like Pentium D) were two single-core joined at the FSB but that's all but gone now...I think. As in the best quad-cores, it's best to have both cores on the same die.

Tony

Users helping Users...
 
Actually, Intel still takes a pair of dual cores and "glues" them together to get to 4 cores. AMD is the only one making "true" quad core x86 CPUs. But Intel has been able to pack enough cache into their quads to make up for the slow FSB bottleneck.

________________________________________
CompTIA A+, Network+, Server+, Security+
MCSE:Security 2003
MCITP:Enterprise Administrator
 
And as a spanner in the works, don't forget your tri-cores!

Only the truly stupid believe they know everything.
Stu.. 2004
 
The good news is that the 2 processors on each dual-core die can still communicate using the fast L2 cache. It is only when communication must cross from one die to the other will the FSB be an issue - the bad news.

Intel's Kentsfield was the quick, cheap alternative to redesigning a single die for all four cores. Such a solution will eventually come from Intel. The first 8-core processors you'll see publicly available will likely be two true quad-core dies glued together in the same way from Intel.

My guess is that AMD will eventually have to do something similar with Hypertransport to stay caught up with Intel. Using a "cross-connect" to link each core requires a new die design which is costly and time-consuming in most cases. It's hard to say though, as AMD is keeping a tighter lip these days on future plans. And of course, the advantage of 8 cores over 4 cores won't be seen anytime soon until more of the OS and applications change enough to take advantage...


~cdogg
"Insanity: doing the same thing over and over again and expecting different results." - Einstein
[tab][navy]For posting policies, click [/navy]here.
 
Earlier dual-core (like Pentium D) were two single-core joined at the FSB but that's all but gone now...I think.
Nope, they're not all gone, but I wish they were! [wink]

I work on a machine with one of those CPUS at work, and our "media pc" at church has one of those stinkin' processors. One of these days, we're gonna banish the church one, and put in a new machine. I was thinking of trying to up to a Core 2 Quad, or at least Duo, but now I'm thinking of waiting to see what the new Nephthalim or however you spell it series works. Tom's Hardware said they couldn't give details, but from the generalizations they made, it sounds purty dad-burn good. [smile]

--

"If to err is human, then I must be some kind of human!" -Me
 
as AMD is keeping a tighter lip these days on future plans.

That depends on who you work for. Other than that, I'd say that your post is on the money. :)

________________________________________
CompTIA A+, Network+, Server+, Security+
MCSE:Security 2003
MCITP:Enterprise Administrator
 
AMD have slated a true 12 core for 2010. They are bypassing 8 core.

Only the truly stupid believe they know everything.
Stu.. 2004
 
Cool. 12 Cores?
Must be a Jewish CPU. [wink]

--

"If to err is human, then I must be some kind of human!" -Me
 
You know, I was thinking that maybe in a year or so, we could upgrade the current church media pc to a Core 2 Duo/Quad, but who knows... if we keep holding off (as it is doing what it NEEDS to do, really), if we hold off to 2010 or 2011, who knows what'll be available? 12 Core CPU, 64GB DDR-4 RAM, multiple SSDs in a RAID config???

I'm crying and drooling all at the same time!

And who knows, maybe by then, [penguin] will be a more viable option even for media related stuff!

[smile]

--

"If to err is human, then I must be some kind of human!" -Me
 
AMD's Quad FX platform is going to be updated later this year to allow the use of two quad-core CPUs and DDR3. Then in 2009, their 45nm chips (that debut at the end of this year) will be released in an 8-core version. In 2010, there is a processor code-named Bulldozer that is supposed to come out supporting 16 cores.

But probably most important, AMD plans on releasing their first Fusion design in late 2009 which integrates the CPU and GPU on the same die. This will bring powerful graphics to the low-cost market, as well as changing the graphics card industry as we know it today.

I had to dig to find it, but that there is some exciting stuff... [thumbsup2]

~cdogg
"Insanity: doing the same thing over and over again and expecting different results." - Einstein
[tab][navy]For posting policies, click [/navy]here.
 
CPU and GPU on the same die. This will bring powerful graphics to the low-cost market, as well as changing the graphics card industry as we know it today.

They have taken a small step already. Look at the reviews for the 780G based motherboard. An integrated chipset that is way beyond what Intel is doing.
They got heavily criticised for the amount they paid for ATI, but I think it's now starting to reap it's rewards.

Only the truly stupid believe they know everything.
Stu.. 2004
 
Sympology said:
Look at the reviews for the 780G based motherboard.

I didn't, but bought one anyway when building a computer for my sister. It did impress me though, the motherboard I got came with VGA, DVI, and HDMI ports and get a video score of around 3 in Vista. This isn't great but then the computer will never be used for gaming.

cdogg said:
AMD plans on releasing their first Fusion design in late 2009 which integrates the CPU and GPU on the same die. This will bring powerful graphics to the low-cost market,

I am taking all this with about five pounds of salt. Yes, placing the GPU on the same die as the CPU (or placing them onto one "chip" and connecting the two with HT) will bring a performance boost to "onboard" graphics; but you still suffer the penalty of sharing non-graphics optimized system RAM.

Then there is also the question of which GPU will be integrated into the package. I cannot see AMD integrating the latest and greatest as it would then cannibalize sales of their new generation cards; so already we're probably talking about a GPU that is a generation behind.

Add to that the different refresh cycles of CPUs and GPUs, and I really start to wonder just how "powerful" the Fusion solution will be. Just remember how long it took AMD to jump on the DDR2 bandwagon because of the integrated memory controller - we have yet to see a DDR3 capable CPU from AMD.

Many will rightly point out that these drawbacks won't really matter for the target market; but that's the point - I don't see anything to get excited about. All that I see in Fusion's future is a slightly better onboard graphics that will improve the Vista experience for the bargain PC buyer. I do not see it providing a vastly superior graphics performance to the current crop of onboard graphics or coming close to challenging the discrete graphics cards. I could very well be wrong though; AMD may really have a miracle in their labs that will surprise us.
 
vanka said:
...but you still suffer the penalty of sharing non-graphics optimized system RAM

Actually, standalone cards do too. Although a majority of the rendering is processed on the card itself, the CPU and system RAM play key roles as well. Fusion will actually reduce the amount of latency between the GPU, L1, L2, L3, and system RAM. The downside as you've mentioned is that it must share those memory resources with the CPU.

The solution, I suspect, will be to have a standalone card work in conjunction with a Fusion processor. Tasks that require the CPU and GPU to interact can be diverted to Fusion, while raw rendering horsepower that only requires access to GPU memory can be channeled to the standalone card. Imagine the possibilities here that goes beyond Crossfire or SLI!

I cannot see AMD integrating the latest and greatest as it would then cannibalize sales of their new generation cards

They have already stated that "[navy]Fusion's graphics processor will be based on a graphics card AMD plans to release in the near future"[/navy]. Well, if it's not the latest and greatest, it will sure be close! The article is dated January 2008:

The question as to how powerful Fusion will be remains a good one. I understand where you're coming from. AMD has already made it clear that early designs will be targeted for notebook users, because integrating the CPU and GPU yields a huge performance-per-watt advantage over today's integrated graphic solutions. That is why I said "powerful" and "low-cost". It's performance overall (though better than average) is not expected to compete with high-end standalone cards anytime soon.

But besides being more energy efficient and crushing the "integrated" competition, it seems that AMD is hoping it will lead to moving other non-CPU components like I/O controllers on-die as well. Clearly this model is following the path of the revolutionary Cell processor. Who knows how successful it will be? If it flops in the long run, not much is lost on the attempt.

we have yet to see a DDR3 capable CPU from AMD
And what do we have from Intel? The P35 or X38 chipsets? Tests were done to show that DDR3 doesn't have any real effect yet. It shows promise, but today's hardware isn't pushing the limit that requires the jump just yet. AMD might actually be timing it right this time for a change by waiting until late this year into early next. Though they shouldn't have hesitated in the past regarding DDR2, I just don't see how that's an issue this time around.

~cdogg
"Insanity: doing the same thing over and over again and expecting different results." - Einstein
[tab][navy]For posting policies, click [/navy]here.
 
Wow, great discussion. From what I've read, nVidia will be integrating video into ALL their chipsets, with a desktop utility (HybridPower) that can shut off the GFX card when you don't need it...less noise, lowered power consumption, longer life, better thermals all come to mind. And really, unless you're gaming or dealing with 3D rendering how much do we really use the GFX card anyway?

Tony

Users helping Users...
 
cdogg said:
Actually, standalone cards do too. Although a majority of the rendering is processed on the card itself, the CPU and system RAM play key roles as well.

No argument there, but once the CPU sends the data to the discrete video card all processing takes place there. Then there's also the onboard memory that has been optimized for graphics (GDDR2, GDDR3, and so on) that no integrated solution can (currently) make use of. True, with a discrete graphics card data still needs to be transfered back and forth between the CPU and GPU; but PCI-E X16 version 1.1 is more than enough, not to mention that PCI-E 2.0 doubles the bandwidth.

They have already stated that "Fusion's graphics processor will be based on a graphics card AMD plans to release in the near future".

Yes, but the article that you provide a link to states that the Fusion is being targeted towards laptops - dual-cores available in the second half of 2009 and no time line for quad cores. Also, according to the article; desktop chips will be released "eventually". This information leads me to believe that the GPU that will be integrated into the Fusion and "released in the near future" is probably a mobile GPU. This doesn't mean that it will be horrible, but mobile GPUs are a trade-off between energy-efficiency and power.

And what do we have from Intel? The P35 or X38 chipsets? Tests were done to show that DDR3 doesn't have any real effect yet.

You're missing the point I was trying to make in bringing up AMD's slow adoption of DDR2 and DDR3. I agree that DDR3 offers only marginal improvement over DDR2, as was also true of DDR2 vs DDR when DDR2 was released; the point is that once you integrate something into the CPU, it will be tied to the CPU's refresh/development cycle.

I'm not implying that AMD made a mistake with integrating the memory controller, but it does have its drawbacks in that the CPU must be redesigned when memory standards change. I foresee a similar issue with the Fusion; while the integrated GPU will probably be decent when it comes out, what about 6 months or a year later? I do not believe that AMD will keep updating the integrated GPU as usually the only way a CPU gets updated during its life-cycle is with higher clock speeds and a die shrink.

If, however, the GPU will not be sharing the same die as the CPU but will be sandwiched together on one chip like Itel's quad-cores to provide an easy way to upgrade the GPU; then I really don't see the point in the Fusion as CPU/GPU communications will then be through HT - same as any other integrated video solution.

Again, don't take this as a total denunciation of the Fusion; it will make great sense in the laptop, small form factor, and low-end markets. I just can't help but see it as a glorified integrated GPU.
 
vanka,
Yes, I don't think your opinion is far off from mine. In fact you restated my point about the first Fusion designs being for notebooks. This is where the immediate impact will be felt, I agree.

Where we seem to differ is where it goes from there. Like you, I don't believe it will ever be able to replace the need for a standalone card for the reasons you and I both have pointed out. But unlike you, I don't see the idea of Fusion to be only about integration of the GPU and CPU. I believe it will lead to more than that. I believe it will be a fascinating extension to the CPU's capabilities that will further enhance 3D and compliment other components such as the video card. Guess we'll just have to wait and see!

Anyone following the hijacking of this thread might also find this article interesting:
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top