Poll: In theory - would 100 cores clocked at 100mhz be better than a single 10GHZ core?
You do not have permission to vote in this poll.
100 cores @ 100mhz is more effective for today's applications and future application development.
0%
0 0%
a single core @ 10GHZ is still more effective overall for our current software model, as well as the future development of software
0%
0 0%
Total 0 vote(s) 0%
* You voted for this item. [Show Results]

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

In theory - more cores, or higher clock speed?

#1
Everyone thought that by now, 5GHZ processors would pretty much be the standard. Instead, dual-core processors hit the market. The effectiveness of the second core was initially in question, since few applications at the time were really designed to make use of them. Over time, manufactures found that simply making more cores was cheaper than raising the clock speed, and as a result, instead of 5GHZ processors, we're seeing as many as twelve cores inside of one CPU, at a clock speed of around 2GHZ or so. It looks like a similar trend is beginning to hit mobile devices, as it's simply cheaper to add more cores than to raise the clock speed further, and software is slowly adapting. This raises many questions about whether having a single, very fast core would be more effective than tons of smaller cores.

So, the question I'd like to raise is, in theory, would 100 100mhz cores, or a single 10GHZ core be better for today's computing, and how would such a model work for future computing models as technology continues to evolve?

____________________________________________________________________________________________

My personal feeling is that our current software does not make good use of 100mhz cores, so having 100 cores clocked at 100mhz would significantly hurt PC performance. If we're thinking from a theoretical standpoint, it is quite possible that 100 slow cores could be put to very good use, depending on the use. Graphics cards, for example, use a similar model. If software can be developed to be heavily multi-threaded to run across many cores, each core could have its resources, such as its own cache, registers, etc... and these threads would not have to share a universal L1 cache. This could either boost or harm performance, depending on whether a shared cache is more ideal for a multithreaded application, or whether each thread will be using different data entirely. Either way, such a CPU setup has the potential to work if software development makes use of it, but as it stands now, most software, outside of graphics, is not currently designed to make use of that many cores.

In other words, my current feeling is that our current software and our current model of computing simply isn't designed to make use of 100 cores, so a large amount of those cores would simply go to waste in a current system. I do believe that the computing trend will continue to add more cores, and clock speeds may drop slightly as more cores are added, but I believe that we'll always be looking at desktop CPU clock speeds of at least 2GHZ as more cores get added. What are your thoughts? 

Reply


Messages In This Thread
In theory - more cores, or higher clock speed? - by Darth-Apple - May 17th, 2013 at 2:13 PM

Possibly Related Threads…
Thread Author Replies Views Last Post
  Are there any significant speed improvements to 10000 RPM hard drives? Darth-Apple 2 5,469 May 27th, 2013 at 12:58 AM
Last Post: Darth-Apple



Users browsing this thread: 1 Guest(s)

Dark/Light Theme Selector

Contact Us | Makestation | Return to Top | Lite (Archive) Mode | RSS Syndication 
Proudly powered by MyBB 1.8, © 2002-2024
Forum design by Makestation Team © 2013-2024