Why can’t I get my program to use more than 50% of the CPU?

Date:October 30, 2006 / year-entry #367
Tags:tipssupport
Orig Link:https://blogs.msdn.microsoft.com/oldnewthing/20061030-01/?p=29193
Comments:    22
Summary:This is sort of the reverse of Why is my CPU usage hovering at 50%?, but the answer is the same. When I run a CPU-intensive task, the CPU percentage used by that process never goes above 50%, and the rest is reported as idle. Is there some setting that I set inadvertently which is...

This is sort of the reverse of Why is my CPU usage hovering at 50%?, but the answer is the same.

When I run a CPU-intensive task, the CPU percentage used by that process never goes above 50%, and the rest is reported as idle. Is there some setting that I set inadvertently which is preventing the program from using more than half of the CPU?

My psychic powers tell me that you have a single processor with hyperthreading enabled. (Because if you had a dual processor machine, you probably would have mentioned it in your question.) And my psychic powers tell me furthermore that the program in question is single-threaded, or at least has only one thread that is doing CPU-intensive work. Therefore, that thread is being run by one of the hyperthreading units of the CPU, and the other one isn't doing anything.

That's why you can't get more than 50% CPU usage.


Comments (22)
  1. andy says:

    There is a workaround for this problem available: go to BIOS and disable hyperthreading :)

  2. Gibwar says:

    When I saw the title, I thought the same thing! Your pyschic powers must be spreading to other people. Another varient of this is "Why can’t I get it to use more than 25% of the CPU?"… from a customer with a dual core/dual hyperthreaded system (2 physical + 2 logical = 4 "cores")

  3. ac says:

    That gives me a good possibility for digression, to ask all the readers here: I still use VC 6 and it can’t cross much 50% CPU :) What are your experiences? Is any newer VS (e.g. 2005) capable to compile using the two cores to be two times faster?

  4. Frederik Slijkerman says:

    VS 2005 can compile two different projects at the same time, if they belong to the same solution, but one project is tied to one core.

    Xcode from Apple on the other hand is able to just hand out single files to each core which really speeds up those Universal-Binary compiles (each file is compiled separately for PowerPC and Intel).

  5. Mihai says:

    <<VS 2005 can compile two different projects at the same time, if they belong to the same solution, but one project is tied to one core.>>

    Which leads to problems if you never bothered to mark the project dependencies right, as I have seen in a project some time ago :-)

  6. BryanK says:

    > Is there some setting that I set inadvertently which is preventing the program from using more than half of the CPU?

    Yes, the setting is probably called "hyperthreading".  Turn it off in your BIOS, and your single CPU-intensive thread will be able to use 100% of the CPU.

    :-P

    (Unless it actually *is* an SMP or dual-core machine.  Then, turn off the second CPU (or the second core), and you’ll be able to use 100% of the CPU.)

  7. Ralf says:

    During long compiles I use the 50% "unused" CPU to read email, surf, edit more source code. It’s nice not having the machine totally bogged down doing one thing.

  8. ::Wendy:: says:

    ignore this if it’s obviously way off topic – As a user the only time I look at my CPU usage is in the task-manager when I’m wondering if one program I’m running is hogging resources,  and I zap it if it’s seemingly slowing other,  more important (email) stuff up.  Should I llok at the overall CPU usage?  Should I care?  I hope the answer is no,  don’t worry,  we’ll sort it,  its not relevant to you…

  9. Which is why multithreading, producer/consumer models, and distributed algorithms are good ideas.  Now that we have HT/Multicore systems being more and more standard, we need programs with a higher degree of ||ism.

  10. Ulric says:

    Somewhere on a forum last week there was a discussion last week about how on linux and OS X, it shows 100% for one CPU usage.  If you have a quad CPU and all for CPU are used full blast, it shows 400%.  

    I’ve been thinking about this and it’s simple, clear and makes sense; Windows’ task manager should do this as well. It’s scalable and easy to understand.  

    For simple users with simple apps, it doesn’t freak them out.  At worse it makes them feel like their machine is over-performing.

    For servers it is more ‘guessable’ than .. ho what is 1/8 of 100% again?

  11. Ulric:  I agree.  Great idea.

    Programmers:  You guys may be interested in this:

    http://www.xoreax.com/support_faq.htm#q212

    "IncrediBuild can take advantage of multiple CPU/Core machines by allowing each CPU/Core to build a file at the same time."

  12. Archangel says:

    Is VS seriously only single threaded? This hasn’t been fixed in the newest versions?

    gmake is multithreaded and seems to work fine – I always thought VS seemed to be remarkably good, but then I’ve never used it on a SMP machine.

  13. Jonathan says:

    At work, I have a Dual-Xeon with hyperthreading, for a total of 4 logical processors. Our (rather large) product uses a build system that can scale across multiple CPUs, but beyond 2 it would actually lengthen build time. I think once you’re past a certain point, disk access time becomes the limitation, especially during the linking phase. I ended up disabling HT.

  14. Mike Dimmick says:

    Wendy: a thread is only hogging the system if it’s maxed out at 100%. Even then, Windows uses priority boosts to ensure that other applications that are running get CPU time.

    If many programs are doing something very extensive with the disk, though, Windows can struggle – it’s not that great at partitioning disk time. You won’t generally see these programs with high values in the CPU Usage column, because they’ll generally be blocking waiting for data from the disk. Also, large amounts of disk access will often cause Windows to increase the file system cache working set at the cost of process working sets; when a thread that could do work wakes up, it often has to wait for code or data to be paged back in before it can actually do anything useful.

  15. Mike Dimmick says:

    Jonathan: whether compiling is disk- or CPU-bound will depend on how large the source code is and how large your system’s memory is. If you’ve built recently and the source code fits in RAM, it will probably be CPU-bound.

  16. Igor says:

    The proper answer to the above question would be:

    Because no one bothers to write multi-threaded applications yet.

  17. Tim Lesher says:

    Archangel: gmake is not multithreaded.  It can fork multiple copies of itself, but that’s multiple processes, not multiple threads.

    VS is similar: it spawns separate processes to compile, link, etc. It could spawn multiple simultaneous instances of, for example, the compiler, but that’s a technique that works better on Unix-like platforms where per-process overhead is lower.

  18. Cooney says:

    I dunno – something long running like a compilation would probably be just fine under windows.

  19. Cody says:

    "it shows 100% for one CPU usage.  If you have a quad CPU and all for CPU are used full blast, it shows 400%."

    "For servers it is more ‘guessable’ than .. ho what is 1/8 of 100% again?"

    1.  Wouldn’t that cause questions like:  "Server X is running at 400%, is that the quad or the dual?"
    2.  Those that work with computers should be able to work out any 2^n rather quickly.

    I can see how having n * 100% available would be useful when you have a high enough n that precision becomes an issue.  By then, however, I’d imagine that you’d want to have n different counters so that you can see the difference between 8 processors at 25% and 2 processors at 100%.

  20. BryanK says:
    1.  Wouldn’t that cause questions like:  "Server X is running at 400%, is that the quad or the dual?"

    A dual-processor machine can’t run at 400%, though; its maximum is 200%.  Unless I don’t understand what you’re saying?

    so that you can see the difference between 8 processors at 25% and 2 processors at 100%.

    But you can’t tell that difference with Windows’ setup when CPU usage is at 25% on a quad-CPU machine either.  Is it 100% of one CPU, or is it 25% of all four, or 50% of two of them?  Most of the time, it’s probably 100% of one, but that’s hardly infallible.

    (Especially with HT — oftentimes I’ll pull up a remote screen on one of our dual-CPU hyperthreaded servers (4 virtual CPUs), and see that CPU usage is at "25%".  That’s rarely 100% on one virtual CPU; most of the time it varies between 60% and 80% on one virtual CPU, and 20%-40% on its pair.  Always half of the total for that pair, though.  I have a feeling it’s due to the way HT works, but I’m not quite sure on that.)

  21. ender says:
    1.  Wouldn’t that cause questions like:  "Server X is running at 400%, is that the quad or the dual?"

    It’s quad – logical cores. If that is 4 CPUs, 2 dual-core CPUs, 1 dual-core HT CPU, or 1 quad-core CPU, you’ll have to look at hardware information though. This also means that the task is using at least 4 threads (the total CPU usage is still reported up to 100% – so you can have 2 tasks each using 150%, while the overall usage is reported as 75%).

  22. 640k says:

    I would like taskmgr’s graph to compensate for SpeedStep®/Cool’n’Quietâ„¢.

Comments are closed.


*DISCLAIMER: I DO NOT OWN THIS CONTENT. If you are the owner and would like it removed, please contact me. The content herein is an archived reproduction of entries from Raymond Chen's "Old New Thing" Blog (most recent link is here). It may have slight formatting modifications for consistency and to improve readability.

WHY DID I DUPLICATE THIS CONTENT HERE? Let me first say this site has never had anything to sell and has never shown ads of any kind. I have nothing monetarily to gain by duplicating content here. Because I had made my own local copy of this content throughout the years, for ease of using tools like grep, I decided to put it online after I discovered some of the original content previously and publicly available, had disappeared approximately early to mid 2019. At the same time, I present the content in an easily accessible theme-agnostic way.

The information provided by Raymond's blog is, for all practical purposes, more authoritative on Windows Development than Microsoft's own MSDN documentation and should be considered supplemental reading to that documentation. The wealth of missing details provided by this blog that Microsoft could not or did not document about Windows over the years is vital enough, many would agree an online "backup" of these details is a necessary endeavor. Specifics include:

<-- Back to Old New Thing Archive Index