Date: | September 13, 2004 / year-entry #335 |
Tags: | other |
Orig Link: | https://blogs.msdn.microsoft.com/oldnewthing/20040913-00/?p=37883 |
Comments: | 42 |
Summary: | Depends which version of Windows. |
It depends which version of Windows you're asking about. For Windows 95, Windows 98, and Windows Me, the answer is simple: Not at all. These are not multiprocessor operating systems. For Windows NT and Windows 2000, the answer is "It doesn't even know." These operating systems are not hyperthreading-aware because they were written before hyperthreading was invented. If you enable hyperthreading, then each of your CPUs looks like two separate CPUs to these operating systems. (And will get charged as two separate CPUs for licensing purposes.) Since the scheduler doesn't realize the connection between the virtual CPUs, it can end up doing a worse job than if you had never enabled hyperthreading to begin with. Consider a dual-hyperthreaded-processor machine. There are two physical processors A and B, each with two virtual hyperthreaded processors, call them A1, A2, B1, and B2. Suppose you have two CPU-intensive tasks. As far as the Windows NT and Windows 2000 schedulers are concerned, all four processors are equivalent, so it figure it doesn't matter which two it uses. And if you're unlucky, it'll pick A1 and A2, forcing one physical processor to shoulder two heavy loads (each of which will probably run at something between half-speed and three-quarter speed), leaving physical processor B idle; completely unaware that it could have done a better job by putting one on A1 and the other on B1. Windows XP and Windows Server 2003 are hyperthreading-aware. When faced with the above scenario, those schedulers will know that it is better to put one task on one of the A's and the other on one of the B's. Note that even with a hyperthreading-aware processor, you can concoct pathological scenarios where hyperthreading ends up a net loss. (For example, if you have four tasks, two of which rely heavily on L2 cache and two of which don't, you'd be better off putting each of the L2-intensive tasks on separate processors, since the L2 cache is shared by the two virtual processors. Putting them both on the same processor would result in a lot of L2-cache misses as the two tasks fight over L2 cache slots.) When you go to the expensive end of the scale (the Datacenter Servers, the Enterprise Servers), things get tricky again. I refer still-interested parties to the Windows Support for Hyper-Threading Technology white paper. Update 06/2007: The white paper appears to have moved. Update 10/2016: The white paper moved again. |
Comments (42)
Comments are closed. |
OK, I know it probably isn’t your decision and it is great that Microsoft is proving lots of documentation and white papers but who thought an executable, zipped Word document was the best way of distributing a something that only contains text. There must be a better way of presenting text on a web site.
This question was raised the last time I posted a link to a whitepaper.
http://weblogs.asp.net/oldnewthing/archive/2004/04/15/113811.aspx#113943
and the answer came a little while later
http://weblogs.asp.net/oldnewthing/archive/2004/04/15/113811.aspx#114321
<pointless-rant>
Wouldn’t a web page served from Microsoft’s web server provide the same level of reassurance that a technical paper was indeed from Microsoft?
There is also a technology called PDF that some people use that is really useful for distributing signed documents with rich content – it doesn’t even require you to run any executable files on your PC to read a document.
</pointless-rant>
There’s a rule that all files available for download must be digitally signed. I didn’t make the rule and I think it’s silly for things like documents, but that’s the rule. The rule was probably created by lawyers. Lawyers are the source of lots of silly rules.
Signing the actual download does mean that you can give the file to somebody offline and they can confirm that the document is signed. Whereas under the "trust the site" method the person has no way of verifying that you didn’t tamper with the file before giving it to them. So there is some merit to the rule.
Also, somebody could launch a DNS attack against microsoft.com and start handing out bogus documents. But they wouldn’t be able to forge the digital signature. A theoertical possibility but one I’m sure the lawyers worried about.
Does Microsoft plan on releasing a patch for 2000 and/or NT to support hyperthreading?
Thanks for the information on hyperthreading Raymond. We got bitten by a DB server that had this enabled and it took us a while to figure out what the root cause was, but even after being told to disable hyperthreading we still didn’t really understand why it caused the issue.
Very interesting as usual :)
Phil
As was pointed out in http://weblogs.asp.net/oldnewthing/archive/2004/04/15/113811.aspx#114321 – DOCs can be signed as well, without being put into an EXE. So digital signatures can’t be the reason why documents are put into executables.
Yes and if you read through to the end, you’ll see why it’s not used.
Requiring docs to be signed makes sense because of all of the macro/VBA stuff that we have today. Opening a .doc could mean running code. Do you trust the source enough to allow the code to run?
IE checks signatures of .exe’s that are downloaded. That’s why .doc’s are rolled into .exe’s and signed.
I know Microsoft loves to eat their own dog food, but as someone who’s both interested in what they’re doing and uninterested in owning a Windows PC and MSWord, I’d prefer it if they’d just use an (gasp) open format like PDF.
Obviously MS doesn’t care what one die-hard Linux user thinks, but it’d be nice if they’d show even the slightest amount of consideration.
@Gordon, And why, may I ask must Coca Cola take any consideration with the Pepsi buyers? Why must MS have any consideratiion with a Linux user that aint using any MS product at all? After all this is BUSINESS.
I’m on a HT P4 machine now, and in Device Manager the computer is listed as "ACPI Multiprocessor PC". Is that correct? Given that XP recognizes HT CPUs, I would’ve expected XP to show a different string on a HT machine.
PDF is not open – talk to Adobe about it and there are lots of viewers for word docs – such as open office ;-)
Brian: I’m not Raymond and you didn’t ask me, but I’ll bet the answer is no. Hyperthreading support is a New Feature, and Microsoft would be foolish to backport it to older releases. Basically it would be a large expense (HT support is surely a major scheduler change, with all the testing that implies) for, in all probability, a loss in sales.
Everyone: As for the way MS distributes white papers, I see how MS’s lawyers have left few options but it’s still a bad idea. That goes for any file format which is executable in whole or in part, not just the ones MS comes up with. Think about that the next time you download an academic paper in PostScript format.
I guess the real question here isn’t why MS distributes papers in EXEs. It’s why there aren’t any widely accepted non-executable file formats which can be signed. What I’d really like to see is a file format which can contain any type of file along with its digital signature. There wouldn’t need to be anything Windows-specific about such a format, nor should there be.
Q. On the original question, ("How does Windows exploit HT?"), what benefit is there in HT on a single processor system?
In this case I’m talking about XP so the OS is fully aware. How can it derive any benefit?
Is there a lower level benefit? Such as in the BIOS?
Can the next person to discuss signed publications start a new thread. I believe this is HT here.
There are many web pages devoted to the advantages of hyperthreading. Pop it into "google" and you’ll have plenty of reading. It is generally a benefit, but as I noted, you can create (arguably pathological) situations where it is turns out to have been a net loss, especially when the OS doesn’t support hyperthreading.
One benefit of hyperthreading was surely unintended but has turned out to be valuable. If an out-of-control process wants 100% of the CPU, and if the process’s code is unaware of the possibility of hyperthreading, then it only gets 50% of the CPU.
David the Irked: Calling the Postscript format executable calls into question the definition of "executable." As the language is Turing-complete, I’d agree that it is executable under at least one reasonable definition. Under that same definition, however, wouldn’t the Word-format file in question be an executable whether or not it was wrapped in a native executable? Last I checked (which was a long time ago, admittedly), Word documents supported embedded scripting.
Ideally, programs that deal with "live" "data" would provide a limited API and be free of security holes. This is not the case, of course, but it doesn’t even set a useful lower limit to say that executables, by the Turing-complete definition, are dangerous, because people have found exploitable security holes in GIF and MP3 decoders as well.
What with the upcoming releases of dual-core processors, does anyonw know how windows licensing treat them? Will a dual-core proc be treated as two physical processors, or as two logical processors?
So does this mean i should turn off HT on a dual xeon with HT machine running 2k server?
Are future (or current) compilers going to be hyperthreading aware? Ie, "compile this and optimize for a 2-CPU hyperthreaded system"? Seems like that would help out even in the pathological sense. And if hyperthreading is accepted as viable (and becomes pervasive), it seems a necessary step.
I agree with the .exe complaint, regardless of lawyers. There is a lot of Microsoft content out there in raw HTML that is not digitally signed. What, someone is going to sue MS because they designed something based on a whitepaper and the whitepaper was *tampered* with? Isn’t that the point of disclaimers, or putting source URLs at the bottom?
Then again, I don’t trust any web content that is even vaguely executable, and run in Opera with most things turned off. Even if it were a plain .doc file I would not download it. So I’m not really a "normal" user. (friends get mad when I won’t read their .doc attachments and force them to resend in plaintext when I *know* the content is plaintext to begin with!)
Jim: I don’t see how calling PostScipt executable stretches the definition. PostScript is a programming language like any other, albeit a special-purpose one. As far as my previous point is concerned, a PostScript porgram can write arbitrary data to any file that the user running it has permission to write to. On most systems that’s enough to cause arbitrary code to be executed at some point in the future.
You’ve hit the nail on the head with your comments about Word. I’m not only arguing against embedding Word documents in executables. I’m also arguing that MS Word, like PostScript, is an inappropriate format for document distribution. RTF would be acceptable, since (I think) it doesn’t allow embedded scripts.
You’re right that there is a security risk inherent in any program which handles input from outside sources. However, that doesn’t mean we need to make things worse by using file formats whose very design supports the execution of arbitrary code on the viewing machine.
Can the CLR exploit the use of hyperthreading?
Or is it like other programs subject to what the OS is telling?
If Microsoft is so enthusiastic about digital signatures, why doesn’t it use digital signatures to help users distinguish legitimate traffic from spyware, viruses and other kinds of malware?
If you look at the list of processes in the task manager – you can’t tell right away which processes are part of windows and which are some kinds of malware. Malicious processes often disguise themselves as windows processes.
It would be nice if every executable in Windows was signed and task manager would indicate which process is signed and which is unsigned (for instance by showing green icon for signed process and red icon for unsigned).
I meant "legitimate processes" instead of "legitimate traffic". I should really proofread before posting, sorry:)
I wish these blog comments were threaded. Sorry to keep this off-topic signing conversation going. I’ve tried using a different title for my comments, hoping that anyone here wanting to read about hyperthreading will notice the topic change and skip to the next relevant comment.
David the Irked:
There are about a dozen different kinds of signing used by Microsoft alone. IE uses Authenticode signatures. Authenticode started as an IE feature, in fact. Today Authenticode is a part of PKI, in Windows security, separate from the IE team.
It would be really nice if there were a pluggable architecture to sign/verify, making it easier for other MSFT teams to have the download manager check their sigs and even enabling 3rd parties to write their own verification snap-ins. We have something like that. The ancient, crusty CryptSIP* APIs can be used to install new Subject Interface Packages (someone must have thought that name was meaningful once) to verify different kinds of signatures. Unfortunately, the SIP stuff is kind of brittle and very poorly documented. And even if there were a new SIP installed, I’m pretty sure that IE only uses the Authernticode one.
I test Authenticode at the API level. Our team would like to redesign signing, making new, friendlier APIs and getting all of Microsoft to adopt them. And, of course, document them so that everyone (PGP, maybe?) could benefit. We don’t have room in the schedule at the moment, but I hope happen in some release after Longhorn*.
* I would call that "Blackcomb", but according to Xeno’s paradox, there is always another release half-way between here and Blackcomb. ;-)
I really like that idea. If the task manager could show some kind of information on whether modules loaded were signed it would make it much easier to locate those non-critical processes that could safely be killed.
I’m sure as developers and IT professionals we’ve all had to help friends out when their PCs get infested with malware, and killing those processes is one of the first things we have to do.
David the Irked: Since when did Postscript programs get access to the file-system?
Code signing proves next to nothing, and solves very few problems. Almost no-one looks at the details of certificates. Most of the CAs are just in it for the money and don’t give a damn about checking identity thoroughly (why would they want to turn customers away?). So the bad guys just get a certificate for Bad Guys Inc. (or maybe for Microsoft Corporation if Verisign is really asleep that day), sign their bad code, and no-one’s any the wiser.
Microsoft took the code signing route before with ActiveX, and look what that did for the average computer user. .NET can provide sandboxes like Java does and might server to replace ActiveX but I fear that commercial pressure and demand for a "rich user experience" will lead to stretching and breaking of the sandbox for those that can buy code signatures.
Ben: Since Level 2 at least. If you’re curious, have a look at section 3.8 of the PLRM, 3rd ed. You should be able to find a copy online.
It was a bit of a shock to me too.
David the Irked: I’m flying the BS flag. PS can access files, but that access is mediated by the OS. Since the PS interpreter doesn’t run as root, the amount of damage that could be done by a PS file is completely out of proportion to the amount of time it would take to create it.
El Lobo: If they’re publishing a whitepaper that they want to make public, they should use an open format so that it’s actually public. Shoot, they could just use ASCII text with a plain old PKI signed digest.
Dave (presumably un-irked): As soon as MS releases a new Word version, OO.o has to start the process of reverse-engineering the file format all over again. I don’t want to have to depend on them to read anything.
Sorry, Raymond. This has turned into a signing conversation.
Dmitriy: I don’t think I’d claim that all of Microsoft is enthusiastic about digital signatures. I am, though.
Showing processes with only signed (by trusted entities) files loaded is an interesting idea, but I’m not sure how it would work. With some of the changes in Longhorn, this might not be necessary, though. (Not sure what I’m allowed to disclose, so I’ll have to be mysterious.)
Ben: I agree that expecting users to manually check signatures is ridiculous, but programmatically checking them is goodness. Software Restriction Policies are cool! (ok – I’m a geek – I’ve just outed myself) I’m not sure it’s so easy to get a spoofed cert from a reputable CA. Yes, I know that Verisign has issued "Microsoft" certs before, but it’s not the kind of thing that happens often and we have revocation for those times when it does. I’m already taking up too much of Raymond’s blog space, so I’ll skip my rant on "chains of trust".
PS Level 1 supported filesystems. I remember a coworker being baffled as his scripts worked fine on his sparcstation, but failed when he sent them to the printer. His code was pulling in data from the computer’s filesystem, which the LaserWriter didn’t have access to.
If you wanted to really do evil PS, you might be able to write a trojan to get the printer to copy files being printed to your IP address.
Drew and Ben Hutchings:
I didn’t mean the kind of system used for signing ActiveX where third-party "trusted entities" can sign executables. Only Microsoft should be able to sign windows components. Microsoft could have a private key specifically for signing Windows components.
Viruses/Trojans/spyware usually install themselfs into Windows or WindowsSystem directories and choose names that appear to be legitimate Windows system files. If task manager would show which files are indeed windows components then it would be harder for malware to disguise itself as windows components.
And back to hyperthreading: I can offer a real-life example of when it hurts performance.
I’m a developer of a fairly large software project. Our compile process is multi-proc aware, and can utilize as many procs as it finds. I have a dual-Xeon machine, which is also HT-capable (to 4 logical procs).
I found that once I enable HT, these 4 compiler processes just consume all of my memory, and thrash the page file, and actually degrade compile time.
Lesson learned: You should think about performance wholistically…
Dmitriy:
Microsoft does sign all Windows binaries. Most of them are indirectly signed (catalog-signed – that stuff under %windir%system32catroot and catroot2). The key is kept in a vault somewhere (I’ve never seen it) and a nice guy named Walt (whom I’ve never met in person but seems friendly in email) signs everything (to the best of my knowledge) that Microsoft ships.
The really geeky tool to check out signatures is signtool/capicom – downloadable SDK tools that will give very detailed info about signatures, even dumping cert and timestamp info inverbose mode. The less geeky way is using sigverif.exe which ships with Windows. Understated disclaimer: sigverif is *not* my favorite tool. Raymond covered it here:
http://weblogs.asp.net/oldnewthing/archive/2004/06/16/157084.aspx
Maybe I should start my own blog and stop hijacking Raymond’s, eh?
<i> <pointless-rant>
[PDF] doesn’t even require you to run any executable files on your PC to read a document.
</pointless-rant></i>
Erm, try viewing PDF without Adobe Reader installed.
9/14/2004 10:36 PM Jonathan
> once I enable HT, these 4 compiler processes
> just consume all of my memory, and thrash
> the page file, and actually degrade compile
> time
The shortage of RAM is equally to blame as the presence of HT. The shortage of RAM would be equally apparent if you had 4 real processors as it is when you have 4 virtual processors.
> You should think about performance
> wholistically
Bingo (except for spelling).
By the way… 9/13/2004 6:43 PM Drew
> I wish these blog comments were threaded.
It’s hyperthreaded, with two threads going on in parallel ^u^
Quote – <i>Jonathan Payne</i>
"There is also a technology called PDF that some people use that is really useful for distributing signed documents with rich content – it doesn’t even require you to run any executable files on your PC to read a document."
I thought that acrobat reader was needed :?
Inspired by this post:
http://br.thespoke.net/BlogReader/SingleEntry.aspx?ID=4679
Note that Fabio’s article assumes that no other vendor (AMD, Cyrix, Transmeta, …) will ever implement hyperthreading.