Date: | January 27, 2005 / year-entry #25 |
Tags: | tipssupport |
Orig Link: | https://blogs.msdn.microsoft.com/oldnewthing/20050127-00/?p=36583 |
Comments: | 50 |
Summary: | By default, the Background Intelligent Transfer Service (BITS) which is used by Automatic Updates will use idle network bandwidth for downloading updates. This is normally not a problem. One case where it can be a problem is you have a large LAN that shares a single DSL connection. BITS doesn't see that that DSL connection... |
By default, the Background Intelligent Transfer Service (BITS) which is used by Automatic Updates will use idle network bandwidth for downloading updates. This is normally not a problem. One case where it can be a problem is you have a large LAN that shares a single DSL connection. BITS doesn't see that that DSL connection is shared. Consequently, each computer on the LAN will be using its idle network bandwidth to download updates and the total of all the LAN computers doing this will oversaturate the DSL connection. [Typo fixed. 31-Jan-05.] Another example where this can be a problem is if you have a network card that connects to a hardware firewall which in turn uses a dial-up modem to connect to the Internet. (For example, you might connect through a classic Apple AirPort which is in turn connected to a modem.) BITS sees your fast network card and can't see that there is a bottleneck further downstream. As a result, it oversaturates the dial-up connection. To tweak the BITS settings, you can fire up the Group Policy Editor by typing "gpedit.msc" into the Run dialog. From there, go to Computer Configuration, Administrative Templates, Network, then Background Intelligent Transfer Service. From there you can configure the maximum network bandwidth that BITS will use. You can even specify different BITS download rates based on time of day, so that it downloads more aggressively while you're sleeping, for example. |
Comments (50)
Comments are closed. |
"..total of all the LAN computers doing this will eversaturate the DSL connection. "
I think you meant Oversaturate Raymond :)
Cool.
You’d think it would be able to work out the effective bandwidth available itself really. Maybe through averaging peak transfer rates, or watching latencies of incoming data over that connection?
If I had a bunch of machines on the otheer side of a DSL line from the internet, I think I’d be looking at ways to have one server download the update and distribute it across the network internally. I think MS even offers some package that does thhis.
mschaef, I think your talking about SMS .. Internal patch management and Windows Update services. I heard its a pretty killer Application.
mschaef,
look at SUS or WUS(=SUS 2.0)
http://www.microsoft.com/windowsserversystem/sus/default.mspx
http://www.microsoft.com/windowsserversystem/wus/default.mspx
I have a love-hate relationship with Automatic Updates (a microcosm of my overall relationship with MS, perhaps ;) )
The "hate" part is that I hate the updates that have EULAs attached, meaning that even if you turn AU on you don’t stay up to date automatically. Sort of defeats the purpose doesn’t it? This means that for every computer I set up for friends and family (I always give them a Limited User and reserve Administrator to myself) I have to go around periodically and agree to all the EULAs on their behalf (doesn’t this mean that the EULAs aren’t actually legally binding on them anyway? The whole EULA concept is broken by design IMHO and should be scrapped entirely). At the very least, EULA-requiring updates should present themselves at boot time and a Limited User should be able to agree to the EULA and kick off the install.
The "love" part, on the other hand, is best shown by an example.
One day I was trying to install SP2 on my father-in-law’s machine. He uses AOL dialup and had just bought a new computer which inexplicably didn’t have SP2 installed (even though they were aware of it – there was a leaflet recommending you get it and install it yourself)
Every time I set it downloading over Windows Update the connection would die after a few minutes, which I thought left me screwed because I couldn’t tell him to download it overnight, which was my plan. Since it was late in the day at that point, I thought I’d come back another day and babysit the WU connection for hours and hours to try to get the thing downloaded. So a week later I went back to do this.
And got the little "updates are ready to install" bubble. Ok, I thought, some piddling little IE fix has downloaded, I’ll do that before I get started on the real work.
Nope. The whole multi-hundred-meg SP2 download had downloaded itself in the background while my father-in-law had been playing on AOL games, and was sitting there on the hard disk ready to install. Fantastic! I was able to have the whole computer fixed up within a couple of hours. I love when software completely exceeds my expectations like this.
FYI. This won’t work for XP Home. It doesn’t come with the policy editor.
Why does Automatic Updates not use pair within a single LAN. E.g. get the list of files and checksums from the Microsoft site each time. Then do a network broadcast to see if any other machine on the LAN has any segments of the file. If each machine downloaded the segments in a random order when not already on the LAN, then this should save a lot of bandwidth for small networks without needing any admin for a site to set it up.
Of course you would need to wait a sort time before doing the broadcast, so that you can listen in on the results of other computer’s requests so as to prevent two many broadcasts on a large network.
Ian Ringrose
ian at ringrose dot name.com (remove .com)
Two thoughts here:
1) Contrary to popular misconception, all computers don’t start downloading over AU/BITS at 3am every day all at once. The downloads use a randomized event that evenly spreads out when a computer checks with AU site and starts the download over a 24 hour period. (actually, after each daily check, it schedules it’s next check randomly between 17 and 23 hours). Once all the bits are downloaded, the INSTALLs kick off at 3am if the computer is on.
2) Ian’s question about a mini-peer-to-peer to avoid downloads – we’ve thought about this and it may be something that happens sometime in the future, but the way WU/AU/Update works, it may not provide the benefits you would think. For one thing, we use a "delta compression" delivery where we examine each individual system and only download a delta set of bits relevent to each machine. This delta is frequently fairly small (less than 100K) for many of our updates, so the benefit of peer-to-peer is not that big. Today, most updates require the 350K installer to be downloaded along with the delta’s, in the future we’re going to be eliminating the 350K for each package, so the total data download will go way down. These are more levereged solutions than hoping another machine needed the exact same set of file deltas.
This is great I was looking for a way to alter BITS bandwidth usage the other day; because Windows Update was remorselessly wiping out my poor XBox’s bandwidth.
AC: you can order a Service Pack 2 cd for free which will give most of the updates your after. Additionally you can create a slipstreamed XP install disc that will install XP as Sp2 without separately installing SP2.
The updates after SP2 are still pretty small in size and can be viewed on your installtion history for Windows Update.
AC: I used to work on Windows 95 Setup and your suggestion that Setup intentionally wastes time I find insulting.
Installing a series of updates one at a time is an unfair comparison with installing them as a batch. A large chunk of update install time is "overhead" like (1) verifying the digital signature of the install package, (2) inspecting the files on your system so that the update can decide which version of the update you need, (3) backing up the old files and registry data in a manner that permits selective uninstall – only after all that overhead is complete can you actually copy the files. If you do them via Windows Update, not only can all this overhead be consolidated, but the presence of a live network connection means that the update can use feedback from the server to optimize the install (see John Gray’s previous remark about delta compression for one example).
"Setup — it could be possible to make such an installation of Windows XP that would take 5 minutes, and not 35."
The setup always says 35 minutes but after installing Windows XP on 4 machines in the last 2 weeks I can say that it very quickly counts off minutes on a fast machine. The process still takes a long time, but not really outrageous for what it has to do. Sure, I installing a base Linux system (say Debian) is much quicker but there is a hell of let less going on.
On my XP SP1 box, the only thing under Computer ConfigAdmin TemplatesNetwork is "BITS inactive job timeout". Are the settings described here only available in SP2 or some other update?
Is there any way to tell BITS not to do anything when connected via dialup? I recently rang up a large mobile phone bill when connected via bluetooth because I forgot to turn off Windows Update :(
AC: Are you talking about an upgrade install (which needs to preserve existing data) or a reformat install? Upgrade installs are insanely complicated.
» Are Automatic Updates Using Too Much Bandwith? InsideMicrosoft – part of the Blog News Channel
Traditionally Windows Setup has inspected each part of your configuration and installed a minimum set, file-by-file, expanding each file as it goes. This is often slow because the install disc isn’t optimised for the process. (The worst for this was Visual Studio .NET 2002 which was really poorly organised – installing that was a serious stress test for any CD drive!)
I believe the plan for Longhorn, at least for ‘clean’ installs, is to essentially install an image of the OS then allow that to be customised. This will allow the installer to simply copy byte-for-byte, sequentially – much faster than copying one file at a time. Sorry, I can’t remember where I read that now.
AC: For future reference, bad typing and poor spelling and grammar do not help convey your point to your audience.
Mike, oh man I remember installing VS2k2 on a high end machine at the time (Athlon 1.4ghz, plenty of RAM) and it took 2 hours. And no, I didn’t have some behemoth like 3DSMax running in the background :)
Anyway, AC, why do you keep claiming this is a matter of politics? And why would it have anything to do with pirated copies? There is no conspiracy, and no order from up high requiring setup to take 30 minutes.
Mike Dunn says
"On my XP SP1 box, the only thing under Computer ConfigAdmin TemplatesNetwork is "BITS inactive job timeout". Are the settings described here only available in SP2 or some other update? "
The policy can be created/authored only in Windows XP SP2 or later. But it will have an effect wherever BITS 2.0 is installed.
Please see
http://msdn.microsoft.com/library/en-us/bits/bits/group_policies.asp?frame=true
for more info about the policies.
The following link talks about various BITS versions
http://msdn.microsoft.com/library/en-us/bits/bits/what_s_new.asp?frame=true
Almost everything MS does is more politics than good engineering, setups and updates included.
Setup — it could be possible to make such an installation of Windows XP that would take 5 minutes, and not 35. I *really* believe that setup guys got the request "setup must take at least 30 minutes!". What additionaly reasures me is that the setup takes a lot of time no matter how slow or fast the computer is, or how much memory it has.
Updates are the same. Some months ago, I wanted to prepare an "upgrade CD" for my parents, who have a very slow dial-up. I downloaded all the updates.
I made the batch file to start each update one after another, without restart. Still, if I’d connect to Windows Upgrade over the fast line, all these updates would take a two-three minutes (automatically even keeping backups in the winnt folder). But here, updating from the batch file, invoking each update exe, not keeping backups, took much, much more time.
So it looks to me again that the engineers just got the request "you must make updates that only work good from our servers — who cares for people without a fast internet or Wus server in their network". Shame.
If they want to prove me wrong, let us easily prepare a combined exe from more small updates. Now that would be a real improvement in maintainability.
Ian:
P2P AU is good suggestion, but then I have concern on whether that’ll create another security hole by having people writing fake updates and pretend it’s the real one. Possibly can be abused by virus and worms if not properly be protected.
Maybe that should not be the case as MS update are usually digitally signed, but it’s something to be aware of.
Why does Platform SDK Setup move ~20000 files to one directory? This is insanely slow on fat32 partitions.
After 24h of constant harddisk torture I had to turn it off, then it started to move all those 20000 files back!
Alex: "slipstreamed xp install disk" is not meant to update the computer which already has the latest *service pack*, but since then missed all 15 security updates. These 15 updates applied offline would take enormous time, compared to the online scenario. Something doesn’t add up.
Raymond: Windows 95 was a long time ago, and I believe you personally did a good job. But I talk about Windows XP setup. What’s on hard disk, after the 30 minutes installation, is something that can be packed to one CD, and that can be upacked to another hard disk in 3 minutes on the modern machine (I don’t count pagefile there). Add to that the time really spent on detecting hardware, which is a few minutes, and that’s all what should be needed for the setup. What’s going on in the rest of the time that really has to be done on the user’s machine and can’t be done before the CD is made? Now, I know that Windows, just copied, won’t work. But Windows Preinstallation Environment which successfully runs from a CD proves that even this is really just a question of politics. I appreciate that Microsoft doesn’t like pirate copies, but it doesn’t have to slow down so much the installation for everybody.
Second, it’s possible to pre-program all the answers to the setup, yet, if you don’t do that once, or you can’t prepare ALL the answers (since it’s not so trivial, it has to be tested etc), you’ll get random questions during totally unpredictable times of setup. That makes it necessary to actually sit by during the whole setup, instead of just first answering all the questions and then doing something else. So this must also be a political decision. I know that now the "press a key to restart" dialog has finally a timeout, but that’s really the only window of time when you don’t have to be around.
Almost Anonymous: "not really outrageous for what it has to do". It doesn’t have much more to do than copy the files and detect the hardware. Where’s the rest od the time spent? Old setups (e.g. Win 3.11) were really processor dependent. The new ones are not. I’d really like to see a graph of the processor usage during these magical 30 minutes. Think once about that — 30 minutes to produce 1000 MB on the hard disk from the 500 MB on CD, on the machine which has can copy the CD content in not more than 3 minutes and which can for the same 3 minutes copy 5.5 GB disk to disk, and that has 3 bilions per second ticks.
My experience with Windows XP setups is that one or two of the minutes take about eight minutes each regardless of the speed of the machine (especially the 39-minute one), but other minutes go faster on faster machines.
Surely in all Windows version setups there’s a lot of probing of buses, which require time spent waiting for possible replies from devices rather than CPU time.
But I do wonder why such huge amounts of time are needed for some of the stuff like saving settings or registering components or whatever they’re called.
1/27/2005 7:16 PM psdk
> Why does Platform SDK Setup move ~20000
> files to one directory? This is insanely
> slow on fat32 partitions.
Sure, but no slower than Internet Explorer in ordinary daily usage with its tens of thousands of temporary internet files.
(1/27/2005 1:14 PM josh
> AC: For future reference, bad typing and
> poor spelling and grammar do not help convey
> your point to your audience.
I wonder if josh can write AC’s native language half as well as AC writes josh’s native language.)
Longhorn is going to install an *image* of itself? Does this mean that they’re going to have some sort of tweak to the filesystem that will allow "mounting" a compressed disk image, and copying it (much like Mac OS X does installations)? Or, do you mean to say that it will literally do a bit-for-bit copy Altiris-style?
[1] describes how an alpha_release of "Longhorn" setup works.
Seems to me that they do whatever is necessary to make it faster. It just takes some time to install :). And in my opinion it depends on hardware. Installing Windows XP in a VPC image on my slow laptop is a lot slower then installing it on some fast machine. If I remember correctly they even warned you in NT4.0 if you didn’t run smartdrive before the setup, and in Windows 2000 setup smartdrive started together with the setup. Can’t see where the politics comes from? Guess I’m really stupid and can’t see this, but what can they win by making the setup slower?
[1] http://www.winsupersite.com/reviews/longhorn_4008.asp
The NT4 thing (which lingered in 2000, XP and 2003) was if you started setup from DOS. Then, winnt.exe (a DOS executable) would use DOS to copy the entire CD to C:, and then restart into the real setup. This copying was indeed much faster with smartdrv.
Norman: During GUI-mode setup, you can press Shift-F10 and get a cmd window. You can then run tasklist/filemon/regmon/etc. and figure out what’s going on.
And Longhorn setup is indeed different – it asks you a few questions in the beginning (product key, machine name, etc), and then proceeds to dump an image on the drive (much like Ghost / DriveImage / Altiris). Then it boots into the new image, does a PnP cycle, boots again and that’s it. Took me 19 min once, on a quite old machine (PIII/600).
Slightly off-topic: The speed of a setup process has a lot to do with the physical medium (the CD/DVD). On my machine, installing the full MSDN Library from a DVD ISO image shared over a 100 Mbps network and mounted as a virtual drive takes about 3-4 minutes. I don’t even know how much it takes to do the same thing from a real DVD, I never was patient enough.
@Cheong: Binaries from Microsoft are *always* signed.
We used to (win95 era) copy all of the install files to the HDD before ever installing the OS, because it was much much faster than doing it from the CD-ROM drive and it would already have the correct parths to the setup files if it needed them for whatever reason.
I bought an SATA hard drive for my last computer. Windows XP installed *much* faster than previously–around 1/2 the normal time. The countdown was still 35 minutes or whatever to start, but as someone above mentioned, some of those minutes went by really fast.
> Setup — it could be possible to make such an installation of Windows XP that would take 5 minutes, and not 35.
On a new system with a 10K RPM Western Digital Raptor, I’ve seen Windows XP up and running in under 15 minutes. The hard drive is always the slowest component in a system, unless you have legacy I/O like a *shudder* floppy drive.
I know that there’s more for setup to do when “upgrading” over previous version of Windows. The description of a Longhorn setup has a lot in common with what I claimed. What you get on your hard disk after the whole setup process is really something that, uncompressed, fits on two CD-s. Add a compression and the copying speed will only increase, since the throughput of a CD (DVD) drive is the bottleneck in the process. And that’s it, plus the hardware detection, and writing of the results. What’s else that setup really has to do on the user’s machine? The system already recognizes the change of too much hardware, making activation necessary again. Why not let the setup be based on that?
And yes, “Registering components” is really the part I don’t like the most. If I understand what’s going on there, there are now a bunch of COM objects, each with it’s own setup procedure, which is basically just “add these entries to the registry and (maybe, I haven’t investigated enough) something to the signatures list”. And for each object, something small is done, resulting in a lot of updates to the files, opening them, closing them etc. So why not having the whole content of the registry and signatures already prepared by MSFT, instead of doing all this on the user’s machine? 99% of these objects can’t be removed in the setup procedure anyway. The code behind Registry was written by really good engineers and carefully tuned – big parts of registry can be imported really, really fast. So “registering components” can be made in seconds. But it seems that these possibilities are not used in the Windows setup. Again if the Setup was more “what can be done only once, during creation of the setup medium, instead of spending time of the millions of users ” I don’t think it would look like that.
So where comes the politics? Maybe with the assumption “the user, no matter how advanced, should actually just buy a new computer with a new version of Windows pre-installed, so he’ll never see the setup anyway”. If that would be correct, we wouldn’t talk about this theme now.
The subject matter has drifted far from Automatic Updates bandwidth management – I’ve already written up an explanation of the complexity of setup (and why you can’t just memorize a registry) but I’ll post it as a separate entry. (Based on the current article queue, it should appear sometime in early June.)
Back on topic then: I know your personal policy doesn’t allow you to comment on the future, but any ideas on when will Microsoft provide a unified patching API (at least for its own software)?
Downloading patches from Windows Update for Windows, Office Update for Office, getting patches by hand for Exchange, SQL and other stuff is very tedious. Updating other apps is waaay worse.
Ovidiu,
Unified API is already in the pipeline.
Please see
http://www.microsoft.com/windowsserversystem/wus/faqs.mspx
about plans to support patching Windows, office, Exchange and SQL from one portal.
On the topic of XP install time, I tested the issue tonight. The install started at ’39 minutes remaining’.
Minutes 39-29 were full of user prompts, so I didn’t bother to record how long it took. The last time the XP install waited for user input was at the 29-minute mark. We’ll call this time START.
START: 8:25
ESTIMATED END = START + 0:29 = 8:54
ACTUAL END: 8:28
Yes, my Windows XP install took 3 minutes when estimated at 29 minutes. At 8:28 my PC rebooted, and 8:30 it was at the first welcome screen. 8:31 the desktop was completely loaded and the install was complete. The reboot time is a bit skewed because I had quick boot turned off in the bios and more than 1 minute of that was before loading the OS.
Of course, my computer is a beast, so my results are not average. But I thought I’d give some hard numbers for those who are interested. (Athlon XP 3500, 250 GB RAID 0, 1 GB RAM)
Thanks for your response, Anonymous, but that is not what I had in mind. I’ve been using SUS since it appeared, I even wrote a download manager based on BITS for my own use, but there’s no way of integrating your own application with this framework.
For example, it would be very useful if apps would update their code or data (e.g. antivirus definitions) with this kind of tools, especially in to LUA scenarios. For instance:
I run as a regular user. In order to get Yahoo! Messenger running, I had to tweak the ACL on the installation folder, since the stupid app creates temporary files in there. When it tries to self-update the same thing happens.
I would like to see a complete framework that allows for applications to update themselves (with delta patches to save bandwidth), even if the files to be updated are under Program Files or some other read-only folder (from a regular user’s point of view). This would be possible by creating a Windows Service to handle downloads and patches (it’s already there and it’s called AU :)) and by allowing installers to register their applications with this service. This way, YM, Acrobat and other apps that have self-update functionality would be able to update via a consistent framework.
Re: unified application servicing:
I wouldn’t hold your breath for much more than what SUS is delivering. The focus is on more reliable mechanisms for the future, not just aggregating / federating all the install technology of the past. There is a certain level of investment going on at that level but fundamentally, for example, there is no way to give a strong transactional guarantee without cooperation with all the installers in the world.
Instead in Longhorn we’re introducing a variation on ClickOnce that’s targetted only at the OS (it’s a major variation; don’t assume too much except that the metadata to describe the state is unified and the same team – mine – did the infrastructure for both).
OS servicing is a funny thing. Most installers have like a 0.1-1.0% failure rate for various reasons. For most apps, that’s a managable number of support calls.
For downloads from windows update, these affect tens or hundreds of millions of people, so a 0.1% failure rate is astronomically bad. We’re working with the transactional registry and filesystem teams to give hard transactional guarnatees about the changes to system state. However getting this right has involved a lot of changes to how components think about deploying themselves and it’s been an uphill battle since traditionally the setup team sinks all the cost of deploying components.
Self-registration is a great example of this. So many people get this just plain wrong. Clean install has a very high chance of succeeding but servicing and uninstall actually fail often (I’m not sure that it’s over 50% but it’s hard to measure because most (yes, over 50%) of the software packages out there don’t even write their uninstall keys right so that you can uninstall the app at all!)
Maybe I’ll coordinate with Raymond and steal (some of) his thunder about how hard setup is. I’m only really interested in talking about it in the context of Longhorn, not legacy (except in comparing and contrasting), whereas Raymond’s fundamental goodness if a cross-version/platform perspective.
Thanks for your reply. One final suggestion:
I think it would be great to have PAGs talking about versioning, on the one hand, and about deployment and servicing, on the other hand.
Microsoft has learned a lot of lessons the hard way and sharing this knowledge would make things a lot easier for many developers and end users, and also for Microsoft. Just making developers aware of these issues would be a great first step, especially since many people seem to look at smart clients as the next big thing.
can anybody help me please.
I can’t turn off my Automatic Updates option because is inactive. I have WIN XP prof and sp1 and sp2 installed. The option is ON and if I try to change it I can’t because all 4 options are inactive.
Thanks
I would like to see an option to cache updates.
It would be nice to be able to select an option to keep the updates. And then I would like to copy them to a thumb drive and take them to another computer that only has a modem and have the update software be able to look at what it needs to update and get the updates from my thumb drive if they are there.
See John Gray’s comments earlier.
http://weblogs.asp.net/oldnewthing/archive/2005/01/27/361595.aspx#361693
Each update is customized to the machine that is downloading it, so you can’t just take the update and install it on another machine since that other machine may have a different configuration.