Another retired computer: The Alpha Rawhide

Date:February 28, 2007 / year-entry #73
Tags:dead-computers;other
Orig Link:https://blogs.msdn.microsoft.com/oldnewthing/20070228-00/?p=27823
Comments:    22
Summary:This computer didn't die like the previous one; it merely outlived it usefulness. In its prime, the machine was a force to be reckoned with. It was about the size of a small refrigerator and generated about as much noise as a vacuum cleaner. It contained four, count 'em, four Alpha AXP processors, each running at...

This computer didn't die like the previous one; it merely outlived it usefulness.

In its prime, the machine was a force to be reckoned with. It was about the size of a small refrigerator and generated about as much noise as a vacuum cleaner. It contained four, count 'em, four Alpha AXP processors, each running at a mind-boggling 400 MHz. It had one gigabyte of RAM and thirteen gigabytes of hard drive space (striped over over a dozen fast SCSI drives). Hey, back in the 1990's these were impressive hardware specs.

When it was in active use, the machine ran a batch file that simply grabbed the latest source code to the shell, compiled it (if there were any changes made since the last iteration), and then repeated. It was called the "hourly build machine" since it took about an hour to compile the shell from scratch. And if there were any errors in the build, it sent mail to the shell team saying, "The Alpha AXP build is broken. Go fix it." It ran other tests on the side to verify that, for example, resources didn't change that would either generate compatibility problems or cause the localization team to get upset. Since very few people on the shell team had Alpha AXPs, the "hourly build machine" was the best chance of catching Alpha-specific build issues before the official build lab noticed the following morning.

When support for the Alpha AXP was officially dropped, the responsibility for producing hourly builds had long since fallen to two other machines, but the Alpha AXP continued to grab the latest source code and index it, making the index available to the entire Windows team. (And since it didn't have to do any compiling, it grabbed the source code to the entire operating system, not just the shell.) Having a search engine running against the entire source code to the operating system you're working on is very handy.

Ultimately, though, the machine was retired. What was once impressive hardware specifications became barely yawn-worthy. The machine sat in my office and served as a table for several years. (I can't even say "an expensive table" since the value of the computer was probably nil by this point.) It travelled with me through several office moves, until I eventually decided to put the machine out to pasture. I wiped the hard drives of all sensitive information and cajoled one of my colleagues who owned a pick-up truck into helping me load it up and taking the machine to the archives department where it now spends its time swapping stories of old times with IBM PC XTs and other hardware from times past.

(The archives department serves an important function beyond merely being a repository of Microsoft history. It occasionally becomes necessary to actually run an operating system from times past for various reasons, be they educational, nostalgic, or legal.)


Comments (22)
  1. Gabe says:

    One of the first uses for the content indexing system was in fact a MIPS machine (this was 1995) indexing the Windows source code. I’m not sure if it was actually running on OFS or just using the indexing engine from it. I am pretty sure that the current indexing engine is a direct relation to that one, though.

  2. Bjorn says:

    Admit it, you guys just use that room to play classic mid-’90s Origin games which never, ever worked under Windows. (Not a compatibility slam; much as I might wish otherwise, Crusader especially used so many low-level hardware tricks it would never run without a program like VMware or DOSBox.)

  3. BarryBo says:

    I used a Rawhide as my dev machine when I worked on Wx86.  Played quite a bit of Diable and Fury3 on it, in the name of emulator testing.  ;-)

    Barry

  4. sergio says:

    Did it also ran on 3-phase power, like described in Gabe’s comment of your "Giving fair warning before plugging in your computer" article?

  5. richard says:

    Man! I would have loved a machine like that. Back in the 90s I had a screaming 486DX running at 100Mhz, later replaced with a Pentium 133.

    I still remember how blazingly fast my new Pentium Pro 200Mhz was. But I still wanted an Alpha machine – they were screamingly fast.

  6. Brian Hoyt says:

    This brings up a question I always wondered about.  How close was the Windows 2000 AXP version to being done?  The decision to end the support for it seemed to happen about the RC1 time.  I wonder if there is anyone out there still plugging away with Windows 2000 on an AXP machine.  Fastest Exchange servers one could get at the time.

  7. David Walker says:

    Back in the 80s I had a screaming original IBM PC running at 4.77 MHz!  With 160 KB of memory (non-IBM, since I couldn’t afford that much IBM memory).

  8. Sarah says:

    The Alpha was serving one function there at the end of its life – keeping Raymond’s office warm.  

    That sucker sure put out the heat – so much so that the A/C worked extra hard to keep our part of the building cool.  The offices around Raymond’s (including mine) were extra cold because of this so I got a retired Itanium and plugged it into the wall to fight back.

  9. Slaven says:

    "so I got a retired Itanium and plugged it into the wall to fight back."

    I’ve seen office wars over temperature but that’s taking it to the next level!

  10. GregM says:

    My college was a beta site for DEC.  I remember when we got a rawhide.  We normally only had 1 of each model to start with, so the machines were given the name of the computer model.  It was running OSF/1 or Digital Unix, whatever it was called at the time, not Windows, but I remember it being a very powerful machine for the time.

  11. We continue to ship Alphas! While Microsoft might have aided the demise of the Alpha, by eliminating any hope that Alpha volume would ever grow to that of the x86, it didn’t stop users who needed to run their VMS applications on hardware more modern than a VAX from buying Alphas. Folks like the US and British Navies, whose warships continue to rely on OpenVMS continue to buy 833 MHz 264DP’s from Microway,and we have enough components to support them for at least another 20 years. Looking for a new Alpha?

  12. "(I can’t even say "an expensive table" since the value of the computer was probably nil by this point.)"

    Oh, man, it must only have been about 18 months ago I bought an AlphaServer on eBay; it now runs VMS, Tru64, Linux, NetBSD and NT 4.0 — lovely piece of kit.  Don’t think I paid more than 100 UKP though.  It’s one of a dozen test machines in my office — we target various flavours of Unix, plus VMS and Windows.

    Don’t get me started on Itaniums — we ended up buying a refurbished HP rackmount for an eyewatering amount of money, to support OpenVMS/I64, HP-UX and Linux.  Couldn’t find a machine being thrown out for love nor money!

    [And another lament that Windows 2000 for Alpha never saw the light of day here; would have been nice to have tried the beta/RC/whichever version did make it out.]

  13. Jonathan says:

    Why do you need an Alpha to compile to Alpha binaries? Surely there are cross-compilers – my 32-bit machine can compile AMD64 binaries just fine.

    Or maybe I’m just reading it wrong?

  14. vince says:

    Why do you need an Alpha to compile to Alpha binaries?

    Surely there are cross-compilers – my 32-bit machine

    can compile AMD64 binaries just fine.

    Well, try it sometime.  Cross-compilers are fragile beasts to start with, and it gets more complicated if you are trying to cross-compile for a different operating system (mainly because you need the header files from that OS, which might not even be legal to copy them).

    And after all that, do you trust the cross-compiled code to run bug free if you don’t have an actual machine to test it on?  Especially if it’s mission critical software, you better have native hardware to give it a test on…

  15. Shaun says:

    "Cross-compilers are fragile beasts to start with,…"

    vince, why would they be inherently fragile?

    I remember using the GNU compiler on SunOS, it was a multi-stage process.  The first stage being compiled with the *nix native compiler, the second stage being produced by that compiler, etc.

    I also remember that the GNU cross compilers tended to be a little flaky and had a shaky reputation.

    However, I don’t think there is any natural reason for this.

  16. Phylyp says:

    The archives department

    Hmm, whatever you might know about this might be a nice topic for the suggestion box.

  17. Tomer Gabel says:

    Any chance of some wartime stories about the archive department’s "other" functions you mentioned? I think it would make for a fascinating (if not educational) read.

  18. Willy Denoyette - MVP says:

    This really hurts, I had Win2000 RC1 running on a 4 way AXP 2100 by the time Compaq came in and took away all the hard work we had done to make this the best platform for Windows for the future.

    I still remember David Cutler in all his excitement, sent us a mail at Digital, when his team first booted a (basic without windows stuff) 64 bit version of W2K, a couple of weeks later came the bad news.

  19. Gabe says:

    Keep in mind that the Alpha was probably the fastest machine to compile on, period. Regardless of what architecture you were compiling to, you probably wanted to run the actual compilation on the Alpha.

  20. Rhys Wilkins says:

    Wow – Alphas!

    I had three of them – I paid about $140 for the lot.  I ran FreeBSD on mine, because I couldn’t find a copy of the Alpha build of Windows 2000.  Mine were hugely fast, and I was very disappointed when I had to replace them with an Intel machine (when the electricity bill arrived).

    A place where I used to work took over another company, and we found their entire AD running on an Alpha server with a late beta of Windows 2000, so it must have got quite a long way.

    Rhys

    p.s.: Raymond, the book is awesome!  I’m about a third of the way through, and it’s definitely well worth reading!  Well done!

  21. Gamma says:

    an Alpha server with a late beta of Windows 2000

    Hey, what about me?

  22. mariush says:

    I’d love to see some pictures of this hardware… and I’m sure others would…

Comments are closed.


*DISCLAIMER: I DO NOT OWN THIS CONTENT. If you are the owner and would like it removed, please contact me. The content herein is an archived reproduction of entries from Raymond Chen's "Old New Thing" Blog (most recent link is here). It may have slight formatting modifications for consistency and to improve readability.

WHY DID I DUPLICATE THIS CONTENT HERE? Let me first say this site has never had anything to sell and has never shown ads of any kind. I have nothing monetarily to gain by duplicating content here. Because I had made my own local copy of this content throughout the years, for ease of using tools like grep, I decided to put it online after I discovered some of the original content previously and publicly available, had disappeared approximately early to mid 2019. At the same time, I present the content in an easily accessible theme-agnostic way.

The information provided by Raymond's blog is, for all practical purposes, more authoritative on Windows Development than Microsoft's own MSDN documentation and should be considered supplemental reading to that documentation. The wealth of missing details provided by this blog that Microsoft could not or did not document about Windows over the years is vital enough, many would agree an online "backup" of these details is a necessary endeavor. Specifics include:

<-- Back to Old New Thing Archive Index