Date: | February 15, 2007 / year-entry #55 |
Tags: | history |
Orig Link: | https://blogs.msdn.microsoft.com/oldnewthing/20070215-05/?p=28003 |
Comments: | 53 |
Summary: | If you try to set the current directory of a command prompt, you get the error message "CMD does not support UNC paths as current directories." What's going on here? It's MS-DOS backwards compatibility. If the current directory were a UNC, there wouldn't be anything to return to MS-DOS programs when they call function 19h... |
If you try to set the current directory of a command prompt, you get the error message "CMD does not support UNC paths as current directories." What's going on here? It's MS-DOS backwards compatibility. If the current directory were a UNC, there wouldn't be anything to return to MS-DOS programs when they call function 19h (Get current drive). That function has no way to return an error code, so you have to return a drive letter. UNCs don't have a drive letter. You can work around this behavior by using the (Griping that seems to happen any time I write about batch files, so I'll gripe them pre-emptively: Yes, the batch "language" sucks because it wasn't designed; it just evolved. I write this not because I expect you to enjoy writing batch files but because you might find yourself forced to deal with them. If you would rather abandon batch files and use a different command interpreter altogether, then more power to you.) |
Comments (53)
Comments are closed. |
Ok a very naive question.
Why not allow UNC current directory paths under cmd.exe, but just have invocation of MS-DOS programs fail when that applies?
I use CMD.EXE every day, but only very rarely do I invoke a bonafide DOS program.
Were it not for the plethora of corporate logon scripts that all map a zillion drive letters (J: seems to be popular for some reason), I’d suggest something like a universal UNC mapping akin to /afs/whatever.
Maybe \servershare could become N:servershare for DOS programs?
Was that a backhanded plug for cmd.exe all growed up ?
"If you would rather abandon batch files and use a different command interpreter altogether, then more POWER to you"
:-)
Hmm, those same batch files might be messed by UNCs in environment variables. So my guess is it’s avoid introducing automatic drive mapping to a bulky cmd.exe – something that should be done completely or not at all.
Bob: Cool idea, but what about programs/batch files that search the root of drive?
(OK, that’s enough from me for today…)
Does this happen on 64-bit Windows where you can’t run DOS programs at all? (I haven’t got a 64-bit PC to hand, so I can’t try it).
You can’t make 19h run pushd because there might not be any drive letters left to create a temporary drive with, and, as explained 19h has no way to indicate failure. Also pushd is a cmd.exe builtin, and 19h isn’t.
Novell NetWare (remember that?) used to use _, [, ] etc… as temporary drive letters to get around similar problems, but the world’s moved on now.
John Elliott: I’ve just tried it on Windows 2003 R2 x64, and I got the same error message that Raymond describes.
I’m not being entirely coherent, but surely the int 21h stuff is an ntvdm affair? ntvdm could map the current directory on proram startup, so DOS programs are irrelevant.
JohnCKirk: Which suggests that there must be batch files that depend on this behaviour as well as DOS programs.
Mark Steward: The problem is that NTVDM can’t be sure that there’s a spare drive letter available, unless it uses letters that aren’t normally valid; and that would almost certainly break programs as well. And be quite confusing for the user ("What does it mean, drive ]: is full?"). And after the first few symbols you’d be into the lower-case letters which would be even more confusing (according to the Interrupt List, MS-DOS 2 supported 64 drives. You could have had a system with two different drives called C: and c:. Boggle.)
Why not have ‘cd //server/share’ do the pushd if required, and popd when you cd to a different drive or share?
4NT has supported CD’ing to UNC paths for ages, and can even launch MS-DOS applications from UNC paths.
If you’re really terrified of what might happen if an MS-DOS app gets back "C" from function 19h when the current directory is a UNC path, then just block the execution of MS-DOS apps from UNC paths (e.g. detect them using GetBinaryType). That would be more sensible than disallowing UNC paths altogether.
I would suggest creating some sort of flag (of the nature that already exist) that allows setting the current directory to a UNC path. Just make sure that anybody who uses the flag knows that some things may not work right if they attempt to get the current drive letter.
I’d really like an explanation for why some Windows Installers blow up when APPDATA is set to a subst’d drive… that is more irritating to me than this issue.
I realize this goes against everything Microsoft stands for, but why not mention what you need to do in the error message?
“CMD does not support UNC paths as current directories. Try using the pushd command.”
> [If only the people who wrote that error message had a time machine. -Raymond]
If only you could edit error messages in subsequent releases [/snark].
I’ll just insert my standard suggestion here:
ship a bash or zsh interpreter in windows vista SP1/XP SP(last) and bind .sh to it by default. Let those of us who want better batching have a common way to access it and leave cmd.exe alone for the sake of hallowed compatibility.
Hmm.
OK, the explanation makes sense for the NT/2k/XP/V implemenation of command.com but not for cmd.exe.
On those operationg systems command.com executes (.BAT) batch files , and cmd.exe executes .CMD files. So exiting scripts should have a problem.
There is already enough differences between these two environments – I hardly think this one would tip the balance. Or is there a deeper story about what happend with an invalid return from that interrupt. But a previous poster suggests not.
Matt: subst drives only exist for the current session. msiexec.exe runs multiple copies of itself (some running as you, some as SYSTEM). The one running as you will see the subst drive, the one running as SYSTEM will not.
So I guess it just depends what happens in the msiexec.exe running as you and what happens in the one running as SYSTEM.
Despite the fact that "batch files suck" only a half-dozen times in the past 6 years have I had to resort to a vbscript to accomplish something that I just couldn’t get done in batch. And though it will strike some as insane, I typically make calls to the vbscript from a batch file rather than just rewriting the whole thing in a vbscript.
I know one day the cmd language and batch files will have to die away, but for now I’m glad they’re there, I’m glad they’re backwards compatible, and I’m glad there are people that still write about them so I can keep learning.
And if everything looks like a nail to a man with a hammer…to me everything looks like something that’d fit nicely inside a for loop.
That’s an interesting data point… any idea what gets returned to a DOS program that makes an DOS function 19h query in that case? (jeez – am I really waiting time asking about DOS API behavior?)
Thank goodness for PowerShell.
Windows PowerShell
Copyright (C) 2006 Microsoft Corporation. All rights reserved.
PS C:> cd \onecodePublic
PS Microsoft.PowerShell.CoreFileSystem::\onecodePublic>
Has anybody ever created VBS versions of the DOS batch functions? So that one could move completely away from batch programming very easily?
Or why not make 19h run pushd when it’s called? I suspect it’s more that an alarming number of corporate batch files do horrible things like extracting drive letters from paths. Batch files are the one of the main purposes for cmd, so you couldn’t block them. And if you run pushd whenever one’s called, you might as well do it on startup.
Not sure why cmd.exe doesn’t just call pushd on startup, though. Perhaps because it’s possible to fail if you have too many drives, and that’s *another* error message and *another* test case.
I’ve actually been looking into batch files again after not really using them since MSDOS6. I’m surprised at all the hackish improvements made for the sake of backwards compatibility. They have allowed me to do a few things MSDOS6 batch files couldn’t do (like get the full path to the batch file regardless of where it’s run from).
A number of years ago, a hi-tech kind of guy told me "DOS is Dead".
Last time I saw him, he was trying to make a living detailing cars, or some-such.
Batch files are a wonderful way of linking different, relatively small programs into a system that does "wonderful things".
Debugging becomes very, very easy. Being able to look at intermediate data is very helpful in the debugging process.
I’ll stop now. I’ve got to go and load my bicycle into my airplane for a trip south.
I realize all the old versions of windows have this problem. But with Vista since the new ntfs supports symlinks to network share, you could mount the share in the home directory, or a special directory like %windir%networkservershare
Atleast that will solve problems in the future…
"ntfs supports symlinks to network share"
Oh… that’s a great reason to upgrade to Vista! :) (no sarcasm!)
Well pushd is nice, but what if I have multiple subdirectories on the server share and I want to know from which one I am running? If I just run pushd on the server share, I get the server share in a new drive letter – but I loose the information about the subdirectory. They are the subdirectories of the new drive, but which one is the one I’m running from?
Currently, each batch file in subdirectories defines an environment variable with its unc path. This works, because each batch file is run separately, so the variable is only set once. Is there another way to get the current working directory? I’m probably missing something about pushd :-(.
Doogal: VBS versions of DOS batch files? Why not just write batch files? They generally work and the MS cmd team (person?) did a superb job of enhancing the shell without breaking old scripts. Sure the cmd langauage is a bit odd, but so’s perl and that rules…
As ever, use the right tool for the right job. (And always know what the most fashionable tool is too, so you can talk about it).
re 2:50 PM, Cooney
Cygwin? I haven’t done the .sh file extension thing, but it is very handy having a *nix command line about the house.
Compatibility.. a simple word.
After 15 years of any product, you must get thinking of something else.
Hopefully Microsoft IS thinking of Windows NE (N-Eleven after N-Ten) that has no compatibility by itself… virtualization and emulation can keep old things going.
Laura: Raymond posted a while ago about virtualization/emulation not being a panacea (you just swap one set of headaches for another: trying to integrate cut&paste, file system, graphics…); on the other hand, it’s exactly how most modern platforms do it already (OS/2 and the NT derivatives running a modified copy of Win3.1 and Mac OS X running a copy of Mac OS 9 under the name ‘Classic’, for example).
James: No, not a panacea, I agree. The thing is, it can be done (as you mentioned Apple). If you don’t change anything, your current headache gets only worse. Eventually you must cure it. Happens when you reach the cost/benefit point.
Sincerely, I don’t think it is worth of it to maintain (MS-)DOS compatibility at all (when was the last time someone bought DOS 6.22? Or DOS APP?). It’s nice to say, I can run 2007 an app developed ’87 but that’s about it, IMHO. That cannot be a blocking factor for evolution.
And something is happening. Singularity is there somewhere. Just a first pass. I Hope.
If 19h can’t return an error, what about allowing CDs to UNC paths and printing an error message if someone tries to run an MS-DOS binary while in one?
All modern programs will still work, people finally get to use UNC at the command prompt, and old MS-DOS programs are protected from undefined behaviour.
The only risk would seem to be that someone could run a batch from a UNC path without realising that part of it relies on an old MS-DOS binary, in which case the error will occur at a later stage (when the binary is run) rather than at the point they try to change directories. Nothing that worked before is broken.
Is that risk the reason this wasn’t done, or is there something I haven’t thought of?
Kudos to Raymond for (once again) pointing out a nugget of hidden good stuff.
I suspect I’ve read "help pushd" several times but never noticed the UNC kludge.
PS Microsoft.PowerShell.CoreFileSystem::\onecodePublic>
Ouch! That "path" is over half the width of a standard console window.
As you might know, in batch files you can refer to the directory where the script was invoked by a special parameter %0. This parameter can be expanded to a "drive letter" and path with %~dp0. This expansion also works for UNC paths!
For example, I use a batch file to install Microsoft security patches directly from a file share without having to mount the share to a drive letter which might be already in use. The batch file looks like this:
@echo off
"%~dp0WindowsXP-KB929969-x86-ENU.exe" /norestart /passive
"%~dp0WindowsXP-KB928255-x86-ENU.exe" /norestart /passive
UNCs are handled just fine everywhere else in cmd. Just not the current directory. If you launch an app using a UNC, then that’s fine, as the current directory (and current drive) don’t change and will still be valid.
(Note: If you cd /d to a network drive, then net use /delete that drive, .cmd doesn’t change your current dir or drive, but moans if you try to use it. So you’d get a current drive letter back from 19h that didn’t point to a working drive.)
Does this make me a .cmd guru, or maybe I should go looking for a life.
Although you can’t CD to a UNC path, you can open cmd.exe with the current directory set to a UNC, I do it all the time (in conjunction with something equivalent to cmp prompt here) – See KB156276.
It should be noted that once you open such a command prompt, the CD command is flaky, it seems that until you use "CD .." (which brings you up to the UNC root directory, regardless of how many directories down from there you are), no other CD commands seem to work. Some of the other commands are flaky as well; for example, "DIR" works fine, but "DIR *.*" complains that "The filename, directory name, or volume label syntax is incorrect.".
Laura: You’re saying that Vista wasn’t a brave enough leap from backward compatibility? MS have arguably shot themselves in the foot with all the stuff that Just Doesn’t Work anymore. People are scrambling to get new versions of software out, new drivers out, etc. It’s gonna hurt them. They will, for a time, be haemorrhaging users to other OSs.
But, I applaud them. I think it was not only brave, and The Right Thing to do: it was the Only Thing to do, if they wanted to maintain respect as a secure OS for business.
They bent over backwards to keep reverse compatibility where it doesn’t hurt, they throw errors where it might hurt (programs/scripts relying on getting the drive letter from the current path), and they break it if they have to for security. That last was the brave and right bit, for which they will take a lot of flak.
I’m hardly Vista’s target market – I’m a BSD/Linux guy by preference, and for various reasons wild horses couldn’t make me buy Vista, but despite that I say: MS did good. There were hard decisions, but they had to be made, and I feel MS generally chose the right path, of security over compatibility, and compatibility over innovation.
If only open source authors would pay as much attention to backwards compatibility.
Laura: The thing is, it IS done that way now: it isn’t "Windows" which can’t handle the current directory being a UNC path – indeed, on the native level, Windows doesn’t even *HAVE* a current working directory! The whole concept of a CWD gets emulated on a higher level (when your calls to CreateFile get translated into the Native API’s NtCreateFile)
Christian: The %~dp0 thing is very handy indeed; it did take a little more ingenuity to get around Adobe’s Postscript printer driver installer, which appears to insist on looking for configuration files in the *root* of the current drive: infuriating!
On the other hand, we have a new contender for Raymond’s list of really irritating ‘toast’ messages: Vista, popping a message up saying the USB subsystem is hosed, ‘click here for more details’. Of course, clicking there is a little tricky with a USB mouse and keyboard :-(
Laura: "Sincerely, I don’t think it is worth of it to maintain (MS-)DOS compatibility at all (when was the last time someone bought DOS 6.22? Or DOS APP?). It’s nice to say, I can run 2007 an app developed ’87 but that’s about it, IMHO. That cannot be a blocking factor for evolution."
DOS support already *is* implemented through an emulation layer (ntvdm – NT Virtual DOS Machine). It doesn’t get in the way of anyone who isn’t using it, so why deprive those of us who do need it of that feature?
The kernel itself has a pretty clean straightforward interface to the outside world – pure Unicode, no reliance on drive letters and no DOS int 19h. Everything of DOS gets added in an emulation layer as needed – just as you suggested.
"They will, for a time, be haemorrhaging users to other OSs."
More likely people bothered by driver/software compatibility will stick with XP rather than go to a different OS where ALL of their drivers/software doesn’t work. Anyway, it’s still early days for Vista but enough stuff works that I get by fine with it as my primary OS. I am missing a couple of things, but I’d be missing a few hundred things if I moved to a different platform entirely.
S:
I guess you haven’t done as much Batch programming as I have – there’s a bunch of stuff you can’t do without jumping through serious hoops. I’d love to convert my favorite Batch files into VBS, it’s a just a pain to replicate the coolest functionality of Batch, and I’m sure it’s the sort of thing someone must have done, even just as an intellectual exercise.
PS I don’t want to shell out to cmd either, I’m trying to simplify things.
James: Raymond’s book is enlightening in this matter.
Yes, DOS is "emulated" and many other things are too. But still we need keep around (?) 8.3 filenames, drive letters and other quirks to keep Vista compatible with it’s past.
Laura: There’s no requirement to have 8.3 filenames at all, on NTFS (they are an integral part of FAT, of course) – if you like, it’s the work of a second to switch that feature off. Drive letters are still very much a current part of the Win32 API, rather than a quirk of the past; in theory, it might be possible to construct a system using only UNC paths, but I suspect a *lot* of code (not by any means legacy or obsolete) would break as a result – probably including important bits of Vista itself.
We already have incompatible extensions (e.g delayed expansion) – why not a "UNC paths" mode where it’s made clear in the help that drive letters won’t be available?
I currently start all my batch files with "pushd %~dp0" to handle running from a UNC path, but it sometimes interferes with other people’s logon SUBST scripts (e.g if they want to subst Y: or Z:)
Florbo:
You could always follow
pushd %~dp0
with
SET DRIVE=%CD:~0,2%
and then use %DRIVE% everywhere.
Dewi,
“They bent over backwards to keep reverse compatibility where it doesn’t hurt…and they break it if they have to for security.”
How exactly is crippling performance of the open standard OpenGL graphics system by layering it on top of the proprietary DirectX (after originally removing it altogether) and disabling the help of many existing programs by removing support for .hlp files bending over backwards and only breaking when they have to for security?
Sounds more like trying to move people off the open standard and onto their proprietary standard to lock them onto their platform.
Sorry about that, but the topic was about maintaining things for backwards compatibility, so it’s at least somewhat on-topic.
I’ve looked for a rationale for disabling .hlp, and I’ve not found anything that said there were security problems, just that MS didn’t want to maintain it. If you have pointers to where we can find more information on the security issues, that would at least help passify the masses a little when they complain to us that they can’t read our help files on their new computers.
Greg: There’s a buffer overflow in .hlp parsing, released just under a year ago – although MS’s official line, according to KB917607, is that .hlp support has *not* gone – it’s just taking a while to fix…
"Users who want to view 32-bit .hlp files must download the program from the Microsoft Download Center, and then install it on their computers. The download for Windows Help is still in development and will be available in early 2007."
OTOH, it also says WinHlp32.exe first shipped with Win 3.1, which would be a little surprising…
"I doubt the minutes of the meetings I attended are available online, sorry."
Oh well, thanks anyway.
James, thanks. That KB article used to say it would be available by retail release of Vista, but it was changed last week (or so) to "early 2007", and it will require that the *user* download the support. The support will not be included with the OS, and software companies won’t be able to distribute it (at least not yet).
Hi Raymond
I’m always loving the old new thing!
The pushd trick is nifty — but what if i need to specify a username and password for the new path?
Is there a way to do this in one line?
is there any equivalent that will "Do What I mean" when I type:
c:>pushd \machinepath /user:domain1user1 /password:Passw0rd
While I can do something similar in two lines with:
c:>net use M: \machinepath Passw0rd /user:domain1user1
c:>M:
I don’t know (ahead of time) what drive letter to use…
cheers
lb