Date: | February 26, 2004 / year-entry #76 |
Tags: | history;time |
Orig Link: | https://blogs.msdn.microsoft.com/oldnewthing/20040226-00/?p=40473 |
Comments: | 17 |
Summary: | Floppy disks use the FAT filesystem, as do DOS-based and Windows 95-based operating systems. On the other hand, Windows NT-based systems (Windows 2000, XP, 2003, ...) tend to use the NTFS filesystem. (Although you can format a drive as FAT on Windows NT-based systems, it is not the default option.) The NTFS and FAT filesystems... |
Floppy disks use the FAT filesystem, as do DOS-based and Windows 95-based operating systems. On the other hand, Windows NT-based systems (Windows 2000, XP, 2003, ...) tend to use the NTFS filesystem. (Although you can format a drive as FAT on Windows NT-based systems, it is not the default option.) The NTFS and FAT filesystems store times and dates differently. Note, for example, that FAT records last-write time only to two-second accuracy. So if you copy a file from NTFS to FAT, the last-write time can change by as much as two seconds. Why is FAT so much lamer than NTFS? Because FAT was invented in 1977, back before people were worried about such piddling things like time zones, much less Unicode. And it was still a major improvement over CP/M, which didn't have timestamps at all. It is also valuable to read and understand the consequences of FAT storing filetimes in local time, compared to NTFS storing filetimes in UTC. In addition to the Daylight Savings time problems, you also will notice that the timestamp will appear to change if you take a floppy across timezones. Create a file at, say, 9am Pacific time, on a floppy disk. Now move the floppy disk to Mountain time. The file was created at 10am Mountain time, but if you look at the disk it will still say 9am, which corresponds to 8am Pacific time. The file travelled backwards in time one hour. (In other words, the timestamp failed to change when it should.) |
Comments (17)
Comments are closed. |
I can’t remember when I last used a floppy. At least 4 years ago, probably more.. Does anyone still use floppies? This seems like a very irrelevant problem today :)
Anon: Almost all flash and USB drives today still use FAT so the problem is not confined to floppies alone.
Anyone know how CDFS handles timestamps?
According to ISO9660
http://www.alumni.caltech.edu/~pje/iso9660.html
timestamps on CDs have 1 second resolution.
What’s a floppy? USB flash is the way to go
Isn’t flash drives formatted with FAT anyways?
For a change the cynicism in this remark is caused by someone other than Microsoft…
Last I read, Posix timestamps had to be based on GMT, which is UTC without leap seconds. After enough years, when 43,200 leap seconds have occured, Posix users will have their midnight sun high in the sky, but UTC users will have it at noon.
There are already some number of seconds difference between UTC and GMT.
So if a process is running in the Posix subsystem, Windows is required to corrupt every timetamp by some number of seconds. Does Windows obey?
The advantage with floppies is that everyone has them, and you don’t have to get on the floor under the desk to put a floppy in, where as you do for some peoples USB ports.
Not everybody has these anymore–Dell’s now completely phased out floppy drives in all their production machines, IIRC, and the last three or four machines I’ve built haven’t had a floppy drive in them. Between CDRs and Bootable CDs, I haven’t ever wished I had it back.
Norman: In Windows, UTC == GMT. The time functions don’t make any adjustments for leap seconds.
I don’t have a floppy any more (I built my own machine, didn’t see a need for one). The only reason I can think of for having one nowadays is for flashing the BIOS, and even now some ‘flash in windows’ utilities are appearing.
Does Windows have a clear concept of leap seconds? I was playing with my GPS a few days ago and I wanted to get accurate times from its track logs. The GPS documentation said it returned seconds since 1st January 1980 (or similar) and neither the Windows or GPS documentation were clear about how they handled leap seconds into account. Does Windows consider all years to have the same number of seconds? If I subtract today’s date from the date at this time twenty years ago should I expect years_in_seconds * 20 or should the difference include leap seconds? As leap seconds are not known that far in advance, can the time difference between a point in time now and a point in time in the future change? Am I making all this far more complex than it needs to be?
To be honest, I don’t really care what Windows does but I would like its current behavior to be better documented (a summary page in MSDN would be really good).
To Norman and Russel:
UTC is actually defined as being .9 seconds within GMT time.
UTC time is computed with a cesium clock, so the seconds from there are more accurate. Then when UTC is approaching a second difference from GMT, which is the time it takes the Earth to go around the sun, a leap second is added or subtracted. Historically, only seconds have been added, about once a year.
GPS time are the seconds from 1980 using an atomic clock and are not adjusted so that it is ahead of GMT by 13 seconds
2/26/2004 9:03 PM Jordan Russell:
> In Windows, UTC == GMT.
UTC does not equal GMT. If Windows says it does, well then what is new, another lie.
I guess this means though, when Windows documentation says it’s using UTC, maybe it’s really using GMT. Then what happens when Windows Time Service picks up UTC from a network time server (some of which are operated by Microsoft). The received time is not GMT.
2/27/2004 1:42 AM Jonathan Payne:
> The GPS documentation said it returned
> seconds since 1st January 1980
Yes. The client (Earth-based application) has to add or subtract the correct number of leap seconds when coordinating between UTC and GPS timestamps.
2/27/2004 11:30 AM David Kafrissen:
> UTC is actually defined as being .9 seconds
> within GMT time.
No. I don’t recall the exact details of when UTC adds a leap second, but that could well possibly be a .9 seconds maximum, how far UTC is allowed to deviate from actual time, where actual depends on actual geophysical motion.
GMT does not have leap seconds. The number of seconds difference between GMT and UTC is always an integer. A few years ago it was 17.
To clarify the point I was making: Windows knows nothing about leap seconds. If you pass two arbitrary dates to SystemTimeToFileTime and compare the resulting FILETIMEs, you’ll find that no leap seconds are factored in. In addition, GetSystemTime will never return a wSecond value greater than 59.
Because there is no leap second handling, when January 1 rolls around the system clock will potentially be off by one (or more) seconds until it’s reset using an external time source that *is* leap-second-aware.
2/29/2004 9:15 PM Jordan Russell:
> Windows knows nothing about leap seconds.
OK, that is a clarification. Then Windows knows about GMT and time zones that are offset from GMT, but Windows does not know about UTC.
> In addition, GetSystemTime will never return
> a wSecond value greater than 59.
Very interesting! Now what happens if this happens: (1) Windows Time Service obtains an update from a network time server (some of which are operated by Microsoft) at a time which just happens to be a leap second, and then (2) an application calls GetSystemTime before the passage of another second. Does Windows check its knowledge of the current time and fake it so that the preceding system "second" or following system "second" is actually two seconds long? Or blue screen? or what?
Un post de otro weblog que explica unas de las diferencias entre NTFS y FAT.
Commenting on this article has been closed.