Networked File Storage And That Other OS

Users of that other OS and Ed Bott are in anguish over M$ tweaking how “8” deals with synced files to/from the cloud.“I rely on being able to see all the files on my OneDrive through Explorer, whether they are synced locally or not; if this integration is lost there is no advantage to using OneDrive over any other cloud. Please add the option for power users to continue to see all files and use an icon overlay to show which are local & which cloud” As a long-time user of GNU/Linux (UNIX-like OS) and NFS, I chuckle at this. If you have a huge file-system that won’t fit on the local client computer, just mount the networked file-system on the local file-system and you don’t need to worry about syncing. Just access the files with your normal tools… If you need greater security, use SSHFS.

The one sticking point is loss of network connectivity. Who has that problem these days anyway? The desktop/notebook PC is usually on a copper LAN with a connection to the Internet. The small cheap computer is usually only used when in range of a wifi or wireless network. Further, NFS mounted hard just keeps trying after an interruption of connectivity so a brief interruption is no problem at all as long as the machines are not powered down. The “little woman” uses this all the time for our local cloud. See the Linux man page or read about SSHFS.

I remember M$ “syncing” our files back in the day. I had a roaming profile with ~1gB of stuff on my desktop… Yep. The stupid system tried to copy every file it could find to my local desktop wherever I roamed. I had to remember which PC I used in the lab or there would be a long wait. That kind of thing is just stupid as the number of files we own increases and the total size of storage reaches hundreds of GB or even TB. Syncing everything makes no sense yet M$ keeps trying. They just don’t get small cheap mobile computers. They don’t understand their users.

So, you can choose to be jerked around on a chain to M$ or you can use what’s tried and true and unlikely to change any time soon. Use Debian GNU/Linux. It will work for you. There’s a reason the Internet runs on GNU/Linux. It’s a great networked OS having these problems solved decades ago. There’s no need at all to use an OS designed by salesmen. GNU/Linux will automatically cache a local copy of files you use, so there’s no need for M$ or anyone else to guess which files to use to clog up your network.

See Testers protest abrupt changes in Windows 10's OneDrive sync.

About Robert Pogson

I am a retired teacher in Canada. I taught in the subject areas where I have worked for almost forty years: maths, physics, chemistry and computers. I love hunting, fishing, picking berries and mushrooms, too.
This entry was posted in technology and tagged , , , , , , , . Bookmark the permalink.

22 Responses to Networked File Storage And That Other OS

  1. DrLoser says:

    Anyway, enough of oiaohm, the “useful idiot” on this site. Let us return to Robert’s original OP:

    Users of that other OS and Ed Bott are in anguish over M$ tweaking how “8” deals with synced files to/from the cloud.
    As a long-time user of GNU/Linux (UNIX-like OS) and NFS, I chuckle at this. If you have a huge file-system that won’t fit on the local client computer, just mount the networked file-system on the local file-system and you don’t need to worry about syncing.

    Thing is, Robert, NFS v4 doesn’t really provide instantly synched-up access to terabytes of data in Teh Cloudz, does it?

    Rather charmingly, you insist that it works perfectly well with your “local file-system,” and I agree, it does.

    Not really what the post was all about, was it, though?

  2. DrLoser says:

    I’m interested, however, oiaohm.

    Have you officially retired as a “Microsoft Value Added Reseller?”

    And if not, how many units have you shifted in the last year?

  3. DrLoser says:

    A “little hard” to turn off?

    What use would you propose for a Roaming Profile, oiaohm, if it didn’t include HKEY_CURRENT_USER?

    Go on, Walter. You’re the inventive sort of chap I like.

  4. DrLoser says:

    Officially ntuser.dat is part of the roaming profile. Its just the 1 file files that always sync from roaming profiles directory on server. Also just to be fun ntuser.dat never uses sync it always send out new copy. ntuser.dat messed up is the largest hazard of roaming profile its also the bit that is a little hard to turn off. Roaming profiles off is only part off.

    Well, since this is about as close as you ever get to sounding like you know what you’re talking about, oiaohm, it’s only fair that I address this head-on and in your terms.

    So, let’s start with the copy of ntuser.dat on my work machine, shall we? I’ve had this machine for 12 months (indeed, as long as I have worked with the company).

    It’s a whopping great 15.5MB.

    Now, this is going to be a bit awkward to transfer between my company’s corporate network and my home machine, given that I only have a 57.6 Kb dial-up modem … no, wait, I don’t. I have access to a 2Gb/5Gb hub, although sadly only at the 2Gb level.

    Now, let’s subtract 50% of the bandwidth for VPN and sundry other considerations. Down to 1Gb. Let’s stipulate that my home computer is having a “bad hair day.” Whoops, there goes another 90%, so we’re down to 100Mb.

    Divide by eight and you get 12.5MB.

    Oh dear. I have to wait a full second and a quarter to get HKEY_CURRENT_USER synched onto my machine.

    I am distraught. Not.

    Now, should you be caught in the wilds, without those small cheap thingies that Robert relies upon like mobile phones, and you seriously need to rely upon a 57.6 Kb connection … well, you have other issues, obviously … but what you actually do is, you remote into the corporate net and get a virtual terminal at the other end which can copy ntuser.dat pretty much instantaneously.

    What with corporate networks no longer relying on Dixie cups and a length of twine.

    In other words, no problem. But nice try, oiaohm.

  5. DrLoser says:

    Depend on edition of Windows and file system involved. 9x to a NT server with shares set NTFS on server every file would copy. Reason 100 percent sync failure due to permission miss match between fat that Windows 9x was using and NT. Yes XP installed on fat32 could also have the same issue.

    Your gibberish exhausts even my penchant for historical accuracy, oiaohm.

    May we just assume that, for some arbitrary period such as the last ten years, this has never once happened? And that therefore the rest of your googled information is completely worthless, since it depends on that non-happening?

    Note that I am not accepting your premise per se. I’m simply suggesting that there are more important things to worry about than whatever happened on an IT system ten years ago.

    ShellShock, HeartBleed, two completely inoperative forks of OpenSSL, the fact that system is a declarative language that foxes experts in Pascal, that sort of thing.

  6. ram wrote, “The networked machines want to see a common time server (NTP server) or have to have their time clocks set pretty close or NFS can get “confused” in some instances.”

    This is quite important. Fortunately, NTP is capable of synchronizing most PCs within milliseconds. Occasionally a PC will have a really wonky clock with a very slow or fast rate or even variable (are they using RC-oscillators???) and NTP takes time or never catches up. I have one PC that NTP couldn’t hold so I just set up a CRON job to ntpdate every 10 minutes.
    # m h dom mon dow command
    11,21,31,41,51,1 * * * * /usr/sbin/ntpdate-debian

    It was brutal but effective, limiting the maximum deviation. NTP converges iteratively and just doesn’t work if the target is dancing about, no matter how good the peers are.

  7. ram says:

    The only problems I’ve had with Linux NFS are:

    1) The command syntax has changed between versions. The change was unnecessary.

    2) The networked machines want to see a common time server (NTP server) or have to have their time clocks set pretty close or NFS can get “confused” in some instances.

  8. oiaohm says:

    DrLoser
    Number One: Roaming profiles do not blindly copy every last megabyte. Can we agree on this? Roaming profiles sync up. There may be a cost in time, but it ain’t “gigabytes.”
    Depend on edition of Windows and file system involved. 9x to a NT server with shares set NTFS on server every file would copy. Reason 100 percent sync failure due to permission miss match between fat that Windows 9x was using and NT. Yes XP installed on fat32 could also have the same issue.

    Also lets take this example I long into 8 different machines then go back to the first machine what happens. What happens here when you get back to the first your profile could have been deleted.

    http://msdn.microsoft.com/en-us/library/cc786749%28v=ws.10%29.aspx
    Feature comes in 2003. Folder redirection. This now enabled for lot of the roaming profile to be redirected to a normal network share so that files are only synced to client machine on usage. NFS shares with users under Linux files only shared to machine if they are used.

    Result is now you cannot disconnect windows computer from network due to the same issue NFS suffers from where user will complain that some of their files are missing.

    This behavior is lighter on network yes smb under windows supports off-line cache. Roaming profile sync in NT was designed before off-line cache in Windows network file systems existed.

    Lot of ways with Roaming profiles idea you are better of using a normal sync program. Why you have sync failures because a roaming profile computer was disconnected from network and roaming profile mixes up a older file with newer contents for a newer file with older contents and destroys the newer data.

    At least using a normal sync program you have control and the means to say screw it I don’t need synced just at the moment.

    DrLoser
    2) It isn’t about ntuser.dat. It’s about the roaming profile.
    Officially ntuser.dat is part of the roaming profile. Its just the 1 file files that always sync from roaming profiles directory on server. Also just to be fun ntuser.dat never uses sync it always send out new copy. ntuser.dat messed up is the largest hazard of roaming profile its also the bit that is a little hard to turn off. Roaming profiles off is only part off.

  9. DrLoser says:

    Linux does have these behaviors but due to lack of registry and the multi file method chosen the explode size is not as harmful.

    Peachy. Just peachy.

    Any analysis of R-B trees to be cited here, oiaohm?

    I say, what larks!

  10. DrLoser says:

    And once again, there are several million Windows System Administrators out there in the wild.

    Only one man, with the wisdom accrued by a profitable career in Saudi Arabian Cyclotrons, knows better. Those several million might think they know what they’re doing … but the evidence suggests otherwise.

    Or then again, maybe not.

  11. DrLoser says:

    It was. The system would be chugging for 15 minutes before being useful. To minimize the waste of resources I had to sit in the same seat every time, something not always possible if another teacher was already there.

    See, this is the beauty of wandering around the Internet and taking three or four hours to teach yourself the basics, Robert. There is no sensible reason whatsoever why you would have been in that position, even three or four weeks into the job.

    Number One: Roaming profiles do not blindly copy every last megabyte. Can we agree on this? Roaming profiles sync up. There may be a cost in time, but it ain’t “gigabytes.”
    Number Two: Any idiot can choose any given PC and go into the area of the “Roaming Profile” and delete whatever spurious glob of expensive disk space said idiot chooses.

    Naturally, Robert, you are not just “any idiot.” Which means you must have had a specific reason not to do that.

    I’d be interested to hear what that reason was.

  12. DrLoser says:

    DrLoser when you have 400 plus users running a clean on every profile becomes had.. Please do not underestimate how big ntuser.dat can get.

    1) I believe the argument, such as it is, involves the roaming profile of a single user. That is an argument worth discussing. Suggesting that it is in any way affected by scaling up to some tiny little number like 400 is … not.
    2) It isn’t about ntuser.dat. It’s about the roaming profile.

    Other than that, oiaohm, you have demonstrated your usual mastery of the subject.

    Which is to say, none whatsoever.

  13. oiaohm says:

    Linux does have these behaviors but due to lack of registry and the multi file method chosen the explode size is not as harmful.

  14. oiaohm says:

    DrLoser when you have 400 plus users running a clean on every profile becomes had.. Please do not underestimate how big ntuser.dat can get. You can have roaming profiles off and still have a sync from hell.

    https://social.technet.microsoft.com/Forums/office/en-US/610dc1f2-b171-4b33-959b-d5cd1d833c35/900mb-of-profiles-stored-in-default-registry-hive?forum=outlook
    This is the biggest I have ever seen reported. 1.97GB in just ntuser.dat without including its backup.

    DrLoser is not just the Roaming Profile that is the problem. Its the odd user who ntuser.dat that goes nuts. Remember you need all students logged in before you can start teaching some things. So yes 1 user with slow login disrupts the everyone else in the class at times.

    So over 1 G of downloads per login in may have absolutely nothing todo with Roaming Profile folders

    The issue I have here even happens with Windows 8 and 10.

    This is a really good A+B+C equals disaster.

    Exchange+Outlook+Messaging+third party backup software= Ntuser.dat exploded to insane size so crippling logins. This is why when people are saying we must have outlook I am not happy. Yes kontact with kolab might make 2G per harddrive disappear but it does not result in profiles that filled with junk that login takes forever.

    Of course there are many other combinations that also equal Ntuser.dat under windows exploding to a unworkable size in fact Ntuser.dat can in fact grow to a size that user cannot even login. DOS attack against all users in a Windows network is possible just because you add one bit of software without proper audit.

    Linux does have these behaviors.

  15. DrLoser wrote, “a gigabyte was “copied” every time you logged in to a different machine (it wasn’t, was it?)”

    It was. The system would be chugging for 15 minutes before being useful. To minimize the waste of resources I had to sit in the same seat every time, something not always possible if another teacher was already there. I had .iso files and lots of packages on my desktop right out where I could see them. Even if I emptied my desktop or arranged they be stored elsewhere all those other users were having the same problems with their documents, images, multimedia files. This is another example of a “MIPS-eating application” that M$ loved to encourage folks to buy new machines bearing new licences.

    DrLoser wrote, “I think that five minutes’ thought might have paid dividends.”

    What 5 minutes? The problem was 50 users on 50 PCs and those files were copied on multiple PCs. I knew the first time I logged into the lab what the problem was but that was after a couple of months in the school so every user had wracked up tons on their desktops. Further, the synching was imperfect because users would get tired of waiting and power off… I did delete one desktop unintentionally and “POOF!” one very angry lady was missing 200 digital images. I had my own little lab so I rarely ventured into the main lab or used other users PCs. I did not install GNU/Linux on any of the schools new PCs, just the old Lose ’98 ones that were deprecated. I brought them out of storage and made them useful.

  16. DrLoser says:

    Ah, but back to Ed Bott and the Windows 10 CTP:

    In other words: “Sorry, we’re still working on this feature.”

    Ed Bott’s brutal yet reasonably accurate summary of the issue.

    Awful, isn’t it? If only M$ would listen to their users!

    Oh, wait. That’s precisely the point of a Community Technical Preview, isn’t it?

    And for your next trick, Robert, you will undoubtedly pick some long-forgotten issue with “Lose 98” out of your archives. It will no doubt be just as relevant.

  17. DrLoser says:

    It wasn’t that easy… Users all over the building had stored hundreds of files on their desktops as had I so the genie could not easily be put back in the bottle without individual backup/restore operations.

    I still feel sorry for you, Robert, and I appreciate the extra detail. One small caveat: some bloke with a 25m underwater swimming certificate (MSCE, RHCE, whatever) is not a sufficient reason to blame an entire Operating System.

    That said, I still feel that the sort of “scientific enthusiasm and investigation” that your extensive experience leads you to might have been better used in this case.

    Leaving aside the relatively obvious fact that you are obfuscating things just a little when you lead us to believe that a gigabyte was “copied” every time you logged in to a different machine (it wasn’t, was it?), the obvious thing would have been to look at the Roaming Profile directory. Easily found. It’s in the environment variable %userprofile%.

    I just looked at mine on this machine, btw. For some unknown reason, Cygwin chooses to use this to store its download history — a whopping 352MB in my case.

    Is there any reason for me to broadcast this stuff? No, there isn’t. So, if the need arises (not likely on a home network, but the principle is the same), I’ll either delete or move the darned thing.

    See? Five minutes’ informed and self-educated thought, and the problem disappears!

    And had you chosen this path for the entire school (or school district, or whatever), you’d have made dozens, hundreds even, of people instantly happy!

    Instead of which your unilateral decision was to cause havoc in said school by demolishing everything they already had in place and replacing it with some shaky thin client solution.

    Now, that might very well have been the right decision, in the long run.

    In the short run, I think that five minutes’ thought might have paid dividends.

  18. DrLoser wrote, “You must have had a total dimwit set your Windows network up”.

    Yep, a paid MSCE did it under contract and his work had to be undone over the year. I was not the guy and I did not have admin rights there until about halfway through the year. That’s what happens in a huge bureaucracy. That was the year XP SP2 was applied without permission from us in the middle of the night and all kinds of stuff stopped working… We had to re-image the whole lab to get it working again. By then I was already using GNU/Linux in my classroom and in other classrooms using the castoff Lose ’98 machines with Debian GNU/Linux.

    DrLoser also wrote, “Just a shame that you didn’t spend the half-hour or so doing your own sysadmin and making sure that your Windows system worked”

    It wasn’t that easy… Users all over the building had stored hundreds of files on their desktops as had I so the genie could not easily be put back in the bottle without individual backup/restore operations. It was a mess that took many days to fix, not just minutes. Then there were incompletely synced file-systems all over the building…

    DrLoser also wrote, “a roaming profile is almost unnoticeably smooth”.

    Yeah, right. Here’s a discussion of the woe many people had in those days. At the same time, desktops worked just fine with GNU/Linux and NFS. My first big installation with Ubuntu GNU/Linux, 500 users on 120 seats, had zero problems with users and their file-system and the system was snappy. Imagine 24 students going to a lab and logging in at different PCs than they used last time and all those desktops had to be synced again with XP…

  19. DrLoser says:

    I remember M$ “syncing” our files back in the day. I had a roaming profile with ~1gB of stuff on my desktop… Yep. The stupid system tried to copy every file it could find to my local desktop wherever I roamed. I had to remember which PC I used in the lab or there would be a long wait. That kind of thing is just stupid…

    I absolutely agree, Robert: it’s spectacularly stupid. It’s far beyond the realms of stupidity I have ever heard from any other Windows installation, where a roaming profile is almost unnoticeably smooth.

    You must have had a total dimwit set your Windows network up, and I must say I feel sorry for you. Just a shame that you didn’t spend the half-hour or so doing your own sysadmin and making sure that your Windows system worked as efficiently as the billion-plus M$ desktops out there …

  20. Modular sunfish wrote, ” run your file manager, press ctrl-L and enter the URI for your server and account.”

    Thunar should be able to do that but, on Debian Jessie, I get this:
    thunar sftp://pogson@someserver/home/pogson/
    Thunar: Failed to open "sftp://pogson@someserver/home/pogson/": The specified location is not supported

    I’ve checked my dependencies and recommends and could not figure it out. Perhaps it’s just Debian being “careful” of networking protocols… Nope. Thunar depends on stuff that recommends stuff that … Anyway, I installed gvfs-backends and now Thunar works as described. Nice. I’ve been using SSH and SSHFS. It’s good to have options.

  21. I feel compelled to leave the firstsecond comment.

    My first experience of NFS was a minor disaster. My first experience of GNU/Linux was a cluster of five old PCs in my classroom used as a centre where a small group of students could work more or less on their own while other students and I worked at other centres or at our desks. I installed GNU/Linux because Lose ’95 crashed every few hours just browsing/word-processing. I wanted to use NFS to set up file-sharing between the computers so I could set up assignments on one machine and have students access them from all machines. I never did get it working. I barely understood file-permissions… I ended up moving around between machines with a floppy disc, “sneaker net”.

    A couple of years later, I was an actual computer teacher with a lab full of Lose ’98 boxes one or another of which would crash every hour. It was intolerable but the rest of the school used that other OS so I couldn’t just pave it over without politicking. So, I used my PC as a terminal server and booted the other machines PXE from my machine. Even though my machine was 1.8gHz, single core, and 32bits with ~1gB of RAM, it worked quite well, amazingly well. The first time I booted up the lab and 30 students fired up the machines as thin clients (PXE loader on a CD), one actually fell off his chair. I was amazed that all of the RAM on my PC was used up but no swapping occurred. It was an impressive performance for GNU/Linux. The clients got their file-system via NFS. It worked perfectly. The clients booted in about the same time as that other OS from the hard drive but the desktop was immediately usable instead of ~2minutes. The file-system was a chroot on my machine set up via LTSP and was a few hundred MB, of which only a few were used. The clients had 64MB RAM and did not use it all. We experimented with pulling the plug on the network connection to the switch. Screens froze. Plugging the cable back in brought everything to life. No one lost any data.

    Isn’t that what people want? Software that works? Why use stuff that doesn’t work from M$ when you can get the real thing for $0 and a bit of time to install/learn?

  22. Modular sunfish says:

    SSHFS is easy and secure but it gets even easier than that using the same underlying SFTP protocols. File managers now also support SFTP. So run your file manager, press ctrl-L and enter the URI for your server and account. It would be in this style with protocol, user+host and path: sftp://mr@pogson.example.com/home/mrpogson/Documents/

Leave a Reply