General Purpose Computing

A recurrent theme seen here in comments and on the web is that the trends in IT are towards less general purpose computing and more specialization. e.g. tablets replacing PCs, cloud services replacing local applications, thin clients versus thick clients, and locked-down systems replacing run-anything hardware.

oldman (one of our long-time commentators) is contemptuous of any PC that is not a fire-breathing monster with tons of RAM, drives and no-expense-spared software.

I recommend the least expensive hardware and software that will do the job and I love a thin client connected to a powerful server. You have to have worked with a modern server to understand that they are amazing computers in performance but they have no place in proximity to humans. One of their least favourite characteristics is to run a mess of fans at 5000 rpm on boot to shake the dust off… Humans should not have fire-breathing dragons in their workspaces. Humans need quiet to think and to create.

Others see a threat to all that’s Holy in unknown software running on a remote server, as if that matters. If I have the right to run software of my choosing on my PC so does the operator of a server. That’s right and proper, not a threat of any kind. I trust humanity to make the right choices when they load up a server with software. It’s their choice. I can still choose operators of servers/providers of services whose model of business is acceptable, just as I choose hardware or software in my operations. That’s not a threatening situation, just more of the same choices we have been making for decades.

The fact that some, like M$, would limit our choices for their enrichment is no more/less a threat than the lack of retail shelf space given to FLOSS in some places. We, the end users have to choose services that work for us, not the monopolist. That’s why I choose Googele over M$ for everything. Google uses FLOSS widely and openly, contributes to FLOSS, gives great service at a great price and has been doing so for a decade. I trust them. I don’t trust M$ which has shown the opposite characteristics for decades.

Using FLOSS on local machines and services from trusted organizations/individuals on the web is a bit more risky but a lot more efficient and should be embraced as an improvement in future IT. Having maintained servers, I know specialists can do a better job of it in many cases, especially in utilities like e-mail. With FLOSS, we have choices of software we can run on local PCs and servers and we have no less choice than anyone else with whom we deal over the network. We don’t know 100% that Joe Brockmeier is a good guy but we can still read his stuff and think about it. We do know that M$ is out to get us and that Google is out to get our business and competes on price/performance.

To view the future evolution of IT as a threat because more computing is non-local is do deny the obvious good the web has done over the last decades. FLOSS, itself, never really flourished until the web gave us ftp/web servers and global distros and repositories of FLOSS. The cloud is just another step in the same direction, better IT using networks. These tools can be used for good or evil. They are not a threat but some of the wielders are. We should be vigilant but still revel in the advancement of IT they represent.

General purpose computing is not going away, either. It’s just becoming a niche in an ever more diverse world of IT. The challenge is to use IT wisely, not to shun it. The vast majority of ordinary users of IT have rather limited needs that can be met by cloud services better than they could ever take care of their own IT. That cloud computing will replace a huge chunk of M$’s monopoly with FLOSS and increased choice is a blessing, not a curse.

A wise woman once told me that I should “judge a tree by its fruit”. We should look at price/performance, in particular. Clouds and smart thingies running FLOSS beat the ____ out of M$’s “general purpose computing” when it comes to price/performance, freedom, and all kinds of things that matter to the world. What’s most important likely, is that M$ has no monopoly on the web and ordinary people can choose what works best for them and know they have a choice. With a lot of “general purpose” computing today, they have little choice but to buy a PC running M$’s OS and they don’t even know how much it costs. Cloud computing costs 1/10 as much as local computing which is probably the most relevant factor for change/wide adoption.

Why should people pay an order of magnitude too much for IT? Paying less for more is not a threat but an opportunity. The only ones this situation should threaten are monopolists. FLOSSies should thrive in the cloud.

About Robert Pogson

I am a retired teacher in Canada. I taught in the subject areas where I have worked for almost forty years: maths, physics, chemistry and computers. I love hunting, fishing, picking berries and mushrooms, too.
This entry was posted in technology. Bookmark the permalink.

19 Responses to General Purpose Computing

  1. oiaohm says:

    Phenom welcome to a serous channel. Robert Pogson, Kozmcrae and me both have seen crap out of NTFS.

    Phenom MS is killing NTFS filesystem off by the way. ReFS is going in it place. NTFS has been a lemon. Its been over due for replacement for far too many years. The disappearing directories is a really fun stunt NTFS pulls particularly when it the windows/system/config.

    Hopefully MS gets the basics right with ReFS. They are cutting features from the file-system to make ReFS.

    Never in Linux history as the newer file-system has less features than the old one once it was deployed to production. Linux is getting close MS is forced to fix the problem they have been sweeping under the rug and hoping it would go away.

    I want to point out how that NTFS disappearing directory works. The NTFS equal to superblocks does not contain a checksum. One of the first new features in ReFS is implement checksums in the MS equal to superblocks. This is a feature that existed in every other major used file-system since the year 1998 other than NTFS and FAT. Yes this is how come NTFS driver does not know a MFT the superblock each is damaged so does not auto use a backup copy and repair the damage. In fact this is the bad part if it not something stops the boot like config containing the reg files. The damaged MFT will be overwritten over the good copy causing the files to disappear for good.

    Yes those windows users complaining they lost where they put there files. Some cases they did not lose where they put them the file system ate them due to poor design and a power disruption.

    Power disruption should not be able to cause old files that are not being written to at the moment to disappear completely. This is what can happen on NTFS. Power disruption at worse should only nuke the files you have open and working on and the data you have not saved. If anything else disappears the file system design or driver is defective. In NTFS and FAT case both are defective.

    So yes I will be recommending the use of ReFS from MS if all the alterations MS is promising is done for windows users. Better late than never.

    Data security is a stack of simple things that cost a little cpu time that result in failures not being major. Skip them like NTFS does and you risk major failure.

  2. Kozmcrae wrote, “With NTFS it would have meant a complete reinstall and the loss of any data the wasn’t backed up.”

    Amen. Apart from “excessive” (joke) malware, the number one reason to re-install that other OS is a messed file-system. NTFS is far superior to FAT but I have seen a lot of NTFS systems messed up as well. In ten years I have only re-installed GNU/Linux due to a messed up file-system a few times and most of them were due to me experimenting. One of the first production systems I installed has ext2 and a power-failure (student-induced) did mess things up. Since ext3 and other journalling file-systems have been used in GNU/Linux I cannot recall a single instance apart from hard drive failure. In an effort to preserve that other OS for my last employer I converted XP SP 1 FAT machines to XP SP3 NTFS and saw much-improved file-system durability but the malware still took the cake. I don’t think many of the XP machines survived malware long enough to demonstrate the corruption of the file-system. Multiple points of failure mask the true mechanisms…

  3. Kozmcrae says:

    “Yeah, yeah, Ohio, we know. NTFS is a crappy file system, because it cannot autorecover when your hard-drive catches fire during a vicious thunderstorm.”

    You have to go to extremes like that Phenom to compare NTFS to Ext3 & Ext4. I’m currently using a hard drive that was dropped from 4 feet to a slate floor. It had damaged sectors so I used the GUI in Gparted to fix it. About 6 months latter my screen froze solid. I booted with the Gparted CD and fixed some more sectors. I rebooted and everything ran fine. Same thing happened a couple of months latter. I backed up my data both times. I know I’m playing Russian Roulette with the hard drive but since those two episodes I haven’t had any problems (over a year now).

    The point of all this is that my hard drive recovered after the loss of a bunch of sectors. With NTFS it would have meant a complete reinstall and the loss of any data the wasn’t backed up. Microsoft Windows is a Royal PITA to be used a one’s own risk.

  4. Phenom says:

    Yeah, yeah, Ohio, we know. NTFS is a crappy file system, because it cannot autorecover when your hard-drive catches fire during a vicious thunderstorm.

  5. oiaohm says:

    Dr Loser funny point btrfs and zfs are both Orcale.

    ext2, ext3, ext4 are basically all the same file-system just like the many versions of NTFS.

    NTFS still lacks many extras in ext2 that resist fragmentation. ext3 add journal-ling to resist crash damage. ext4 has background defragging and online defragging that the the file system driver can be ordered todo. “e4defrag – online defragmenter for ext4 filesystem” ext2 and ext3 you defrag by exploiting a tool called shake defragmenter. Shake goes around looking for a fragmented file finds one and rewrites it causing the file-system driver to fix up fragmentation. Yes truly horrible solution but it works.

    I have my own version of shake defragmenter for windows. Its great when you have those 70 percent full drives that windows defrag says it cannot defrag. NTFS driver can defrag them if the files are moved. But the windows defrag interface to filesystem cannot for some reason. Still have not worked out why. Seams to be particular disk data patterns cause the windows defrag system to fail. Few files shaked around and then it can magically defrag. Percentage of usage has not changed one bit. Yes this is windows 7, vista, xp and 2000. NT 4 does not seam to suffer from it. Weird strange windows bugs. So when you have a windows drive that say it cannot defrag just move a few files and see what happens.

    NT did have a good idea of a api to tell the filesystem to defrag a file. Implementation of that is a little lacking at times.

    NTFS in Windows 7 still can lock of functional IO to the drive while it is defragging by the API. This really does need to be fixed. ext defragging api is designed not to block disk IO by applications. So yes I can basically defrag the full filesystem while system is under load without anyone noticing with ext4. Hopefully MS might decide this is worth the effort fixing in Windows 8.

    Due to you not knowing your file systems. Most Linux people are running at least ext3. Lot are running ext4.

    btrfs joins the long list of in-place upgrade ext2->ext3->ext4->btrfs will be able to happen. No reformatting of partitions.

    Also ext4 is not finished it development. Ext4 should get snapshotting soon. Basically equal to windows volume shadow copy with needing to use lvm. Then there will be motivation to move from Ext3 to Ext4.

    Dr Loser
    “After that, it’s down to the same advice on surfing the Net that you would give out on any OS.”
    Linux and Unix you alter a little. Run a HIDS. Because viruses hitting Linux most likely will be new so have no anti-virus signature. Running a Hids on a windows network never hurts. Same reason those viruses too new to have signatures.

  6. Dr Loser says:

    @Robert:

    I’d trust you to administer a Windows network ahead of “MSCEs” any day of the week; talk about a worthless piece of paper. If all they did is to Ghost images back and forth, then they weren’t much in the way of a Certified Engineer, were they?

    Mind you, I wouldn’t trust you (in this particular respect) further than I could throw you. You seem to be extraordinarily accident-prone when it comes to “perfect hardware” running XP.

    Should you ever need to advise somebody with an XP machine on how to keep malware out, btw, it’s trivially easy. (1) Make sure your service packs are up to date (basically, Patch Tuesday takes care of this for you). (2) Make sure your user account is separate from the Administrator account. (3) Run a free anti-virus system (I recommend Avast).

    After that, it’s down to the same advice on surfing the Net that you would give out on any OS.

    This non-robust/fragmenting file system, btw? It hasn’t been the case since NTFS, which is essentially since XP. But Linux on the desktop certainly does a better job of this: ext2, ext3, ext4, ReiserFS, the list goes on and on … all of them perfect, all of them non-fragmenting, and none of them giving you a single reason to move on to the next one. Something of a miracle, isn’t it?

    To be fair, I think the recent crop of Linux file systems has finally blazed through the quality bar.

    I don’t suppose you’d endorse ZFS, what with it being an Oracle trademark and all, but BTRFS seems reasonable, and there are probably one or two more I’ve forgotten (which one does IBM support?).

  7. Dr Loser wrote, “You must not have tried very hard since, say, 2000, then.”

    Certainly I tried my best in several institutions to keep that other OS going. In places were the Internet connection was extremely feeble, it was possible because there was less access to malware. In other places half the machines were not running well with XP but ran like rockets with GNU/Linux. Even with clean images deloused, I could not keep XP going at my last post. Some machines had been re-imaged several times before I gave up and installed GNU/Linux. It ran trouble-free.

    M$ has always had trouble making robust filesystems. They used to fragment and the OS used to write all over the drive messing things up on power failures. I have seen hundreds of XP systems that would not boot. Last time I checked that was the lingua franca of IT in schools. I worked at one place that MSCEs visit periodically and they did not have enough hours in their day to expand our network while keeping drive-images up to date using Ghost. At another place re-imaging was needed on dozens of machines while I worked there. The hardware was fine but XP just kept failing.

  8. Dr Loser says:

    @Robert:

    “For a while that may appear to be so, but I have never seen that other OS continue to run trouble-free for very long.”

    You must not have tried very hard since, say, 2000, then.

    “Malware and other problems arise like failure to boot.”

    Try setting up a Windows account with an Administrator and a User. (You know, like rwxrwxrwx+, but better.) No malware.

    Failure to boot? Really? Evidence, please. And I suppose a reinstall would be out of the question … unlike the average Linux distro, which requires you to hide the important bits like /home and /sys/usr and so on before a reinstall every six months.

    “MacOS used to be no better until they got BSD underneath.”

    And here we hear the Coming of the Lord …

    “Now it’s more or less decent although I detest the GUI of MacOS.”

    Magical powers of That Other Not Quite Linux-Compliant Unix System, But With Better Video And Audio Support.

    Shame about the lack of Gnome 3.x arguments, though, isn’t it? It’s almost as though people who own Macs have better things to do with their lives.

    “Still, the software needs to be upgraded to prevent malware taking over.”

    And there, oddly enough, and having considered the evidence over at TMR, I am in total agreement with you.

  9. Dr Loser says:

    No, Koz, I don’t.

    To (mis)quote the estimable Lily Tomlin:

    “Loon, meet Reality Check.

    “Reality Check, meet Loon.”

    You have absolutely no sense of proportion or even humour whatsoever, do you?

    Let me be painfully blunt here.

    Not a single purchaser of a PC cares how much a capacitor costs, nor how many the PC contains.

    Nor, and I didn’t mention this, the half-life of the capacitors involved. Which is actually more important than most other hardware-based decisions.

    And as a corollary:

    99% of purchasers of a PC do not care how much the OS costs.

    And the other 1% shop around in any case, and probably get a lousy deal. I’ve done it myself, and I can’t quite see the point of wasting my time again.

  10. Kozmcrae says:

    “I’ll bet you they don’t know how many capacitors there are in their computer.”

    But of course you do Dr. Loser.

  11. Dr Loser wrote, “if you own a Mac or a Windows PC, you don’t have to do any sort of a job at all. It just sits there and works.”

    For a while that may appear to be so, but I have never seen that other OS continue to run trouble-free for very long. Malware and other problems arise like failure to boot. MacOS used to be no better until they got BSD underneath. Now it’s more or less decent although I detest the GUI of MacOS. Still, the software needs to be upgraded to prevent malware taking over.

  12. Dr Loser wrote, “It costs precisely what they paid for it.”

    Very few retail buyers see the price of that other OS. It’s bundled with the hardware. In some places in the world that other OS competes on identical hardware side by side so the price difference is known but not around where I live.

  13. Dr Loser wrote, “When you type these blog posts up, what device do you use?”

    A Fujitsu keyboard, built like a tank about 7 years ago. It is sometimes connected to a NTAVO thin client, sometimes to GNU/Linux terminal server and sometimes to various notebook PCs scattered around my home.

  14. Dr Loser says:

    And a totally disinterested question, Robert:

    “A recurrent theme seen here in comments and on the web is that the trends in IT are towards less general purpose computing and more specialization.”

    When you type these blog posts up, what device do you use?

  15. Dr Loser says:

    @Robert:

    ‘A wise woman once told me that I should “judge a tree by its fruit”.’

    Well, mothers say that stuff. They don’t expect you to stay at the mental age of ten forever.

  16. Dr Loser says:

    “With a lot of ‘general purpose’ computing today, they have little choice but to buy a PC running M$’s OS and they don’t even know how much it costs.”

    I suspect you denigrate the unwashed masses unfairly on this point, Robert. They know precisely what it costs. It costs precisely what they paid for it.

    (No hidden subscriptions and bandwidth hikes and so on, as with the Magical Cloud, btw.)

    Oh, you meant how much the OS cost?

    Well, there you would most certainly be correct.

    The poor fools don’t know how much the CPU cost, either. Or the hard drive. Or the motherboard. Or the monitor. Or even the damn case.

    You know what? Even if you asked them nicely, and gave them huge hints, I’ll bet you they don’t know how many capacitors there are in their computer.

    And this is a really bad thing. There might be one, there might be a million, and it is important to know … because you might be getting ripped off on capacitors, you know. Capacitors aren’t free.

    This 1/10th the cost: your usual source of bogus statistics, I assume. Take my anecdotal example. I bought this wretched pile of junk for £99 about five years ago, when my last computer blew up. It came with XP pre-installed.

    On a constant, but non-amortized, basis, therefore, it has cost me roughly £19.95 per year.

    Are you saying that Teh Cloudz will offer me the same service at (an equally constant, but non-amortized basis of) £1.99 per year?

    Do I get free moose jerky with that?

  17. Dr Loser says:

    @Robert:

    “Having maintained servers, I know specialists can do a better job of it in many cases, especially in utilities like e-mail.”

    In general, if you own a Mac or a Windows PC, you don’t have to do any sort of a job at all. It just sits there and works.

    I’d love to know how email compares to any other possible application at all. Remind me, Robert, of the last time anybody proposed running an email server on their own fat PC…

  18. Dr Loser says:

    @Robert:

    “oldman (one of our long-time commentators) is contemptuous of any PC that is not a fire-breathing monster with tons of RAM, drives and no-expense-spared software.”

    Really? I’mm loathe to suggest that you are just imagining this, Mr Pogson. I’m going to trust your fertile intuition on this one.

    What is your “Beast,” then? A cuddly little equivalent of a marmoset?

  19. Clarence Moon says:

    You are being “penny-wise but pound foolish”, I think, Mr. Pogson.

Leave a Reply