Advice to the Green Party of Canada

Suggestions For A Computing Plank In The Green Party’s Platform

Robert Pogson

2011-10-15

Abstract:

The Green Party already mentions computers in its platform, Ensure that all schools with even a small inner city population staff an after-hours and weekend computer lab year-round to allow those students without internet access to complete their school assignments and develop their interests on their free time. That represents something of the value of computers in our society but it is only a small part and much more could be added about improving the quality of life with computers, responsibly, taking account and minimizing environmental/sustainable costs. It is recommended that because of the cost of energy and material consumption in production, distribution, use and disposal of computers, the Green Party should recommend

  • use of small inexpensive computers,
  • thin clients,
  • ARM technology, and
  • Free Software.


Any organization using most of these recommendations will reduce the “Digital Divide”, energy consumption throughout the life-cycle of a computer, and minimize electronic waste. For example, an Android smart phone can be sold for as little as $100, provide most of the features some users need, consume 1% of the electrical power of a “PC” in use, and contribute only 1% of the e-waste on disposal. That is possible because of the advances in electronics made in the last decade, use of Free Software with no expensive licence and no built-in obsolescence. Even a conventional “PC” can have its life extended by a factor of two using Free Software cutting e-waste in half. The thin client technology is old technology brought up to date. One extends the life of a computer by using a small computer with no fans or drives or using an old conventional “PC” to show the pictures and send the clicks to and from a new powerful PC or server. That way multiple users get the use of the powerful machine while using much less energy and material.


Environmental Costs of Computers

Many people treat computers like small appliances and do not give them any special consideration when it comes to environmental matters. Computers have several issues which make them much more of a burden to the environment than most other small appliances.

  1. Most computers in use in the world use the Microsoft Windows operating system and an Intel or AMD CPU chip. Microsoft makes most of its money from Windows PCs by selling the manufacturer a licence that costs about $50. That licence is an OEM (Original Equipment Manufacturer) licence which lives and dies with the computer. It is in Microsoft’s interest that a PC slows down and dies far sooner than necessary so that a new one will be bought in as little as 3 years. Microsoft does that by constantly supplying “updates” which supposedly make the machine more secure, immune to malware and bug-free. It also has the desirable side effect that the computer ends up running more and more processes as it is used and it becomes slower with use to the point that the user will try to have it fixed at great expense or discard the old one and buy a new one. Intel and AMD also benefit from this activity by selling ever more powerful CPU chips to make the new PC run faster than the old. One can install GNU/Linux on an old PC and see that it will run twice as fast and not slow down with use to see that this slowing down is quite unnecessary. The result is that a person blindly using Wintel PCs will discard the PC every 3 to 5 years while a person using GNU/Linux can enjoy his PC at least 8 years or until the moving parts need replacement. In addition to this deliberate slowing of a PC, Microsoft also produces an operating system that is very fragile and prone to being invaded by viruses and other malware. These intruders rob the user of his computer resources often making the machines seem slower. In addition some malware sabotages the computer or steals valuable information greatly reducing one’s quality of life. The cost of removing malware periodically from a PC is unknown with GNU/Linux because the operating system is designed as a true multiuser system and security is built in. Windows was conceived as a single user system from the beginning and had no intrinsic security until XP SP2. The solution to the problem of a slow computer in any event is not to discard it but to give it minimal work to do, showing pictures and sending clicks something that 10 year old computers can do well. That’s how thin clients work.
  2. On top of these totally controllable features of computers, they are much more complicated than most appliances and contain large amounts of quite toxic materials including heavy metals and plastics. While manufacturing a PC can be quite inexpensive, it is very expensive to recycle the intricate parts reliably. Much of it becomes landfill.
  3. A computer uses a surprising amount of energy to manufacture and to ship to customers and then uses electrical power throughout its life. Worse, most of the power a computer consumes is wasted and converted to heat. In winter, this can be used to heat a building but in summer may make a living space uncomfortable and require additional air-cooling. The typical desktop PC uses more than 100 watts of power for the case and 30 watts for the monitor. Notebook PCs use less, perhaps 50 watts, because they need to run from a battery. In most cases the CPU in the PC is idling all day long. Even when a person is using a PC the work a CPU does is controlled more by the user and the storage devices than the CPU so it idles and runs in short bursts to do tasks. This can be studied using monitoring software or by running multiple users simultaneously on a PC. With Windows that is rarely done but GNU/Linux can run 30 users comfortably all at the same time on a typical desktop PC with enough memory. The faster the CPU one has, the more power it uses and the more it wastes idling, much like a huge engine in a car. By various tricks one can slow down a CPU when it is not busy but one still has the larger environmental costs and the purchase price of a larger than necessary CPU. The CPU power a user actually needs to do most tasks is more like the tiny ARM CPU in a smart phone costing $30 and using 1 watt rather than the hair-drying 64bit quad-core CPU from Intel using 100 watts and costing $100 (and more for the larger power supply). A modern thin client uses a low-powered CPU like that in a smart phone to show pictures and receive clicks for a faster, more powerful computer. The total energy consumption can be very low. Just counting CPU power, 30 “PC”s might use 30 x 100 watts = 3KW, enough to heat a small home. 30 thin clients might use 30 x 1 watt for the clients and 100 watts for the powerful PC coming to a total of 130 watts, about 30 times less power. Thin clients, being fanless can easily last 10 years, further multiplying the energy saving to about 60 times less energy consumption over the lifetime. A typical PC might cost $300 to buy and a thin client $50, compounding the savings.

The standard objections to changing the way computers are being used hinge on falsehoods:

  • small cheap computers are slower – This is not necessarily true. A thin client actually has very little work to do and is idling a lot. It’s server can be a very powerful machine with extra memory and storage being actually faster than a typical PC. In the school where I worked last year. We had some brand new PCs come in that students felt were terribly slow. The reason? They were used to 8 year old PCs used as thin clients which could log them into the server in 5s and load the word-processor application in less than 2s. The new PCs were twice as slow because they had to peel the software from the hardrive for every click whereas the server had multiple users already running applications in memory and the same software needed only one copy
    in memory to satisfy many users. It’s a magical improvement in performance until you understand how it works. Any place with multiple users can benefit from this technology even a home with two users. In schools and offices it is incredibly efficient. The computer person has only one computer to maintain instead of many because the thin clients are as reliable as telephones and just keep humming. If you have GNU/Linux on that server, the fix-it person does not even need to maintain anti-malware software, another thing that makes “PC”s slower.
  • small cheap computers rely on servers to work and that’s a single point of failure – Not necessarily relevant. Google is a single point of failure but is more reliable than many PCs. The reason is that Google runs GNU/Linux on its servers and they use clusters of servers so that if one fails the others take over instantly. Any school or business can arrange the same automatic fail-over strategy just for the cost of one more machine. The network is often copper wire which lasts 25 years or so. Network switches that fail can be replaced in minutes. The typical “PC” running Windows has a huge failure annually and a bunch of smaller ones. They are usable about 99% of the time. The typical “PC” running GNU/Linux is usable about 99.9% of the time because of the better design of its software. By using a pair of machines to be a cluster of terminal servers, one can get 99.9999% reliability with GNU/Linux and 99.99% reliability with Windows. Use GNU/Linux. One of the major failures of Windows is malware. The “bad guys” who make malware produce more than 1000 new malwares every day and Microsoft gives them a head-start. Every month, Microsoft releases updates to fix serious vulnerabilities in their software, at a time convenient in their timezone. The result is that if the update arrives in the middle of your workday you cannot apply the updates immediately. The “bad guys” can analyze the changes and design malware to exploit the vulnerability in a few hours and send it to you in an e-mail or a website that you visit. Then you have no protection and later if the malware has done no harm you can remove it in a few days when the anti-malware industry catches up. The result is that you are almost certain to have malware on your PCs and you often have to reboot to do the updates and take care of the malware. That reduces the reliability of Windows.
  • small cheap computers actually cost more – Even Microsoft says that and it’s a lie designed to suppress competing technology.

    see http://www.informationweek.com/news/51201703

    The licence for Free Software usually costs $0. It is provided with the software as a free download. It’s easier to maintain than Windows because a programme called a package manager can update, install or remove all the software on your PC when you run GNU/Linux. Microsoft updates its own software but the user has to individually update all non-Microsoft applications. Schools that have adopted GNU/Linux find they can triple the number of PCs in the system without hiring any additional staff because GNU/Linux is so easy to maintain. Microsoft makes its false claim by comparing their prices with the prices of the companies that include the price of maintenance in a subscription fee, confusing the consumer with apples versus oranges comparison. Microsoft’s licence costs a consumer $100 or more and GNU/Linux costs $0. You can find people who will charge you $1000 a year to maintain your PC but would you do business with them? No. You would find someone more reasonable or do it yourself. I have installed GNU/Linux machines that run for years with no maintenance required. Microsoft has arranged with OEMs to hide the price of their licence in the purchase price for PCs so consumers cannot make this comparison. With a little digging you can plainly see identical systems cost more with Windows: http://alturl.com/m4iqz


Real-world Demonstrations of Small Cheap Computers in action

While it is fairly easy to set up a demonstration of better technology, it can be complicated by resources: space, equipment, test subjects, etc. The web, on the other hand, is full of examples with video:

  • Home Education Festival Dorset, UK, 2005 demonstrated a tent of old notebook PCs used as thin clients. One machine kept a room full of children happy.

    see http://www.youtube.com/watch?v=kDvqw-I2sjw
  • Setting up any PC to boot thin clients over the network so they don’t even need a hard drive.

    see http://youtu.be/7RimB4qKiz8
  • A new modern small cheap thin client computer with an ARM CPU is demonstrated at


GNU/Linux is an Example of Free Software

Free Software is an important concept in the use of IT in our society. Our society depends upon computers and software but members of our society are sufficiently capable of developing software that works well. By sharing that software the world does need not be limited by the fiscal plans of corporations making money from restricting how we use IT. Ideally Free Software carries with it four permissions:

  1. Permission to run the software,
  2. Permission to examine the code of the software to know how it works and what it does in detail,
  3. Permission to modify the software to improve it or customize it for some purpose, and
  4. Permission to distribute the software and the requirement to distribute any modifications.

This structure was created by Richard Stallman who created the Free Software movement decades ago. He set out to create a complete set of software to make computers useful with few restrictions. By 1990, the GNU organization had a complete environment for a UNIX type of operating system except for the management of local resources. Linus Torvalds started the Linux project that by 1993 provided the last component. Today hundreds of GNU/Linux distributions of software meet every need of a hundred million users. The flexibility of this licensing makes it ideal to preserve the environment and to maximize benefits of IT to people.

The value proposition for GNU/Linux is amazing. For $0 (the price of a download) anyone can install Free Software on a PC and be free of malware, slowing down, licensing fees and restrictions on use and copying. This seems unnatural in the commercial world but it is not. It is an example of sharing, something that evey individual, family and organization does. It’s what makes us human. Some question the stability of a system based on Free Software but it is very robust. People, particularly young people love Free Software because it is a great aid in learning to program computers to see the work of experts. Since no licences need to be purchased, Free Software helps schools provide more software to students for all kinds of purposes far beyond word-processing: databases, web servers, file servers, and team-building activities, all which are expensive add-ons using Windows. The Debian GNU/Linux distro at http://www.debian.org/ is the largest distro in terms of contributors. It has 25000 software packages being shared by people and organizations from around the world. This represents $billions worth of software and it’s all designed to work optimally for the user and not the supplier.


The Green Party Should Promote Free Software

It’s not just about money or performance. It’s about every aspect of our society involved with IT. Free Software is consistent with the values of the Green Party. The very concept of sovereign government is threatened by global monopoly in IT. Several governments have embraced Free Software as an important plank in their platforms reaching for local control and local benefit from IT. Brazil, Venezuela, Paraguay, Russia, China, India, Malaysia, Cuba and others have all embraced Free Software for good reason. Many other countries include Free Software as an important tool: France, Germany, and Spain are prominent. In Canada several school divisions use GNU/Linux with great results. It is ideal to promote local production of IT rather than simply being consumers of IT. The Green Party should embrace Free Software.

Not only should the Green Party promote Free Software, but the Green Party should push for an end to monopoly by Microsoft and Intel on retail shelves. That monopoly was created by exclusive deals initially between IBM and Microsoft but then extended to manufacturers to build only Windows computers and finally to retailers to sell only Windows computers. Monopoly is never the best option in commerce. The world can make its own IT better than a single corporation. Ironically, Microsoft pushed IBM out of the PC business by squeezing value for Microsoft alone and now IBM is one of the biggest promoters of GNU/Linux in large businesses. They realized they created a monster and did something about it. The Green Party should see Microsoft for what it is and embrace Free Software in the best interests of Canadians to be free to use IT any way Canadians want. In particular, Canadians should be able to buy small cheap computers and they do when they are offered. Remember the netbook? Retailers were not able to keep GNU/Linux netbooks on the shelves until Microsoft forced manufacturers to install Windows XP on them. Now very few netbooks are sold. Microsoft killed a viable affordable form of IT suitable for many in our society and made it unavailable. That is unacceptable and the Green Party should stand up for the freedom to use the best technology for the job. Android/Linux is Free Software used on smart phones and tablets. More than 50% of smart phones use it and it is found on retail shelves. Microsoft’s system is on a few percent of smart phones because consumers don’t want it and the makers of smart phones are mostly not Microsoft’s partners in monopoly. There is no technical reason to shy away from Free Software and many valid reasons to embrace it.

About Robert Pogson

I am a retired teacher in Canada. I taught in the subject areas where I have worked for almost forty years: maths, physics, chemistry and computers. I love hunting, fishing, picking berries and mushrooms, too.
This entry was posted in technology. Bookmark the permalink.

52 Responses to Advice to the Green Party of Canada

  1. oiaohm wrote, “I work rural. So I don’t have the option all the time to order items in to solve problem that has to be solved today. If this means placing Libreoffice or linux in place so work can be done until orders are placed so be it. Most critical thing to me is that the job gets done.”

    There is a huge difference between how “city” folk and “country” folk operate. Non-free software often requires paperwork and payment that is just unproductive. It’s a waste. FLOSS permits one to me much more self-sufficient as long as there is a network. My wife thinks nothing of jumping in the car on the spur of the moment to buy something rather than thinking ahead, using an alternative or improvising yet it is natural to me to do something myself rather than depend on others. In my recent purchases, I saved about $100 by choosing simpler alternatives and it cost me nothing to make the change. For the motherboard, I accepted fewer memory slots and no ECC support to save at least $50 for Beast and I built one of the tools I needed for muzzle-loading from wood to save money and to be self-sufficient. My wife just spent $60 per lamp to change lamps that were just a few years old… It takes all kinds.

  2. oiaohm says:

    oldman
    (Network Manager, documentation Manager, System Administrator, Consultant)

    Problem is they are all valid job titles for work I do at different times. Just like fencing contractor is another valid title of mine.

    “You on the other hand seem more than willing to force fit any application request from your users into some for of Open source app (commercial or FOSS) on Linux, regardless of utility, function or feature.”

    Don’t you get it. I work rural. So I don’t have the option all the time to order items in to solve problem that has to be solved today. If this means placing Libreoffice or linux in place so work can be done until orders are placed so be it. Most critical thing to me is that the job gets done.

    Basically 1 percent functionality is better than 0 percent functionality oldman. I guess you don’t work with the possibility of supplies taking ages to turn up. So you never have to make do.

    “Application performance tuning whether the application in question is running on windows or linux is a black art. My experience says that the registry fragmentation “problem” if it truly exists at all, is at best only a small portion of the problem.”

    Problem here performance tuning is not a black art. Completely. Its not black art has testing. Locating problems and curing them.

    There are such things as over optimising this is where it becomes a black art. Registry lookups being slow due to fragmentation curing is not into the black art. The different between a black art optimisation and a non black art optimisation is that you can test if the optimisation is required. So registry regeneration is testable if you have done anything that will have improved performance by simply testing how fast the registry can be read. If it not faster you would have gained nothing.

    Registry regeneration does not remove a single key. So there is no possibility you have deleted something that will have to be replaced. Basically functionality ways the generated hive should be exactly the same as the one its replacing.

    Slowing down with age in Windows is a set of 3 common problems. Small and go unnoticed. But they grow and get worse.

    Same with going threw service and checking if the services are truly required for the current role and are not left overs. This is not black art.

    Same with sorting out too many log entries.

    Note 1 of the 3 I have listed as the core problems are black art cures.

    Random-ally delete reg key like a lot of so called reg cleaners do is very much a black art. Due to have no way to know if they are still required.

    Now here is the big thing. Reason why cleaning out the registry is a black art is there is no tracking of what requires what. Lack of package management.

    Linux is a complete different beast we can do lots more clean ups because we have better data to work with.

    Oldman have you ever bothered taking a image of a windows machine that is running slow and systematically working out what alterations are required to bring it back to health that can all be tested for. Answer is NO right.

    Problem is oldman for me the answer is yes. I have done this. I did track it down to 3 sortable out problems that deal with the worst offenders in systems that have been abused by stacks of applications installed removed and otherwise messed with.

    Would I be able to clean up more if Microsoft provide correct package management that tracked what registry keys owned to what application so uninstalling worked yes I would. Is there any major effect from this form of bloat. My testing in fact says no. Reason I inserted a stack of junk keys into a clean system intentionally.

    Most important thing was order of keys in the registry so they loaded effectively. Bloat does cause a minor effect but at worst it is less than 0.001 percent in performance metrics.

    Fragmentation at worst I do mean at worst where I have intentionally randomised the key order in the hives when I regenerated them can in fact almost grind the system to a halt. People have no clue how often applications are in fact looking up the registry. So yes it can for sure be one of the causes of problems people see that are slowdown.

    Yes I truly have benched this including generation of the worst possible that windows could create ever oldman.

    This is the problem to me your are incompetent oldman. Never have be taught the correct skill of locating optimisations. If the optimisation is real you should be able todo both directions just by inverting what you are fixing. And when you invert the fixing method you should really harm the performance. This way you know you have a location of a real possible optimisation.

    Maybe oldman I am “nasty thuggish knowitall.” because I a completely sick of people who say hey the problem does not exist and lack the skills to perform test cases to prove one way or the other.

    Claim optimising is a black art when it should not be is another thing I get sick of. This shows pure incompetence.

    Voodoo optimisation is a term I do use for optimisations that a person cannot prove. Cannot provide information to produce a test case. I have provided you with more than enough information to produce a test case if you used your brains oldman. So that I can independently test that it works.

    I am science oldman. All results have to be replicable. This is why I use configuration management so much.

    Simple fact here your are being a nasty “nasty thuggish knowitall.” oldman who really knows nothing about the topic who want to some how discredit the person who truly does know the topic. You have not tested what I have told you. You are just dismissing it out of hand you incompetent bit of work.

  3. oldman wrote, “nasty thuggish knowitall.”

    I like that. It describes many on M$’s side.

  4. oldman says:

    Actually the real problem I have with you Mr. oiaohm is much more fundamental than all this geek detail.

    I dont trust your word.

    You have poisoned the waters of your credibility with you Linux bias and your FOSS bigotry. The fact that you have presented you self in various roles (Network Manager, documentation Manager, System Administrator, Consultant) makes you look like your are pumping up your importance. You come off as a nasty thuggish knowitall.

    You say that you know windows “too well”. SO what!
    I know both linux and windows “too well”. On my less charitable days, I am ready to stuff both linux and windows into a culvert (to quote an old Jerry Pournelle line).

    The difference between us is that I have been trained by experience and circumstance to temper my biases and concentrate on the job at hand. I do not have the luxury as you apparently do to exercise my biases. I am more often than not required to design and specify to a pre done set of specification for a set of applications that can run on windows, linux or a combination of both. SO long as those applications conform or can be made to conform to our current institutional Standards of security and supportability, I will support them.

    You on the other hand seem more than willing to force fit any application request from your users into some for of Open source app (commercial or FOSS) on Linux, regardless of utility, function or feature.

    Application performance tuning whether the application in question is running on windows or linux is a black art. My experience says that the registry fragmentation “problem” if it truly exists at all, is at best only a small portion of the problem.

  5. oiaohm says:

    Oldman the issue is most people just format there machine and start over when it gets slow.

    Basically the fault is in the design and programmer instruction documents for every version of Windows NT 4.0 on.

    Oldman you should have heard people complain that there computer is slower after running any of the so called reg cleaner programs. This should tell you that those programs don’t work. You know they don’t work did you never bother finding out why.

    The issue is program design partly as well. Ok I want to change a reg value. I have two options.
    option 1.
    1) check for exist of key.
    2) if key exist modify existing value.
    3) if key does not exist create key.
    option 2
    1) delete key
    2) create new key with new value.
    Option 2 is lovely. Exactly what you really don’t want to be doing a hive file. Because what it does is makes the location where the key was free space. and could put the new key at end of registry and index.

    Guess how many programs are using option 2 its faster in the short term. If you read the Microsoft instructions to programs how you are meant to interface with registry you are not meant to do option 2 ever.

    Part of the problem why the windows computers are slowing down is bad coding.

    ERD style compaction was added in XP SP3 . Ie reason why XP SP2 and before registry just expands and expands and expands is not exactly key bloat. Free space in registry not being reclaimed so of course the computer was going to slow down because large the registry file is the longer to takes to search but most of the size was free space related not keys. Yep pointless regclean was going on a lot with XP SP2 and before. It was not really reducing the size of the hives.

    The thing is a lot of faults have been fixed. There is 1 left in the registry. Keys getting in one hell of a messy order. The effect of this normally will take a few years of operation really to start showing self badly if you have badly behaving programs. That most people will have at least a few badly behaving programs.

    The simple fact is in Microsoft eyes the problem should not be happening if you are using quality software oldman. So you report you get told that it software related damage so not their fault.

    Since all keys the application was going use should have been added in the install stage and removed in the uninstall stage. So they are all in a small section of the registry not spread out over it.

    But we all live in the real world we don’t always get the means to use quality software. Also some of Microsoft own coders don’t obey the coding instructions for windows registry so the hive don’t end up a mess.

    Its not google you need to be looking at. MS programming and internal design documents tell you clearly about this problem. And repeatedly advice you not todo particular things when interface with registry.

    You think what a reg cleaner is triggering. They are more often than not trigger option 2 that no coder is meant to be doing because programs are forced to replace the keys the reg cleaner incorrectly removed.

    1 more fix and the slow down from registry will be fixed completely oldman. Some system to sort the registry after it had people run bad applications or stupid registry cleaners triggering programs to stuff up registry.

    Also windows update adding new keys can be a factor.

    Microsoft stopped making a regclean program because they found this. So it been known for a fair while.

    Oldman you have nothing that does not prove it existance. You should have heard enough reports. Problem is you never investigated.

    Simple fact is here Oldman blame the user happens too much. “They installed crappy programs they run crappy programs so to fix machine I reinstalled.” So no one bar people like me bother finding out why in hell windows is slowing down on particular users.

    Why formatting becomes required to restore some computer to heath. When they have not been virus infected. I went digging. I found. I fixed. I did report.

    So yes I do 100 percent back you statement that Windows 7 is better than XP oldman. Just I am true about the fact there is a fault left MS could do better on.

    Yes its funny that the complete idea of registry key bloat is mostly a myth caused by free space being left in registry of XP sp2 and before.

    Oldman registry cleaners only exist because people find there computer running slow and attempt to use it to fix the performance problem. But anyone like me who monitors what those program do see that there are some that claim to sort out the registry when in fact all they are doing is making the hive files worse.

    Yes in my eyes registry cleaners are the worst con job ever done. The promise to improve your computer performance when in fact there operation is ruining your computer performance.

    Be truthful oldman you have seen the effects of the issue. This would be why you stopped using registy cleaning programs. You called them bad you never found out why. Lets just say I like a few questions answered when I have strange problems happening. who, what, where, why and how and I did until I can answer all 5.

    Are you are coder for Microsoft Windows Applications yes or no oldman. Remember I have been. I have done I lot of different jobs. So not just admin. not just coder. So I have read almost the complete cross section of Microsoft documentation on Windows.

    My hate of windows is partly because I know it too well.

  6. oldman says:

    “Oldman nothing I am talk about with windows slow down issue is Myth. I know exactly where the problems are and as you can see I can given enough time remember how to fix them. Now you being so much a Windows Admin you should have been poking fun at me for not remembering how to fix it.”

    For the record, my windows system administrator days are behind me. I do however keep up my skills.

    All you’ve given is a 12 year old article that describes a situation that “could” happen. You the assert problem is not fixed.

    Again, where is your citations that this is confirmed a known problem. I’ve been googling for such citations myself and all I find is a lot of self serving hoo-hah from vendors who want to sell me their registry cleaners.

    Sorry Mr. oiaohm, but you’ve proved nothing more than you know how to expend a lot of effort “fixing” a “problem” whose existence is still not proven.

  7. Compare that to the simplicity of using the file-system in GNU/Linux to keep track of configuration. What a fragile house of cards that other OS is. Thanks for the info. “Easy to use”, sheesh!

  8. oiaohm says:

    “Unless you can provide credible proof (MS technet postings) that acknowledge the alleged “bugs” that you talk about, I ‘m inclined to dismiss this as the kind techno geek urban legend that all to often passes for fact in pc tech.”

    oldman there is no technet postings because its written in the MS documentation about the internal of windows that is on technet. Why would you duplicate.

    http://technet.microsoft.com/en-us/library/cc750583.aspx
    “Finally, over time a Registry hive can fragment.”

    Openly and admitted fact. I can pull out documentation on windows 7 as well that this also also the case. Really the world should not be “can fragment” but “do fragment for most users”.

    I had forgot the exact way to fix it without reinstalling. Boot up with PE disk Dump hive to .reg file and create new hive from .reg file. None of the comerical reg cleaners do this and you cannot do this from inside running windows. Yes it would be really nice if you could say to windows hey the hive are a bit crapped regenerate on shutdown so I have clean hives for next boot.

    Process is horrible for a novice.

    I was remembering the cannot fix losing the bit that it was just cannot fix inside running Windows because the hives are in use and its past the skill of most novice users to fix.

    The instructions with ERD remove the free-space so increasing the performance but the keys are still recorded in the hive in a messy order. So all the applications keys that are in one folder in the registry may be spread all over the registry file so when an application trys to access all it keys it can force more blocks in memory being used to hold registry keys than what otherwise would have been require.

    Remember disk operations are the slowest operations you can be performing. So fragmented registry on average requires more disk operations than a non fragmented so causing performance issues.

    The method I just detailed is basically how to do a full defrag on the internals of a registry. The ERD instructions are how todo a part.

    But that are a windows tech you know nothing about registry fragmentation internally shows your incompetence oldman know your tools weaknesses.

    The three major causes of slowdown of windows can be managed. Again it require admitting they exist.

    Yes the article I have linked to is the oldest article reporting it oldman. It was first reported as a issue in Windows NT 4.0. It has not been fixed.

    Of course what I have just showed you is if you admit windows slowdown exists you can find the instructions to repair it.

    Simple set of 3. Windows is running slow do the following.
    1) Clear out no longer required left over services most likely eating your ram and cpu time so causing the performance issue you are suffering from.
    2) Clear the Event log system because this could be over full so making writing to the event log dead slow so causing your performance problem.
    3) Regenerate the hive files to remove fragmentation. (This is harder than it should be and too hard for novice users) As this could be causing your performance issue by making applications require more disk operations when accessing the registry for information.

    After that when you boot up machine will almost perform like a clean installed machine again. Of course you can have some excess registry bloat in there causing a few seconds extra on boot up.

    Claiming the fault does not exist means you cannot tell people how to fix it. Linux does have some of it own long term performance issues. Not many. Mostly running out of disk space because you never clear the logs and download cache of package manager ever.

    House cleaning of an OS. All OS’s in existence have require house keeping. Don’t perform it you will pay. Problem here is the Windows users don’t want to admit there is a requirement todo house keeping on Windows.

    By the way number 1 of the clean up process is number 1 on the Linux house keeping when a Linux server is under performing as well.

    Number 2 is also on the Linux house keeping list. Just generic clean up logs.

    There is a common clear caches. What is normally number 4. Most people do use bleachbit or something else equal to perform this. Ie people remember to do this house keeping.

    Oldman nothing I am talk about with windows slow down issue is Myth. I know exactly where the problems are and as you can see I can given enough time remember how to fix them. Now you being so much a Windows Admin you should have been poking fun at me for not remembering how to fix it.

    Not claiming that the fault does not exist.

    Indexing registry in Windows 7 internally uses kinda the same file format as hives with pointers along the file allowing for free space and fragmentation. Hive are like window fat they do not contain any fragmentation avoidance code either even in Windows 7 and 8.

    Registry and file-system issues are highly linked due to related design. Other than the fact lack of simple to use tools or auto-tools to correctly deal with the registry problem. Windows 7 auto file defrag is a auto-tool dealing with the problem of file-system.

  9. Here’s a guy who converted his small business to LTSP. The advantages for him were ease of maintenance, lower cost of operation, better performance, and cooler and quieter operation. He did a Munich-style conversion and did one person per month after his boss was satisfied.

    Here’s a healthcare business using LTSP to reduce costs. They have been using LTSP for a decade now and growing with it.

    Here’s a 200 user medical system using LTSP. It works.

  10. oldman says:

    “That’s not so. Many banks, financial institutions, libraries and the like find it quite useful. Those organizations are not into tech so they love that it just works and requires very little maintenance. ”

    Of course it’s so, pog. Again you are looking at the niche (I know LTSP made it into a few libraries) as far as banks are concerned, this is most likely a special case. LTSP is nowhere to be found in the general business computing population.

  11. oldman says:

    “IBM servesment ty0 general business customers, not just schools and they are happy to provide expertise setting up offices using thin client systems very similar to LTSP.”

    And mopst of those installes based on Citrix. Technology that either connects to a windows server running terminal services, or more recently VDI.

    Powerful systems are not needed to interface with a vdi environment. we have tested them on all size systems – management types especially like the fact that they can access a system running MS office from their iPads.

    Yes Pog, thin clients are becoming important, but what is being implemented are going into VDI based systems running on VMWare Citrix Xenserver, or microsoft Hyper-V.

    LTSP is a non starter for these people.,

  12. oldman says:

    “My opinion seems consistent with the market: in w3schools.com, Vista gets 5.6% of hits. In Wikipedia, it’s 11.23%. If it’s so wonderful, why are people not using it much compared to XP or “7″?”

    Irrelevant.

    Vista is now obsolete Pog. It is interesting to note that the “failed” OS has a bigger installed based than Linux on the desktop.

  13. My opinion seems consistent with the market: in w3schools.com, Vista gets 5.6% of hits. In Wikipedia, it’s 11.23%. If it’s so wonderful, why are people not using it much compared to XP or “7”?

  14. In my school, we did side-by-side tests of XP and GNU/Linux on 8 year old PCs versus Vista on a brand new 64bit dual-core machine and Vista was totally uncompetitive. People sued M$ over Vista if you recall they were that ticked off by it. People sued their employers over being forced to use it and employers refused to pay employees while it booted.

  15. oldman wrote, “LTSP has zero traction outside of the educational niche that it started in”

    That’s not so. Many banks, financial institutions, libraries and the like find it quite useful. Those organizations are not into tech so they love that it just works and requires very little maintenance.

    While Extremadura did involve schools, the government offices got the same treatment. A LTSP server was dropped in each room and ran all the software for the department. They installed 4000 LTSP servers one weekend and the workers started “cold” on Monday.

    see A Translation

    IBM serves general business customers, not just schools and they are happy to provide expertise setting up offices using thin client systems very similar to LTSP. Thin clients work for a wide range of situations, not just schools. VDI does not save power if powerful clients are involved. Many businesses are concerned about power, heating, cooling and maintenance. LTSP answers those issues very well.

    see http://desktops.cbronline.com/features/cbr_desktop_virtualisation_podcast_ibm_virtual_client_expert

    While VDI is discussed in that interview, simpler thin clients are also discussed.

  16. oldman says:

    “Vista was messing up stuff as simple as file-operations.”

    I ran vista SP1 on a vintage 2004 dell portable. file operations were fine. speed (it had was a 1.9ghz pentium M processor and 2Gb RAM was as good if not better than XP.

    This is fact which is better that your biased assumptions.

  17. oldman says:

    “Vista was messing up stuff as simple as file-operations.”

    I ran vista SP1 on a vintage 2004 dell portable

    “M$ can sometimes get things right but my default assumption is that they will not”

    You are entitled to your views Pog, but this will not change the fact that by my and most other people’s estimate your default assumption says more about your biases than what modern microsoft OS’s and products are about.

  18. I don’t know about Vista or “7” but M$ is known to produce a hell of a lot of bugs and take a lot of time fixing them because the software is so damned complex. Vista was messing up stuff as simple as file-operations. It is not hard to believe they would mess up indexing.

    see The Long Goodbye

    Have you forgotten stuff like that oldman? Last year, I converted my images of XP from FAT32 to NTFS and the indexing did seem to work well. M$ can sometimes get things right but my default assumption is that they will not. At about the same time I defragged the horribly fragmented file-systems and what should have been a trivial copying operation on the nearly empty 40gB hard drives became a very long task.

  19. oldman says:

    “To be correct in vista and 7 the indexing is buggy so yes each change is a roll of the dice if the indexing is going to be good or bad for performance because the indexing order gets fragmented.”

    Unless you can provide credible proof (MS technet postings) that acknowledge the alleged “bugs” that you talk about, I ‘m inclined to dismiss this as the kind techno geek urban legend that all to often passes for fact in pc tech.

  20. oldman says:

    “These folks are usually forced to install a router and network which is more arcane. Once they have a network, connecting the boxes is trivial. The whole system behaves as one easy to manage GNU/Linux box.”

    What are you talking about Pog? The installation of internet services is usually set up by the cable provider these days. In this situation, the only tasks that the home user needed to perform, was setting the wireless access key (under the guidance of the installer) and to make note of how they were to sign on to the router to change the wireless key.

    Thats it.

    Even in the case where I upgraded my wireless router, a CD based installation process woas provided that walked me through the installation process so smoothly that I would have had no problems even if I wasn’t experienced.

    If seen the canned edubuntu installs of LTSP. they only make life simpler to point, and that point is nowhere near the level your average non geek/gear head home user expects.

    And this does not even begin to address the fact that one is stuck in the linux desktop ghetto using FOSS only.

    “LTSP is widely used in schools, offices, libraries, any place a bunch of clients are on the LAN.”

    LTSP has zero traction outside of the educational niche that it started in. And any case possibility of its being more widely used is being lessened as organizations move to more flexible VDI based solutions.

  21. LTSP clients behind a terminal server may show up as a single client in web stats (single IP address using the same UserAgent string) so they may be undercounting usage. LTSP is widely used in schools, offices, libraries, any place a bunch of clients are on the LAN.

  22. oiaohm wrote, “the not practical for typical home user is a partly valid argument.”

    The typical home user does struggle with new ideas like IP addresses, but they do install routers and take the default network of 192.168.0.* and it works for them. Distros like Ubuntu used to make installing an LTSP server a trivial click at installation time and the box needed a second NIC. If you install LTSP using the package manager on a machine with one NIC it will work and all they have to do is turn off the DHCP server on the router. It’s doable. It amounts to using the PC as part of the router. I made a video of it. That’s way more complicated than installing LTSP on a normal thick client because the default networking would be correct so there’s no need to edit dhcpd.conf. It takes half an hour even on a slowish machine.

  23. Contrarian says:

    “The whole system behaves as one easy to manage GNU/Linux box.”

    So many people are doing that today that Linux has risen to account for almost 0.6% of the traffic on popular websites. What’s next?

  24. Nope. These folks are usually forced to install a router and network which is more arcane. Once they have a network, connecting the boxes is trivial. The whole system behaves as one easy to manage GNU/Linux box.

  25. oldman says:

    “Many homes go from one to a few PCs for the growing family. Instead of buying 3 new PCs, buy 1 and get better performance for all for a lower price.”

    So individual users are going to go from multiple easy to use systems to become sysadmins to a single linux box running an arcane terminal system. This is geek crap Pog.

    Get real!

  26. oiaohm says:

    Ivan there is a problem. 1 LTSP server can handle quite a few clients.

    Some of the old dependable hardware has only 100 watt power supplies in them. Really the don’t draw that much either. Reason the don’t have heat generator video card in them either.

    Sometimes 1 LTSP with 5 clients built out of best of the bad old machine is drawing as much as placing 2 full modern units.

    Of course you would be better off to be using arm thin terminal clients that pull 1 watt each and just crush the crap out of those old clunkers but that is not always and option.

    If you are worried about power bill thinking 1 what arm machines are about 100 USD each for the case. So you buy about 5 for the price of 1 standard cheep PC case. Price of 2 computers you are seating 5 users and you are way ahead on power.

    Ivan the not practical for typical home user is a partly valid argument.

    The power usage argument is invalid sorry Ivan. You should have taken note about what I said lasted colder the better then thought what kinds of machines that would have been.

    Really it would be a long debate if we go into the nitty gritty of what would the ideal for power and everything else computer for a home user.

  27. Many homes go from one to a few PCs for the growing family. Instead of buying 3 new PCs, buy 1 and get better performance for all for a lower price.

  28. Ivan says:

    “His new PC. He adds LTSP-server-standalone to it and he is in business.”

    So instead of recycling the old computer, just connect it to the new one. Increasing the electric bill.

    That is not very practical for the typical user.

  29. oiaohm says:

    oldman when was that 8 year old Dell reformatted.

    PageDefrag defragments the registry on the disk surface not the internals of the registry. This is a bigger problem than internal registry fragmentation that hit XP but does not exist in Vista and 7.

    Note I am not talking about registry bloat. To be correct running registry cleaners the way the current do operate. Plays into the type of fragmentation I am talking about. To be correct in vista and 7 the indexing is buggy so yes each change is a roll of the dice if the indexing is going to be good or bad for performance because the indexing order gets fragmented. Also you have to wonder why they had to add indexing. The data in the store is still ending up all over the place inside the store.

    There is no true defrag tool on windows for the registry hive internals. There really needs to be one. No third party makes one. Basically that you think registry cleaners is what I am talking about shows your error.

    The fragmentation fault is one clear reason not to use registry cleaners because the registry bloat is less harmful than the fragmentation inside the hives and indexs. Ie remove bloat create fragmenation inside the hives you have gone backwards.

    “Are we talking about event manager entries here? Assuming that, the only time that I have ever seen a problem is when an application dumps so many entries into the event logs that they fill up. This is fixable via group policy, so is a non issue.”

    Exactly the logs are not a major issue if people are aware what they have todo. If you computer is slowing down check that you event manager logs are not blown out to hell. Event logs by factor default will grow and grow and grow until you system slows down. This is a change you have todo to windows to maintain performance. Oldman you and me do this. Lot of people who complain about windows being a slow snail don’t.

    Note I am not talking about disk fragmentation at all. On large harddrives running NTFS and windows 7 most of the time running a defrag tool is not required because Windows automatically load up a disk defrag program so dealing with the file system fragmentation.

  30. His new PC. He adds LTSP-server-standalone to it and he is in business.

  31. oldman says:

    “A few minutes spent reconfiguring a PC to be a thin client compared to the hour or longer it might take to go to a store to buy a new one is very efficient use of time. I recommend it.”

    And what pray tell does you average home user connect his thin client to?

  32. oldman says:

    “Is that an admission that XP has those problems?”

    It is an observation that you continue to talk about obsolete versions of windows, and that you you want to talk you need to look at what the currently shipping versions of windows are capable of.

    Better still. My 8 year old 3ghz Dell with 4Gb of RAM shows none of these problems. That is my observation.

  33. Is that an admission that XP has those problems?

  34. oldman says:

    “Time the login to a usable desktop on Day One and a year later and tell us that. I have seen a PC running XP that took five minutes to respond to a click.”

    Talk to me about windows 7 and then we will go somewhere Pog.

  35. oldman wrote, “I have NEVER had to deal with any of the slowdowns on desktops that oiaohm details.”

    Get a watch oldman. Time the login to a usable desktop on Day One and a year later and tell us that. I have seen a PC running XP that took five minutes to respond to a click. Where I last worked there was an XP SP1 machine off-the-LAN that ran many fewer processes than XP SP3. It was 28 if I recall correctly. It was noticeably faster than the updated machines which ran 45-55 processes.

  36. Ivan wrote, “Do you really expect people that are working two jobs to make rent and buy food to take the time to turn an old computer into a thin client?”

  37. Hit Del etc. on boot to get into BIOS Setup
  38. Change booting option to include network booting/PXE as the first choice or default
  39. Save changes and exit
  40. I’ve done a lot of that. With practice it takes a little over a minute and perhaps another minute to test the result. If you do other things on the visit, like setting a BIOS password it can take a few minutes per PC. Most older PCs can boot PXE in 30-45s. Other options could be to unplug the unused drives to save a bit of power.

    A few minutes spent reconfiguring a PC to be a thin client compared to the hour or longer it might take to go to a store to buy a new one is very efficient use of time. I recommend it.