1935

That’s the title, not an error message. It’s like Orwell’s 1984.

Users of that other OS are getting 1935 as an error message and it means that other OS broke itself again and must be re-installed. It’s fairly common that that other OS breaks itself although I’ve never seen it with GNU/Linux. I have re-installed GNU/Linux a couple of times when I broke the system doing risky brain-surgery.

When an OS breaks itself while users are doing perfectly normal things, you should not rely on that OS for your IT. Use Free/Libre Open Source Software. It works for you, not against you.

About Robert Pogson

I am a retired teacher in Canada. I taught in the subject areas where I have worked for almost forty years: maths, physics, chemistry and computers. I love hunting, fishing, picking berries and mushrooms, too.
This entry was posted in technology. Bookmark the permalink.

54 Responses to 1935

  1. twitter says:

    I may have glossed over the details when talking about backups, but I’m surely aware of how easy it is to move and duplicate gnu/linux images. As Pogson noted, recent hardware dependent changes have made moving things a little more difficult. The use of uuid in grub2 and networking are a pain that has to be worked around with scripts that are easy enough to write. Compensating for that is the tremendous gains in auto configuration that have been made. As Ohm notes, you can take default installs and move x86 drives without no further effort to just about any x86 computer. Other huge wins for gnu/linux are file system and network efficiency which make backups both trivial and less necessary. Windows remains a bad joke in all of these areas unless you virtualize it and run images on gnu/linux.

  2. Nonsense. It’s a GUI and lots are using XP or 2003 versions, not even 2007. Experience with either of those will be just as useless for 2007+.

  3. Contrarian says:

    and when two people show up to interview and one is familiar with MS Office and the other is not, due to the lack of conventional experience, the former is hired and the latter remains on the dole.

  4. oldman wrote, “what counts is that in an institution where IT is regularly and permanently shortchanged.”

    The school at Easterville has the IT system other schools envy. It’s fast, resourceful (servers actually do useful work like serving databases and accumulating and indexing information instead of merely tracking passwords), and flexible. They have tons of software available without needing any non-Free stuff. The curriculum spells out the objectives and enriching software businesses is not one of the objectives. Most of the schools in the region can only afford 1 PC per classroom and 1 computer lab. Those guys have a cluster of PCs in every room, an abundance of printers and a lab all for less money per PC than they could have with that other OS. By the numbers they are far ahead. 2 PCs with any OS are better for education than 1 PC with that other OS.

  5. oldman says:

    “My students had the fastest PCs in the building even though they were 8 years old. Beat the hell out of XP on the same machines or that other OS on a new machine.”

    In the end, what counts is that in an institution where IT is regularly and permanently shortchanged, a solution was found that allows learning to take place.

    I would have done it differently,and I definitely would have worked to preserve their option to use commercial software if needed, but in the end I cant disagree with the results you got.

  6. Nonsense. Students learn a lot in my labs: hardware, software, maintenance, installation, programming, graphics, audio/video,…

    The moves may pinch pennies but also optimize performance. My students had the fastest PCs in the building even though they were 8 years old. Beat the hell out of XP on the same machines or that other OS on a new machine.

  7. Contrarian says:

    “Consider a single computer lab, say with 24 PCs”

    It seems wrong to even suggest that you have a goal to maximize the productivity of the computers in such a lab. Computer labs are created so students can learn to use computers and popular software, at their own pace. Giving them some sort of terminal connected to obscure FOSS applications on a server instead is not what the taxpayer ordered, I think. The students are not equipped to use any such experience in the real world and so you do them all a disservice with such penny pinching moves.

  8. Scripting overcomes that partially and it is sometimes necessary in GNU/Linux. I imaged a system and found that the system recorded NIC MACs so that interfaces were renamed on target machines. Also the name of the machine needs to be changed. That can be done with DHCP. Other than that GNU/Linux gets a big +1 for portability of images. I have moved the same image onto 3 quite different systems with GNU/Linux when I needed a separate image for each with that other OS. Normally that is not a problem but the particular server I was using for images had only 40gB of storage and I could not retain many backups.

  9. oiaohm says:

    Contrarian the big point twitter and others miss. Is that Linux system image backups as long as you have not stupidly altered them from defaults or at least left a default Linux kernel there. Can normally be restored on any hardware compatible.

    Now windows due to product activation bricks self.

    Linux is simpler to backup, restore and simpler to mass image. All due to missing 1 feature Product Activation.

    Most key thing to business is downtime. Due to Linux broad hardware tolerance to be restored on down time is less. Due to the fact applications and the like normally don’t have to be reinstalled even if you have completely replaced everything bar the harddrive.

    Worst is a little bit of network setting tweaking and maybe removing the closed source video driver and installing the right video card driver. Insanely minor compared to restoring windows.

  10. twitter says:

    The overwhelming use of gnu/linux in businesses that live or die based on uptime and data reliability are good evidence that there are plenty of “Enterprise” data redundancy tools out there. Besides Yahoo, Google, Twitter and so on and so forth, the vast majority of high performance computing clusters use some kind of gnu/linux. It should be obvious that gnu/linux people mastered the art of automatically configuring and running networked machines more than a decade ago, without expensive tools.

    On the desktop, users have plenty of options. My favorites are rsync and grsync, which provide zero worry incremental, encrypted network or local file synchronization. The nifty thing about gnu/linux is that those backups, while cheap and easy to make, I have not needed them in over a decade of gnu/linux use. The hardware I run is remarkable only for how old and crummy it is.

    Windows users alone need to worry about routine software failure sucking their life and data. Microsoft routinely proves that organizations of any size suffer the same problems. One of the latest examples was the failout of their Danger service. Older examples include Deepwater Horizon and the London Stock Exchange. Keep up the good work in your alternate reality, Contrarian.

  11. IT can be done by individuals and organizations of all kinds. There is no need for a business to be involved to do IT. Information technology is finding, creating, storing and presenting information. Anyone can do that on any scale.

    Consider a single computer lab, say with 24 PCs. Most labs are set up with thick clients, say 5000 MIPS each. I could let them idle running a GUI at 2% utilization or I could run 24 parallel processes to do some real computing. Or I could run 24 thin clients at 1gHz and keep a server busier. You bet your sweet bippee I am aware of the reality of IT. I just tend not to waste IT.

  12. Contrarian says:

    “Why should we be?”

    In order to gain an understanding of the reality of IT, #pogson.

  13. Some of us are not involved in “enterprise” whatever that word means to you. Why should we be?

    “To undertake an enterprise, or something hazardous or
    difficult. [R.] –Pope.
    [1913 Webster]”

  14. Contrarian says:

    “With GNU/Linux there are many options for backup….”

    Rather simplistic, #pogson. Does the term “Enterprise” ring any bells with you?

  15. With GNU/Linux there are many options for backup. One that I like is to backup the /home directories and /etc but just keep the package lists for APT to restore. With a local repository that saves many gigabytes of storage on the whole system. I also backup the partition table if it is complicated or RAIDed. Then there’s good old dd and Clonezilla. Another trick you can do with RAID 1 arrays is to pull a drive and put it on the shelf. The RAID can rebuild itself instead of making a backup. It’s good to have choices. You can use what’s best for the occasion.

  16. Contrarian says:

    “Simply put, a well designed OS shouldn’t require one to mess with restore disks on a personal desktop, barring hardware failure. LINUX has meet that requirement for a long time”

    Are you an OS designer of some renown, #oe, such that we can take your word at face value?

    What is the backup app for Linux that you refer to? All my experience with those systems is where IT admins use VERITAS backup with Linux, the same as they did with their Unix servers. VERITAS is rather expensive, too, but it was the standard almost everywhere. Even for Windows servers.

  17. oiaohm says:

    rmdir /q /s is in Vista and 7 and is the replacement to deltree. Linux Apostate that is why deltree is no more.

    Funny enough from the command line on Vista and 7 you can format disk without having UAC turn up and ask you if you want to format disk. Yes the old command line format still works.

    Yes one of the way to bipass the UAC is open up a command prompt running as administrator.

    Highest privilege user in Windows is System. Things you can do at that level are quite fun like reseting every password in a ADS without a password.

    Selinux restriction rules even su or sudo does not remove them. Its a role change.

    Linux secuirty modules are high privilage is linux. And you cannot log in as them. Only provide them with advice.

    Sane and experienced way under Linux is also to avoid dd and other items without wrappers or without adding extra prevention tech most cases.

    Most windows users don’t know the direct paths. Like the raw UNC path for a hard drive or usb key inside windows. They do exist. Damage they can do is massive.

    Over years Microsoft has made it harder and harder to get to “\\.\” of NT. Early NT is was fairly simple to get to. This is the Windows NT path equal to Linux-Unix “/” . All your c: d: so on are just links to something in the hidden tree under “\\.\”. Yes Linux and NT design is not 100 percent different they appear more different due to what is hidden on the Windows side.

    Yes if you know what you are doing you can alter setting in windows to allow rmdir /q /s to work from “\\.\” resulting in by by everything. Also if you know the harddive device path under windows raw access is just open up a file.

    Basically Linux has not hidden its internals. Linux Apostate. So yes it allows you run with knife in hand. Is it wise todo that no.

    Does not block you from using safer methods.

    Linux way allows both safe and unsafe. At times the unsafe is required. Unsafe can be made safe 2 ways. 1 use a tool. 2 setup Linux secuirty system to prevent particular operations. Number 2 is tell selinux that no one is allowed raw alter OS drives except kernel file system drivers. If number 2 the dd you did would never worked.

    Same can be done for rm -rf / I have lot of directories off limits to delete or modification when I generally go sudo or su to root. I have selinux role for editing the /etc directories and other directories I need to modify. I must switch to role for the directory I want to change. Other wise any attempt will cause selinux to terminate the attempt.

    Foot Shotgun and Linux normally is the result of poor secuirty setup.

    Basically I don’t like Ubuntu because the default security is not good enough.

    Linux can be made many times more idiot safe if the secuirty systems Linux kernel provides is used.

  18. Linux Apostate says:

    I don’t know about deltree, because it’s not in Vista or 7, but certainly for format and unmount you have to use the equivalent of sudo first. Even logged in as an administrator, you have to do that for a fixed disk. There’s a level of privilege above administrator.

    Like you say, the Windows way is to get a tool that does what you want, rather than trying to do it with a Swiss Army Knife/Foot Shotgun such as dd.

  19. oiaohm says:

    Linux Apostate windows hides the I will kill you path.

    Little out of date Robert Pogson.

    rmdir /q /s is the Windows 7 by by everything. Its the mirror command on windows to rm -rf .

    There is a device path where all harddrive and all devices are located under windows Linux Apostate. You do a rmdir /q /s on that and result is exactly the same as doing a rm -rf / as root on Linux.

    There are graphical tools for coping iso images to usb keys for Linux. UNetbootin comes to mind but there are others.

    There are reason why I set up extra rules in selinux to I have to switch to specialist mode todo particular things. Yes selinux can block deletes in system core and dd over the boot drives.

    Really I wish distributions would engage some of these protects.

    “But then Windows doesn’t make it easy to copy a raw ISO image onto a USB stick, so swings and roundabouts, I suppose.”

    In fact if you know where the device directory is under windows you can copy a raw iso image to a USB stick with a copy command of course you have to unmount the filesystem first. Result is fun. 1 the data copied to the key. 2 Windows bluescreen of death after the copy is complete because the drive now looks different.

    Yes highly creative. Linux is a little more picky and you have to use a special command to do it.

    Linux Apostate basically when you know what windows can really do lot of the claims against Linux are a laugh. Someone pulled someones leg and they fell for it.

  20. format c:?
    cd c:/
    deltree *?

    And then there’s good old regedit…

  21. Linux Apostate says:

    oe, I thought so. I thought I’d never make a serious mistake while Linuxing. “rm -rf /”, no way. Then one day I nuked some poor chap’s partition table by typing the wrong target for dd. /dev/sda (HDD) or /dev/sdb (USB stick): it’s such an easy typo, and yet with such devastating consequences if you get it wrong. You would really have to struggle to make the same error in Windows. But then Windows doesn’t make it easy to copy a raw ISO image onto a USB stick, so swings and roundabouts, I suppose. I learned many lessons that day.

  22. oe says:

    Simply put, a well designed OS shouldn’t require one to mess with restore disks on a personal desktop, barring hardware failure. LINUX has meet that requirement for a long time. Not so for Redmond’s product.

  23. Backing up on the single hard drive of a PC is useless if the hard drive dies. They are pretty reliable but there is of the order of 1% per annum failure rate. That may or may not be acceptable depending on how valuable time and data are.

  24. Contrarian says:

    “That works only if the user has gigabytes of storage available.”

    Well, duh! I have this $359 Dell Inspiron that I am using right now. It has some 417GB of empty disk space. You may have been playing with these too small and cheap computers too long and have not noticed that lots of memory is left over these days.

    “Though I understand the successor is able to create a full system image, this option was not obvious,”

    I think that is the default condition, but I may be wrong.

  25. Linux Apostate says:

    I don’t think much of NTBackup or its successor Windows Backup & Restore. Though I understand the successor is able to create a full system image, this option was not obvious, and nor was the capability for incremental backups which are very important. I also don’t know how you restore – presumably you have to reinstall first? Where is the live CD restore option? Has it been deliberately nerfed to avoid an anti-trust lawsuit, or sell a more complete product, I wonder.

    Whatever backup software you use, you really need to have a removable hard disk to store the data. These are really cheap nowadays and easily attached via USB.

  26. twitter says:

    I love the parade of Microsoft damage control here. Whenever there’s an obvious problem and Windows quits working in a predictable way that’s sure to burn millions of people, the Microsoft boosters blame the user and pretend no one else has the problem. It’s been this way forever and it’s part of Microsoft’s Technical Evangelism, they always blame someone else for Windows failures. They want you to ignore your experience and that of your peers.

    Sadly but predictably, those failures are common. Every version of Windows has them and everyone has seen them. At times, they have taken down whole networks at Fortune 100 companies. Microsoft boosters pretend they don’t exist or that gnu/linux is just as poor. What a bad joke.

  27. That works only if the user has gigabytes of storage available. Most users of that other OS have no idea what a gigabyte is. Backing up the registry is not sufficient.

  28. Contrarian says:

    FYI, in case you are interested:

    http://en.wikipedia.org/wiki/NTBackup

  29. Contrarian says:

    “M$ provides “free” software but no backup/restore capability.”

    There is a complete backup applet that comes with Windows, #pogson. It was derived from Symantec’s (then Seagate Software) Backup Exec product by Seagate under contract and first appeared in NT4 and Win98. One button backup for everything in the registry and personal data areas unless you customize it for specific directories and/or files. Backup to disk, tape, or even network locations.

  30. Linux Apostate says:

    That’s poor planning by the previous owners of the machines, or by the previous IT guy (if any), and it sucks. I sympathise. You should have been given all of the installation media, or at the very least, the license keys that you would need to reinstall from a new copy. But I guess these are second-hand machines, the budget is tight, and so you have little option other than the one you took. For what it’s worth you clearly made the best of a bad situation.

    Restore discs and reimaging are viable options for typical users though.

  31. Not always. Most of the systems I have seen in schools come with nothing at all. M$ provides “free” software but no backup/restore capability. If schools make no image they have no backup. I tried last year to image that other OS but the malware was already spread. It was futile so I switched to GNU/Linux which worked like a charm.

  32. Linux Apostate says:

    But there’s the thing, a restore disc is provided by the manufacturer.

  33. Linux Apostate wrote, “So the user does not need to reinstall anything. It really is sufficient for the user to just back up their own files.”

    It did not work in this case. Malware can infect/sabotage local backups so they are not particularly safe. Hardware failures can clobber local backups so they are not as effective as a remote backup, although more convenient/faster to restore. I do not know any normal users of that other OS who know to backup anything but their personal files. That’s reality. An OS has to be much more rugged than that other OS. It has too many modes of failure: malware, registry corruption, fragmentation, failure to boot, messed up drivers, and foolish dependence on M$ to maintain systems.

  34. Linux Apostate says:

    Regarding backups, if you buy a preinstalled Windows machine it will almost certainly come with a restore disc or a restore partition which can be used to reimage the system to its factory settings. So the user does not need to reinstall anything. It really is sufficient for the user to just back up their own files.

    Generally, Windows is not painful. Treat your Windows machine as if it were Linux! If you know how to do something in Linux, but not in Windows, then the answer is research. You almost certainly can do it, you just have to figure out how. There’s a huge community for Windows, and lots of information online.

  35. oiaohm wrote, “If you don’t want to fell the guilt Robert Pogson improve you secuirty policy design.”

    Hey, I was just a one-man part-time IT department. I did everything manually or with a few simple scripts. If it were a full-time occupation, I could fancy things up. Unfortunately there’s no way to fix things after I have left a place where the keys were generated. Those weak code generators were active for months.

  36. oldman says:

    “Robert Pogson did not bother me too much. Since I have the process for key change overs pretty much automated.

    Every 6 to 18 months we redo the keys. Openssh issue just meant the key change over process had to happen sooner.”

    This is all fine and good, but it is ultimately irrelevant what your practice is in this case, Mr. oiaohm. The idiot who did this is IMHO emblematic of the lack of change control in a “pure” ie. Non commercial effort like debian in particular and as far as I am concerned of open source in general. Nobody should have had the right to commit changes without proper QA’ing, period.

    Perhaps you may have felt this was a non issue, but this little glitch sh-t canned a project that I was working to introduce a particular open source appliance that happened to be based on Debian. The minute my management found out, they nixed it from even consideration. I have since gone on to a similar appliance which while it is not a capable in function isand feature, is “good enough, and more importantly, is not based on Debian (in fact it isnt even based on Linux at all but on on freeBSD).

    Perhaps you feel that you cen bet the form on hackers crap like Debian, but I’ll stick with Red Hat thank you.

  37. oiaohm says:

    Robert Pogson did not bother me too much. Since I have the process for key change overs pretty much automated.

    Every 6 to 18 months we redo the keys. Openssh issue just meant the key change over process had to happen sooner.

    Lesson to take from the openssh is have change over tools on hand.

    I have had cases of cleaning up systems after rogue operators. So yes you have to presume the rogue has the master keys and will use them.

    .ssh/authorized_keys in all user accounts as part of the secuirty policy does not get backed up here. Also can be deleted at a moments notice. There are other regenerate items that are also forbin to be backed up.

    Its all down to the quality of your secuirty policy how much the openssh effected you. Parties like me with quite strict and quite good policies it was nothing more than a minor annoyance making sure the new package was pushed out to everything then forcing a key clear. The key clear happens part of normal operations of the secuirty policy.

    Yes the big problem was the package would generate bad keys. So a single machine with it left installed would undermine the key clear minor-ally.

    To be sure we ended up doing 2 key clears 1 when we thought it was deployed everywhere. 1 when we were 100 percent sure(yes we had missed a machine). 1 extra disruption compared to a general key turn over.

    I felt no guilt over the issue. If you don’t want to fell the guilt Robert Pogson improve you secuirty policy design.

    Main keys for likes of webserver and the like also in the policy I work with are backed up independent to the data. This is all about preventing pulling old backups and setting up broken encryption without knowing about it.

  38. To be fair, the openSSH vulnerability was particularly insidious because people like me had it embedded in backups and images and it was a lot of work to hunt them down. Debian did make newer packages do some hunting, even identifying weak keys in users’ accounts, but still some came back to haunt us from backups and .iso files kicking around. It was a nightmare especially if one had a ton of servers/PCs using the weak keys. Not only did the keys need to be replaced but all the places those keys were copied to had to be hunted down (.ssh/authorized_keys files). It was pretty bad… I have moved around a lot and had no idea how many weak keys I left behind or whether anyone was fixing them. It was a personal burden of guilt. 🙁

  39. oiaohm says:

    oldman you are foolish. All the high positions in debian are held by people employees to be there by many hardware makers.

    Even the openssh developers adminited there test cases were not complete enough at that time as well. Yes the optimisation setting with current day openssh source code will not cause bad side effects since the code will refuse to build.

    oldman Really the openssh error is not were near as bad as taking 15 years to proper fix the ping of death attack.

    Yes the hack fix ms did was that you could not be pinged by anyone not in your workgroup. But anyone in your workgroup up to the last fixes for Windows 7 could crash your computer or take control of it.

    Openssh error was fully fixed quite quickly in compare. There are a lot of other errors in windows that have just been masked over and not proper repaired.

  40. IT is like travel by air. It’s the best way to do things but sometimes things go wrong. The OpenSSH thing was a fiasco. A creative solution to a problem brought disaster because of a breakdown in communications. It happens. No software is bug-free but I will take the occasional bug from Debian GNU/Linux instead of the steady stream of critical bug fixes needed to keep that other OS behind the bad guys.

    Either way you place your trust in strangers. I would rather trust strangers who share my ideas about cooperation than folks who admittedly are out to get me. M$ has the stated purpose of making people dependent on them by lock-in. That’s not the right way to do IT.

  41. oldman says:

    “While some claim Ubuntu is the most user-friendly OS, it’s not. For installations Debian GNU/Linux works much more reliably because they release “when it’s ready”. ”

    Yeah right just like they did their “improvements” to OpenSSH. I am sure that you will protest that this was an aberation – Please spare me. As far as I am concerned, this is exactly the kind of crap that happens in a volunteer effort.

    My issues aside, the bottom line with debian is that While it may indeed have improved Debian remains a distribution of the community by the community.
    The fact that you found you way around it relatively easily says more about your abilities than the quality of the distribution for a newbie.

  42. While some claim Ubuntu is the most user-friendly OS, it’s not. For installations Debian GNU/Linux works much more reliably because they release “when it’s ready”. I like that. Folks can read on Distrowatch.com and pick out an OS or read reports or watch the installation on Youtube. Still, it’s a small fraction of humanity that bother to install an OS. Using that other OS that invites malware in or sloshes its registry or is just too complicated to be manageable is a mistake.

  43. I was a total newbie the first time I installed GNU/Linux. It was about a year after I saw a guy do an install from floppy. I was not even paying much attention because it was not of interest. When I did it myself out of necessity, it was easy. In those days I had to edit a configuration file. I can type/edit. It was no problem. Installing is much easier these days. I teach students to do it. I show them once and turn them loose. No student has ever been stumped. If they don’t know the response, they just push enter usually…

  44. That’s a restoration from backup something that is beyond the casual user. I have never met a user of that other OS who did anything more than copy their personal files to CD or USB drive. The installation process from scratch for that other OS is so beyond what people know that they actually pay people $100 to do it. I have installed that other OS a few times and it was always an adventure in frustration whether it was wait, please wait, re-re-reboot, drivers or the thing just failing to boot. GNU/Linux, OTOH is just copying files. It works.

  45. oldman says:

    “When an OS breaks itself while users are doing perfectly normal things, you should not rely on that OS for your IT. Use Free/Libre Open Source Software. It works for you, not against you.”

    Unless you are an ubuntu user and some part of your installation breaks on the next upgrade. It seems to me taht THAT is more of a works against you moment than the one that you post.

    Really Pog, it is quit eleme to think that someone is going to change out all their software based on a problem.

    Office 2010 went on to my system without event.

  46. Linux Apostate says:

    Reinstall of Windows is easy for me. Insert backup software live CD, attach external hard disk, wait 30 minutes. Mind you, I had the exact same arrangement for Linux.

  47. Contrarian says:

    “He was in Catch-22.”

    But he was but one in 10 million. No one else has come close to that situation. Many more people have more trouble with Linux.

  48. HAHA! Clever, but foolish. The re-install of GNU/Linux is far easier than that other OS and this guy did his best to keep that other OS running. He was in Catch-22.

  49. Linux Apostate says:

    The article reminds me of the “We tried installing Linux in our office” article, which is produced by tech journalists every so often. These articles always have two notable characteristics:

    1. The article concludes with “Linux is great, but we’re not ready to switch over just yet”.

    2. Something goes wrong, but a helpful expert from Canonical puts hours of time into fixing it, and then it works. Mostly.

    14-16 hours of effort is not atypical for this sort of article. And nor is spending most of that time talking to an expert engineer whose manager doesn’t want an unfavourable report in the press.

    Really, this chap is just enjoying the Linux install experience without leaving the comfort of his Windows environment. And the remedy is the same – reinstall Windows.

  50. Contrarian says:

    “the best he could come up with was re-installing the whole mess”

    There was probably something wrong with his installation, #pogson. An obscure anomaly at the worst. Out of the hundreds of millions of MS Office installs on Windows 7, a very few have come to such difficulty. That is hardly remarkable.

    I daresay I could go to the Linux forums and cherry-pick hundreds of sad tales of failed Linux installation attempts that drove the user to completely abandon the effort and vow to never return. That is no more indicative of the common experience with Linux, though, than this anecdote.

  51. A guy spent three 8 hour days on the problem and the best he could come up with was re-installing the whole mess. He even phoned M$…

    “Diligently, I addressed these issues one by one. I uninstalled .NET Framework 4, Silverlight, and Visual C++ using the control panel and “FixIt” script tools Microsoft provides to remove vestiges of these services from the registry. I verified that TrustedInstaller.exe was in its expected home (it was). I turned off all non-essential, non-Microsoft services (a “clean” boot). I disabled active monitoring in my security software. I tried first a clean boot, then a boot to safe mode.

    I probably spent 14-16 hours on this “project”, including a wasted weekend day, between digging around online and implementing the suggestions.

    At the end of the day I still returned to the same old thing — the dreaded “Error 1935”. “

  52. Contrarian says:

    The link is incorrect, #pogson. It is not very clear just what you are on about.

  53. JairJy says:

    First, your link are incorrect. Second, the solution is very simple: reinstalling Net. Framework.

Leave a Reply