Ho Hum. Yet Another Organization Saves With FLOSS

For years the sycophants of M$ and “partners” have told the world, at least anyone that would listen, that FLOSS costs more in the long run…“Open source gives the university more features, more flexibility and lower costs. Next year the costs will already be 30 per cent lower and after five or six years, the difference with the proprietary system will be 70 to 75 per cent.

We are being approached by many public administrations from large central government institutions and municipalities. They do the math and they see the enormous financial gains that are possible.”
Meanwhile, in the real world, folks can do the maths and save a huge chunk of the cost of IT, not just licences, but maintenance, flexibility, performance, … Everything is better with FLOSS. In my own experience with desktop and server systems, schools where I worked broke even on costs of migration almost instantly as the licensing was a huge fraction of the capital cost and we used that saving to get almost twice as much IT for the same money. Maintenance dropped by a huge factor because the distributions managed most of our updating/upgrading for hardly any cost besides deciding to upgrade.

Use FLOSS and GNU/Linux. That’s the right way to do IT.

See Coimbra University to save plenty with open source ERP.

About Robert Pogson

I am a retired teacher in Canada. I taught in the subject areas where I have worked for almost forty years: maths, physics, chemistry and computers. I love hunting, fishing, picking berries and mushrooms, too.
This entry was posted in technology and tagged , , , , . Bookmark the permalink.

11 Responses to Ho Hum. Yet Another Organization Saves With FLOSS

  1. oiaohm says:

    Something all people who say Linux does not have backwards compatibility miss.

    The elf format that Linux uses was designed by Sun. In fact its designed to allow massive backwards compatibility. Is it possible for applications for applications on Linux to ship with their own libc and libraries the answer is yes.

    Notice the newest version of Patch elf allows changing library names application depends on. rpath embed in executable to allow look up directories for an application to be overridden.

    None of these features are new. People go hey Linux needs to make SXS solution. The answer is no Linux does not. RPATH designed by SUN in ELF is SXS. In fact the RPATH list of paths is a replacement to the SXS manifest.

    kurkosdr if a Windows program was breaking all over the place and it was not using SXS to ask for the libraries it requires. Who would you blame Microsoft or the Vendor? Vendor right. So closed source software vendor for Linux not using RPATH is the same thing.

    A vendor requires a particular version runtime of lets say Microsoft C. Now they don’t ship that with binary and application fails who do you blame. Microsoft or Vendor? Vendor again right. Problem here remove Microsoft insert Ubuntu and you see people blaming Ubuntu. You can swap in any Linux Distribution name.

    Problem here both of these cases its Blame the Linux Distribution under Linux for not being compatible enough.

    Could closed source vendors for Linux decided to provide a unified runtime independent to distributions with their own loader program the answer is yes. Could closed source vendor decide to have a /opt/vendorname-runtime and include this rpath in all their executables and libraries the answer again is yes. Would this behavior be close to identical to what they do on Windows the answer is yes.

    The reality is annoying yes you can use a chroot to install older or newer applications then use patchelf to make application called used the right loader and libraries.

    Patchelf every generation is becoming more and more of a relinker. Only thing it cannot change in the latest patchelf is version number on symbols.

    The majority of the require backwards compatibility systems exist in Linux. Yes including running new binaries on old Linux systems. The missing bit is more the package manager and distribution choice not to support it without using chroots and items like patchelf.

    kurkosdr the arguement about backwards compatibility is normally done by people who have not looked into the topic.

    Complaining about distribution package management lacking work around for Dependency hell using the backwards compatibility systems that exists due to elf binary format would be valid.

    The correct issue is Dependency Hell caused by package mangers. Not backwards compatibility support. The amount of effort required to support forwards or backwards on Linux is not that much. Issue is the tools todo the job are mostly missing GUI. Like patchelf no GUI.

  2. oiaohm says:

    https://www.complang.tuwien.ac.at/anton/linux-binary-compatibility.html Very key report to read kurkosdr.

    kurkosdr backwards compadiblity on binaries since 1998 on Linux is quite workable. Find a few libraries mess with ld.so a little and they run.

    If you want to complain about distributions being excessively pain in ass against running old programs. They are.

    If the binrary was made after 1998 the backwards compatibility is no worse than taking on shims in windows.

  3. oiaohm says:

    kurkosdr by the way its not the first time you have missed has out of a sentence when it valid todo so. You only noticed at the moment because you have been excessively nit picking.

    Backwards compatibility on Linux is about knowing how to manage it. Its not like Linux does not have all the frameworks.

    Run a Linux dynamic loader some time. Like under debian 64 bit. /lib64/ld-linux-x86-64.so.2
    –inhibit-cache option. So list of known installed libraries do not access.
    –library-path PATH option that sets the paths ld.so goes looking for libraries.

    A chroot mixed with these two can get you a very long way.

    kurkosdr Linux has very good low level systems for backwards compatibility. Better than the Microsoft shim system. Anyone who has ever had to use Microsoft Windows Compatibility modes and been on the wrong side of the shim system its not fun.

    Microsoft has better GUI’s for backwards compatibility.

    Slightly smarter package manager mixed with a system to auto generate .desktop and shell script wrappers and Linux would be way ahead of windows for backwards compatibility. SXS functionality is not very hard to emulate using what ld.so provides.

  4. kurkosdr wrote, “Let’s migrate and pretend all is fine despite the delays.”

    So, a new guy comes in and says things aren’t like he’s used to… The IT guy in the linked article explains it all. Clearly, the new mayor either has some agenda other than the smooth running of IT or he needs to take his pills. Where was this guy years ago when all this was on the front page and debated in City Hall? Asleep?

  5. kurkosdr says:

    I am lucky. This just popped up in Techbroil:


    Let’s migrate and pretend all is fine despite the delays. It’s not like citizens have a choice.

  6. dougman says:

    The “closed” source model of doing things is dying, the “open” source way is the future and will apply to every known sector.

    The benefit will of open source will allow every person to make a contribution to the whole, allowing it to grow and expand further.

    Looking beyond Linux, the idea is set to change the entire planet for the better.

  7. kurkosdr wrote, “Isn’t it strange that the only organizations that save from switching to FLOSS are governments, which don’t lose customers if work is done slower?”

    That’s an illusion. Governments tend to be open/accountable so we know about their migrations. Corporations usually keep such things internal as do individuals. I can tell you the little woman and I have no complaints about having to work more slowly. We click and things happen right away. At installations where I had OpenOffice.org in RAM, users certainly noticed the tremendous speed of startup compared to that other office suite getting shaken out of a hard drive. Meanwhile, the EULA of that other OS forbade such use without paying extra. Performance increase from GNU/Linux thick clients to thin clients was about 5 times, and a bit more for the old setup of M$’s office suite on XP’s hard drive. M$ did a “prefetch” thing which made the desktop unusable for ~30s on some systems while my users were busily editing… Chuckle. The EU documents side-by-side tests of OpenOffice.org v M$’s stuff and there is no loss of productivity on the whole.

    See also, DE: Freiburg: open source office three to four times cheaper.

    The cost of licensing software from which M$ “takes value” is enormous. With FLOSS one can do a lot with that saving and have change left over. My biggest migration saved 40% just on software and invested that money in more/better hardware. The project broke even on Day One (6 servers versus NONE, a bunch of printers, scanners, cameras, RAID, storage and gigabit/s networking compared to just a bunch of PCs with that other OS).

  8. kurkosdr wrote, ” Linux has hidden costs in backwards compatibility, which lead to hidden costs in administration, which leads to work done slower”.

    Huh? These are not hidden costs and they pile up on M$’s side of the ledger as being the cost of escaping lock-in. Many organizations now include the costs of undoing an acquisition in the capital costs of acquisition just as they must account for the cost of disposal of old/hazardous stuff. The fact is that these costs must be paid even if an organization sticks with M$ to keep up a whole archive with the never-ending Wintel treadmill of file-formats. Just look at it this way: A government at some point adopted Wintel, say, back in 1998. Now they have 16 years of archives that have to be “maintained”, that is kept accessible by M$’s moving target of an OS. What happens 20 years from now or 200 years from now? Do they really have to keep a Wintel machine from the 1990s and an OS from the 1990s in use just to access the archives? They do if they want to accept M$ is the only provider and M$ ships bug-ridden stuff not complying with any standard but their own quicksand… That’s an exponentially growing cost. OTOH, a government adopting GNU/Linux and open standards owns its data and its data-formats. They have minimal on-going costs because tweaks to the standards happen every decade or so compared to Wintel’s pace. How many years did it take M$ to almost comply with its own office-document standard? How many years did it take for ODF to settle? QED

    I have migrated a lot of teachers to GNU/Linux. The cost for the bulk of their documents was zilch. The only time I had to help anyone migrate files was a school with 3 report-card forms in “Works”. We just retyped them. Done in an hour or two. Meanwhile, M$ demanded folks learn to use “the ribbon” and learn where the “File” menu was hidden. I was in a room of adults once and we were all confronted with M$’s new office suite. None of us could figure it out so we used OpenOffice.org and got the job done in minutes. Where’s the cost? Munich found they didn’t even need to spend all the money they budgeted for these costs. They don’t exist.

  9. kurkosdr says:

    Linux hidden costs = Linux has hidden costs (sh!t, i accidentally ohioham’ed)

  10. kurkosdr says:

    See, Linux hidden costs in backwards compatibility, which lead to hidden costs in administration, which leads to work done slower, but if you are a government organization and don’t care, everything is peachy.

    Just don’t ask the poor saps waiting on the line.

  11. kurkosdr says:

    Sycophant speaking:

    Isn’t it strange that the only organizations that save from switching to FLOSS are governments, which don’t lose customers if work is done slower?

    Even LiMux admited longer waiting lines from people waiting to get their work done.

Leave a Reply