Another CIty Switching to LibreOffice

After a city moves the office suite from M$ to LibreOffice, what is holding them back from losing that other OS? Nothing.

“Las Palmas also aims to meet the goals promoted by Cenatic, the national resource centre on free and open source. “This is a strategic project of the Spanish government that seeks to promote and disseminate open source in all areas of society.””

From Cenatic’s website (Via Google Translate):
“The XPS 13 with Ubuntu 4.12 currently for sale in the United States is exactly equal to that provided with the Windows operating system preloaded , but its price is considerably lower than incorporating a distribution platform for free.” and

“CENATIC is the National Reference Center for the Application of Information Technology and Communication based on Open Source, a strategic project of the Government of Spain to promote understanding and use of free software and / or open source in all areas of the society, with special attention to public administrations, businesses, the technology sector provider and / or user of free technologies, and development communities.

CENATIC is a State Public Foundation, promoted by the Ministry of Industry, Energy and Tourism and the Junta de Extremadura, which also has on its board with the regions of Andalusia, Aragon, Asturias, Cantabria, Catalonia, Balearic Islands, Basque Country and Galicia, as well as the telephone company.

Creating CENATIC is the result of a commitment to technology, as endorsed in the Spanish legislation, which establishes measures for the government share, reuse and collaborate on technology projects for the benefit of greater efficiency in the use of public resources quality and safety of these projects.”

Why is my government not doing the same?

see Spain's Las Palmas' moves 1200 PCs to LibreOffice | Joinup.

About Robert Pogson

I am a retired teacher in Canada. I taught in the subject areas where I have worked for almost forty years: maths, physics, chemistry and computers. I love hunting, fishing, picking berries and mushrooms, too.
This entry was posted in technology. Bookmark the permalink.

65 Responses to Another CIty Switching to LibreOffice

  1. Yonah says:

    Oiaohm: “You lose because you have just proven you don’t read.”

    On the contrary, my linguistically challenged friend. My reading and writing skills far exceed your own. SERIOUSLY, I read your garbage then simply ignored it because, once again, the source is YOU and only YOU. I love winning!

  2. oiaohm says:

    Phenom
    “The second statement is correct, but the first is just plain wrong. Have you heard of cache misses? Of branch mispredictions? Heck, forget these, let talk even of very basic stuff like waiting for IO operations to complete.”

    Yes to all those. Its not completely wrong it correct only for 1 case other than I missed a word should have had the word virtual before cores. If you had run multi process like software builds and documentation conversion. You find out there is a ideal number of threads any system can handle.

    Branch mispredictions this is what hyperthreading was designed to limit processing effect of. Double the number of threads to number of physical cores. It also limits the effects of cache misses.

    This is a multi-thread statement when dealing with something like a complier or something like a spreadsheet where you must at the end merge the data back into one piece. Both have a related IO pattern. Read at start. Spend time processing if threaded merge then Write at end. So if you have a lot of excel documents to process a single threaded multi process attack is going to beat a multi threaded multi process attack. Why because multi threaded requires a merge of data processed on each thread.

    The big word you missed that is critical why on a 1000 core system multi-thread falls apart doing spreadsheets is NUMA.

    Non Unified Memory Architecture where cpus have to sync to transfer data between each other because you threaded. This is slow.

    Yet on a NUMA a multi process does not hurt as bad. Why no cpu to cpu memory controller transferring of data. Yes fully IO related.

    If you had build from source using gnumake -j number of processes you would understand what is going on.

    Multi process and multi thread don’t magically scale unlimited in a complier or spreadsheet case. They are the case that you can over thread. You cannot over break into processes.

    Something like excel is a different beast to a http server running multi thread. Http multi threading is not merging data from many threads in most cases.

    I was referring to multi-threading for the case of a spreadsheet. Single threaded in multi process works out better for massive spreadsheet processing. Load shape is predictable and you will not end up with the case that the first document you opened staving the second document you are opening from getting cpu time. Then you will in a excel case waste a lot of cycles merging data back.

    To a spreadsheet style workload more threads don’t equal better all the time on large systems it equals hindrance due to running into NUMA. And in fact if you watch excel carefully Phenom is avoiding threading. How its avoiding threading is the result cache.

    Processing system has is not referring to the full motherboard You have a 4 physical cpu chip system. You are running threads on all of them. Each cpu chip is a physicial processing system. Once you have the case that you job is extending over many CPU crossing NUMA boarders you are in ouch if you are planing to merge that data back into on piece.

    Yes I should not be using mosix cluster terms but NUMA terms. I still think NUMA in terms of mosix clusters. Behaviour is very similar just clusters hurt worse for thread to thread syncing.
    http://www.mosix.cs.huji.ac.il/

    Yes when you have more thread on a Numa system than what a section of it can handle the OS normally moves load around. Resulting it bad case of Numa memory syncing.

    Phenom yes you have every right to tell me off for using clustering term instead of NUMA term so misleading you slightly. Problem is there is not a official Numa term for a Numa segment that I remember. My brain has used the next best thing. Equal term from cluster computers.

    Yes I should have started that section for spreadsheet/complier style workloads. Not written it appearing generic an presuming it to be read with the context of problem at hand.

    Basically you need to cap the spreadsheet so its processing does not cause sync problems between NUMA sections when you need to process vast amounts of spreadsheets. ccNUMA hurts.

    So make Calc multi threaded without a results cache its still going to be slow. Make Calc open up too many threads when batch processing on a NUMA system it will be slow too. Single threaded per program is the safest on a NUMA system without tweaking. Since 1 thread cannot not by mistake end up in two NUMA segments trying to sync data with each other.

    Threading has a scale problem. Very big system threading can equal slow. NUMA segments is problem desktop users forget. But when you have a large Linux thin client server NUMA segments do become a problem.

    Linux costs about the same to start a new thread as start a new process. New process is neater on NUMA stuff as long as what you are doing can operate that way.

    So in some cases calc being able to fire off more than 1 process would be a god send(linux and os x users). Ok when running on windows you would not want to do this. Because windows is slow to start processes fast to start threads.

  3. Phenom says:

    Ohio, I am in a particularly good mood today, so I read the first two paragraphs of you wall of text. Unfortunately, the very first one is absolute crap.

    “Multi threading is only useful faster if you don’t end up with more threads than the number of cores the processing system has and if the threads are not locking with each other attempting todo stuff. Switching between threads is not free.”
    The second statement is correct, but the first is just plain wrong. Have you heard of cache misses? Of branch mispredictions? Heck, forget these, let talk even of very basic stuff like waiting for IO operations to complete.

    Ohio, do you actually know anything of anything at all? I start to doubt that you’re truly aware of the exact whereabouts of your kitchen.

    And please do not bother worrying about what I do understand and what I don’t. You are far from being the one to judge me. 🙂

  4. Phenom says:

    Pogson, language features have nothing to do with instruction sets and versioning. C# and Java have a handful of features which Pascal lacks – closures, lambdas, asyncronous constructions, just to name a few. Other features, such as generics, came a bit too late. Pascal was a fine language (in its Delphi incarnation) until 2000, after which it fell sadly behind. Again, I speak of language features, which have nothing to do with instruction sets and versioning.

  5. oiaohm says:

    Phenom
    “1) Excel scales better on multiple cores than Calc. Period. You may try to degrade this advantage as much as you can, but reality is that people need it.
    2) Buying Excel is cheaper than developing a custom-tailored solution.”

    Your number 1 is not always true. If you are processing many spreadsheets automated. Calc wins. Excel only wins if it being used by a human even then not in all cases.

    Multi threading is only useful faster if you don’t end up with more threads than the number of cores the processing system has and if the threads are not locking with each other attempting todo stuff. Switching between threads is not free.

    Phenom this is why I said performance gain talk about multi threaded is bogus. Multi thread in the application has to be compared to multi process. Depending on what you are doing depends if multi thread is any advantage at all or if multi thread is hindering. So ideal is a spreadsheet program you can tell to be single threaded for particular usages and multi threaded for others. single threaded when in multi process usage. So process management can keep the number of threads to cpus aligned simpler so achieving higher overall performance in that case.

    Phenom This is the problem threading is a double sided sword. Helps in some cases hurts in others.

    Result cache helps is almost all cases.

    Phenom really you don’t understand performance optimisation. Anyone who does know that threading is not a magical save all in fact threading in application being wrong for how you need to use it will hinder performance.

    The issue here is neither Excel or Calc in threading is ideal for all cases because neither has the means to change how they thread to suit the cases of use.

    This is the problem Phenom you don’t know this. So you would be a person that would attempt to put Excel as a square peg in a round hole and wonder why it does not work right.

    Your number 2 there is not always true and particularly for somewhere with 1000+ users. If you are needing multi user access to the same set of results in real-time for best effectiveness. Excel is dead in the water and you should have the same stuff done in either Oracle or Postgresql.

    Total Cost of Ownership Phenom. You need to understand when a Excel stops and something like Postgresql and Oracle starts. Remember both of those can do every operation Excel does triggered as data enters them. This is why its important to know that those two databases can do everything Excel can ok harder need more skilled stuff but can do it. You will have usage cases where you should not be using Excel but should be using one of the highly powerful databases.

    Any cost you save using Excel compared to custom-tailered solution can be offset by the fact the all users cannot access the information very quickly and each user is needing to either wait or recalculate themselves. Individual recalculating can be processing time used of other more productive business operations.

    The most common maths programming language used today is python. Its not that slow. The code produced is clean.

    Phenom also how many copies of Excel do you need 100 dollar a pop by 200 users is quite a bit of money. These governments are 1000 users+ at 100 dollars a pop. Excel costing gets worse as you get larger. Starts off small and cheep but you do reach a point where paying someone to write custom is cheaper than paying for MS Office to have Excel.

    Also there is another problem when you have a 20 users all opening a excel document it only takes 1 to stuff it up. Custom solution is also more data safe since you have better control over who modifies how to works. Again this is a large business thing.

    Phenom reality we are talking somewhere with a 1000+ users in a lot of cases at that scale Excel should be replaced with database solutions. Up to about 500 users Excel is questionable but there might be a cost saving by using it. Max for sure its right is about 10 users. So somewhere between 10-500 users you should migrate to a database solution.

    Phenom maybe you have never worked with 500+ or considered how much time is being cost because others cannot access the information in real-time you just entered into the spreadsheet.

  6. Phenom wrote of PASCAL, “it is already far behing Java and C# in terms of functonality”.

    Just as ARMed CPUs don’t need huge instruction sets to do everything, PASCAL does not need a huge repertoire to do everything. PASCAL is a general-purpose language designed to be almost minimal so that students could learn it rapidly. I taught the basics in 3 lessons. It has all the basic concepts: data types, variables, constants, assignments, looping and sub-programmes. I particularly like the strong type-checking which makes coding much easier when going from concept to implementation. The Free Pascal Compiler which I use also uses separate compilation of modules with strong type-checking so a programme of arbitrary complexity can be built. Compare that with the struggles over differing versions of Java or C compilers. I think it’s silly that programmes need to be rewritten when a new version of a compiler comes out.

  7. Phenom says:

    Pogson, I am worried. Is having Ohio around in a forum contagious for the forum host? Seems you have caught the eloquency from him.

    You love for Pascal is charming, as it is a beautiful language, but it is already far behing Java and C# in terms of functonality. Anyway, Pascal is head, shoulders, body and feet better than PHP, so you have my congratulations here.

    Back to our topic.
    1) Excel scales better on multiple cores than Calc. Period. You may try to degrade this advantage as much as you can, but reality is that people need it.
    2) Buying Excel is cheaper than developing a custom-tailored solution.

  8. Phenom wrote, “Since when does it cost $0 to hire a programmer to write your custom-tailored application?”

    It does not cost $0 to hire a programmer. Neither does it cost $0 to hire a spreadsheet programmer. There are a lot of bad spreadsheets out there because ordinary users are tempted to create spreadsheets and they have no idea about scalability. I was once involved in a school system where a teacher had contributed a spreadsheet to replace the paper register of attendance. The PHB thought it was wonderful and ordered it to be used system-wide. He wanted a master spreadsheet at HQ that could be copied and pasted from every teacher’s stack. It was an unmitigated disaster because the original was deeply flawed in scalability. For example, it assumed 20 students maximum per class when that was the average class-size in the organization. Teachers scurried to add rows whereupon all kinds of linked cell-references failed. So, instead of using a convenient new tool, every teacher was put to work debugging this thing and producing a local version which could not be merged nicely at HQ… That was the end of that. It probably cost tens of $thousands because no one had the correct vision. I talked to the originator who later moved into my position when I left. He had offered it as a proof-of-concept rather than a working copy but the PHB cast it to the wind. Now compare that experience with what would have happened with the database riding on a server at HQ from the beginning. We all had a fine and redundant WAN. For one tenth the effort a good system could have resulted. Instead they reverted to paper.

    PHP is only one layer in the stack. The big bottleneck with spreadsheets is not the storage access or speed of computing it is the scrolling and click-time of the GUI. Rembrandt did not paint masterpieces peeking through a keyhole. PHP is far faster than a human. I would use PASCAL however.

  9. kozmcrae says:

    Phenom wrote:

    “And since when is programmer’s work free?”

    Do you mean programming a spreadsheet?

    The problem with spreadsheets is that they are so versatile. It’s okay to apply that versatility to small to medium sized spreadsheets. But when those spreadsheets grow into monsters they become an unmanageable mess. That’s when the spreadsheet should be broken down into manageable pieces.

    Unfortunately the person or group that created the monster spreadsheet is often unwilling to let go of their creation. The spreadsheet is their baby that no one else knows how to manage because it’s so complex. That helps to keep their job safe. They hold the keys to the data with their monster spreadsheet. I’ve seen this situation with my own eyes.

  10. oiaohm says:

    Phenom sorry PHP and ASP.Net depends on setup. Hiphop php used by facebook runs rings around Asp.Net.

    PHP is only slow when used particular ways. PHP after the compliers of it is one of the fastest languages you can use for server-side. Also hiphop will not build some of the more suspect php in existence.

    My other comment is still missing about postgresql and pl/python and its addons for nice powerful maths processing.

    Phenom
    “And since when is programmer’s work free?”
    When the programmer is on staff doing it in time between jobs.

    Yes that IT officer you have in some businesses manning the helpdesk. He can be coding as well.

    Lot of businesses pay people to sit places doing nothing when jobs like doing us something to improve overall effectiveness should be going on.

  11. Phenom says:

    And since when is programmer’s work free? Since when does it cost $0 to hire a programmer to write your custom-tailored application? Or since when does your own time cost $0 to sacrifice production hours to write the app?

    Too many businesses in EU, USA and Canada charge more than $150 per hour. And you want to save $180 for a license… Ridiculous. Hence it never happens in real life.

    Btw, I had a good laugh reading your recommendations to do math calculus in PHP. PHP, which is the slowest among all server-side programming languages. 🙂
    http://stackoverflow.com/questions/2302933/asp-net-vs-php-performance-future-proofing-ease-of-development

  12. oiaohm says:

    Phenom
    “No database can provide the calculus instrumentatium of Excel, and you will now need to use an external application for the purpose. Which you might have trouble finding. And why should you bother, when Excel costs just a hundred bucks?”

    You have this wrong Phenom you don’t know open source databases. The one you are looking for is called postgresql. Very common database its common for a reason is scarily flexible.

    Yes welcome to the world of postgresql where you can plug many different procedural languages in.

    http://www.postgresql.org/download/products/4-procedural-languages/

    Native python in postgresql is very powerful. Particularly when you are including http://matplotlib.sourceforge.net/ or
    http://sympy.org/en/index.html

    Remember python is a default language in postgresql.

    There is a case of a lack of a GUI to glue it into one piece but if you are needing realtime processing of data as it comes in the same format Excel would produce postgresql can be got todo it. So yes updating graphics as data changes in the database. Even storing these graphics back inside the database.

    Everything Excel can do Postgresql using python with pure python addons can do if not exceed. Except for the GUI.

    I am not kidding on the exceed. http://en.wikipedia.org/wiki/Sage_%28mathematics_software%29
    Yes from postgresql pl/python you can can reach all the way out to Sage and other very high level maths solutions if you need it.
    Postgresql is not limited to your normal SQL languages like you have R and a few other specialist field languages as well.

    Even so the pl/python is not a external program its a internal program of the database.

    So basically when you hit the limit of Excel or Calc processing you reach for postgresql. Interesting point is postgresql is FOSS. The addons that make it more powerful than Excel are FOSS. Just no one has made a newbie gui for it.

    There are commercial addons to postgresql that use GPU processing that Excel cannot exploit.

    Yes you pass Excel with postgresql and pl/python with everything else postgresql can do Excel is not anywhere near the same ball park.

    Phenom yes I will give using postgresql for some case is overkill. Same with pulling in sage and other open source mathematics software since python alone is able todo a lot.

    Most people stop at Mysql and other light weight open source databases.

    If you were wanting the commercial path you would be talking oracle db.

    The number of DB that can do what you are talking about is limited Phenom but the number is not zero. Neither Oracle DB or Postgresql need to use external programs todo same features as Excel. Reason for using the external programs is faster setup.

    Phenom postgresql solution has an advantage many people can be working on the same data set at the same time extracting the results they are looking for in real-time as the data is updated.

    So other than lack of GUI there is no other advantage Excel has when facing off against the high end databases.

    Phenom in fact you can be crippling the usage of the data by not using a database to allow real-time sharing and processing. This is a important reason why the serous question needs to be asked when any spreadsheet starts getting huge. How many people need access to the information at once. If the answer is more than 1 most likely the information should be moved into a database and processed inside a database.

    Most likely your garbage believe that databases cannot do particular things has caused you to remain in spreadsheets with data that should have been moved to databases long ago.

  13. Phenom wrote, “No database can provide the calculus instrumentatium of Excel, and you will now need to use an external application for the purpose. Which you might have trouble finding. And why should you bother, when Excel costs just a hundred bucks?”

    You should bother if you need the performance. You could also use paper if you wish. 100K “entries” would only cost ~$20…

    Your “external application” could well be a spreadsheet displaying the output/input from/to the database or it could be a few lines of PHP or PASCAL or some similarly easy to use programming language. The largest database I have is a snapshot of Wikipedia from 2005. What would Excel make of that? Another biggy is my local database of MealMaster TM recipes from the web. It took a similar time to write a PASCAL application to deal with that as it would to write a spreadsheet. Rather than search a spreadsheet I might as well search a database or other fast structure.

    M$’s price is $179… So, I can get superior performance using Postgresql or MySQL for $0 with some programming environment for $0 and I don’t have to create a stupid spreadsheet for stuff that’s way too big to fit on a screen.

    Here’s an example of how spreadsheets don’t scale:“Microsoft Excel does provide the option of assigning a text name to a cell or range. But, if you try to use names systematically for all variables in model (cells and tables), you soon find out why few spreadsheet users do that. Defining a new name for a cell takes 5 steps (mouse clicks and menu selections). Naming does not scale well for handling models with hundreds or thousands of named elements: Excel offers pull-down
    menus and list boxes to select names from all the defined names in a spreadsheet — so it is difficult to use more than about 40 names in a model, a tiny fraction of the number of variables in most interesting models”

    see What’s Wrong With Spreadsheets

  14. Phenom says:

    Pogson wrote: “I have never seen “

    It doesn’t matter what you have seen, Pogson. Reality has better eyes, and it has seen spreadsheets with hundreds of thousands entries.

    You can argue that a database would be a better storage, but you will lose. No database can provide the calculus instrumentatium of Excel, and you will now need to use an external application for the purpose. Which you might have trouble finding. And why should you bother, when Excel costs just a hundred bucks?

  15. oiaohm says:

    Phenom “Excel scales to multiple cores, while Calc does not. Which is a problem for Calc.”

    Mostly Excel does not need to scale to multiple cores either for most operations. Because its not performing that many. Excel internal was design for single core very tight.

    Results cache cuts your processing way down. The important thing is how Result cache is done so you don’t end up with bogus results or at least that a full recalculate is a full recalculate removing bogus results.

    http://nabble.documentfoundation.org/GSOC-Using-cached-formula-results-during-ODS-import-td3993563.html

    This is what we have to watch careful how this caching of results is done so we don’t end up with another bug bear.

    Phenom here is the shocker with xlsx in libreoffice 3.7 when it reads the results cache it is currently able to open the file in half the time of excel.

    The talk about multi threaded is kind bogus. Most of the performance gains will have nothing todo with multi threading. Most of the gains will come to only recalculating stuff that has truly changed most of the time. This reduces updates a lot.

    Multi threading with spreadsheets is quite a minor speed boost.

  16. Phenom wrote, “Excel scales to multiple cores, while Calc does not. Which is a problem for Calc.”

    No. It’s no problem for Calc. On spreadsheets of reasonable size, Calc works very well. In education, I have never seen a spreadsheet that required more than one core. If you need multiple cores to do the job, you should be better off with compiled code and/or a cluster working on the problem. You should not be using a spreadsheet but a database.

  17. Phenom says:

    Spreadsheets don’t scale.
    Well, in this particular case they do. Excel scales to multiple cores, while Calc does not. Which is a problem for Calc.

    The GUI slows people down too as soon as a spreadsheet grows to more than a few pages.

    This reminds me of a story with a client of ours. We developed a tool which enabled them define filters for their data, and which integrated pretty nicely into their existing software (shipping insurance, no floss stuff out there). The tool was rather capable of creating complex filters, but had a straightfoward interface and worked also nicely for simple cases.

    After the tool was delivered, the application administrator received the following ticket from one user:
    “I create a filter on policy year equals 2009 and policy year equals 2010. I receive nothing. Why?”.

    For nitpickers: the tool had options like “any of the following”, but users simply tried to apply their everyday thoughts into the filters, hence these funny attempts.

    Pogson, it is not nice of you that you want to take humanity back to the console, where they should type arcane queries to retrieve data, when normal people can’t tell the difference between “and” and “or” in SQL.

  18. oiaohm says:

    Yonah
    “oiaohm: “Unfortunately not recorded.”

    Ha ha ha. Indeed. Better luck next time. I win this round. ^_^”

    You lose because you have just proven you don’t read. 10 months comes from the certification process. This documents are pull-able 5 months come from preping production. And in that 5 months they really cannot change anything due to the regulations that if you are modify before product release you approved certification is voided. Yes the audits and all that do consume 10 months. Even that it was not recorded information to confirm what was said is at other sources because its rules and regulations.

    Arm device due to lower complexity takes 1 month to pass certification for most arm devices. Same with x86 system that solder everything to the motherboard so requiring to do no power-supply alterations from firmware. Most x86 machines don’t fall into that camp.

    Some arm devices have two firmwares. The core chip-set firmware(this does power management) and the OS boot firmware(this is started chip-set firmware) stored in two independent flash areas. Basically arm has smarter design to get around the regulators so power management and OS boot don’t cross over each other. Yes the chip-set firmware can only start the OS boot loader firmware nothing else. x86 really need to copy arm. This would bring cases like Microsoft wanting to alter the boot process from 10 months down to 1 month. Todo this requires two independent flash memory. So quite major chipset design change and a revision in how the x86 cpus works to support to proper independent flash chips. Yes ouch not simple to fix.

    So on arm the core chip-set firmware has to go threw the 10 month nightmare process the OS loader firmware does not. Any alteration to the OS Loader firmware on arm can happen at any time in the lead up to production. x86 saving on flash chips equals major certification headache.

    Yonah
    http://davidwillium.hubpages.com/hub/Recovering-data-post-Excel-Formula-Corruption

    You are so smart explain why these failures happen. How excel end up believe that it don’t need to re process formula.

    The answer is the formula result caching in the xls or xlsx file. If the revision number of the result is large than the formula the formula is not re run no matter what option you choose in excel. Next the revision number is signed when it goes past max and goes negitive any formula past that formula on the excel sheet will not be processed either. This is how excel cheats. Every value has a last edited value. If the formula result last edit is greater than the cells feeding into it the result must be current in Excels eyes so causing major bugs when that revision number gets damaged so it greater than the last write revision number. If its negative is locked from change. Negative lock is a historic hang over how excel 95 locked cells. Problem is any excel newer than excel 95 lacks the option to remove the Negative cell lock. Yes MS supporting history comes back and bites us here.

    The handling of cached results of formulas in Excel that give a major performance boost is botched.

    Calc is now implementing a formula results cache. Hopefully they get it right. This wrong is a complete nightmare.

    People repeat refer it as Formula Corruption. When its not that the Formula is Corrupt in the excel document at all. You can print the formulas no problem with the formulas. Its the Formula Result Caching in the excel document that has gone Corrupt. There is no direct option in Excel to drop the Cached results once they are stuffed up from Excel. You can use libreoffice and other third party tools to dig your way out of this.

    When the revision number in the cached value has gone negative this also blocks Excel from being able to export to other formats include cvs because the modern exporters go what the hell is this negative number is and fails.

    Yes older an Excel document is the closer it is to having formula result caching corruption because the revision numbers just keep on counting forwards. Unlimited usage and changes all Excel documents will fail. Design flaw.

    This is the big problem people use MS Office having no clue how the file formats they are using are unstable and need some features added to deal with the issues. Like really how hard its it to add a true feature that goes ok we will reset all result counters on all formulas and clear there value and recalculate. That true feature is missing from Excel. There are a few features that read like they will do that but under tested they don’t work.

  19. ze_jerkface wrote, “Is it silly when it works fine in Excel? “

    Look. A spreadsheet is basically an interpreted programme embedded in a data-structure. A compiled application will beat the crap out of one any day. The fact that folks think Excel is a good substitute for a proper application is part of M$’s lock-in.

    The day someone tells me to run Excel or else is the day I tell them to go to Hell. Spreadsheets don’t scale. The GUI slows people down too as soon as a spreadsheet grows to more than a few pages.

  20. Yonah says:

    Brah, still not telling us exactly what the “cheat” is. We aren’t dumb enough to believe it’s using cached data. Keep trying.

    oiaohm: “Unfortunately not recorded.”

    Ha ha ha. Indeed. Better luck next time. I win this round. ^_^

  21. ze_jerkface says:

    I have read of businesses using enormous spreadsheets. It is Computer Science 101 that those should be done using dedicated programming or database management systems. It’s silly to use a GUI for such huge spreadsheets.

    Is it silly when it works fine in Excel? Anyways you seem to be unaware of standardized reports. A lot of companies/goverments want data in standardized reports so they can run their own custom programs on them. You can’t tell them that it’s silly to request reports that contain hundreds of pages. You either provide it or they go somewhere else. Excel is so cheap compared to labor that you have to offer something better or it isn’t even worth the time of a corp to investigate the alternatives. LibreOffice also looks like a dated version of Office and the typical Western employee will resent having to use an older version if they use something newer at home. It doesn’t matter what you say to them, all they hear is “blah blah blah blah we’re too cheap to buy Office 2010 blah blah so use this knock-off”. Hell I bet the typical Office user would pay out of pocket to not use Office. Anywhoo MS is about to blow it’s own foot off with Windows 8 so why not take a break from M$ obe$e$$ing and garden for a while?

  22. oiaohm says:

    Yonah
    “It is doing a particular cheat that means you Excel document if it gets damaged writing or reading for disk can be display false figures and perform maths on those false figures.”
    On this page I give the instructions that show one part of the cache already. That excel is really not running the formulas every time.

    http://davidwillium.hubpages.com/hub/Recovering-data-post-Excel-Formula-Corruption

    How to cope with the results cache going south is written all over the place. Most who don’t do data recovery don’t know the cause. Minor damage to the files sends Excel nuts to the point even at times removing and re entering the formula will not cause Excel to work properly keep on saying the result was some historic number. You zero the number and the one cell ends up remaining zero. The cause is damaged document. ODS documents don’t do this at this stage and I hope it never does. You don’t need to get specialist software to fix a ODS so it works right at this stage.

    Yonah Excel basically has a major problem that currently appears at random in all version upto and including 2010. I suspect 2013 still have it as well. I find found it when I was generating using Java .xls files that excel could be made do very stupid things because it was believing that the file on disc was correct so not recalculating.

    Funny enough one way to get Excel 2010 to attempt to fix it self is have it export as ODS then import. Since in the process you lose the cache. But that is if it will export the damaged file. Yes it can fail to export when its damaged.

    2013 excel might fix it with strict OOXML. Or at least provide something that is simple to get rid of the results cache so excel can start clean again.

    The issue cannot be fixed just use Excel at this stage. Just to be fun the .xls file format documentation by Microsoft in fact don’t document the formulae results cache. There is some documentation with .xlsx files. That have helped a few third party tools to exist to fix it.

    I had the funny one where all my formula results from a generated .xls file were stuck on zero. 100+100=0 Ok something is wrong. I had miss formed the formula results cache.

    The bios one is a case of face to face.
    Yonah Australian Linux Conf 2012 face to face talk with the head of Intel, HP, Dell and IBM over the state of motherboard firmware disasters. That was after the biosbits.org talk. Unfortunately not recorded. Result was a statement of the turn around time. From production of prototype board to final production for the bios. Yes there is a hang time from prototype to production to 15 months. This hang time has improved it was 18 months 3 years ago.

    You could call the 15 months 10 months of bureaucracy and audits. This is why you don’t see rapid motherboard firmware updates. EFI is an attempt to move some of this stuff to hard-drive were it can be updated without going threw the process from hell. Because the hard drive is not forever welded to the motherboard and what it loads does not access the SMI where the power management of board is so what ever on there does not have to pass the electrical safety tests. Yes of course Microsoft did not request a update that could go there.

    The fact the firmware on a x86 motherboard messes with the board electrical system causes some major Certification headaches. 15 months has a cause. If we returned to old jumper configuration of the power system they could update motherboard firmware without having to jump threw 10+ months of hoops again. Yes Plug and Plug is a bastard at times. There there is the durability testing as well.

    Arm hardware does not have this problem. Since it does not have firmware changing power voltages. When you have firmware changing power voltages this might product electrical noise that now means you have to go back threw certification that your device is still suitable for usage after messing with the firmware.

    Yonah basically the 15 months is mostly to get the device certified for market. Then 10 months lag after that to certify any alteration to motherboard firmware. The regulators are part a bugger here because until the device is in production you can apply for a firmware update or you break the certification process.

    If I was willing to dig out the rules and regs on the certification process I could pull out the sections on electronic power supply control firmware requirements. Even atom chips that weld to motherboard still have firmware controlling voltages for the ram. So yes opps. Yes means to plug and play ram and cpus good idea having firmware set voltages not to blow them. Makes motherboard firmware certification a complete prick. Maybe who knows after the this last time maybe motherboard makers might split the firmware in two. Power management and Boot. Would make sense to avoid the certification process.

    Going back deeper is getting past where I bother reading.

  23. Yonah says:

    Oiaohm: “It is doing a particular cheat that means you Excel document if it gets damaged writing or reading for disk can be display false figures and perform maths on those false figures.”

    Ding, Ding, Ding! I like this one and shall make this one Citation Request #4. Oiaohm, you now have 4 pending citation requests that you haven’t answered. Follow the trail from here: http://mrpogson.com/2012/07/16/8-is-neck-deep-in-its-own-mess/#comment-92119

  24. oiaohm says:

    kozmcrae please read mine about Tar multi threading. Number 29.

    Tar criticism of Calc is valid. But you also have to take into account how Excel gets faster. It is doing a particular cheat that means you Excel document if it gets damaged writing or reading for disk can be display false figures and perform maths on those false figures.

    Just because someone is faster does not make it better. This is true for Calc vs Excel. Depending on what your requirements are Calc can be the better solution even that it could be the slower because it is doing maths correctly.

    Tight financial times head you cannot afford a projection error due to a Excel bug.

    Tar I would most likely was not aware that Excel cached its formulae results to disk and has tracking to work out what has and has not be changed and only recalcs related to what excel thinks been changed. Where calc by default was recalcing the lot every time. So calc is not just lacking multi threaded is producing correct results because it was redoing everything every time. So a one off error would display.

    Yes that thinks Excel has changed can be screwed up by bad ram as well. There are many ways Excel can go south on you.

    I hope in the process of making calc faster they don’t ruin its correct results provide.

    The reason why Excel is so abused in business is a sad one. Most people use Excel as a database instead of using a database because MS Office standard did not come with access and Excel was the next best thing to a database in a lot of peoples eyes.

    Yes its a very hard call what one you should be using. For Spanish current mess Calc might be the best solution. Excel at high cost and high risk of bugs in calculation may just be too much of a risk for the hard times they have head.

    The performance difference between Excel and Calc is reducing at a decent pace. The call is a lot closer than a lot would think. Speed vs Correctness.

  25. Tar says:

    kozmcrae, I don’t use Microsoft. I am an Apple fan and only have Apple and a Linux computers.

    My comment did not derail anything it is a valid criticism of Calc that oiaohm did confirm.

  26. kozmcrae says:

    Chris Weig wrote:

    “For God’s sake, kozmcrae,…”

    So then it was really Tar’s number 13 comment that Robert was referring to in comment 19. Tar was responsible for derailing the thread, not Robert.

    This is just like I predicted a couple of weeks back. One Cult of Microsoft member disappears (Viktor) and is replaced by another one, Tar. Nothing has changed. You all play dumb. It’s 1995, Linux has less than 1% of the desktop and is the OS of hobbyists and basement dwellers. Get a life people.

  27. oiaohm says:

    DrLoser
    “Thing is, the people working on Excel have spent twenty years or more on making sure that O(n log n) works. They didn’t just do multi-threading for the sheer joy of it.”

    That is the problem O(n log n) is correct. But Excel was never design for resource sharing in a terminal service/thin client solution. Instead was design on the idea that user was sitting in front of a decanted machine they don’t share with anyone else and that the machine would not be massive.

    So what Excel does multi-threaded is not designed for huge systems. So LibreOffice will need a different solution to Excel because some places it could be running could be truly huge with 1000+ cores.

    DrLoser
    “Libre Calc will possibly get it right, within the odd rounding error or two.”
    To be scary you just got your statement wrong. Excel has rounding errors. These round error a replicates of history Excel errors and some of the are down right funny like Excel thinking 1900 is a leap year. For a person like me who will be look at historic financial at times Excel is dangerous it gets the number of days in a year wrong and other things at times.

    One of the reasons to use LibreOffice Calc is that it does not contain a lot of Excel errors. Excel maintained errors for backwards compatibility in current day documents. To be correct it was not required to do this. .xls and .xlsx contain version numbers. So the defective features could have been deprecated and errors showed to end users when opening a document effected by them. Microsoft with Excel choose the path to sweep these bugs under the carpet. Some people claim Libreoffice calc errors when it does not match Excel but every time I have seen someone claim this you find they were using some buggy part of Excel.

    So if Spanish was spreadsheets without application bugs in maths they will not be using Excel. Ok they might not be using LibreOffice either. Reality of the mess is that Excel is buggy.

  28. Chris Weig says:

    For God’s sake, kozmcrae, missing your prefrontal cortex again? We got here by way of post #19, where Mr. Pogson claimed, trying to rebuke the statement that Calc’s calculation engine was slower than Excel’s, that his real-world scenarios were representative of all users. He then dismissed other’s real-world use of Excel by recommending the move to a real database.

  29. kozmcrae says:

    Chris Weig wrote:

    “So, once again: you can write anything about Microsoft’s software. But if that doesn’t come from experience, if you can’t back it up, then it’s worthless.”

    How did we get from “Another CIty Switching to LibreOffice” to Robert Pogson’s personal expertise in spreadsheets? Oh, that’s right, drift away from bad new for Microsoft.

    Microsoft is losing its grip on the desktop office application market Chris. Why don’t you spit into the wind, you might get better results than attacking those who bring you bad news about Microsoft.

  30. Chris Weig says:

    In that case no one could say anything about M$’s software because they are the only ones that know all about it. That would be a sad state of affairs. Fortunately I can say what I want.

    Fortunately you deliberately misunderstood what I wrote. Always a pleasure.

    So, once again: you can write anything about Microsoft’s software. But if that doesn’t come from experience, if you can’t back it up, then it’s worthless. You make a general claim that Calc is better than Excel. How can you know that, if you only have used spreadsheets in the limited capacity you have described?

  31. Dr Loser wrote “pivot table” as it meant anything. I don’t use that term, but one of my 14 pages was a summary of the entire project for the suits who don’t like details. I don’t consider that a particular feature. It falls naturally from being able to refer to other cells and sheets.

  32. Chris Weig, wanting to censor the web, wrote, “you can’t make qualified statements about certain aspects of software, if you don’t use said software in such a context where said aspects are used/necessary”.

    In that case no one could say anything about M$’s software because they are the only ones that know all about it. That would be a sad state of affairs. Fortunately I can say what I want.

  33. Chris Weig says:

    Mr. Pogson, you can’t make qualified statements about certain aspects of software, if you don’t use said software in such a context where said aspects are used/necessary. If your spreadsheets are not large but small how can you make general statements about LibreOffice Calc being better than Excel under all circumstances? You can’t. But nonetheless you do.

  34. DrLoser says:

    What more can one do with a spreadsheet besides that and the generation of report-cards, analysis of data and creation of graphs which I do all the time?

    Nothing at all, Robert. Nothing at all. (Actually, there are other things, like importing and exporting obscure formats, and pivot tables, but we’ll leave them be for now.)

    Thing is, the people working on Excel have spent twenty years or more on making sure that O(n log n) works. They didn’t just do multi-threading for the sheer joy of it.

    Your pitiful little school spreadsheet might as well be done on a Texas Instruments calculator … but, if you insist, Libre Calc will possibly get it right, within the odd rounding error or two.

    Excel is misused, because at the top end you should probably be using an RDBMS. But on the other hand, Calc is hardly used at all.

    Chuckle. I wonder why not?

  35. Chris Weig wrote, “without a doubt that you haven’t used spreadsheets in any manner which would make you qualified”.

    I use spreadsheets more than most of the teachers in my schools do, there’s no advantage to using Excel in schools even if Excel were faster which point I don’t concede.

    The 14 pager was the entire shopping list and budget of work for the school at Easterville. It was used to optimize the cost of the project and then to print the purchase-orders. It was a real-world application of spreadsheets dealing with a huge expenditure for the school. It was reviewed by the Board of Education, the principal and the Director. Sheets were sent to suppliers. What more can one do with a spreadsheet besides that and the generation of report-cards, analysis of data and creation of graphs which I do all the time?

  36. oiaohm says:

    Chris Weig every time you quote history you miss the key events. The RTF agreements around 1995 was key events to Office. MS presence on almost every maker who was making basic run computers. Lot of those makers of machines running basic are OEM makers today.

    Without basic MS would have been lot smaller fish. Without basic MS would not had the money to buy 86-dos so MS-Dos would never happened.

    Yes even back when MS basic and MS-Dos started by anti-trust law it was not valid to charge for every item someone produced no matter if it used your copyright material or not. Yet Microsoft did exactly that.

    Microsoft anti-trust actions start before the release of MS Dos in the time of MS Basic.

    This effectively killed competition got MS the funding to buy Dos and to make Windows. How does this kill competition simple. If I am making my own OS and I have to pay for every machine I ship it on your OS where is the profit for me for making my own OS. That right there is not one I end up priced out the market so I stop making my OS so MS got less competition. Yes competition extermination was well under-way before the release of MS-Dos.

    Computer history there are particular key events. Chris Weig. Those key events are interlinked and got MS where they are today. Most of those key events are not allowed by anti-trust law at the time they were done let alone current day stricter anti-trust laws.

    Microsoft is a house of cards where each card is an anti-trust action holding them with there market share. Problem is key cards are starting to disappear.

    MS monopolies have been build against the rules. So there will have to be a correction.

    Chris Weig Microsoft reacting badly to ODF with a rushed job of OOXML was the fact that ODF threaten to break there complete market hold same way MS used RTF to destroy the other office suites. Yes MS sold RTF to goverments as a safe and stable interchange format. From 1993 to about 1998-1999 rtf was fairly defacto standard before .doc took over.

    Chris Weig you pick a point of time and say at that point of time MS was not a monopoly so they could not have done something wrong. This is in fact incorrect.

    The speed MS goes from nothing to a monopoly tells you that something was wrong. When you investigate you find out what it was. Even today something’s MS does I call questionable. Like if you produce a token Windows mobile phone we cut the patent fees you have to pay on everything else.

    Exactly why does what I produce have anything todo with how much a patent is worth. This to me stinks like anti-trust usage of patents.

    Heck IE was effectively stolen work. MS really did not care what they did to get ahead. Only in recent years has MS started caring what the law says is allowed.

  37. oiaohm says:

    Tar
    “Calc not properly multithreaded. My full reply is waiting in moderating queue (not trash can I hope) and should appear soon….”

    I know is not. But at LibreOffice 3.5 Calc performance in particular areas are faster than Excel 2007 2010 and 2013. 3.6 and 3.7 expand the areas that are faster.

    The interesting point is Calc is not to properly multi-threaded and its already defeating excel sometimes quite majority on common tasks. Also multithreaded is a double sided sword.

    3.7 what is the current development branch can open ooxml excel files faster than MS excel 2007, 2010 and the 2013 beta.

    This is a interesting reality. Not all operations in Libreoffice are going to be slower than Excel some will be faster. Question comes how often the slow options of Libreoffice will be required.

    Multi-threaded a double sided sword processing has to be synced. Multi users on a terminal server not that well threaded can work out better for load balancing between users. Less cpu time lost in load balancing maths. Remember each thread is something extra the system has to track. That tracking does add up. Then you have the core to core syncing.

    So really we want two office suites. One with a very fast limited threaded and one that threads out where able. Limited threaded for the terminal server solutions. Terminal server with a 1000+ cores threading might not be exactly a good idea sync lag might be worse than if you did not thread at all.

    Tar so there is a question on Spanish planned setup how big of a problem its going to be. In fact adding multi-threading in the wrong setup may in fact be worse than not multi threading at all.

    Tar we are seeing sections of the improved threading. The art is working out exactly what is the right level to thread to.

    Funny enough one of the biggest reasons for LibreOffice being slower than Excel is not multi threading is the fact Excel caches past results in .xls and .xlsx files that LibreOffice does not use.

    Yes you can make Excel open a document and display that A1=1 B1=1 C1=sum(A1:B1) and have like 1000 in C1. Then have some math operation perform of C1 so displaying a completely stupid result.

    Only recently has LibreOffice Calc added this. So in past every time you opened a ODS it was basically performing a full recalculation.

    Also Excel does not perform a full recalculation this become very clear if you intentionally damage a excel document. It performs a recalculation on formulas where it believes the result has possibly changed. So where you damage a cached result from 1 to 1000 it leaves those wrong. So 1 + 1=1000 excel can be tell you as true. Yes optimisations.

    So there are quite major performance boost Calc can pick up without going multi-threaded just by coping some of Excels cheats.

    http://www.tech-recipes.com/rx/2980/excel_2007_how_to_clear_values_keep_formulas/ Do this and watch you excel recalculate time expand quite massively Tar.

    So really libreoffice calc has been safer because when it does a recalculation it really does.

    Lot of people using excel forget to clear the values of formulas to be sure excel is tell them the truth. Some very expensive company stuff ups happen because of that.

  38. I added a recent comments widget which might show the feeding frenzies…

  39. Chris Weig says:

    Chris Weig also history idiot as normal.

    Says the village idiot from the Australian outback.

  40. Chris Weig says:

    The largest spreadsheet I have ever created was 14 sheets with perhaps 50 lines per sheet. The little bit of adding and multiplying involved took no time at all. The largest spreadsheet I have ever used on that other OS with M$’s stuff was just a tiny bit longer and not perceptably faster.

    And that proves without a doubt that you haven’t used spreadsheets in any manner which would make you qualified to talk about the issue of Calc being worse than Excel.

  41. Rar wrote, “dedicated programming or database management systems.

    This is too expensive for a city. Software always cheaper then labor.”

    Nonsense. MySQL or Postgresql cost $0. The rest is similar effort to the task of creating/maintaining a spreadsheet. A spreadsheet is software mixed with data, an inefficient system. e.g. backing up data and backing up software, trying a new version of the software with same/different data and its more useless lock-in.

  42. WP does not seem to have that option. I notice that oldest comment is listed at the top of the page even though I have settings of “newest”. I wonder what happens if I change to “oldest”? Achhh! No difference. I am being ignored. Clearly WP has a few bugs…

  43. Tar says:

    Scrolling is fun on touch displays. Seen Minority Report? 🙂

  44. Tar says:

    @oiohiom

    Calc not properly multithreaded. My full reply is waiting in moderating queue (not trash can I hope) and should appear soon….

  45. Tar says:

    dedicated programming or database management systems.

    This is too expensive for a city. Software always cheaper then labor.

    It’s silly to use a GUI for such huge spreadsheets

    How else can users edit them? spreadsheets are a democratic technology. users can do so much with them, even import data from web or database.

    Calc is slower because recalculating code is single threaded eg a single data base lookup in a cell will grind calc to a halt. Excel on Macintosh has same problem, not so Excel on Win.

  46. Tar says:

    Instead of just:

    Comments 17

    Like this:

    Comments 17, Last comment 21 July at 12:30pm

    Only a little suggestion 🙂

  47. Tar wrote, “Has the recalculation engine been rewritten to allow multithreading? Calc recalculation engine is tonnes slower than Excel.

    It will cost Spanish people much more in employee hours if this bug has not been fixed before deployment. ODF files also load too slowly.”

    Tar has no basis for that claim even if he is correct in his premise that Calc is “slower”. Tar does no know the kinds and sizes of spreadsheets in use. The largest spreadsheet I have ever created was 14 sheets with perhaps 50 lines per sheet. The little bit of adding and multiplying involved took no time at all. The largest spreadsheet I have ever used on that other OS with M$’s stuff was just a tiny bit longer and not perceptably faster. So, I doubt the vast majority of Spain’s users will consider speed or lack of it a consideration. I have only ever heard of formatting as a concern from my users.

    I have read of businesses using enormous spreadsheets. It is Computer Science 101 that those should be done using dedicated programming or database management systems. It’s silly to use a GUI for such huge spreadsheets. Users have to search to find things anyway, so there’s no advantage to using a GUI for those. Printing such spreadsheets would be a ridiculous waste of paper in many cases since few will flip pages to find stuff. The choices of sorting order grow huge in number as columns and rows increase so a printed copy will not be optimal for most tasks. The desktop paradigm of a paper document represented by a file breaks down for large spreadsheets. Scrolling is neither fun nor effective use of time.

  48. Tar wrote, “Then infrequent posters can see of they are replying to very old posts or not.”

    There are no very old posts on this blog. I only started it in 2007. 😉

    I see the date of the post at the top of the page for each post, and at http://mrpogson.com/ . It says “Published by Robert Pogson “+date . Then each comment has its posting date and time. I see the same whether or not I am logged in. Not sure what Tar wants.

  49. Tar says:

    Another post deleted, please restore it. Thankyou.

  50. Tar says:

    Has anybody before studied about the correlation of Microsoft dominance and Reagan-Thatcher ideology?

    Microsoft dominance maintained through copyright monopoly granted by state.

    Hardware manufactures are Laizze faire, harsh competition and low prices.

    It is state granted monopolies of copyright and patents that give large corps their power.

    So you are exactly wrong in your conclusion.

  51. Tar says:

    A little suggestion, Please put the date of the last comment on the front page post next to the number of Comments.

    Then infrequent posters can see of they are replying to very old posts or not.thank you.

  52. oiaohm says:

    Tar
    http://dbank-libreoffice.blogspot.com.au/
    The load issue is being addressed. Quite major alteration to ODS load times coming in 3.7 that will be before Spanish deployment time.

    Since Libreoffice 3.5 http://cor4office.blogspot.com.au/2011/12/libreoffice-35-faster-on-windows-than.html

    Yes interesting right some areas LibreOffice Calc is faster than Excel. Past 3.5 it gets interesting.

    Daniel Bankston is basically one of the coders who is decanted to dealing with some of those performance issues. Other ones I know of to watch is Markus Mohrhard and Kohei Yoshida. All of them with their alterations are having quite major effects on calc performance for the better.

  53. Tar says:

    Has the recalculation engine been rewritten to allow multithreading? Calc recalculation engine is tonnes slower than Excel.

    It will cost Spanish people much more in employee hours if this bug has not been fixed before deployment. ODF files also load too slowly.

  54. oiaohm says:

    Ivan MS Office 2013 has known interoperability problems with MS Office 2010 and MS Office 2007.

    Same reason why MS Office 2010 has trouble at times with 2007 documents.

    Basically OOXML has been one huge mess.

    ODF has been approved along side PDF as a archive format for national achieves.

    Its not just LibreOffice and OpenOffice that has interoperability problems with MS OOXML. Every office suit using OOXML has high interoperability problems.

    ODF using office suites have quite low interoperability problems.

    Chris Weig also history idiot as normal. MS Office dominances traces to RTF in fact. MS got all Office suites in 1995 to agree to use a common format called RTF. Being a MS invented format Microsoft extended this format so breaking compatibility with its competitors so allowing it to take domination of the market. This is a documented case of anti-trust case Corel won this against Microsoft a long time ago. Please don’t try rewriting history as Microsoft some how making a better product. Microsoft did a underhanded stunt to be where they are today in the Office market. Microsoft even got governments in many countries to only accept documents in the standard format of RTF.

    Chris Weig
    “When Microsoft stepped onto the scene with MS-DOS, they were just one among many companies doing the same thing.” Again you have missed history.

    Microsoft entered the OS game with Basic not MS-DOS by the time Microsoft entered with MS-DOS they were already a monopoly with Basic based platforms. Its simple to forget MS-Dos was Microsoft second OS platform. Lot of the Microsoft basic sellers moved to MS-Dos so MS-Dos did not start from a standing start. Also on anti-trust MS was done for doing a lot of deals in breach where companies had to pay for the cost of MS-DOS on the machine if or if not the machine shipped with MS-DOS.

    History has Microsoft doing a anti-trust action after anti-trust action to get market share. The damage these anti-trust actions did was massive.

  55. Ivan says:

    “…thereby acknowledging that folks have a stack of archives they want to convert to ODF.”

    Wouldn’t that really signify that they’re not basement dwelling cretins that are masturbating over their freedom and have acknowledged that their office suite has interoperability problems?

  56. Chris Weig says:

    @Mats Hagglund:

    Japan didn’t need a free market? Tell me then, whereto did they export all their goods which made their “economic miracle” possible?

    Ever heard of “soziale Marktwirtschaft” (social market economy)? Its implementations in Austria and Germany were based around — yes! — a free market as the core.

    You have your logic totally reversed. You imply that free markets are bad because of the special case of Microsoft having achieved a natural monopoly in the OS market. Utter BS. When Microsoft stepped onto the scene with MS-DOS, they were just one among many companies doing the same thing. No monopoly. Lotus 1-2-3 and WordPerfect were the dominating applications on MS-DOS. No monopoly. Microsoft building better products than their competitors and Microsoft’s competitors blowing it, that’s the reason for Microsoft’s dominance. Of course, after Windows had taken off, it was hard to stop the train, as businesses boarded that train and saw Microsoft’s dominance as a very convenient thing.

    If according to Robert Pogson the free market — with small cheap computers and smart thingies — will oust Microsoft from their position of power, then you can’t, in contradiction to that, claim that the free market inevitably produces convergence towards a monopoly in a specific sector.

  57. Mats Hagglund says:

    After the Great Depression and WW2 there was a certain period when most of western societies found state as a main coordinator of economic growth. We can call it Keynesian era whether you like it or not. “Free markets” didn’t help Germany, Japan, Austria, Canada, Sweden and Finland to gain good stable relatively great economic growth. These and most of the western nations got positive growth via “socialism”. Even USA has a kind of military Keynesian socialism. All in all nations had national vision how to go to welfare state.

    If we had Keynesian political mainstream now governments in west surely would see FLOSS and Linux as a natural way to solve IT problems. Instead we have now very Reagan-Thatcher was of thinking the world. That’s why our politicians believe in theory of “no free lunch” and how markets will solve our problems. Markets means rather old fashioned and unwieldy, inefficient Microsoft ecosystem and with closed source/proprietary software.

    Has anybody before studied about the correlation of Microsoft dominance and Reagan-Thatcher ideology?

  58. Chris wrote, ” They’ve been hired to improve compatibility with OOXML, thereby acknowleding that MS Office is still king.”

    …thereby acknowledging that folks have a stack of archives they want to convert to ODF.

  59. oiaohm says:

    Chris Weig MS Office 2013 is the first MS office to use proper OOXML as well. So the timing of adding support to LibreOffice lines up with when MS comes into alignment with there own standard.

    There will be issues with 2013 opening some 2010 and 2007 documents. So hell yes it will pay to fix up LibreOffice to support OOXML now to open documents that will break.

  60. oiaohm says:

    Chris Weig
    “LibreOffice … right. Suse and Lanedo will receive 140000 Euros from some cities who use LO. They’ve been hired to improve compatibility with OOXML, thereby acknowleding that MS Office is still king.”

    Funny no it does not prove that. So by your logic MS Office 2013 supporting ODF 1.2 is acknowledging LibreOffice as king. Sorry that your logic Chris Weig completely bogus. Both MS Office and LibreOffice are supporting each others formats.

    I see this as more paying to be rid of the need of MS Office. If MS Office was king in their eyes they would use it and not be paying to fix up LibreOffice.

    After release of MS Office 2013 the file differences between LibreOffice and MS Office will reduce.

    Yes there is a reason why MS will not allow MS Office 2013 to run on Vista or XP as this would speed up MS collapse.

    Also the 140000 Euros from those cities to improve LibreOffice is still way cheaper than paying Microsoft Licensing.

  61. Chris Weig says:

    “Yes, but what’s best in life, Robert?”

    “To crush Linux users, to see them driven before you, and to hear the lamentations of their mothers.”

    “This is good, this is good.”

    LibreOffice … right. Suse and Lanedo will receive 140000 Euros from some cities who use LO. They’ve been hired to improve compatibility with OOXML, thereby acknowleding that MS Office is still king.

  62. kozmcrae says:

    Robert Pogson wrote:

    “Why is my government not doing the same?”

    That’s a good question, not just for your government but for other municipalities, school districts and governments. I think the answer has not been invented yet. When it is invented it will be called something like “Microsoft’s disease”.

  63. oiaohm says:

    Phenom
    “probably because your country is not suffering such an economic crisis?”
    Correct that is also part of it. But its not the whole story. Even inside Canada there are areas of higher Linux usage than others.

  64. oiaohm says:

    Robert Pogson history answers your question.
    “Why is my government not doing the same?”

    Companies did not change from CP/M and other OS’s instantly to Microsoft. Same with Unix systems to Linux.

    Each of these changes that happens that does not change back equals less market share for Microsoft to sell product. Also equals more software development on FOSS. The first changes was the hardest the next changes it progressively gets simpler.

    Patience Is A Virtue.

  65. Phenom says:

    Eh, probably because your country is not suffering such an economic crisis? Spain are not only neck-deep in economic troubles, but also try to only simulate cost-saving policies.

Leave a Reply