Examining and Modifying the Code

A few trolls here have derided the utility of being able to examine and to modify code. I am not a C-programmer but I am not a complete idiot and can and do examine the source code. I have been using a neat little prgramme, gebc (GNU Exterior Ballistics Computer) but found a couple of rough edges. One of them I could fix…
“On my system, the column heading "Wind Drift" is cut at both ends to "Vind Drif" in the Range Table. I edited the file RangeWindow.cpp to change "Wind Drift" to "Windage" in this (123) line: tbl->add("@b@cRange\t@b@cDrop\t@b@cDrop\t@b@c
Velocity\t@b@cEnergy\t@b@cWindage \t@b@cWindage\t@b@cTime",0); There still appear to be a couple of pixels shaved off the "W" but it is more readable this way. The programme gives me accurate results. I would like to see it improved with resizable windows, fonts, etc. It is very usable as is but my eyes are old…”

see GNU Exterior Ballistics Computer | Reviews for GNU Exterior Ballistics Computer at SourceForge.net.

The programme is easily built with
./configure
make
make install

in the directory gebc-1.07 for the version I have. The download came as a .tar.gz file.

The idea that being able to examine or to modify the source code of software is somehow a disadvantage or useless is silly. Clearly it is an advantage and one that ensures bugs can be squashed more rapidly and by a more diverse group of users.
gebc-1.07_range_table

About Robert Pogson

I am a retired teacher in Canada. I taught in the subject areas where I have worked for almost forty years: maths, physics, chemistry and computers. I love hunting, fishing, picking berries and mushrooms, too.
This entry was posted in technology and tagged , . Bookmark the permalink.

29 Responses to Examining and Modifying the Code

  1. oiaohm says:

    Der Balrog look up fsnotify. That was not rejected on technical merit and it in the Linux kernel as I described. I could not implement it. There are also a few things in the cgroups mailing list that are also implemented as I described.

    I have not seen one from you at all Der Balrog failed or successful. I have had just as many successes as failures.

  2. Der Balrog says:

    Der Balrog Funny. Apparently you have not noticed from time to time I do comment on Linux kernel faults.

    Oh, I have noticed. I have noticed! Do you know what else I’ve noticed? That the kernel developers treat you as the joke you are. Show me a comment by you on the LKML which hasn’t been rejected for utter lack of technical merit.

  3. oiaohm says:

    Der Balrog Also history of you talking just smack if you expect us to do this where are your bug fixes or suggestions.

  4. oiaohm says:

    Der Balrog Funny. Apparently you have not noticed from time to time I do comment on Linux kernel faults.

    Just at the moment I do have a issue I can see but cannot come up with anywhere near usable solution.

    Basically how to improve the kmem cgroup controller to be able to place a value that is not kernel build dependant. Its a little hard of a problem to work out a neat solution.

  5. Der Balrog says:

    Hey, Pogson, which parts of the kernel have you examined so far? The kernel developers need your help. Go find bugs instead of talking smack. The same goes for Mr. Peter Dolding.

  6. oiaohm says:

    Please don’t say UPS’s as solution. The cause of most of these big nasty kill everything in server rooms is UPS failures. They are the source of the power spike.

    Gone darn inverters can malfunction that do major ouch of like 500+AC into 240 power-supply intermittently wearing down fuse in power-supply and motherboard regulators at times.

    When I do put in a replacement its UPS and Server I have replaced until I can test what I have ripped out.

    dw so yes your defence against blackouts is what brings you undone.

  7. oiaohm says:

    bw the server for windows 3.11 workgroups is NT 3.5

    The server for windows 95-98 is NT 3.51 and NT 4.0. Nt was not large volume you would have been looking at.

    Yes the NT do have workstation modes but most common usage was server. Sun Unix servers were more expensive than Windows NT server setups.

    OS/2 was not popular where I was. It was Windows NT, Netware and BSD/Linux stuff in the server rooms. Those out numbered everything 1993-2000.

    All 4 of these OS’s Windows NT, Netware, BSD and Linux not one had product activation that tied to the hardware. This is why Unix was not exactly this popular where I was. Spares was not always dependable.

    bw even with the windows 3.11 and the 9x lines you could rip the hard-drive out of one machine and put it in the next one with only minor alterations to make it work again.

    bw with Windows XP/2000 you could get volume license product keys. These volume license product keys also disable the hardware change key. Other than driver hell that has happened in Windows 2000 on those could be hard drive ripped out of one machine placed in the next without issue.

    Windows 2003 in server is where hardware locking starts with no key to disable it. Vista on desktop.

    So I cannot have a drive with Vista or 2003 sitting on the shelf ready to be inserted into any machine in case of failure.

    KMS server you might say. Now this is a mondern evil I run into dw. You know that Key Management Servers only activate clients for 180 days max before they have to connect to the server. Next they don’t display message they have not been able to active.

    Finally here is the kicker of kick. For a KMS server to activate anything it must be able to see 25 computers that are KMS. Someone alters the roster around. Now KMS server is not getting 25 KMS computers on at the same time. 3 months latter you find out when people cannot log into there machines because they are not activated.

    dw how is this better. You use MAK those are 1 shot keys that you have to enter a new key every time the hardware changes enough that windows does not believe it the same hardware. New hard-drive and video card can be enough. So volume licensing now suxs and its land mined.

    dw with all the trouble 9x had some days I would happy go back to it. Just not to have to deal with product activation failures.

    The counts with KMS are the following.

    –25 for any Windows Client Operating System-
    So KMS unable to see 25 it will stop activating.
    –5 for any Windows Server Operating System–
    So unable to see 5 servers it will stop activating.
    –5 for any Microsoft Office Application/Suite–
    Office suite is about the only one that is kinda practical for KMS. Yes it is still possible to end up with random machines in the office with MS Office throwing activation failures due to network issues talking to the KMS blocking people from getting there work done.

    bw now here is the killer you are only allowed so many KMS server activations. So Murphy has a field day and all you KMS server fail. Now you are stuck on phone for a few hours to get Microsoft to fix it.

    This is why I run Linux where I can in Servers. When the shit hits the fan. As long as the harddrives have not failed. Yes I install pairs raided. I can put 2 hours as repair time. Even if its take a desktop machine and turn it into a server just for the short time until I can get parts.

    I only have to go to my backup images if both harddrive died. That is truly surprisingly rare. I am more likely to lose a motherboard, cpu or powersupply.

    Basic reality with NT 4.0 and before, Linux, BSD, Netware… Even Unix as long as you have replacement hardware is that the broken server can be pulled out of operation and a different server hardware made active using the same hard drives.

    This is handy it is 1 am in the morning something has failed do you think I want to be doing hardware diagnostic. Or would I just prefer to swap the lot.

    Answer every time is swap the lot. When I am awake and unlikely to screw up writing is the time to find out what is wrong with the hardware.

    It day light you have 40 other employees breathing down you neck wanting to know when the server will be back on-line so they can get there job done. What number do you want to say 2 hours or 8 hours.

    Please note you must say larger than what is required. Scotty in startrek. If you say less and don’t meet it they will kill you. And 8 hours is about the max you can say.

    I can do a linux harddrive swap 1 hour use spare image drive on shelf in 1 hour. That gives me 1 hour to data recover from backups if I have to. Most cases it back online in 1 hours.

    Even when running N+1 setups. You still need to get the +1 back as soon as able. I have had cascading power supply failures. Yes a power spike had damaged the power-supplies and they nicely fell over at different times.

    Robert Pogson with windows having a backup image is not enough. You need a full backup machine activated and running to be able to repair quickly.

    Problem is running machines are exposed to power problems that lead to motherboard and power-supply failures. Big problem with this kinda failure is normally simultaneously. Yes basically everything that was on at the time of issue is going to stop.

    dw you have not been in the server room to know how windows drives us up the wall.

    OS/2 server you could rip it hard-drive out and put it in the next machine as well. Just when it got very old it become impossible to replace its hardware with anything other than virtual machines.

    I have brought broken OS/2 network back online using qemu and bochs in Linux. Until replacement could be made. Of course they had backups. That was down for a day that was 14 machines. A week did they not have backups or did not have skilled IT officers. Part functionality should have been restored in 24 hours.

    bw just think of that golf course issues except the one being complained to is the poor IT staff with the CEO on your neck. Then you most likely understand why I hate Windows. Anything that makes my repair time longer I hate.

  8. oiaohm wrote, “1 hour downtime to swap a motherboard turn into 8+ todo a full reinstall and data recovery.”

    Yes. If one wants to run that other OS with much less trouble one needs a backup for each machine whereas GNU/Linux can often have one image for a raft of kinds. I saw that firsthand in a lab where students could swap mice and freeze that other OS. I had to tie down the mice and still they froze. So, I converted to GNU/Linux and LTSP could run the whole lab reliably. I even dragged 10 year old machines out of storage and added them to the LAN and they just worked.

  9. bw says:

    You will have to supply any details, since that was so long ago that my memory fails. Even so, I seem to remember that Microsoft was not very important in terms of servers until around ten or so years ago when Windows 2000 began to become popular both for server use and for desktop use. As I remember it, we had Sun Unix servers and Windows 98 clients before that time when a big switch was made.

    The NT stuff was never very popular, at least where I worked at the time.

    That has nothing to do with whether people had any option to using Windows 95 in 1994 though, which is the topic that no one seems to be able to stay on. I sort of remember that OS/2 was used by a lot of companies and was being pushed by IBM in that era although I never used it personally. I know that the pro shop at my golf course had an OS/2 based network that ran the cash registers and the tee time scheduler and even the handicap calculation stuff. I remember because it broke one day and they were almost out of business for a week before it was finally replaced. We had to do handicap updates with pen and pencil and that created a lot of arguments.

  10. oiaohm says:

    bw can you please explain to me why a server death due to motherboard failure with windows can cost you a full reinstall at times with 2008 and 2012 server due to activation failures. Yes when windows has activation failure it can make a nice big mess of stacks of settings in latter editions of Windows. So you call the latter additions of Windows self bricking OS’s. Yet NT 3.5 and 4.0 server you could just swap the harddrive and boot safe mode fix a few drivers and everything was good every time. Linux today is even the same as old NT 3.5 and 4.0 for ability to repair.

    –Everybody knows that Windows has gotten consistently better over the years.–

    Reality this is a lie. In some ways Windows might of got better. But in some critical ways bw Windows has got way way worse. Ways that can see what should be 1 hour downtime to swap a motherboard turn into 8+ todo a full reinstall and data recovery.

    This is not true for network people. Windows has progressively got worse with heavy handed activation solutions. Driver makers doing bad thinks like damaging default video card drivers.

    bw Linux might not be the most nice to use. The one thing Linux has been good at is being have to be transferred to new hardware without getting too badly upset.

    20+ employees not being able to work due to server failures is not fun.

    Companies like redhat in the year 2000 got very bluntly aware they did not have the support of the hardware makers. So focused on markets where they could control the hardware and did very well.

    bw basically the time Microsoft was focusing mostly on Desktop. Linux was focusing on lots and lots of big servers.

  11. bw says:

    I have no doubt that you were capable of doing that in 1999 and capable of directing students to do the same thing. At that point in time, though, Win95 was pretty old hat and even the wildly successful Win98 was giving way to Win2K which I am sure you will have to admit was orders of magnitude more robust than the early Windows versions, even Win95.

    You are contrasting options for an OS that a professional used personally and used by students with professional instruction with general audiences. Windows is mostly used by working stiffs with little or no help from anyone. Windows installed in large office environments with professional IT staffs will have few if any bad outcomes as well.

  12. bw wrote, “in 1999 it didn’t have the capability of being used beneficially on a PC”

    That’s a lie. In 2000, I used a distro released in 1999 and it rocked. It was solid and did not crash. It thrived on the same PCs on which Lose ’95 could not run more than a few days without crashing. Those PCs running Caldera eDesktop GNU/Linux were very valuable allowing students to get on the web, word-processing with StarOffice and doing IT with KDE. Students had no problem with it and it was like having an assistant in the classroom for me.

  13. bw says:

    Win 95 was released in 1994 and the study that found that Linux was not much of a threat was some 5 or 6 years later. Linux was late to the party and even in 1999 it didn’t have the capability of being used beneficially on a PC. It was a kind of Unix and only good for servers if it had any use at all.

  14. bw wrote, “I don’t recall Linux being on anyone’s radar in 1994 when Win95 had a stunningly successful introduction and was the talk of the town.”

    It was on Bill Gates’ radar:
    “The Linux user community itself has been pegged at between 7 and 12 million.” (1999)

    If it weren’t on his radar, what was Muth doing studying it? In those days, M$’s OS was making about $2billion revenue per quarter (perhaps a few million licences) so 10 million GNU/Linux users was huge. RedHat was founded in 1993… It was on their radar, too.

  15. ram says:

    Being able to examine and modify code is a huge advantage to Linux. Although I can personally code some types of things well enough, from time to time I hire serious professional specialists in certain types of media-creation coding (audio, video, UI) to update/modify various company workstations.

    For example, recently my company upgraded some Linux audio workstations from:

    ALSA 1.0.14rc3, jackd 0.102.0.20 fervent-1, qjackctl 0.2.21-1, and HDSPMixer 1.6

    to:

    ALSA 1.025, jackd 0.121.3, qjackctl 0.3.2, and HDSPMixer 1.11

    or, in other words, the audio infrastructure from around the year 2006 to 2012, while retaining everything else (i.e. not breaking anything).

    Only in Linux can you perform that kind of ‘brain surgery’ and retain the value of your existing investment in specialized (but legacy) software and hardware.

    In our case, this permitted us to use our old audio workstations to seamlessly drive and exchange multichannel digital audio content for our back-end Linux ‘mini-supercomputing’ cluster.

    This way we get the power of the Linux cluster (which mostly does video rendering) coupled to our existing recording studio equipment and tools.

  16. bw says:

    A rather broad spectrum of claims here. I am not a student of Linux history, but I don’t recall Linux being on anyone’s radar in 1994 when Win95 had a stunningly successful introduction and was the talk of the town. Was Linux as useful in 1994 as Windows 95 was upon its introduction? If so, the Linux proponents missed a golden opportunity.

    But I think that was not the case and the “battle”, if we must have one, was between Win95 and OS/2. Linux was never a contender.

    I think your “So has Linux” claim is indicative of the sort of thinking that goes into it. When some sort of trend starts to develop, the open source developers find a new model to chase and they start to develop in the same direction. That gives slow followers of a trend a great opportunity to get in at a low cost, but it dooms open source to always trailing the trend setter. That is not always Microsoft, but it has been the case often enough, I feel, to keep things in Microsoft’s favor.

  17. bw wrote, “Everybody knows that Windows has gotten consistently better over the years.”

    HAHAHA! So has GNU/Linux. I have seen nothing from M$ even approaching the power, flexibility and cost of GNU/Linux. The list of malware attacking M$’s OS successfully is huge and growing rapidly. Next update from Redmond will “fix” 57 vulnerabilities next week… How many thousands of bugs were released? We don’t know but if they fix a few hundred per annum for a decade it’s a hell of a lot more than Debian GNU/Linux.

    So, their OS does not crash just idling any longer as Lose ’95 did. Big deal. They haven’e earned a penny with the crap they were selling. I think M$ is one of the biggest frauds ever seen.

  18. bw says:

    All that is just so much hot air. Everybody knows that Windows has gotten consistently better over the years. Anyone who wants to gripe about it can do so, but Microsoft really does not care. They don’t lost any money or any sleep over it. Split all the hairs you want and publish your opinions until the cows come home. You will be ignored as always.

  19. oiaohm says:

    bw in fact the stability of windows depends on what one you are talking about.

    Windows NT line has decreased in stability from the early line over time in a lot of ways.

    bw too much backwards compatibility has downsides.

    Lot of early Windows programs use a lot of non documented features. Including bugs and errors. Microsoft has taken the point of view don’t break or isolate applications. This means defective ABI’s remain.

    It is the unfortunate problem. Linux distributions lower backwards compatibility the more the stability has increased over the same time frame.

    The required answer is nasty. Linux with cgroups with glick2 right path. Turn applications into nice isolated bundles. SXS under windows is flawed.

    The biggest mistake MS made with NT is allowing applications to install dll into the system directories. Thinking NTFS supports hardlinks if every application installed every dll they need into there own program directory then you did deduplication and hardlinked everything is better.

    In fact doing this under windows can reduce ram usage.

    bw there are basically some design fault ms needs to bite the bullet on. Worst part is MS has taken wrong turns to fix them.

  20. bw says:

    You are not being fair here. In the early days things were much more unreliable than they are today, but what I have seen, and what I think most others see as well, is that the stability of Windows has been constantly increasing over time and a corresponding increase in functionality and dependency has occurred over the same time frame.

    People use computers more and more every year. The very need to access the internet for personal communications and other day to day activities is what has given rise to a mobility need and demand for smart phones and iPads.

  21. bw, apologist for M$ writes, “I don’t want anything to be fixed as much as I want it to not be broken. I have uses Windows since the original 3.0 days and I have never had any problem that caused me to be unable to use my computer while waiting for anything to be fixed by Microsoft. I imagine that the same is true for Linux. If it didn’t work out of the box, the box would have been returned to the store immediately. That just doesn’t happen and that is why Windows is the most common operating system on PCs as it has been for so long.” and a bunch of other stuff.

    Come out from your cave. It gives you a tunnel-vision view of the world. The only reason I went to GNU/Linux was because Lose ’95 would not work. It crashed at the drop of a hat. I have seen XP fail quite regularly too. The idea that M$ produces good stuff every time is ludicrous. It leaves me breathless in its level of ignorance.

    Here, you can watch Bill Gates waiting around for M$ to get its act together:
    http://www.youtube.com/watch?v=-NsXHPq71Bs

    There’s a reason BSOD is in the public’s vocabulary.

  22. bw says:

    “bw with Linux and Open source if you don’t want to fix bugs you self then you should have paid for a support contract ”

    I don’t want anything to be fixed as much as I want it to not be broken. I have uses Windows since the original 3.0 days and I have never had any problem that caused me to be unable to use my computer while waiting for anything to be fixed by Microsoft. I imagine that the same is true for Linux. If it didn’t work out of the box, the box would have been returned to the store immediately. That just doesn’t happen and that is why Windows is the most common operating system on PCs as it has been for so long.

    I also think that any advantage that Linux might have on a cost basis vanishes when you buy a support contract. That makes the out of pocket cost of Linux much higher than Windows and it is only good for a period of time whereas the Windows license is good forever.

    ” don’t know if you now are a troll or a idiot. Don’t you read the contracts that come with your software”

    There you go with the insults again. But no, I don’t read the contracts. Why should I? If some company is bent on providing inadequate service and wants to hide behind some clause in the verbose cloud that most EULAs consist of, they will certainly win in court but will be out of business in short order due to a lack of anyone reordering or updating their products. Companies survive based on their good reputation and the accompanying good will of their customers who honor them with repeat business.

    Windows comes with a built-in update function that frequently makes changes automatically to the software on my computers. I can hardly complain about any lack of attention. It is more of a case of too much.

  23. Der Balrog wrote, “Rather prominently in the code there’s an array called colwidths. I guess it was too hard to see? Of course, you then would have to deal with other idiocies, like the window being a fixed size, and so on.

    So you have proven that fixing code is indeed not trivial. Can any user be bothered to do it?”

    Hey! I don’t code in C. The column adjacent had a different title even though it was the same item in different units. I fixed that. Fixing the code is rather trivial. One just has to either change the fixed values to variables or redo the windowing to use the libraries. The high value of the programme is that the ballistics is done very accurately. The UI is secondary to that. I can use the programme as is and make neat presentations of it by exporting the data. I could also redo the I/O in Pascal and link to the number-crunching parts. I could make it into a web application as well.

  24. oiaohm says:

    Der Balrog sending a hack like that in at least shows the more skilled coder that there is a problem at X location.

    –So you have proven that fixing code is indeed not trivial. Can any user be bothered to do it?–

    Exactly what does it have to be any user. Another user might be better at writing documentation than code. Another might be a better tester.

    Note that ever you Der Balrog have pointed out location for better fix and you are stuck up jerk not interested in FOSS. Really I don’t really know how I could find a better example of any user can help than you Der Balrog in your last comment.

  25. Der Balrog says:

    A few trolls here have derided the utility of being able to examine and to modify code. I am not a C-programmer but I am not a complete idiot and can and do examine the source code. I have been using a neat little prgramme, gebc (GNU Exterior Ballistics Computer) but found a couple of rough edges. One of them I could fix…

    That is not a fix! That’s a hack at best. And it doesn’t even solve the problem.

    Rather prominently in the code there’s an array called colwidths. I guess it was too hard to see? Of course, you then would have to deal with other idiocies, like the window being a fixed size, and so on.

    So you have proven that fixing code is indeed not trivial. Can any user be bothered to do it?

  26. oiaohm says:

    bw with Linux and Open source if you don’t want to fix bugs you self then you should have paid for a support contract with IBM, Redhat…..

    bw
    “That should be part of whatever sort of warranty goes with the program.”
    Then why are you buying OEM MS Windows it does not come with any warranty other than legally mandated.

    With this statement I don’t know if you now are a troll or a idiot. Don’t you read the contracts that come with your software. Legally there is no warranty on MS Windows or Office home users buy its in the EULA that most people don’t read.

    Businesses pay for support contracts in the volume licenses. So a bug you find as a home user bw Microsoft has no reason at all to listen to you or to ever fix it for you.

    Reality here bw learn what you have paid for. Yes with MS Windows and MS Office non volume you have only paid for the right to use not to have it repaired for you or to have any right to repair it yourself. Just have to hope that the issue you have is repaired to fix up someone using volume license with Microsoft products or your are completely screwed.

    FOSS you have the option to fix it yourself or pay a subscription to a support service or employ a programmer directly.

    Cost paying a support contract does make FOSS as expensive as paying for volume license. But unlike MS volume license that are only upgrades and the machine must have a valid license already. FOSS support stuff can go straight on bare hardware.

    This is why I have such a big problem people complaining about FOSS developers not listening to them. They don’t pay attention on OEM license stuff Microsoft don’t listen either and you paid money.

    At least with FOSS if you have handed over cash they will at least listen and attempt todo something.

  27. ssorbom says:

    Even a user who never takes any interest in their computer, never files a bug report, never subscribes to a mailing list, or never reads computing news still reaps the benifits along those who do. No, participation of all users is not mandatory, but if they ever do happen to raise their heads and take an interest, the barrier of entry is low, at least to start out with.

  28. bw says:

    This sort of thing is unworthy of such philosophic discussion. People who sell software as a product sell it to people who do not want to do it for themselves. The original idea of a “home computer” was for people, who knew how to do their own thing and who also wanted to do it, could do so.

    If I buy a program and it has bugs, I don’t want to have to fix them myself. That should be part of whatever sort of warranty goes with the program. Besides, anything simple enough for a person who does not have a detailed understanding of the product to understand is not likely to cost very much money to begin with. If you want to do it yourself, the free programs with source code are what you would want. There is room enough for everyone to do things the way they want to do them. I think that the paid-for stuff is going to meet more people’s needs if the free stuff requires anyone to fix it themselves.

  29. ssorbom says:

    My uncle and I one had an argument about this very same thing. I held (and still hold) the position that even if the majority of a programs users have NO ability to fix bugs themselves, they still benefit from community peer review done by those who do have said ability. Peer review is very real; the only issue is that said project has to be large enough to attract community attention. Also, I dont know if this still applies, but a critique against Microsoft in the early days was that their official policy on bug reporting was to pretend that bugs didn’t exist (see http://artlung.com/smorgasborg/C_R_Y_P_T_O_N_O_M_I_C_O_N.shtml, you ma need to scroll down a bit) Opening source code makes debugging info easier to find, both because of things like bug trackers, and because debugging symbols can be installed with the program, so that if a segfault happens, info on what the program was doing the moment it crashed can be seen from the command line. You don’t need to know anything to paste that into a bug report! But Microsoft can’t include this because to give away debugging symbols is like giving away source code.

Leave a Reply