300,000

300K. That’s how many folks have taken the Intro to Linux course offered by The Linux Foundation. “we were able to offer our Intro to Linux course for free to nearly 300,000 people from all over the world. While the United States ranks first in the number of students taking Intro to Linux, it only represents about 30 percent of all class participants. The top geographies include the U.S., India, United Kingdom, Brazil and Spain.” That’s about half the population of Winnipeg, my nearest big city. That’s several times the size of the Canadian armed forces. It’s 50% more than the number of people involved in the invasion of Normandy. I would call it significant.

What are those folks going to do with that knowledge? Install GNU/Linux on their PC? Set up a computer lab? Migrate some department or organization to GNU/Linux? Perhaps, but one of the second or third tier of obstacles to wider adoption of GNU/Linux has been the availability of local people with the requisite skills. I remember my first exposure to GNU/Linux. It took days of reading and days of trying stuff to get a working system. Without the web I likely would not have been able to do it. The actual installation was trivial when I was armed with just a little knowledge. Now, thanks again to the web, GNU/Linux is mass-producing skilled people. I like it. With my knowledge I was able to install GNU/Linux hundreds of times. 300K people could install GNU/Linux millions of times or buy and install GNU/Linux systems millions of times. Expect increased growth in adoption of GNU/Linux everywhere.

See Introducing 300,000 People to Linux.

About Robert Pogson

I am a retired teacher in Canada. I taught in the subject areas where I have worked for almost forty years: maths, physics, chemistry and computers. I love hunting, fishing, picking berries and mushrooms, too.
This entry was posted in technology and tagged , , , , , , . Bookmark the permalink.

39 Responses to 300,000

  1. oiaohm says:

    DrLoser you have not mentioned OS.

    2005 Windows FTP servers don’t have SSL mode. Linux, BSD and Unix FTP servers of 2005 do have SSL mode.

    The fact is you have only done one security audit. “I’ve only ever been asked to do a security audit once” as you state here. I have taken part in over 100.

    The policy you are apply may be a company only policy. Notice that you rebuild apache without ftp support. So to obey there requirements you cannot use IIS because it contains a ftp server that you cannot remove.

    Year 2005 not all companies required ftp disabled. As Robert showed other Audited companies are running ftp ports. Big thing is 2005 anything with FTP for user login SSL had to be enabled.

    DrLoser
    More than that, in fact, as part of the audit, I rebuilt specific versions of Apache 1.3 and 2.2 — don’t ask why we needed both — with all the security features I could find advice on locked down to the max. And then I packaged them.)
    Any clue why I would never ever do this if I could choose a different item. Are you going to be there to rebuild the installed Apache servers as new CVE errors appear. Remember you own built packages don’t come with upstream provided updates and auditing. Thank you for setting up a stack of security holes.

    Nginx appears in 2004 ok this might not meet the stability requirements but from 2003 there was also lighttpd.

    In fact what you describe doing is a breach of security. 1 you told me the party you were working for not directly but enough information to exactly identify them. 2 you have given me a year. 3 you have given me product versions enough to guess.

    DrLoser I would love to have you real world name so I could have you blacklisted so you can never perform an audit because you cannot be trusted to keep you mouth shut so could be the cause of a security breach.

    2008 BT suffered a breach to about 50 percent of there dslams. Guess who the provider was. I almost bet I am talking to the jackass responsible for that mess in you Drloser. This would be why you have not got any more audit jobs.

  2. DrLoser says:

    (Substitute RPC for RCP in that last.)

  3. DrLoser says:

    For the avoidance of doubt, oiaohm, yes, I packaged Apache 1.3 and 2.2 with a locked down configuration that took into account every single Apache CVE I could find on Mitre. And not just CVEs, but the “advisories” they used to issue … I’ve forgotten the name for those.

    And all of that was completely secondary to the main audit. Which went as follows:

    1) Don’t leave the RCP ports open. You are an idiot if you leave the RCP ports open.
    2) Don’t leave the FTP ports open. You are an idiot if you leave the FTP ports open.

    And, for reference purposes, that was back in 2005.

  4. DrLoser says:

    I guess you were one of the people systems who I found logged closing FTP and never closed HTTP server that was not required remotely due to company using VPN.

    You guess wrong, oiaohm.

    If you are not running a VPN and don’t need all the access a VPN can provide a client key ssl based ftp server can be quite secure. Yes it a double crack event.

    I imagine that Home Depot and Target are running a VPN, as far as access to their corporate network is concerned. Even these guys aren’t that idiotic. Otherwise I’m pretty sure we’d have heard about it.

    As for FTP via SSL, it’s not exactly a “double crack event.” Given that the phisher already has the FTP credentials, and since these are an integral part of the RAM-scraping and therefore a given, it’s at best a “single crack event.”

    If you’re talking about tunnelling via SSH, you’re SOOL, because for various technical reasons people don’t do this.

    If you’re talking about FTPS … you’re a little closer to the mark. That’s a possible solution, although it appears not to have been used in these cases.

    However, it’s still your typical patchwork kludge of nonsensical bit-part solutions, isn’t it, oiaohm? The fact remains that there is no conceivable use for an outward facing FTP server (I don’t even care what ports you use) from the core of a corporate network. Smearing SSL all over it is fairly pointless, because if there’s an SSL exploit available, you’re still stuck with the neanderthal FTP protocol behind all of this.

    And, trust me, the FTP protocol is about as neanderthal as it gets.

    Nope. Sensible people these days use a VPN.

    DrLoser did you audit the HTTPS severs to make sure only access was with user authentication?. There has more more data stolen by malconfigured http servers than anything else.

    Which part of “I’ve only ever been asked to do a security audit once, and even I know better than you, oiaohm,” did you not understand? No, I was not asked by Home Depot to do a security audit. Nor was I asked to do so by Target.

    (And as it happened, the audit I did perform, for a major supplier (50%+ of DSLAMs) to British Telecom, was indeed focussed on configuration of the HTTP ports and servers. More than that, in fact, as part of the audit, I rebuilt specific versions of Apache 1.3 and 2.2 — don’t ask why we needed both — with all the security features I could find advice on locked down to the max. And then I packaged them.)

    You, on the other hand, have never done this even once, have you, oiaohm? And how do I know this?

    Because if you handed in a security report at the end of your first audit, and at any point it said “You’re fine to leave a corporate network FTP server up and running, guys. Swell idea!”

    … you’d be fired from your “auditing” company on the spot. With extreme prejudice.

    Robert can get away with this guff, because he’s never been in a position of responsibility. You, on the other hand, oiaohm?

    Well, I don’t know. Does the name Walter Mitty ring any bells?

  5. oiaohm says:

    DrLoser are you aware that FTP with SSL on will not pass through a standard Internet router because the traffic is encrypted so the firewall cannot snoop to work out what ports should be forwarded. Due to all packets encrypted you will suffer from connection failure as soon as you attempt to login. This does not change of SSL security is breached by some remote attacker.

    So you have FTP on port 20 with Mandory Explicit for users. Unless you have specially configured the router the connection will fail for user logins. You have to particular forwards a group of ports to allow users to remotely use FTP in SSL mode.

    Even if the IIS and Apache ftp server is disabled they both should be configured into FTP SSL mode with anonymous disabled. This way if it does get turned on the router at edge of network will limit where it is accessible from. Fairly much kill it dead.

    Block the ports is not the answer allown. Configure or remove all you ftp servers and all your network routers correctly and FTP is reduced to a zero hazard.

    FTP is not a major network hazard if configured. Unconfined FTP servers laying around may bite you badly. Yes a lot of unconfigured ftp servers you find are IIS or Apache httpd.

    FTPS is port hungry this makes it a very safe option compared to HTTPS due to the fact you have to intentionally go out of way to make FTPS work.

    There are serous reasons not to use IIS or Apache.

  6. oiaohm says:

    DrLoser I will tell you something really important. I have found this by many auditor idiots. They block port 20 and 21. FTP client to port 80 where the IIS/Apache httpd is. Guess what since the FTP configuration is enabled in IIS it accepts FTP traffic over port 80 by default. Apache someone would have had to assign its FTP to port 80 by mistake resulting in FTP working.

    Fairly much were you getting rid of IIS and Apache Httpd Internet access as well? I guess not because most people don’t think of them as FTP servers.

    DrLoser it is pointless blocking ports if you still have servers running that can provide that functionality since port numbers are just a suggestion. Old rule if it is possible it will happen. Nginx for http/https only sites is a great thing.

  7. oiaohm says:

    DrLoser
    The secure sign-in has already been compromised.
    This is funny enough not the case. It is only some forms. Interesting enough its is one of the FTP secure sign-in forms that has not be compromised. Per client key secure sign-in in fact has not been breached when combined with a user-name and password with a limited attempt account. Only configurations allowing any form of client key has been breached. This is a feature that is disabled a lot. Little problem is secure sign-in on FTP is broken so is secure sign-in HTTP. If you are worried about secure sign-in issues http access from the server has to be dropped as well.

    Something interesting FTP secure sign in is the same as http secure sign-in except for one thing. Ftp mandates a username and password. http secure sign-in username and password is optional. One of the issues with ssh and http severs is for a long time you could sign in just using the ssl client key this is a mistake. Yes the idea that I encrypt so I am the correct person is wrong.

    Reality from a security point of view is you should shut off http ports before you shut off FTP ones. Of course used FTP server does require to be configured correctly to prevent data leakage so does http.

    Http protocol also does not define a limit on going threw the OS file-system either.

    2005 on Linux you already have selinux wrapping so the complete idea of a Linux server exposing FTP and having user mounted NFS work is misconfiguration.

    Is the sever Linux or Windows you are shutting the FTP on 2005.

    http://httpd.apache.org/mod_ftp/en/ftp/
    Next ftp port usage could be your apache or iis. So if your FTP can go wondering threw the file system where ever so can your http accesses in either of those configurations..

    Nginx is getting popular because it does not contain a FTP server. So ftp server cannot be flicked on because someone is just altering the http server configuration.

    So ftp server access going where ever something is wrong with your configuration. Someone has used a sever they don’t understand. Apache http server is an ftp server yes a lot of http servers turn out to be ftp servers. So if you have disabled remote access by FTP you should have killed http off as well. And this version of ftp obeys .htaccess files.

    FTP is just a protocol. Every ftp server provides security options. Then the OS provides extra levels of wrappers around that.

    DrLoser 2005 FTP was no more insecure than a Http server. Lot of company networks have exposed worse to the internet like RDP.

    DrLoser even not sandbox FTP server a breach damage can be quite small. Of course this depends on the quality of the FTP server. Like does it support htaccess or ftpaccess limitation files. The FTP protocol it self does not define limits. The FTP standard for build FTP servers define a lot of limitation rules including max path depth. Yes max path depth is a standard feature of every FTP server. So you can set a directory level of 1 so users cannot cd anywhere.

    I guess you were one of the people systems who I found logged closing FTP and never closed HTTP server that was not required remotely due to company using VPN.

    If you are not running a VPN and don’t need all the access a VPN can provide a client key ssl based ftp server can be quite secure. Yes it a double crack event.

    DrLoser did you audit the HTTPS severs to make sure only access was with user authentication?. There has more more data stolen by malconfigured http servers than anything else.

  8. DrLoser says:

    I’m suddenly possessed with an urgent need to know this, oiaohm.

    Out of the myriads of subjects over which you possess some sort of command, which would be the top three?

    No googling, please.

  9. DrLoser says:

    Three completely useless posts, oiaohm, and nothing remotely worth saying.

    To be correct 1998 tell you to kill out windows support completely and with FTP be careful to be configured correctly. 1998 ftp could execute random crap. No modern implementation of FTP servers supports the execute this command instruction so closing a huge stack of security holes.

    “Kill out windows support completely?” Garbage, oiaohm, garbage.

    And all the rest. Garbage, oiaohm, garbage.

    Now, let us for a moment assume that we are in charge of the outward-facing IP ports for the corporate domain of, say, Home Depot or Target.

    Let us assume that we run some sort of rudimentary audit on the things. What do we find?

    Yup, since 2005 at the very earliest, the obvious conclusion is that we should shut down FTP, outward-facing, on the corporate network.

    Because, you see, all that bollocks about “secure sign-in” and “encrypted transfer” and even the “execution of random crap” is completely irrelevant when it comes to exfiltration as evidenced by Home Depot and Target.

    The secure sign-in has already been compromised.

    After that, all you’re left with is the nature of FTP. Which, unless you sandbox it rigorously (Robert will be interested to learn that this is what FTP download sites do), is the best protocol yet devised by Man to upload previously exfiltrated data without any sort of barrier whatsoever

    Like I say, oiaohm, this would be obvious to a ten year old.

    But not, apparently, to you.

  10. oiaohm says:

    http://wiki.centos.org/TipsAndTricks/SelinuxBooleans
    DrLoser this is also a good read.
    allow_ftpd_use_nfs
    allow_ftpd_use_cifs
    Same applies to apparmor and other LSM modules post year 2000.

    ftp server cannot magically go of browing threw nfs or cifs or anything else network without administrators approve if OS secuirty is on with Linux.

    FTP random-ally going off to where ever is a Windows problem or a malconfigured Linux server.

  11. oiaohm says:

    To be correct 1998 tell you to kill out windows support completely and with FTP be careful to be configured correctly. 1998 ftp could execute random crap. No modern implementation of FTP servers supports the execute this command instruction so closing a huge stack of security holes.

  12. oiaohm says:

    But it’s sod-all use when it comes to an open-ended commitment to let external entities roam at will through a corporate network.
    This is not something FTP protocol defines one way or the other. FTP does not include relaying.
    https://docs.oracle.com/cd/E37670_01/E36387/html/ol_cj_sec.html

    DrLoser I think you have mixed SSH up with FTP. FTPS and FTP is fine. SFTP could be a problem.

    Cite a single case where I made such a preposterous suggestion.
    http://mrpogson.com/2014/10/31/more-failures-of-the-wintel-monopoly/#comment-218283
    Right here DrLoser. Sensitive data is allowing NFS mountable directories inside a FTP server. This is because you have not configured your ftp server and nfs correctly.
    http://www.linuxsecurity.com/docs/SecurityAdminGuide/SecurityAdminGuide-3.html
    General security guide from 1998. That is right users cannot mount nfs partitions where ever they like. So if a NFS share is inside a ftp chroot there is a administration error because NFS security has not be limited.

    Of course 1998 is before FTP servers on Linux provided encryption in the form of FTPS.

    Also notice the same guide tells people that they must kill of Samba and FTP completely as well.

    Things has changed yet DrLoser is now saying I must explain old stuff that should alway be got right.

    NFS + FTP equaling exposed data is administrator not even following 1998 instructions for server setup and should be fired on the spot.

  13. DrLoser says:

    Does the number of ultimate users count?

    Again, a question from the world of Sanity.

    Dunno, ram, but I’d have to say it’s a pretty good proxy.

  14. DrLoser says:

    Gosh, DrLoser, you stubbornly refuse to accept that folks might not put sensitive information on an FTP-server.

    Apparently you can do oiaohm a favour here, Robert.

    Cite a single case where I made such a preposterous suggestion.

  15. DrLoser says:

    I have some public facing ftp servers, but all they are serving is information we want to distribute to the public.

    A voice of sanity at last, ram. To avoid further confusion, nobody has any problem whatsoever with this usage of FTP — basically because FTP is a standard, and because it might well be the only protocol you can guarantee to be in wide use at the client end.

    It is therefore good for me. It is good for you. It is good for Robert. It is good for olderman.

    But it’s sod-all use when it comes to an open-ended commitment to let external entities roam at will through a corporate network. Because it has no built-in way to stop a client doing that. Indeed, the whole FTP protocol practically mandates the reverse.

    Stick it on a box on its own, by all means. But for any other purpose — please, no.

  16. DrLoser says:

    DrLoser really simply that is what cgroups and other sandboxing of servers is for. Note your credit card processing and product information to be inside rules are meant to be on different servers or heavily security split to be inside rules. If it not separated it a breach point 3 and 7.
    “3. Protect stored cardholder data”
    “7. Restrict access to cardholder data by business need-to-know”

    Ahem.

    In the cases of Target and Home Depot, rule #3 was clearly breached in the first place, by allowing the Black Hats to RAM-scrape the POS terminals and furthermore to exfiltrate that data to the main corporate network.

    Tell me, oiaohm … how is the Magic Fairy Dust of cgroups and sandboxing supposed to be of any use at all here?

    As for #7, may I gently remind you that I have worked in the credit/debit card business for six years, oiaohm, whereas all you have done is to gaze longingly at a $1 million prize bull and wonder whether you should bolt the gate after you.

    It’s a requirement on personnel, oiaohm, not on whether or not to use cgroups and sandboxes. Thus, “business need-to-know.”

    The single most useful contribution you have made to this conversation, oiaohm, is your recommendation to use a “properly secured outward facing FTP connection,” with bells and whistles and encrypted login/data transfer and what-all.

    And why is this useful? Because it demonstrates even to a clueless idiot corporation, much less one that conforms to the Industry Data Security Standard, that you should never, on any account, be let within a million miles of a properly secured IT system.

    Other than that, worthless gibberish as always.

  17. DrLoser says:

    I have request of Robert. Since DrLoser like demanding that I cite everythjng. Lets see how he likes having play by those exact rules. Every time his post is lacking a cite delete it.

    As an alternative to whining to Mommy, oiaohm, why don’t you just

    a) provide cites where relevant and
    b) point out when I fail to do so?

    One would have thought that you are old enough to take care of these things on your own.

  18. ram says:

    I have some public facing ftp servers, but all they are serving is information we want to distribute to the public.

    As far as installs go, the people I work with may not install all that many sites with Linux, but each cluster consists of many thousands of cores. Is each core an install? Or each box (meaningless) an install? Does the number of ultimate users count?

  19. oiaohm says:

    I have request of Robert. Since DrLoser like demanding that I cite everythjng. Lets see how he likes having play by those exact rules. Every time his post is lacking a cite delete it.

    Maybe then DrLoser will learn it does not pay to throw stones when you are in a glass house.

  20. oiaohm says:

    How do you configure an outwards facing FTP server in such a way as to stop a Black Hat from retrieving a stash of security-conscious information, eg credit card info?

    DrLoser really simply that is what cgroups and other sandboxing of servers is for. Note your credit card processing and product information to be inside rules are meant to be on different servers or heavily security split to be inside rules.
    http://en.wikipedia.org/wiki/Payment_Card_Industry_Data_Security_Standard
    If it not separated it a breach point 3 and 7.
    “3. Protect stored cardholder data”
    “7. Restrict access to cardholder data by business need-to-know”

    Point 7 particular. Any service being allow access to credit card information must have a need-to-know reason. Remember you can place the ftp in a virtual machine or in a cgroup name-space or many other containments to prevent access when need-to-know is not meet. In fact I can never think up a case that an ftp server has a need to know reason to access credit card information.

    You are still talking about a malconfigured server. A lot of http services are also in breach of point 7. 1 server processing insecure and secure data does not pass point 7.

    There has been no credit card data stolen by FTP server. Home Depot breach was the Http server. Home Depot started using FTP for uploads after the breach because FTP is more secure than using some HTTP website upload that was broken allowing credit card data to be stolen.

    DrLoser time for you to find 1 cite. Find a reported data breach done by ftp that involved credit card details. Remember all credit card breaches has to be reported and how. You will find a lot that the breach method is http and https. Not one that is FTP.

    DrLoser you attack example only worked on a mal configured server where users login home and ftp home are identical.

    Just because you have username

    Those credentials have already been stolen.
    If we are saying credentials are stolen already there is still bugger all difference between FTP and HTTP. RDP and SSH you also find running these are worse if credentials are stolen.

    By the way just having a username and a password may not be enough against SSL protected FTP server.
    http://download.pureftpd.org/pub/pure-ftpd/doc/README.TLS

    The max required login is in fact username password and known to server SSL client key. Breaking into a max secured FTP server is like breaking into a VPN except the FTP server allows less to be done if it does get breached. Username and Password only enough if you are using downgraded FTP security that is currently forced because most FTP clients are crap. By the way most web browsers don’t support per site SSL client keys so this results in HTTPS also running with downgraded security. By the way known only clients keys that exists in the reference version tftpd hard to configure but it exists.

    Lot of the problems with FTP has nothing todo with protocol but a lot todo with crap client programs.

    Giant Clue-Bat is what you need DrLoser. HomeDepot server breach was http server not the FTP server they started using FTP server after the they were breached at http.
    http://krebsonsecurity.com/2014/09/home-depot-hit-by-same-malware-as-target/
    This year they were breached because there POS systems got virus infected.
    HomeDepot has always handed out the breach information. Its never been the ftp server. Basically this is another case of DrLoser making stuff up because DrLoser is too lazy todo homework.

  21. DrLoser wrote, “How do you configure an outwards facing FTP server in such a way as to stop a Black Hat from retrieving a stash of security-conscious information, eg credit card info?”

    Gosh, DrLoser, you stubbornly refuse to accept that folks might not put sensitive information on an FTP-server. You stubbornly refuse to accept that the software and data might be readonly, even running on CD, for that matter. Even I, a complete amateur by your standards, can find those and more solutions to the imagined problem in short order. The reasons to use FTP are many including simplicity, reliability, efficiency, that all may outweigh any insecurity in certain circumstances, customers credit card data not being among them. Years ago I applied for a job at a first-rate research corporation that was going to put all its public archives on CD on an FTP server. They weren’t crazy. They weren’t taking any silly risks. They wanted a system as efficient, cheap, simple and reliable as possible only with FTP.

  22. DrLoser wrote, “Everybody else is clearly living in Fantasy Land.”

    Divide and Conquer, eh?

    So, I would suggest that a good chunk of knowledgeable people would disagree with DrLoser and other Free Software besides Red Hat works for people.

  23. DrLoser says:

    What I will say, in favour of GNU/Linux and in terms of security, is that this entire discussion has driven me to conclude that (should I be in a position to make such a recommendation) I would forcefully assert that any SME or upwards should under no circumstances even contemplate wasting their time with incompetents who babble about Debian and APT and the LSB and God Knows what else.

    From a security perspective, every last one of these fools needs to be hit upside the head with a Giant Clue-Bat.

    My recommendation? Red Hat. There’s at least a smidgeon of a possibility that they know what they’re talking about.

    Everybody else is clearly living in Fantasy Land.

  24. DrLoser says:

    It’s actually worth replying to oiaohm’s absurd gibberish in this case, simply for informational purposes. Normally I do so because I like a good laugh (it really is like throwing thruppence at the loonies in Bedlam), and I admit that this predilection is … shall we say … slightly sordid.

    But in this particular case, I might be able to teach something useful. So here goes:

    DrLoser please read here. Last post Roberts.

    Lesson #1: There are such things as permalinks. And they are useful. Robert’s assertion?

    FTP servers are nearly perfect for distributing manuals, spec-sheets, brochures, etc. HomeDepot uses an FTP service for suppliers…

    Home Depot apparently also (inadvertently) uses an FTP service to enable Black Hats to collect a stash of scraped credit card details.

    I was inspired by reading your cite, oiaohm. But not in a nice way.

    Thruppence! Ker-Ching!

    The are tones of companies that run outwards facing FTP servers without issues.

    Wrong Standard Unit, oiaohm. And (saving specific FTP download sites, which clearly have “tones” of security behind them), there are only two types of such companies:

    1) Those whose security has already been breached.
    2) Those whose security is about to be breached.

    It’s a ridiculously dangerous way for a corporation to behave in 2014.

    There is nothing wrong with an outwards facing FTP servers as along as they are configured correctly.

    How do you configure an outwards facing FTP server in such a way as to stop a Black Hat from retrieving a stash of security-conscious information, eg credit card info?

    You can’t, can you?

    Linux security best practices is mostly don’t use old version of protocol.

    I hope to God that you have just made this up out of thin air, oiaohm. Fortunately I am absolutely certain that you have done precisely that.

    No cites, I see.

    Using current version FTP correctly usernames and passwords and data are sent encrypted. Yes it sits on the same FTP port number.

    I’ve been observing this total cluelessness about security on the current thread for quite a while. I’ve been waiting for you, oiaohm, or Robert, or indeed anybody else to twig that:

    Login/Password security credentials are completely irrelevant!

    Honestly, a child of TEN could spot this.

    The question is not “How would I steal credentials over a secure connection to a Linux port.” Those credentials have already been stolen.

    The question is why would you let somebody who has phished the login salcl1 and the password 4l4sk4! have access to an outward-facing FTP connection?

    Only a total incompetent would argue that this is a “good idea.”

    Well, that’s my summary of the pertinent security issues for today, kiddies. Read it and weep!

  25. luvr says:

    “They (i.e., drugs) just happen to have worked for me.”
    “But Linux? Nah, mate.”

    That sounds to me like you’re saying that Windows is your drug (which would make Microsoft your pusher).
    You’re finally beginning to make sense, then… 🙂

  26. oiaohm says:

    http://mrpogson.com/2014/10/28/systemd-debates/#comment-219124
    DrLoser its only that it got lost in the noise that you think you know anything DrLoser.

    DrLoser does not even know first edition Posix requirements.

  27. oiaohm says:

    http://mrpogson.com/2014/10/31/more-failures-of-the-wintel-monopoly/
    DrLoser please read here. Last post Roberts. The are tones of companies that run outwards facing FTP servers without issues.

    There is nothing wrong with an outwards facing FTP servers as along as they are configured correctly.

    Linux security best practices is mostly don’t use old version of protocol. Using current version FTP correctly usernames and passwords and data are sent encrypted. Yes it sits on the same FTP port number.

    The bull about FTP mostly comes from Windows admins who have not used Windows 2008 or latter properly. All Linux FTP servers had SSL support in the year 2000 but was running into trouble because windows ftp clients did not support it. Yes some Linux FTP servers were a little slow implementing.

    http://tools.ietf.org/html/rfc2228 This is when secure FTP was invented. Yes 1997. It took Microsoft over 10 years to catch up with FTP standards. Its IIS 7.5 in Windows 2008 that gets around to providing secure FTP.

    Also IIS 7.5 supports different users for FTP to system users or to use System Users. Outwards facing really should not use System Users.

    DrLoser even so telnet server that ships with Microsoft products still is an out of date protocol. There are others like ldap from windows servers lacking up to date encryption only option. Note that one is lethal as is a really nice man in middle attack against an active directory network because the clients when refused on one connection encryption just cycle threw the list until they connect without notify the end user either.

  28. DrLoser wrote, ” I like the idea of people getting to know Linux sysadmin best practices. Linux networking best practices. Linux security best practises “.

    That’s not the purpose of the course. It’s meant as a first course, an introduction. Once folks have the engine running they can learn best what it can do and how to get things done.

  29. DrLoser says:

    DrLoser reason why we butted heads is the fact you have not taken the Linux foundation or redhat course internals course covering processes.

    My point of view that a LWP is a process the the Linux is the Linux point of view.

    Aaaah … Precious! You can always depend upon oiaohm to drag any topic over the distant horizon and then make a spurious claim.

    There are two reasons that I do not accept your spurious drivel about Linux Lightweight Processes, oiaohm.

    1) I have proven you to be wrong in every single identifiable detail.
    2) Everybody else who has replied to anything at all that you have said on that subject has also proven you, detail by detail, to be hopelessly wrong.

    I fail to see how wasting fifty hours of my time on a spurious “educational” course designed and facilitated by a numb-nut bunch of marketroids at the Linux Foundation could possibly help me out here.

    Maybe I should just Google for Ultimate Wisdom.

    Like you, oiaohm.

  30. DrLoser says:

    On a marginally more serious note — this purported “course,” sponsored by those well-known “technical experts and not at all a collection of worthless marketing freaks” the Linux Foundation …

    … doesn’t actually teach anybody at all anything useful at all about Linux, does it?

    See, I’ve got no problem with that. I like the idea of people getting to know Linux sysadmin best practices. Linux networking best practices. Linux security best practises … Clue: no outward facing FTP, please.

    I just think that this is a horribly worthless way of doing it.

    And if 300,000 people have indeed completed the course, then that’s merely 300,000 more worthless monkeys in the employment pool.

    Luckily, I work exclusively on Windows systems right now, so I don’t have to wipe the monkey faeces off the corporate Linux walls.

  31. DrLoser says:

    Hmmm… I expected you would be happy that those who are so brain-damaged to even think that Linux might be interesting for them, can now discover for themselves what useless crap it actually is? For free, at that!

    You have a fine, if mildly disreputable, career ahead of you as the local street corner drug-pusher, luvr.

    “Hey, kid! You know how your teachers tell you to “leave off that stuff?” You know how disappointed your parents would be? You know how The Man lies to you and tells you it’s gonna ruin your life?

    “Yeah, kid, I know what you’re thinking. It couldn’t hurt to try it, could it? I mean, you need to make your own choice … you don’t want authority figures to make that choice for you, do you?

    “And best of all, it’s free!

    Now, I would be the last person to recommend drugs and violence to anybody. They just happen to have worked for me.

    But Linux? Nah, mate. Tried it a couple of times, and it just made me feel like a cheapskate loser.

  32. luvr says:

    “Nope, sorry. A recipe for churning out monkeys. Nothing more”

    Hmmm… I expected you would be happy that those who are so brain-damaged to even think that Linux might be interesting for them, can now discover for themselves what useless crap it actually is? For free, at that! Think about it: those 300,000 that took this Linux Introduction will be rushing back to Windows, and never look back!

    WARNING: “This post is not suitable for the humour-impaired.”[emphasis added:rp]

  33. oiaohm says:

    DrLoser reason why we butted heads is the fact you have not taken the Linux foundation or redhat course internals course covering processes.

    My point of view that a LWP is a process the the Linux is the Linux point of view. A LWP tasks/kthreads has many tasks. LWP is in fact multi thread ever that from userspace you only ever see one.

    The Computer define of process means a LWP is a process and a posix process is a process on top of processes. Not a process on top of threads. Its a important difference when reading Linux man page. What process is the man page referring to is the most critical thing you must answer or you will read the man page incorrectly and have issues.

    Linux define of process is different.

    File operations covers VFS and some of the nasty issues like mounting over a directory currently being accesses. Commonality causes users issues of what the hell.

    DrLoser lot of the issue you have is not understanding the basics about Linux.

  34. DrLoser wrote, ” A recipe for churning out monkeys. Nothing more”.

    Hmmm… It may be true that graduates of this course will not be CIOs immediately but they will be introduced to GNU/Linux which is the objective of the course. If I were offering the course and I wanted follow-up business, I would make the course reasonable interesting for the target audience and give them a good starting point for continuing education in GNU/Linux. I would expect that a $free on-line course would attract a variety of folks from computer-geeks, to small business people to folks wanting to spice up their careers. There could be a few consumers in there too wondering if the small cheap computers are any fun. It takes some initiative to start and finish a course so I expect great things from these folks and the next batch and the next. It would be very interesting if the numbers of folks taking the course and subsequent courses keep increasing. Such courses are probably much more effective than Rute and TLDP, and Linux Questions which were about all I had back in the day.

  35. DrLoser says:

    I’m going to attempt a more intelligible effort. (My pardons.)

    Naturally, you should have linked to the outline. It’s triffically informative.

    A bit concerning that “Finding Linux documentation” is at #6 out of 17 (I’m going to assume that the “Conclusion” is basically a sales pitch — much the same way it would be on, say, an MSCE course).

    Even more concerning that “File operations” come in at #7.

    No doubt oiaohm would benefit from #15, “Processes” but we’re perilously close to the end of our notional three College Credits at this point, and it’s time to turn back to Wikipedia …

    Nope, sorry. A recipe for churning out monkeys. Nothing more

  36. DrLoser says:

    Thanks for pointing out this deficiency.

    And thanks for correcting that deficiency. Wow, is that a Prospectus or what?

    I’m going for “What???”

  37. DrLoser wrote, “There’s no prospectus”.

    Thanks for pointing out this deficiency. Here’s the description with a link to the outline.

    It’s basic stuff, like how to use GNU/Linux as a user or system administrator. It would have been great if this had been available when I was spreading the gospel in the North.

  38. DrLoser says:

    Now, thanks again to the web, GNU/Linux is mass-producing skilled people.

    Maybe. Neither of us has taken the course, which has a workload of roughly 50 hours (I believe that equates to about two, maybe three credits towards the 120 that one would need for a college degree).

    There’s no prospectus, so it’s impossible to judge the content without trudging through it.

    Here’s two further, equally plausible, possibilities:

    1) Courses like this are mass-producing trained monkeys who are going to find the corporate workplace a bit of a shock.
    2) Courses like this are no more and no less worth pursuing than an entry-level MSCE certificate.

    Given my known antipathy towards MCSEs, you can correctly deduce that I regard both possibilities as very likely true to the mark.

Leave a Reply