The Little Woman Returns To Using A Thick Client

For a year or so The Little Woman has been using one of my old thin clients I bought to put on a presentation at a teacher’s conference nearly ten years ago. The thin client was sluggish and annoying at times (100 mbits/s NIC, needing to coordinate our reboots/installations, video) and it took a lot of tweaking to make it work for her. Yesterday, we had her old PC repaired. It needed a new motherboard. In the process, we bumped up her RAM, storage, video, networking speed and CPU power. Beast is a little short of RAM so this will help me until I upgrade or replace Beast which has been serving as her GNU/Linux terminal server. Hardware cost? Zero (spare part). Loss of Freedom? None.

I booted her renovated PC by PXE (boot from LAN in the BIOS) and loaded Debian’s installer by TFTP. I couldn’t just copy Beast’s OS over to her because Beast runs a lot of stuff she doesn’t need, so I will install Debian 8 Jessie from scratch. The new system will be snappy with all bottlenecks widened. It was a great project for a rainy day with high winds. Fall has blown in.

About Robert Pogson

I am a retired teacher in Canada. I taught in the subject areas where I have worked for almost forty years: maths, physics, chemistry and computers. I love hunting, fishing, picking berries and mushrooms, too.
This entry was posted in technology and tagged , , , , , , , , , , . Bookmark the permalink.

12 Responses to The Little Woman Returns To Using A Thick Client

  1. dougman wrote, “ship them back to be reimaged.”

    Yes. This is one of the two main reasons I used GNU/Linux in the North. Shipping cost >$100 each way, so such “repairs” cost more than a replacement PC. Silly. I taught students how to install GNU/Linux and they were free of that nonsense forever. Notebooks are another matter, because swapping parts is often difficult. With ATX boxes, even changing hard drives, motherboards, PSUs, and RAM was usually a simple matter.

  2. dougman says:

    Chromebooks make for an excellent thin client. I used one myself for two years, while all the other people I worked with always had problems with their laptops and had to ship them back to be reimaged.

  3. oiaohm says:

    kurkosdr X11 really did not support GUI elements directly.

    Historic X11 supported stack of images and font rendering to images and input scripting to mask lag.

    It’s a diskless personal computer, because you just put an execution environment in it capable of running non-OS code. Or more accurately, a diskless personal computer with local cache.
    This description matches X11 and is why X11 is such a security nightmare. Also old GLX/DRI X11 attempts to send opengl shaders, models and the like to the local computer.

    Have you seenhow much CPU most “web apps” like google maps or your average WebGL app or that Unreal Engine For Browser demo chew?
    And that is lighter than old X11 running same thing remotely.
    http://www.virtualgl.org/About/Background
    X11 design to problem is horible.

    VNC,RDP, Spice and vmwares one I cannot remember now are your pure thinclient protocols. Part of the reason we need to change to Wayland is hopefully in the process we lose the idea of rendering instructions being done anywhere else other than where the application is running. So server side GPU usage. Html5 is sane as enough logic is client side so you don’t kill the network with traffic and vnc,rdp,spice… are sane because again they don’t kill the network with traffic. X11 just needs to die.

  4. kurkosdr wrote, “the economics of having some other computer do all the execution for you don’t work out.”

    That’s utter nonsense. Networks work and thin clients make great and useful nodes. At Easterville we installed 6 powerful servers that did all the work for the school and a bunch of thin clients interfaced to them very well. They were small, responsive, perfectly adequate to the task. I did the maths. That was the most IT we could get for the money. The economics of thin clients work very well. That’s why millions get sold every year. The reason hundreds of millions of thin clients aren’t sold every year is that they last so long. I bought them around 2006 and they are just now running out of useful life because everyone has huge monitors while in those days 1024×768 was near its peak of popularity.

  5. kurkosdr says:

    The problem is that the thin-client is a fantasy. A true thin client will receive no other data except GUI elements (such as windows, window borders, menus, check boxes etc) and text, images, audio and video to draw (decoded by a specialized circuit).

    Aka what X.org tried to accomplish. But failed because the economics of having some other computer do all the execution for you don’t work out.

    If a “thin-client” has a JavaScript runner (interpreter or recompiler) or a JVM, it’s no longer a thin-client! It’s a diskless personal computer, because you just put an execution environment in it capable of running non-OS code. Or more accurately, a diskless personal computer with local cache.

    Have you seenhow much CPU most “web apps” like google maps or your average WebGL app or that Unreal Engine For Browser demo chew?

    If the JS-enabled browser of your “thin-client” can execute those apps, it’s not a thin-client.

    This is why Chromebook not-thin-clients need to have the same specs as a low-end Ubuntu box. They are not thin clients.

    What you actually have sir is a diskless personal computer, ancestors of those diskless Sun Ultra 2’s I once saw in the “tech-graveyard room” when I was in uni. Unless of course you want tonargue those beefy-for-the-era Sun Ultra 2 were thin-client.

  6. kurkosdr wrote, “Why leave all that power un-utilized, and instead tax the network (a resource that’s actually scarce today) to offload somewhere else thing that could be done on the local machine?”

    It saves total capital cost, energy, maintenance, and can give improved performance. TLW’s machine was sluggish at video, but it was like a rocket for opening windows because lots of useful files were already in RAM. It’s just silly to have N copies of everything when one or a few will do. Even hard drives, costing only $40 in bulk may cost $40K if you have K clients. Then there’s the maintenance of all those file-systems. It’s just better to use thin clients. Her old thin client has a 400MHz processor and could do the job quite well. It just sucked for YouTube and such. 1gHz would work much better. Her “new” thick client is 3.2gHz dual-core.

  7. oiaohm says:

    kurkosdr there are cases for thin-clients. Thin-clients using rom based/write protect switches inside locked case have a usage in places like prisons where people are getting direct access to the hardware.

    Every system today has a basic dual-core 1GHz processor (even if it’s an ARM gigahertz) and 1GB or RAM. Sorry not true. Arm duel-core 1GHz processor with 756MB ram 1G flash beasts still exists and smaller do exist in the thin client market. Turns out that is what that can pack in a single SOC chip. I really do mean absolutely single SOC chip where the ram CPU and everything else is in a single chip. In about 4 years what you claimed will be base level.

    How ’bout offloading only what cannot be done in the local machine? Optimal-body-weight client?
    Linux terminal services do have that option. Yes local machine being audited thin terminal software displayed to user. So user thinks machine is fully functional while its in fact off line. 1 to 2 cores out a 4 core system is more than enough todo thin-client the other 2 cores can be busy checksumming the system. Yes there are uses.

  8. kurkosdr says:

    (reposting previous comment)

    BTW, today I understood what you FOSS ies were ranting about all those years. Today I discovered that my new laptop (a z70-80) installs software in windows during every boot without permission. But the funny thing is that the laptop does not need to root anything or overwrite bootsectors to do this, this mal-feature is all provided by Microsoft (Microsoft’s Windows Platform Binary Table they call it).

    Quite frankly, I ‘ve now completely lost trust in Windows, and I am moving all my computing to Desktop Linux, besides for the win32 apps I need to use. No, I don’t care if Desktop Linux is not perfect.

    I know, you probably couldn’t care less, but I wanted you to know that I do not consider you a loon anymore. And I should have probably started dabbling with Desktop Linux years ago, like every self-respecting geek does. It was a mistake trusting a since company with freaking everything.

    PS: The funny thing is I will probably go to Ubuntu. Sorry, I like the proprietary support it gets. Can’t cut off the habit cold turkey I guess.

  9. kurkosdr says:

    (and before some Deaf Spy, That_Exploited_Monco or DrLoser pop off and blathers that it’s all the fault of that “weedy Chinese OEM” Lenovo, lemme tell you that the primary fault lies with Microsoft, for opening a 3-metre wide backdoor with admin rights in their OS, and not guarding it appropriately. Who knows what other three-letter or four-letter acronyms reside inside windows that only MS and a select number of top-tier “partners” know about. That’s the thing about trust. You need to be trustworthy)

  10. kurkosdr says:

    BTW, today I understood what you FOSSies were ranting about all those years. Today I discovered that my new laptop (a z70-80) installs software in windows during every boot without permission. But the funny thing is that the laptop does not need to root anything or overwrite bootsectors to do this, this mal-feature is all provided by Microsoft (Microsoft’s Windows Platform Binary Table they call it).

    Quite frankly, I ‘ve now completely lost trust in Windows, and I am moving all my computing to Desktop Linux, besides for the win32 apps I need to use. No, I don’t care if Desktop Linux is not perfect.

    I know, you probably couldn’t care less, but I wanted you to know that I do not consider you a loon anymore. And I should have probably started dabbling with Desktop Linux years ago, like every self-respecting geek does. It was a mistake trusting a since company with fucking everything.

    PS: The funny thing is I will probably go to Ubuntu. Sorry, I like the proprietary support it gets. Can’t cut off the habit cold turkey I guess.

  11. kurkosdr says:

    I never got the deal with thin-clients anyway. Every system today has a basic dual-core 1GHz processor (even if it’s an ARM gigahertz) and 1GB or RAM.

    Why leave all that power un-utilized, and instead tax the network (a resource that’s actually scarce today) to offload somewhere else thing that could be done on the local machine? And have the thin client be a basic HTML-render with JS thingie?

    How ’bout offloading only what cannot be done in the local machine? Optimal-body-weight client?

  12. dougman says:

    New motherboard? I have repaired so many Dell motherboards in my past, the electrolytic capacitors had dried out and needed new ones.

    Often times, when doing a reinstall on a board with bad caps, Linux would install just fine when Windows would not.

Leave a Reply