Please forgive me if I sound like I represent anything whatsoever, but I feel like I need to write this down before it slips away.
LTSP (The Linux Terminal Server Project – http://www.ltsp.org ) is a very interesting beast when it comes to open source software projects. We are somewhat of a paradoxical entity, representing an entire system as well as many individual systems. Maybe LTSP is in between the two…we’re like an ether that facilitates an interconnectedness of individual systems, but reside outside of the main system. We tie together multiple open source projects such as dhcp-server, tftp, X ( http://www.x.org ), NFS/NBD (and many, many others), to make something of a meta-system that, in the end, facilitates a thin client style computing environment.
There aren’t many projects like this out there. Most software projects focus on making sure their own bits function. LTSP seems to focus more on getting everyone else’s bits to function together in a compatible way, so our vision of thin client computing can thrive. Most of the time, this works very well, as many of the small core projects we depend on do not change much. However, sometimes depending on some projects for our own functionality and stability can be difficult.
It brings up many interesting points that a project such as LTSP can actually represent an ideal more than a single piece of software. How do we get everyone to play together nicely? How do we get developers to consider us when developing a new technology? How do we remind everyone that the reason GNU/Linux (All *nixes in general, really) has thrived longer than any other OS is that it has stuck to its roots of building very small, very stable tools, and tying them all together? Sed, awk, grep, cat … these are all incredibly important text processing tools that *nix has depended on for tens of years, and are still just as important today as they were when they were written. People have come to depend on them for everything imaginable, from making a simple shell script that tells you who is logged into your home computer, to mission critical applications from medicine to military. LTSP is no exception to relying on these basic tools to make it what it is.
There are also other very elaborate projects such as the Gnome and KDE environments. These are incredible pieces of software that also rely on many underlying tools to work. They also create their own methods of solving problems and adding functionality that affect a great number of people. Sometimes deploying these new methods break compatibility with other pieces of software out there. Not that they mean to break software for their own benefit, but it seems as though the general direction of some larger software projects have become very centric to their own specific goals… This is not a bad thing, as obviously any software project aims to accomplish their own specific goals.. but the problem is that many times, these projects do not have the knowledge that their software is being used by a much, much larger number of people in the real world than they imagine. To be specific, I feel that some open source projects have forgotten about the network transparency of X (the nuts and bolts of any graphical window manager/desktop environment in *nix).
Projects such as Compiz ( http://www.compiz.org ), for example, are responsible for a tremendous amount of adoption of the Linux operating system. Eye candy is, no matter who wants to dispute it, one of the biggest factors in the adoption of an operating system these days. If it looks pretty, it is appealing to the user and they are more likely to want to use it. If it looks ugly, clunky, outdated and/or otherwise visually unappealing, people are most likely going to avoid it and gravitate toward something that fills their desire to work with something more aesthetically pleasing. The drive to create a pretty desktop is completely valid.
There is a balance to be found, however. It is important not to lose the forest for the trees in any endeavor, in the open source ecosystem. X has been the foundation of almost every major graphical environment and application (save SDL and friends) in *nix for many, many years. X was also built from the ground up to be a “Network Transparent” graphical environment. This means that X applications should function, and feel, the same on a local computer system as on a computer that is running it across the network on a remote system. It is the network transparency of X that has generated an enormous amount of effort and energy toward software projects that otherwise would have never seen the light of day.
LTSP is but one project that has taken the raw power and flexibility of X network transparency to provide a way to use a SINGLE POWERFUL COMPUTER to serve graphical desktop sessions to tens (or even hundreds) of thin client computer terminals. This is not unlike the olden days of mainframes serving hundreds of to character-based “dumb terminals” ( http://en.wikipedia.org/wiki/Computer_terminal ) – quite similar, in fact. The ideas, methodologies and end result is that of complete centralization with your computing environment. The difference here is that we have gone from the old-school character-based terminals to fully functional graphical terminals, complete with local device support such as USB media and sound. All of this has been made possible through the underlying technologies that we have relied on for many years. People don’t realize that they’re typing away on a computer hidden away in a network closet…they think they’re typing into the computer at their desk.
There is a struggle, however. The desktop computing revolution has taken the world by storm. Most people have a full-blown computer at each desk in their office, at home, and at school. There is enough computing power dedicated to each person that could easily power 10 or more. This abundance of individual computing power has given software developers the chance to focus their efforts upon seemingly less critical aspects of a user’s experience – eye candy, for instance. It is not uncommon for people to watch high definition movies directly on their computers. However, the METHODS to deliver this quality media to the computer screen comes at a price which is withdrawn from the abundance of power at each person’s desktop. Unfortunately, the idea of being able to deliver video to a computer screen across a network is put on the back-burner, since the majority of people do not need to do this. This isn’t to say that it isn’t entirely possible to do this – it is, in fact. However, some software developers pay very little attention to this type of situation. Software such as Adobe Flash takes a tremendous amount of computing power to simply display a single video on your screen from YouTube or a similar site (or even on your local system). It is well known in the LTSP community that Flash is not geared toward networked computing environments, as when you have many client sessions using the software, the server computer becomes “clogged”, like too many people flushing their toilets at the same time. Don’t get me wrong, it has become much better over the years – but it is still a major pain. In the meantime, streaming high-def videos from an alternate software technologies seem to work perfectly well.
I guess my point is that we seem to be losing our roots when it comes to deciding where to spend our computing power. It is very costly to code inefficiently, yet it is common to do so, as most everyone has their own bank to draw from. Those increasing numbers of entities, however, who are turning to projects such as LTSP, are beginning to see that the direction of software in general has been toward the individual user at their own individual computer. LTSP administrators, developers and users all are on the forefront of discovering the inadequacies of badly written software. We must remind the open source software ecosystem that the roots of what makes X so important is its network transparency.
We must push toward greater efficiency in the tradition of *nix – small bits of super-fast code that do what they do extremely well, and are 100% compatible with all other code that follows that tradition. We must put aside the idea that since processors are getting faster, memory is getting cheaper and hard disks are getting bigger, that we can all assume it for our own application’s use. Centralized computing is becoming a major force once again – it is an energy efficient system administrator’s dream of controlling an entire network of computers from a single point. It’s time for software developers to bring this into their “sphere of consciousness” and create robust, efficient software, rather than to assume an entire computing system for their single application.