Ontario LinuxFest makes an auspicious debut

118

Author: David 'cdlu' Graham

The first-ever Ontario LinuxFest, unapologetically modeled on Ohio’s conference of the same name, took place on Saturday at the Toronto Congress Centre near the end of runway 24R at Toronto’s international airport. With only a few sessions and a lot of quality speakers, the organisers kept the signal-to-noise ratio at this conference as good as it gets.

The charismatic Marcel Gagné gave the first talk I attended. Gagné started his talk on what’s coming in KDE 4.0, which is expected to be released in mid-December, by stating that KDE 4.0 is a radical departure from existing desktop environments, including current versions of KDE.

KDE 4’s revamping is based on the premise that user interfaces are not natural or intuitive. We learn to work around the interface instead of designing interfaces that work around us, Gagné said. The best way to evolve desktops going forward is to make them more organic. They should work the way you want them to work.

He then demonstrated KDE 4.0, cautioning that it remains a work in progress. He spent a half hour showing us the various new features already prepared in KDE 4.0. If you like eye candy, KDE’s new desktop will keep you happy.

Ts’o keynote

The mid-day keynote address on Linux’s past, present, and future came from Theodore Ts’o, the first Linux developer in North America, who joined Linus in developing the operating system in 1991.

Noting the operating system’s earliest history, Ts’o jokingly described Linux as a glorified terminal emulator. He asked his audience to name the dates when Debian passed its constitution, Red Hat released version 3.0.3, the release date of the Qt Public License, and when it was that Richard Stallman requested Linux be renamed to Lignux. Amid several guesses, he said 1998, 1996, 1998, and 1996 respectively. Much of this progress, he said, is already 10 years ago.

Ts’o gave a brief history of Linux: In July 1991, Linus wrote the very first version. In 1992, X was added and the first distro was created. In 1994, Linux 1.0 was released and for the first time included networking support. A year later 1.2 came out; it was the first kernel to have multiplatform support, with the addition of SPARC and Alpha. In 1996, multi-CPU (SMP) support was added. In 1997, Linux magazines began to show up, and the user base was estimated at around 3.5 million people. In 1998, Linux received its first Fortune article, gaining corporate attention. In 1999, Linux 2.0 came out, and its user base had risen to an estimated 7 to 10 million people. That year also saw the dot-com bubble and the rise of the Linux stocks such as Red Hat and VA Linux, the latter of which gained a record 698% the day it began trading. Briefly, Ts’o said, VA had a larger market cap than IBM.

In 2000 the slump began, but lots of cool work was still being done on Linux. By 2001, he said, Linux was used by an estimated 20 million users. In January 2003 Linux 2.6 was released and a new release model was adopted. Linux began to be taken for granted by corporations. 80% of Sun Opterons were running Linux rather than Solaris. And, around then, SCO starts its lawsuit.

Today, Ts’o said, we are into our second round of 2.6 kernel-based enterprise Linux distributions. There’s more competition. Vista’s unpopularity has resulted in Microsoft extending Windows XP’s life by an extra six months. Its failure is an opening for Linux. Sun is starting to get open source too, open sourcing Java.

Sun, Ts’o said, is releasing Solaris under a GPL-incompatible license, and 95% of Sun’s code is still developed by Sun. Sun is worried about quality assurance, he said, but so is the free software community. Sun, he said, used to have a policy that if your code commits broke build three times, you were fired. It’s hard, he commented, to move from that environment to open source.

Ts’o’s employer, IBM, is by contrast a small part of the community, he said, but is happy with that. It takes the attitude that it doesn’t have to do everything itself.

And SCO has declared bankruptcy, Ts’o said to widespread applause in the room. But that said, open source software faces legal issues, particularly with the US Digital Millenium Copyright Act (DMCA) that the US government has been attempting to export, most recently to Australia. Now there is a patent troll suing Novell and Red Hat. Trade secrets are a problem, with the DMCA’s limits on reverse engineering, he says. Our defence as a community, he warned, is to get involved in the political process

Next, Ts’o waded into the debate between the GNU’s GPLv2 and GPLv3.

GPL version 3, he said, is now three months old, and it is clear that GPL version 2 is not going away. The result of this is that there are now two separate licenses. Kernel developers, he said, are not fond of version 3. He summarised the debate in two words: “embedded applications.” Linux is in data centres and pretty much owns the Web serving market, Ts’o said. Embedded systems and desktops are Linux’s future markets.

Ts’o said he’s on the kernel side of the debate. From the GPLv2 view, he explained, we want embedded developers to use and contribute back to Linux so we can all do the stone soup thing and all make it good. From the GPLv3 view, he went on, the mission is to allow all end users to be able to use embedded appliances with their own changes in the systems. TiVo, for example, has checksums to make sure changes are not made and this is a violation of freedoms. The rebuttal to this, he continued, is that if we use v3, appliance vendors will simply go elsewhere. We love our contributions, he said, and the developers’ priority is software, not hardware. Open architecture companies will tend to survive better, he said. Most TiVo users won’t make the hacks, but that 0.01% that do enhance value for the rest.

This argument will not be settled, Ts’o asserted. It’s a values argument, a religious one. GPLv3 adds restrictions, he said, and that means that GPLv2 sees GPLv3 as a proprietary license. The FSF, he said, argues that these restrictions are good for you.

GPLv3 code can have v2 code mixed in with permission, he said, but what happens if an LGPL v3 library is linked to a GPL v2 application? Is it legal? Now developers have to worry about GPL compatibility that undercuts instead of establishing new markets. It’s kind of a shame, he concluded.

Competition within the existing Linux markets is becoming a problem, Ts’o said. In the early days, Bob Young, founder and one-time CEO of Red Hat, used to hand out CDs for competing distributions at events. He described this as “growing the pie,” said Ts’o. The bigger the pie, the more Red Hat’s piece of it was worth. The competition right now, though, is causing him some worry, he said. He called it the tragedy of the commons.

Some companies are doing the hard work while others are reaping the reward, he said. As some companies do the research and development and make the results open source, others are picking up the work and selling it in their own products. It’s perfectly legitimate, Ts’o said, but there is a risk of companies ceasing to do the work if they are not the ones benefitting from it. Will there be enough investment in the mainline kernel to sustain it? Not enough people, he said, are doing code review as it is. Who does the grunt work?

Open source is good at fixing bugs and making incremental improvements, Ts’o said. Massive rewrites are more difficult. As an example, he cited the block device layer of the Linux kernel. A need was identified in 1995 to rewrite this, but it was not done until 2003. Kernel 2.6 fixed this piece of the kernel, but required rewriting many parts of the kernel to accommodate the new system.

Most major open source projects, he said, have paid people at the core. Linux and GNOME are mostly funded engineers, he said, although KDE is more of a hobbyist project than the others. At some conferences, such as the Ottawa Linux Symposium and LinuxWorld, it is easy to think that corporate dollars are all there is, while conferences like FOSDEM and Ohio LinuxFest are not the same way. The funding of some open source developers and not others can cause tension, he said, noting Debian’s controversial Dunc-Tank project, which aimed to pay some staff full-time temporarily to facilitate a faster release. Many people inside Debian did not like this. While some Debian developers are paid by outside companies such as Canonical, money should not invade the inner sanctum of Debian, Ts’o said people felt. How do you work within this division? We need both the hobbyists and the corporates, he said.

Moving on, he warned the audience to be wary of Microsoft. Vista’s failure does not mean Microsoft is sitting on its hands. A few years ago, he pointed out, Sun was seen as dead. Now look at it. Microsoft has a lot of money, he said, and will not always make stupid moves.

Software has to be easy for everyone to use. Microsoft spends millions on usability tests. He described a process where not overly technical people are put in front of new software and told to use it. This process is recorded and the developers watch the recordings to gain a better understanding of how actual users will use the applications. He described it as similar to watching a videotape of your own presentation.

Some software, he said, will always be proprietary. Tax filing software, with its involvement of attorneys, and massively multiplayer games such as Worlds of Warcraft are examples of this. If we want to achieve world domination, he said, we must at least not be actively hostile to independent software vendors. Windows, he says, has a huge installed base. We have equivalents to 80 to 90% of Windows-based applications, but there are a lot of niche apps. The Linux Standard Base, he said, is a way to help us achieve a Linux where ISVs are able to produce this software for Linux.

Whither the Linux desktop? The year of the Linux desktop, Ts’o noted, seems to be n+1. But, he said, we are getting better. We are starting to see commercial desktop applications for Linux, such as IBM’s Lotes Notes. Laptops are now selling with Linux installed, and we now have a decent office suite.

So what are we missing, he asked. Do we have bling? He moved his mouse around and made his desktop turn around like a cube with Compiz. Do we have ease of use? It’s getting better, he said. Raw Linux desktops are pretty much on par with raw Windows desktops for usability, although we have some ways to go to catch up to Mac OS X, Ts’o said.

Do we have a good software ecosystem? We’re getting there, he said, but we have that last 20% that is needed.

We have office compatibility now, too. The format is more important than the operating system, he said. If people are unwilling to try Linux, put OpenOffice.org on their Windows machines. With the advent of OOXML, Microsoft’s proposed document standard, people will need to change formats anyway. We might as well change them to OpenOffice.org and the OpenDocument Format (ODF). If we win the format war, we can switch the desktops later. And getting people to try OpenOffice.org, he noted, does not require people to remove Microsoft Office.

It has been a great 16 years of Linux, Ts’o said. It’s amazing what we’ve done. There is lots more to be done, he concluded, but nothing is insurmountable.

Local issues

Following Ts’o’s keynote address and a lunch break, I attended another interesting session that covered a project from a rural area not far from where I live. Ontario’s BGLUG, in the counties of Bruce and Grey around Owen Sound, has gotten together with United Way to distribute donated low-end Linux machines to underprivileged students through an anonymous system with Children’s Aid. Brad Rodriguez of BGLUG and Francesca Dobbyn of the United Way gave an hour-long talk discussing the details of how their project got going and its early results.

The short form is that a local government office offered Dobbyn’s United Way office a handful of retired but good computers. Some of these were used to replace even older machines in her office, but she contacted the local Linux Users Group about the idea of making the machines available to high school students in need.

Families who are on government assistance must declare the value of all goods they receive, including software, and this value is taken out of their assistance cheques, so it was important, Dobbyn said, to ensure that these machines cost as little as possible and had no sustained costs. The LUG installed Linux on a number of these machines and Dobbyn, through the local Children’s Aid society, distributed these machines anonymously to local students in need. They now accept hardware donations and have been keeping this up as a permanent program.

They used Linux, specifically Ubuntu 6.06, Rodriguez said, because it is free as in beer and is not at risk of immediate compromise as soon as it is connected to the Internet if antivirus and other security software isn’t maintained, an impossibility when the people setting up the machines cannot contact the people using them.

Four members of the LUG have volunteered to be contacted at any time by these students for help with the machines, which are not powerful enough to run games but are strictly for helping students in need complete homework assignments in the age of typed-only essays and PowerPoint class presentations.

One of the issues schools have faced is that when the students take the work to school, they are putting their disks in Windows computers and using Microsoft Office to print out their assignments. While there is no technical issue, the school board will not allow non-Microsoft software to be installed. Like many school boards, they are underfunded, and Microsoft donates a lot of the computer equipment on the sole condition that non-Microsoft software be barred from these machines.

And the rest

Among the many other topics discussed at Ontario LinuxFest was a completely objective comparison of Microsoft’s OOXML document standard and OpenOffice.org’s ODF document standard by Gnumeric maintainer Jody Goldberg, who has had to wade through both in depth. His summary is that OOXML is not the spawn of Satan, and ODF is not the epitome of perfection. Both have their strengths and weaknesses, and he sees no reason why we could not go forward with both standards in use.

Ultimately, Ontario LinuxFest was one of the best Linux conferences I attended. Because organizers kept it to one day with two sessions, two BOFs, and the everpresent Linux Professional Institute exam room, there were few times when it was difficult to choose which session to attend. With only a handful of sessions on a wide variety of topics, the signal-to-noise ratio of good sessions to filler was high. I did not find any sessions that were a waste of time.

The only sour note of the conference was that it did not break even. Although between 300 and 350 people attended, organisers literally had to pass a bag around at the end asking for contributions to offset their budget shortfall. In spite of this, I believe Ontario LinuxFest is a conference that is here to stay, and I look forward to OLF 2008.

Categories:

  • Events
  • News