Monday, July 15, 2013

In Depth: Has Ubuntu lost it?

In Depth: Has Ubuntu lost it?

In Depth: Has Ubuntu lost it?

When Ubuntu burst onto the scene nearly a decade ago, it defined itself with its choice of name. The word 'ubuntu' means 'we are who we are because of other people'.

With that ethos as its guiding principle, you might expect that the Ubuntu distro would focus on community development, sharing credit and trying to foster a sense of togetherness among its users and developers. The communal philosophy offered part ownership in a distribution for anyone who participated.

From the beginning Ubuntu was focused on community. It appointed Benjamin Mako Hill, the former Debian member and contributor, to the role of community manager for Ubuntu. This role later went to Jono Bacon, who writes for us regularly and is serious about the task of bringing people together to create something special.

Ubuntu community

There's a strong tendency among free software projects for the loudest shouters to be heard the clearest. Ubuntu, from the outset, tried to ensure that everyone would be heard. While other distros' forums could be unwelcoming for new users, with newbies unhelpfully told to 'RTFM' (read the f***** manual), the Ubuntu forums encouraged all to contribute and share their knowledge with others, rather than just show off. It sounds idealistic, utopian, and it was, but some felt that some contributors were more equal than others.

One of the highest-profile complaints came from kernel developer Greg Kroah-Hartman, kernel maintainer for the stable branch, who had identified a disparity between contributions from Canonical and other developers.

In a speech at the Linux Plumbers Convention in 2008 he criticised Ubuntu for not contributing more to the Linux kernel, saying that, of the 100,000 kernel patches made in the preceding five years, only 100 of these came from Canonical, creating the strange situation whereby the world's most popular Linux distribution contributed only 0.1 per cent of the work needed to keep the kernel going.

Kernel contribs

In the same period, Red Hat contributed 11.9 per cent of the patches, and Novell, makers of SUSE, contri buted 7.3 per cent. More recent figures from 2012, compiled since the release of Linux kernel 2.6.36, reveal similar stats. Red Hat, SUSE, Samsung (via its work on Android), IBM, Google, AMD, Nokia, and even Microsoft, all contribute more to the Linux kernel than Ubuntu does.

Four years later, in 2012, Kroah-Hartman complained that "Canonical uses me as a gatekeeper of what bugs get fixed in their kernel package and sent to their customers. There's so much wrong with this, I don't know where to start..."

Launchpad

Installer

In 2004, soon after Ubuntu's launch, Canonical announced a development platform to host software projects. It included lots of features important to developers, all rolled up in an attractive and convenient portal. Ubuntu would be just one of thousands of projects that would be s tored, updated, tracked, planned, and managed using Launchpad.

Other project hosting sites had their problems, and Launchpad was designed to solve them. However, despite its avowed open ethos, Canonical kept its web development code closed for five years - only migrating to the Affero GPL in July 2009 (the Affero licence closes a loophole that exists in the standard GPL - when code is running remotely via a web server, it's not actually being distributed, so the GPL doesn't apply. Affero fixes this, to keep web apps open).

Four years - and a lot of criticism - after the service was first unveiled, Launchpad was open source software, but the perception remained that Ubuntu preached one thing and practiced another.

Desktop dissent

Ubuntu makes fundamental design changes, but were they necessary?

Ubuntu but   tons

Ubuntu grew in popularity on the back of the GNOME desktop. Its fusion of Debian with a streamlined package selection, and a predictable release schedule, plus the most recent stable release of the GNOME desktop, was good not only for Ubuntu, but also for GNOME.

In 2004, the year that Ubuntu was released, the Member's Choice Awards at LinuxQuestions.org rated KDE as the most popular desktop environment, with 58.25 per cent of respondents in an open vote, compared with 30.9 per cent for GNOME (the next most popular desktop was Xfce, with 11.23 per cent).

As Ubuntu rose to prominence, GNOME enjoyed a huge growth in popularity. In 2008 the gap had narrowed to 43 per cent in favour of KDE and 40 per cent in favour of GNOME. By 2010 (two years after KDE released its paradigm-breaking version 4.x) GNOME had eclipsed KDE's popularity, with 45 per cent of voters calling it the best desktop environment, compared with 33 per cent for KDE. T he future was bright. The future was purple. Ubuntu users had grown to love GNOME.

Arbitrary changes

It was a sign of things to come when users upgrading to version 10.04 rebooted to find that their window buttons had mysteriously migrated from the right-hand side of the window bar to the left.

Responding to the feedback, Canonical founder Mark Shuttleworth said that "Moving everything to the left opens up the space on the right nicely, and I would like to experiment in 10.10 with some innovative options there. The design team is well aware of the controversy, your (polite) comments and more importantly data are very welcome, and will help make the best decision."

Ivanka Majic, one of Ubuntu's designers, explained the decision as "a golden opportunity not only to make our OS as good as the competition, but to make it better", and posed the question: "Are we smoking crack to think that the learning curve for getting used to a new position is ever going to be worth any real or perceived benefit of new positions?"

Unfortunately, the learning curve was too great for most people, and users elected to put the buttons back in their rightful place with gconf-edit. Even today a Google search for 'window buttons on right' brings up a guide to fixing Ubuntu - and this is for a design change made two and a half years ago.

In contrast to the earlier touchy-feelie values espoused by Ubuntu, Shuttleworth clarified Ubuntu's relationship with the majority of users, telling a mailing list at the time: "We have processes to help make sure we're doing a good job of delegation, but being an open community is not the same as saying everybody has a say in everything. This is not a democracy. Good feedback, good data, are welcome. But we are not voting on design decisions."

Two and a half years on from the beginning of the title bar experiment, and we're yet to see anything innovative i n the right-hand side of the bar. But we have had the mother of all baby/bathwater incidents in the shape of Unity.

Unity was not the perfect solution that Canonical claimed it would be, and in fact, is just becoming stable enough for reliable use in the latest versions. One of the problems with Unity was Compiz, on which Unity was originally built and that commonly conflicted with OpenGL. It's fair to say that, technical issues aside, Unity has not been popular.

If we look at the figures for the year after Unity was released (2011) on LinuxQuestions.org, KDE is still the favourite distro of 33 per cent of users. Unity, which was supposed to be better than GNOME, was just 4.6 per cent. Perhaps even more illuminating was the fact that Xfce's popularity had grown to 27 per cent. Having built such a large, helpful community, it's strange that Ubuntu didn't listen to them more.

The design change is one example, but there are more - the Upstart init daemon (w hich pointlessly duplicated much of the work done in SystemD) and Project Harmony are both examples of Ubuntu announcing new ideas that fail to gain much traction.

Another idea that never happened was the proposal that Ubuntu switch to a rolling release developmental model either alongside or replacing the current bi-annual full release model. Distros that use rolling releases, such as Arch or Manjaro, tend to be quite popular among more experienced users, and several either are exclusively rolling releases or offer a rolling version alongside periodic full releases.

However, contributors to Ubuntu were concerned. How would they have to rearrange things now? Shuttleworth decided the best thing to do was strengthen the Long Term Support releases, shortening the life cycles of interim releases, and "designating the tip of development as a Rolling Release."

In 2008 Shuttleworth suggested Debian, GNOME, the kernel, X, and all other projects time thei r major releases to coincide with Ubuntu's development schedule in what he called "cadence." As he put it, "There's no doubt in my mind that the stronger the 'pulse' we are able to create, by coordinating the freezes and releases of major pieces of the free software stack, the stronger our impact on the global software market will be, and the better for all companies - from MySQL to Alfresco, from Zimbra to OBM, from Red Hat to Ubuntu."

Shuttleworth later replied to criticism saying, "We're not trying to get people to shift to accommodate us. We're trying to catalyse a conversation across the whole ecosystem, and let things settle where they do."

Many in the Linux community considered the idea of coordinating schedules to be futile, and rejected the concept.

Space: the final frontier

Next spec

Replacing GNOME with Unity, however, didn't cause anywhere near the level of uproar the community experienced when Mir was announced. Mir is a new display server written to replace the X window system and Compiz compositing manager.

Underlying the decision was Canonical's desire to compete with the Android operating system on phones and tablets. X doesn't scale down very well, especially with fancy effects, while Mir, according to Canonical, theoretically offers these things in more efficient, much smaller, and safer packages.

The decision to move to Mir was accompanied by criticism of the competing project, which annoyed plenty of developers. Software engineer Kristian Høgsberg countered Ubuntu's claim that Wayland wasn't up to the job. "The technical reasons on the Mir page just don't add up Ć¢€¦ don't go out and tell the whole world how Wayland is broken and has all X's input problems."

Wayland

Daniel Stone, contributor to X.Org (one of the base technologies without which we'd still be staring at blinking green text on a black background), put in: "I'm not worried about Wayland's future at all. I'm just irritated that this means more work for us, more work for upstream developers, more work for toolkits, and more work for hardware vendors."

Another large deposit of code and an untold number of developer hours were now out the door, but Mir was already working with Android display drivers, and was the wave of the future according to management. Despite the wailing and gnashing of teeth, Ubuntu is moving forward in a new direction, and desktop PC's isn't it.

In October 2012 Ubuntu announced that it would no longer be sharing as much developmental news or as many developmental releases with the community as in years past. Shuttleworth was tired of the in-fighting and outside criticism, saying: "While we won't talk about [new features] until we think they are ready to celebrate, we're happy to engage with contributing community members that have established credibility in Ubuntu, who want to be part of the action."

Ubuntu on the move

Will Canonical's commitment to Unity pay off in the long run?

Ubuntu smartphones

Ubuntu showed the world what many had been saying for years - that Linux really isn't that difficult. Somehow Ubuntu hit just the right chord of revolutionary spirit, community ownership, and geek status for the masses. It was a remarkable accomplishment, which others, such as Mandriva struggled to achieve but failed.

It wasn't long before Ubuntu began appearing on corporately manu factured machines. In April 2007, three years after Ubuntu's first launch, Dell announced that it would be stocking machines pre-loaded with Linux, with CEO Michael Dell telling the world that he used Ubuntu 7.04 at home. The buzz that this created helped propel Ubuntu - and Linux as a whole - into the mainstream.

Linux gained more exposure with the short-lived craze for netbooks - small, low-powered devices with tiny keyboards, which have seen their market niche all but wiped out by the advent of touchscreen devices. Many of these devices came with Linux pre-installed, but the manufacturers crippled them with buggy, badly designed interfaces and poor software choices.

Ubuntu Netboo remix

Ubuntu rode to the rescue with its excellent Netbook Remix, but by then the damage had been done, with many people convinced that Linux was a cut-price inferior alternative to Windows. That was a shame. As Mark Shuttleworth said in 2012:

"We've known for a long time that free software is beautiful on the inside - efficient, accurate, flexible, modifiable. For the past three years, we've been leading the push to make free software beautiful on the outside too - easy to use, visually pleasing and exciting. That started with the Ubuntu Netbook Remix, and is coming to fruition in 12.04 LTS."

He was right, but Ubuntu preloaded has suffered a couple of false starts. In 2010 Dell quietly dropped its Ubuntu machines from its website, saying at the time: "We've recently made an effort to simplify our offerings online, by focusing on our most popular bundles and configuration options, based on customer feedback for reduced complexity and a simple, easy purchase experienceĆ¢€¦ We're also making some changes to our Ubuntu pages, and as a result, they are currently available through o ur phone-based sales only."

However, it's not all bad news. Dell's excellent XPS 13 laptop, a rare high-end machine preloaded with Linux, is now available in both the USA and Europe. And Shuttleworth remains optimistic, even bullish, about the future of Ubuntu preloaded.

In May 2012 he told a press conference: "Next year about 18 million PCs, or five per cent of the total market, should ship with Ubuntu preloaded." Canonical currently claims "20 million users and counting" and 1.3 million websites running on Ubuntu servers with 22,000 new sites added every month.

Beyond that, Shuttleworth has announced plans for Ubuntu TVs, Ubuntu smartphones, and Ubuntu for tablets. He's predicted 200 million Ubuntu users by 2015, and he's hoping that gadgets are the way to get there.

A change in direction

Ub   untu tablets

At the launch of Ubuntu for phones Shuttleworth told us that the OS had been designed with broad device compatibility in mind, but we have yet to see a flood of touchscreen devices bearing the Unity interface. We've seen one model planned for release in October 2013, but as yet the release is planned only for Australia. Check out www.ubuntutablet.com.au for the details.

Is Ubuntu for phones dead in the water, or is the take-up simply slower than anticipated? Shuttleworth told us at the launch of the phone OS that Canonical had the backing of a "very large supplier of silicon to the mobile industry," and that it was "in discussions at the highest levels with top-tier carriers in North America, Europe and China to get launch partners for those devices".

In the same conference call, Shuttleworth expressed his vision for the future as universal Ubuntu bran ding. He wants shoppers to walk into electronics stores and cellphone kiosks and recognise Ubuntu because they already use it on their phones, computers, TVs, and several mobile devices. "It solves a lot of problems for us if people go into a store and see Ubuntu branding."

Made for TV

Ubuntu TV

That would be quite an achievement, but is Shuttleworth overreaching given the somewhat disappointing take-up of Ubuntu for phones, and the lack of market penetration achieved by Ubuntu TV so far?

Ubuntu TV was the first non-desktop gadget-based OS launched with the Unity interface, and it was launched with a similar fanfare to that that accompanied the launch of Ubuntu for phones. Ubuntu for TV was going to revolutionise the way we consumed content, setting us free to watch what we want when we want.

Canonical CEO Jane Silber said at the time of Ubuntu TV's launch: "OEMs and ODMs are increasingly wary of the walled garden approach that certainly Apple takes - and increasingly Google, although it's much more open than Apple. We see a lot of demand for a neutral player."

This is exactly the line that Canonical is taking with Ubuntu for phones: that there's a duopoly of Apple and Google controlling smartphones and other small mobile devices, and that the OEMs of the world are crying out for a system that will enable them to break the control of the big two - a "neutral player", as Silber puts it. We hope the phones and tablets succeed, but is it too late for Ubuntu to make an impact on the mobile market?

If it is, and the phone- and tablet-specific features rumoured to be in a forthcoming release of Ubuntu (check out https://wiki.ubuntu.com/Touch/Install to get a copy of the Ubuntu Touch Developer Preview - but read the disclaimers first) go l argely ignored, then you have to wonder whether Canonical's push for convergence has been worth the effort.

Over a year on from the launch of Ubuntu TV there are precisely zero manufacturers making the things.

Why Ubuntu's great

Ben Everard doffs his hat to Mark Shuttleworth for taking all the risks

Ubuntu Gnome

Ten years ago, the Linux world was very different from today. Back then, the idea of using Linux if you weren't a hard-core computer geek was laughable. Mark Shuttleworth changed all that. He came along with an idea that Linux could be easy to use, and he made it so.

Earlier, we said that Canonical didn't contribute as many patches to the Linux kernel as other companies. This is because Canonical realised what other people didn't: that desktop users didn't care about the kernel - they wanted a well-integrated, easy-to-use desktop.

That's what Canonical has spent their time doing, and you know what? They've been far more successful at getting new users into the Linux fold than Red Hat, SUSE, or any of the other top kernel contributors.

Canonical's contribution to the Linux ecosystem wasn't new code, but a new mindset. A mindset that said "Hey, Linux is for everyone." And, interestingly, that mindset was what Linux needed then. There were already hundreds of developers working on the kernel, but very few working on making it friendly.

The second big thing that Canonical has done for Linux is provided infrastructure. Take Linux Mint as an example. It's one of, if not the, most popular Linux distributions today. It's easy to see it as being the successor to Ubuntu as Linux for the average Joe. Yet, the way the Mint developers have been able to focus so much on creating a great user experience is by not having to focu s on the infrastructure. They simply use Ubuntu's repositories, and then, without even trying, they have servers to distribute updates around the world for zero cost.

Ubuntu is also the only remaining major commercial distribution that regular users can actually use. Sure, Red Hat does all this wonderful stuff, it might even be a wonderful distro, but unless you have thousands of pounds to spend, you'll never know. The same is true of SUSE. Yes, they both have community versions, but they keep the best stuff for their enterprise clients only.

The reason for this is that all three of them (Canonical, Red Hat and SUSE) are businesses, not charities. In a perfect world, Canonical would employ vast numbers of kernel developers and push all their changes upstream, and Red Hat Enterprise Linux would be free to download (yes, we know Centos is free, but that's in spite of Red Hat's efforts, not because of them).

The point we're getting at is that every Linux co mpany has to make compromises. It's almost ten years since Ubuntu launched, and the Linux desktop's been getting better at a slow-but-steady rate. There's nothing wrong with that, but sometimes you need to shake things up to get to a better place, and you can't shake things up with community consensus.

As Henry Ford said: "If I had asked people what they wanted, they would have said faster horses." To get past a certain threshold, you need someone to step in and say how it has to be done. When we have the benefit of hindsight, we either laud these people as visionaries or deride them as fools, but at the time, before the dust settles, they're always treated as trouble makers.

Unity is just such a shake up. It challenges us to think about what makes a good desktop, and about whether we're using our computers in the best way. Even if it ultimately fails, we as a community will be a bit wiser about how to make good interfaces.

The plurality of deskt op options on Linux means they can try this without taking any other choices away from users. After all, it wasn't Canonical that stopped GNOME 2, it was GNOME themselves. If you don't like Unity, KDE, Xfce, LXDE and (as of the most recent release) GNOME Shell all have official Ubuntu versions.

Unity brings us nicely along to Mir, Canonical's other controversial graphical project. We're not going to make excuses for Canonical: it was wrong of Canonical to spread misinformation about Wayland when it announced Mir. We hope it was an honest mistake rather than a deliberate attempt to spread fear, uncertainty and doubt about the alternative system, but even if it was a mistake, Canonical must have known that the information would be widely circulated, and should have checked their facts were completely accurate.

However, just because there was some wrong information in the public announcement doesn't mean that the entire idea behind it is wrong. We weren't privy to the discussions within Canonical about why it needed to do it.

However, now it's decided that it does need to do it, it's gone about it in a way that should mean a minimum of duplicated work for applications. It's implementing Qt and GTK bindings, and providing legacy support for X. Hopefully, this means that the extra work involved is on the part of Canonical.

Beyond the desktop

Apple newton

Apple released a tablet. You may have heard of it - it's called the Newton. To be honest, it didn't set the world on fire, and it gradually faded into obscurity and ceased production in 1998.

Naturally, Apple realised that it couldn't do hand-held devices and didn't try again. If that were true, we'd be living in a very different world today. Maybe we'd all be using Blackberries, maybe Nokia would have fin ally committed to a Linux-based phone, or maybe we'd all be using feature phones that had batteries that lasted two weeks and played Snake 2. We'll never know because that's not what successful companies do. It's not what successful people do.

If we want Linux to succeed, it shouldn't be what we want Linux developers to do. Ubuntu TV hasn't been a success yet, but that doesn't mean it was a bad idea to try it, and it certainly doesn't mean that Canonical should stop trying new things.

Getting Ubuntu into consumer electronics would be a massive boon for the open source movement. Let me tell you why. Android is built on Linux. We all know that, but what's on top of the Linux kernel is a mish-mash of proprietary software centres (Google Play, etc.) and commercial 'apps'. Fundamentally, it's a commercial system that happens to be based on an open source kernel.

But, more importantly, it doesn't challenge the established players in any market on anything othe r than mobile. It doesn't push people to use Linux in any way other than on their phones.

Now, imagine for a moment that Ubuntu phones took off. Imagine they were seen as the third player behind iPhones and Android phones. Imagine going into a phone shop and seeing a display of Ubuntu phones, and having the sales staff show you how to use them.

In this potential future, it probably won't make any difference to you. After all, the fact that you're reading Linux Format means you're probably already a convert. However, imagine what it'll mean to less technical people. The people who - if they've heard of Ubuntu at all - just think of it as a thing for geeks.

Suddenly, it'll be seen as a user-friendly system, maybe even a cool system. If this happens, Ubuntu desktops will inherit the image and be seen as potential computers for everyone.

Mint Gnome

Ten years ago, Canonical set out to create a mindset of "Hey, Linux can be for everyone", and it did, within the Linux community. By trying to move out into consumer hardware, it's trying to take this mindset to the masses. We don't know whether it'll work, but we applaud them for trying.

If it works, the knock-on effects for other distros could be massive. Not least because the Linux desktop would be seen as a major platform and taken seriously by hardware manufacturers. Computers are changing, and they're changing fast. They used to be on our desks, but now they're in the cloud and in our pockets. Maybe tomorrow they'll be in our glasses and watches as well.

Time and tide waits for no man, and neither does technological change. To play it safe is to surrender the future to innovators, and get caught up in an endless game of catch-up. From desktop Linux's point of view, to play it safe is to remain forever a niche product for geeks and hobbyists.

To innovate is to get a chance to define the future, but it comes with the risk of alienating your current users. The unique advantage desktop Linux has is that we can do both. Ubuntu can innovate while Mint, OpenSUSE, and all the others play it safe. That's the advantage of the open source ecosystem that we've built up over the years.

But we only have that advantage if people are willing to take risks. Mark Shuttleworth is taking a tremendous gamble. If it pays off, then the whole Linux community benefits.

    


No comments:

Post a Comment

//PART 2