For some, the words "Lotus Notes" are enough to strike fear into the heart. But for younger readers, that resonance is probably absent, and so the importance of the recent port of the Lotus Notes client to GNU/Linux is probably lost.
In a sense, Lotus Notes for GNU/Linux is noteworthy precisely because the program is the epitome of corporate computing, with all that this implies. Its appearance is further proof that GNU/Linux has arrived. It also removes yet another obstacle to adopting free software in a business context for some 120 million people currently using the program on other platforms - whether willingly or not.
31 July 2006
For some, the words "Lotus Notes" are enough to strike fear into the heart. But for younger readers, that resonance is probably absent, and so the importance of the recent port of the Lotus Notes client to GNU/Linux is probably lost.
The US PubMed Central service has become one of the cornerstones of biomedical research, and a major milestone on the way towards full open access to all scientific knowledge.
Just as the world's central genomic database GenBank exists in three global zones - the US, Europe and Japan - so the natural step would be to roll out PubMed Central as an international service. The first move towards that has now been made with the announcement that a consortium of UK institutions has been chosen to set up UK PubMed Central (UKPMC). That's the good news. The bad news - maybe - is that one of them is the British Library.
Why is that bad news, since the British Library is one of the pre-eminent libraries in the world? Well, that may be so, but it is also deeply involved with Microsoft's Open XML, the rival to OpenDocument Format; Microsoft is trying to push Open XML through a standardisation process to match ODF's full ISO status. It is particularly regrettable that the British Library is bolstering this pseudo-standard with its support, rather than wholeheartedly backing ODF, a totally open, vendor-independent standard, and this could be real problem because of the British Library's role in the UKPMC consortium:
In the initial stages of the UKPMC programme, the British Library will lead on setting up the service, developing the process for handling author submissions and marketing the resource to the research community.
It's the "handling authors submissions" that could be bad news: if, for example, the British Library gave any preference for submissions be made in Microsoft's XML format formats, it would be a huge step back for openness. The US PubMed Central does the Right Thing, and takes submissions in either XML or SGML. Let's hope the UK PubMed Central follows suit and goes for a neutral submissions policy. (Via Open Access News.)
The Wall Street Journal has a nice piece about what it calls the "moguls of new media":
As videos, blogs and Web pages created by amateurs remake the entertainment landscape, unknown directors, writers and producers are being catapulted into positions of enormous influence. Each week, about a half-million people download a comedic video podcast featuring a former paralegal. A video by a 30-year-old comedian from Cleveland has now been watched by almost 30 million people, roughly the audience for an average "American Idol" episode. The most popular contributor to the photo site Flickr.com just got a contract to shoot a Toyota ad campaign.
What I like about this WSJ feature is that it shows clearly the difference between the new media it celebrates and the old media it represents. The WSJ piece is well written, well edited and full of well-researched facts. Rather unlike new media, which tends to be scrappy and light on substance. But then, that's its charm, just as the reason the WSJ will always have a role, even when new media becomes even more pervasive and even more successful, is because it will never be any of these things. (Via Slashdot.)
The news that CNN is now soliciting user-generated stories and content - rather as the BBC has been doing for a while - is important not so much for what will result, but for the message it sends. Even if the user-generated content turns out to be nugatory, the fact that CNN is jumping on this bandwagon gives the latter more impetus, which can't be a bad thing in terms of re-inventing media.
Judging by this article, Brazil's federal government may well be the next to adopt ODF as its official standard. As the news item notes, adopting open source is all very well, but if your documents are still locked into proprietary formats like Microsoft Office, you're only half-done.
The great thing about these announcements is that there's a positive feedback loop: the more that are made, the more other governments feel safe in following suit, which boosts the process even more. (Via Erwin's StarOffice Tango.)
The news that someone is offering their Digg profile on eBay is hardly a surprise in these days when people will try to sell anything there; but it's nonetheless significant. Digg is one of the leading Web 2.0 sites, and a leading exponent of the power of social networks. What can be done with Digg can be applied elsewhere.
This will lead to a de-coupling between the person who creates the online account in these networks and the account itself, which can be sold to and used by others. Which raises the question: wherein lies the value of that account? If the person who created it - and whose social "value" it reflects - moves on, what then of that value? In effect, the account becomes more of a brand, with certain assumed properties that can be lost as easily as they were gained if the new owner fails to maintain them.
30 July 2006
OK, so it's the idea of free computing that Microsoft is trying to patent, rather than just free software. It's still doubly stupid. Stupid, because this is obvious, and hence non-patentable. Stupid, because it's not even novel (and hence non-patentable): it was tried during Dotcom 1.0, where it failed miserably. Make that triply stupid.
So far, the best answer to How can open source companies make money? seems to be that of providing services - typically training, support and general consultancy. There's another approach, involving dual licensing, but this is more problematic in some ways, and there's also evidence that it may only be a transitional approach on the way to a full service model.
Against this background, Irving Wladawsky-Berger has an interesting post on his blog about services, beginning with this observation:
If you look at IBM's business last year, services revenues were roughly 55%, while systems (hardware) and software revenues were around 25% and 20% respectively. But services constituted around one-third of the company's profit, for a very simple reason. Systems and software products leverage technology assets and apply engineering principles to improve quality, scale-up capacity, and achieve higher productivity and profit margins. Services, on the other hand, have historically been significantly more labor-based, less prone to economies of scale, subject to higher quality variations, and generally less productive and profitable.
Services - and analyses of them - will clearly be moving to the foreground in years to come, and not just in open source. The latter will, however, be a trailblazer in this respect as in many others. Another reason for those outside the world of free software to pay close attention to it.
Many in the world of free software are aware that the Spanish region of Extremadura has been installing GNU/Linux in its schools. It has even created its own distribution called GNU/Linex. This project began many years ago - the roots go back to 1998 - and most people probably assumed it had all fizzled out.
Far from it. For the Extramadura government has now announced that it would be going even further:
All the computers of the Junta of Extremadura (goverment state of Spain) will be running free software within a year. This project makes the Regional Government the first Public Administration to adopt standards upheld by international organizations, that favour "technological innovation and the reduction of user dependency.
The councillor for Infrastructures and Technological Development, Luis Millán de Vázquez de Miguel, met the press this Friday to inform about the agreements reached in the last board meeting of the Government held last June 25. In said meeting, it was agreed that all the computers of the Junta of Extremadura would have to be adapted to free software office tools and gnuLinEx (the local flavour of Debian GNU/Linux) within one year.
Thus, as from now on, all workers of the public administration must use open document formats (ISO/IEC DIS 26300) for their office applications for information and creating administrative processes, as well as PDF/A (Portable Document Format ISO 19005-1:2005) for Exchange Documents, when guaranteed unalterable visualization is required.
That is open source, and OpenDocument too.
But what's most interesting about this announcement is that it shows how mighty oaks can grow from small acorns: once organisations have tried free software and discovered how good it is, it becomes much easier to move on to larger-scale implementations. This, in its turn, shows that every open source project, now matter how small, is important.
29 July 2006
It is a truism that if you believe in your product you welcome comparisons with the competition. After all, if you really are the best, you can only gain from that fact being made manifest.
So this delightful tale from PC Pro suggests that even Microsoft has no faith in its own products:
There can be no doubt that Microsoft is afraid of the open-source movement. Last month, as part of the conditions for allowing us to include Office 2007 on the cover disc, Microsoft Corp - in many ways a company distinct from the far cuddlier and more approachable Microsoft UK - wouldn't allow us to put any open-source software onto the same disc.
Microsoft specifically named OpenOffice.org as a program we couldn't include on the cover disc. That's an incredibly short-sighted move. The whole point about Office 2007 is what it offers over and above OpenOffice.org, that it allows you to create more sophisticated documents more quickly. If I was on the board of Microsoft Corp, I'd be demanding that magazines bundled both side by side so that people could make their own comparisons.
Obviously, the powers in Microsoft Towers aren't quite so confident of what the outcome might be. (Via SunMink.)
A couple of weeks back I wrote a piece for Linux Journal suggesting that top coders might start using some of their increasing influence outside computing to start talking directly with those in power in an attempt to change political agendas directly.
Now here's an interesting complementary suggestion: that supporters of free software should become consumer activists, and try to influence things from the bottom up.
In the early days of free software in business - say ten years ago - there was a natural tendency to think of it as a monolithic entity. But rather as chromatography can be used to separate out the constituent parts of an apparently uniform blob, so time gradually teases out the different elements that go to make up the rich and complex world of open source.
Thus we have projects like Apache and GNU/Linux, which are so much a part the mainstream now that it probably hard for most people to imagine that they were never part of it. Then there are the projects like MySQL and JBoss that are fast establishing themselves as second-generation leaders. Finally there is the new wave - the SugarCRMs, the JasperSofts and Alfrescos - that are coming through fast.
I found a nice representation of this evo-devo in a post on Matt Asay's blog, where it is attributed to Robin Vasan. I'm afraid I've never heard of him (I obviously lead a sheltered life), but I see from his bio that he's involved with Alfresco, as Matt is, so this is obviously the connection.
Aside from the graphic - which diverges in detail from my view of things, but is broadly the same - Matt's post contains several other interesting slides (and ideas) from his recent presentation at OSCON 2006. It's well worth taking a look at.
28 July 2006
So Federated Media (John Battelle's new gig) has "launched a parenting “metaweblog” (at www.federatedmedia.net/parenting) that highlights the talented voices of the authors in its new Parenting federation."
The metaweblog - basically an aggregation of blog feeds on a given subject - is an inevitable development for blogs as they enter the mainstream. After all, it is hard work going out and finding all the blogs that interest you on a given subject: much easier to subscribe to one handy metaweblog feed and be done with it.
I mean, I ask you: next thing, they'll be asking us to think about what we read.
The line-up of cross-platform open source apps is pretty impressive, except in one area: Web design. Until now, all we've really had is Nvu, which is certainly very easy to use, but rather limited in terms of more advanced features.
I've an article over on Linux Journal that looks at dual licensing. The issue of how you might make money out of open source software is important, not just to the coders but also to the users. If the former don't feel that they've received enough payback - of whatever kind - they are likely to move on, and the latter will then suffer.
So I was interested to see the news that everybody's favourite blue frog - Azureus - has taken some venture capital money to turn "legit", or at least corporate. There's no reason why this should have any downside for the free version: presumably they aim to produce paid-for, customised versions for content producers etc. In fact, it might even help educate the latter that BitTorrent is not synonymous with evil.
For a nation of rationalists, France's DADVSI law is a seriously irrational piece of legislation that guts not just fair use, but comprehensively clobbers free software. Read this explanation - and then weep. (via Boing Boing.)
This story about US legislation that will require schools to block MySpace and other social networking sites might not seem to have much to do with this blog. But the Deleting Online Predators Act (DOPA) is symptomatic of a far-wider problem.
The story is basically old, ignorant politicians that are taking the opportunity to grandstand by supporting legislation full of high-sound principles - "protecting the children" and other such tosh - that in fact is highly pernicious.
Of course, banning access to social networking sites from schools will not "protect the children": on the contrary, it will expose them all the more. The children will simply access these sites from home, or friends' homes, where they will probably be completely unsupervised. At school, by contrast, they could be taught how to recognise trouble, and how to deal with it - educated, in other words.
The knee-jerk reaction to ban any novel technology that can be used for nefarious purposes is also symptomatic of politicians' lack of understanding and fear of the new - call it the Tubular Syndrome. DOPA is clearly block-headed, but it's not, alas, a unique example of political block-headedness.
Google has launched an open source code repository, called, with stunning originality, Google Code. This is particularly good news, because it not only signals Google's continuing efforts to boost open source - probably the best single way to attack Microsoft without seeming to - but also because it provides an alternative to the main code repository in use today, SourceForge.net.
Don't get me wrong, I've nothing against SourceForge. On the contrary, it has played a crucial role in the blossoming of free software, and we owe it a huge debt. But it represents a single point of failure that has been one of open source's greatest weaknesses: just imagine what would happen if SourceForge were to go down for a few days. The existence of an alternative, backed by Google, can only be good news in this respect.
If you need a quick but comprehensive introduction to ODF to give to relations/friends/colleagues/total strangers, you could do worse than this one by Sam Hiser. It concludes by noting:
At one time the main interface for working with information in documents was the software application (an office suite or a text editor of some kind); now, the main interface is the document itself, and it won't matter what application you use. The OpenDocument Format is bringing the world from an application-centric model of computing to a document-centric model of computing. This means that creating new business processes will be as easy as typing a memo on a PC or working with a small connected device.
27 July 2006
One idea gaining currency is that of creating a patchwork of free wireless Internet access points by leaving your WiFi network open for anyone to use. Of course, spoilsports have adduced all sorts of reasons why this might be a bad thing to do, but now, courtesy of those nice people at the RIAA, here's a reason why it might be a good move:
For years, the RIAA has claimed that having the IP address of a computer that has shared unauthorized files is the equivalent of having the evidence of who was actually sharing files. That, of course, is false. The IP address simply can help you know who paid for the internet access, but not who was using what computer on a network. In fact, this even had some people suggesting that, if you want to win a lawsuit from the RIAA, you're best off opening up your WiFi network to neighbors. It seems like this strategy might actually be working.
I'd have expected this news about a new Emotion Incubator Group at the W3C to have been released on the 1st April:
Emotion-oriented (or "affective") computing is gaining importance as interactive technological systems become more sophisticated. Representing the emotional states of a user or the emotional states to be simulated by a user interface requires a suitable representation format. Although several non-standard markup languages containing elements of emotion annotation have been proposed, none of these languages have undergone thorough scrutiny by emotion researchers, nor have they been designed for generality of use in a broad range of application areas.
Well done Andy Updegrove for spotting this: quiet day at the office, Andy?
I have been writing about the Internet since 1994, and about ICANN, the body that oversees the domain name system, since 1999. Unfortunately, in those seven years, ICANN has confirmed the fears of myself and many other observers about the stultifying effect it would have on the functioning of the Internet.
Given that ICANN was backed by the US Government, which is not known for its sensitivity to the views of others, especially mere foreigners, I had largely resigned myself to this sorry state of affairs. But miraculously, something seems to have happened:
In a meeting that will go down in internet history, the United States government last night conceded that it can no longer expect to maintain its position as the ultimate authority over the internet.
Having been the internet's instigator and, since 1998, its voluntary taskmaster, the US government finally agreed to transition its control over not-for-profit internet overseeing organisation ICANN, making the organisation a more international body.
For me, the main thing to come out of this is the hope that ICANN might finally be forced to open up its totally inappropriate secrecy, and as a result that maybe - just maybe - it will start serving the Internet community in the way it was supposed to all those years ago.
Something called "Cultural, Artistic and Scientific knowledge for Preservation, Access and Retrieval (CASPAR)" sounds like a typical money-wasting euro gravy-train. But the central problem it aims to address - "How can digitally encoded information still be understood and used in the future when the software, systems and everyday knowledge will have changed?" - is important, and becoming more so by the day.
Over long periods of time, you cannot hope to keep every wacky proprietary data format alive by storing copies of the relevant software: you'd also need to store old operating systems, software manuals etc. The only practical solution is to use open formats. For these, the information will be accessible long after the programs that created them have gone to the great data repository in the sky.
Alfresco, probably the leading open source enterprise content management company, has announced that it will support ODF. Not earth-shattering in itself, but a useful, incremental, step forward. (Via Bob Sutor's Open Blog.)
I've written about INSPIRE before, and now this depressing piece in The Guardian suggests that the twits in the UK Government are going to scupper it because of their feudal insistence on protecting inefficient and anachronistic "businesses" like the Ordnance Survey. Release the data and let a thousand businesses bloom, people.
The Guardian has a story entitled "The open source approach to education". It's about a report on the same from Futurelab, and while there's nothing really new there, it's nonetheless good to see the usual memes being spread about a bit in new contexts:
It is also possible to conceive of young people or teachers working together as programmers to create new resources and tools that are of relevance to them in supporting their own learning. These approaches go beyond the traditional distinction between 'users' and 'producers' of educational resources, instead, they offer models of innovation in which these communities are intermingled, the notion of ownership is changed and the economical model of cost and reward is reworked. These new hybrid models of innovation that FLOSS exemplifies require us to ask what models of ownership we might need to develop; what mechanisms might need to be put in place to encourage exchange between sectors; what role users of educational resources might play in the creation of resources; and what business models would need developing to allow further exploration in this area.
26 July 2006
Scalix has always been something of a showcase for how GNU/Linux can be used to run enterprise-level software. And yet, ironically, Scalix itself has been proprietary.
Given that the company was successful with its earlier strategy, this move to a dual-licensing approach is an interesting comment on the even greater advantages of being open source, and not just running on it. (Via Linux-Watch.)
Another great post by Rob Weir, with an even better heading (any friend of Modest is a friend of mine). It shows in loving detail how Microsoft would rather exhume one of its old zombie standards rather than be caught, er, dead supporting an open standard, especially one supported by ODF.
For all its corporate rapaciousness, Microsoft has always been scrupulous in keeping its partners happy: it knows that much of its immense strength derives from the huge Windows ecosystem it has created around itself. Indeed, whatever its manifest misdeeds in terms of abusing its monopoly position, it is arguable - and the company itself has made the argument - that through this vast network it has created far more wealth than any harm it is accused of.
Against this background, two recent moves are pretty astonishing. First, there is Zune, which as many commentators have observed, is unlikely to damage Apple so much as all those who signed up to the horribly-named "PlaysForSure" initiative to provide online music services based around Microsoft technologies.
And now, even more surprising, we have Microsoft's move into offering healthcare software. The actual figures involved are minuscule, but the signal it sends is immense. For it seems to suggest that in its growing desperation at the loss of market share in its traditional sectors - and with the threat of ever-greater losses in the future - the company has decided to break its golden rule to leave to third-parties vertical markets, while it supplies (at a handsome profit) all the infrastructural stuff.
I can't help seeing the hand of Ballmer in this, eager to make his own mark on what is still Bill Gates's company. It would be an obvious thing for a hard-nosed salesman to do - to carve up former partners in an attempt to grab slices of new pies. But I predict that the move will go down very badly with Microsoft's erstwhile supporters, already unnerved by the sword of Zune hanging over them, as they begin to wonder which sector will be next on the Microsoft hitlist.
In fact, I expect they're starting to feel as sick as a parrot.
I've been a little dismissive of the One Laptop Per Child (OLPC) project, but having read LWN.net's fascinating interview with Jim Getty, the Vice President of Software, I have become more impressed with what it is trying to do, and how it is trying to do it.
Unfortunately, it seems that the Indian Ministry of Education does not share my new-found enthusiasm. According to Heise Online:
The Indian Ministry of Human Resource Development (HRD), the country's ministry of education, has rejected the suggestion made by the country's Planning Commission to take part in the "One Laptop per Child" (OLPC) program. In a letter sent to the Planning Commission last month Education Secretary Sudeep Banerjee of the Ministry of Human Resource Development stated that the case for giving a computer to every single child was "pedagogically suspect." This the Indian daily The Hindu reports. "It may actually be detrimental to the growth of creative and analytical abilities of the child," the secretary added. Classrooms and teachers were more urgently needed than "fancy tools," the Education Secretary went on to say.
It's a fair point, and a concern that research will need to show is unfounded when the first deliveries of the laptops take place next year (assuming they do).
Further proof, if any were needed, that open source business operates under different rules. Here's a letter from CEO of Pervasive, a company set up to "with the goal of helping accelerate the transition from the traditional, high cost database licensing model to the open-source, high-value model", specifically to PostgreSQL. But something funny happened along the way:
While we always knew that PostgreSQL is a solid product with advanced database capabilities and that it has a very real opportunity to shake up the high-end database market, we underestimated the high level of quality support and expertise already available within the PostgreSQL community. In this environment, we found that the opportunity for Pervasive Software to meaningfully increase adoption of PostgreSQL by providing an alternative source for support and services was quite limited.
In other words, the PostgreSQL community can look after itself, thank you very much. Just as can all other flourishing free software projects. Which is why, ultimately, they will prevail, because there are immune to the fads and fashions of the business world. (Via Matthew Aslett.)
Om Malik has a nice (if brief) piece on the rise and rise of the domestic penguin. I particularly liked this:
The logic is simple – the other two players – Apple and Microsoft will have their own proprietary systems. Apple will not share them with anyone else, of course! Microsoft would as long as you play ball with their DRM system or some other lock-in. (And they are not to be taken lightly.)
This leaves CE makers, and start-ups with one option – Linux.
Time to put some herring in the 'fridge.
This isn't particularly new, but I've only just caught up with it (on Irving Wladawsky-Berger's blog). It's "Democratizing Innovation" by Eric Von Hippel, yet another book that draws inspiration from open source:
The trend toward democratized innovation can be seen in software and information products -- most notably in the free and open-source software movement -- but also in physical products. Von Hippel's many examples of user innovation in action range from surgical equipment to surfboards to software security features. He shows that product and service development is concentrated among "lead users," who are ahead on marketplace trends and whose innovations are often commercially attractive.
Nor have I read it, but the basic idea sounds good, and Von Hippel is generally very sound - as evidenced by the fact that he has made the book freely available online under a CC licence.
The Semantic Web is a kind of intelligent Web, one that consists not just of contextless numbers and meaningless words but of data that means something:
The Semantic Web is a web of data. There is lots of data we all use every day, and its not part of the web. I can see my bank statements on the web, and my photographs, and I can see my appointments in a calendar. But can I see my photos in a calendar to see what I was doing when I took them? Can I see bank statement lines in a calendar?
Why not? Because we don't have a web of data. Because data is controlled by applications, and each application keeps it to itself.
The Semantic Web is about two things. It is about common formats for interchange of data, where on the original Web we only had interchange of documents. Also it is about language for recording how the data relates to real world objects. That allows a person, or a machine, to start off in one database, and then move through an unending set of databases which are connected not by wires but by being about the same thing.
Tim thinks it's going to be really big, which is good enough for me.
But to use all this richness, we're going to need tools, so it's good to seem some open source ones coming along. Doubly good, because they're from a company, Aduna, that has seen the free software light. (Via Enterprise Open Source Magazine.)
25 July 2006
Why am I not suprised by this?
“It appears as if the venture capital industry is slowly ratcheting up investment levels for the first time in four years, and these increases seem to be directed in a prudent manner,” Mark Heesen, president of the National Venture Capital Association said in a press release. That might have been true, but the recent trend of big money investments shows that prudence might be giving away to old-fashioned avarice.
Om Malik makes clear why the next dotcom delirium is inevitable:
Before they can go out and raise cash for new funds, many of the firms have to invest the remnant money from their circa 2000-2001 funds. Otherwise, Limited Partners might be asking them the difficult question: Why should I give you more cash when you are sitting on a pile already? A good example would be Oak Investment Partners, which raised $2.56 billion to become the largest venture capital fund ever.
In other words, there is simply too much greedy money in the system, some of it left over from the last VC feeding frenzy at the trough, which in the coming days absolutely must be spent, whether or not it makes sense. Did anyone say bubble?
When it comes to virtualisation, I still wonder what the fuss is about, since it all seems so, well, virtual. But even I have to admit this (warning: highly-level geekery) is seriously heavy:
Xen, an open source virtual machine (VM) monitor, works with just about any Linux distribution. One useful feature for shops that care about high availability is Xen's ability to migrate virtual machines while they are running.
With Xen and live migration, hardware can be replaced or upgraded without interruptions in service. Applications can be freed from the hardware they run on.
Jay Rosen is the Richard Stallman of open source journalism: he has thought the ideas and pushed for the action. So anything he came up with would be interesting, but I think that his NewAssignment.Net idea is more than that. It is:
In simplest terms, a way to fund high-quality, original reporting, in any medium, through donations to a non-profit called NewAssignment.Net.
The site uses open source methods to develop good assignments and help bring them to completion; it employs professional journalists to carry the project home and set high standards so the work holds up. There are accountability and reputation systems built in that should make the system reliable. The betting is that (some) people will donate to works they can see are going to be great because the open source methods allow for that glimpse ahead.
In this sense it’s not like donating to your local NPR station, because your local NPR station says, “thank you very much, our professionals will take it from here.” And they do that very well. New Assignment says: here’s the story so far. We’ve collected a lot of good information. Add your knowledge and make it better. Add money and make it happen. Work with us if you know things we don’t.
Do read the whole post: it's long, very detailed and very well thought-out.
I hope it works. But I fear it may not, because it sounds terribly similar to schemes during dotcom 1.0 that were designed to do the same for open source. That is, somebody - companies, usually - would put up money to get particular bugs fixed. Coders would then agree to fix the bugs for the money. It was a great idea, but all fizzled out somehow.
Maybe this will work better, because people will be more engaged about stories, especially if it touches their lives in some way. But in any case, it's worth trying, especially since Craig Newmark, of craigslist, has provided $10K to give it a whirl. (Via Searchblog.)
Mention wikis, and most people think of Wikipedia. But Wikipedia is a one-off, a unique, unrepeatable example of what a wiki can be. And so its well-aired growing pains are also pretty specific to what it is and what it's trying to do. They arise mostly from the lack of a strict organisational hierarchy that allows content to be perused and ultimately policed. Strikingly, just such a hierarchy is a salient feature of all the main open source projects, from Linux down.
As a consequence, this most uncorporate of tools might just flourish best precisely in the context of a company. Why? Because there the hierarchy is already in place - it doesn't even need to be articulated, it can simply be applied in the context of a wiki. Basically, this means that more junior members of the hierarchy have to watch what they say and do more than senior ones.
That doesn't imply that they should refrain from joining in: on the contrary. The wiki is a canvas on which to display their wit and wisdom to even the most senior echelons of the company, so it would be counter-productive to abstain entirely. But it does mean that the kind of puerile activities that some get up to on Wikipedia would be self-censored.
Against this background, it's interesting to see announcements from two companies offering corporate wiki products. JotSpot applies the wiki's collaborative method to traditional tools like documents and spreadsheets. I think that's a mistake, because the wiki isn't so much a way as a thing, contrary to popular wisdom. After all, collaboration is hardly a new idea; what's new is the specific form of the wikispace in which it happens. In that respect, I prefer Socialtext's approach, especially now that it has come out with an open source version.
The Association of Learned and Professional Society Publishers (ALPSP) has released its response to the British Library's Content Strategy. It's a wonderful demonstration that they don't really know what is going to hit them:
We believe that a shift towards the provision of online rather than physical access is appropriate. However, customer expectations of what is possible with online content are limited only by the capabilities of the technology, and not by realistic business considerations; at the extreme, every UK citizen might expect free online access, and unhindered re-use, at home or at work to everything in the Library’s collection, which would obviously destroy the market for publishers.
Saying that online access will "obviously" destroy the market for publishers is akin to saying - as was said - that television will obviously kill the cinema, that the cinema will obviously kill the theatre etc. etc. Those who are the gatekeepers of older technologies always fear new ones that will reduce their powers. But what happens is that new technologies tend to create new opportunities even for the older ones they appear to threaten - at least to those who are open-minded enough.
Two classic examples: MP3s have created a huge demand for songs that are no longer in the catalogues, and P2P networks are full of videos of old television shows. Think how much money they owners of these materials could make if they decided to satisfy this demand themselves, instead of trying to stifle it.
It's the same with books. Providing online versions does not kill the need for books; indeed, it is likely to encourage people to buy more, for one very simple reason. The text that you read online is not the text you read in a book, even though the characters are similar: it lacks the physical experience of bookness. It is that - not the text - that book publishers are ultimately selling.
"Obviously", judging by the comments above, and by many others elsewhere, it's going to take a long hard battle to din this idea into the heads of those in the publishing industry. (Via Open Access News.)
Another reason why the system for registering domain names needs sorting out:
For quite some time numerous European registrars have been complaining that the company Ovidio is registering .eu domain names en mass. Upon looking into the complaints Eurid's legal department came to the conclusion that Ovidio Ltd., Fausto Ltd. and Gabino Ltd. had been used as a cover by some 400 registrars to "warehouse" .eu domain names.
At least this time there may be some rapped knuckles. Ultimately, though, something more thoroughgoing is required to stop this persistent abuse.
24 July 2006
I don't pretend to know anywhere near enough about the inner dynamics of the WTO to understand what the apparent failure of the "Doha Round" means, but I live in hope that it represents some fatal weakening of the WTO globally. Especially in the area of intellectual monopolies, I can't help feeling that the WTO is a 20th-century mechanism for solving a quintessentially 21st-century problem.
I'm a big fan of Wikipedia; I use it several times a day. But that does not mean I am blind to its manifest (and manifold) faults. It does mean that I want it to get better.
So I was particularly interested to read about the latest kerfuffle involving an entry that was edited inappropriately and the battles to get it rectified, because it involved Bernard Haisch, who is the president of the Digital Universe Foundation. Digital Universe is a major project that, among other things, will offer a test of whether it is possible to use wikis to organise knowledge in such a way as to enjoy all or most of Wikipedia's advantages without its most egregious disadvantages.
Haisch's measured complaint is well worth reading. But perhaps even more interesting is Larry Sanger's comments on the incident. As the co-founder and "chief organiser" of Wikipedia from 2000 to 2002, Sanger is uniquely well-placed to draw lessons from the saga. Now might be a good time for Wikipedia to phone home and take cognisance of Sanger's views.
I'm not keen on the term 'pirates' when applied to people who copy content; its one of those blatant attempts to pre-empt the debate by adopting a deliberately loaded terminology (rather like the idea of a 'war on terror'). My view is that pirates - the real ones - were a murderous and contemptible crew whose crimes are not even remotely comparable to those who transgress one-sided and disproportionate copyright laws, and therefore the two should never be associated.
But maybe I will need to re-visit my position. Although the reality behind pirates has not changed, the public perception probably has. And that's largely thanks to two films: Pirates of the Caribbean I and II. As a result of Johnny Depp's lovable rogue, equating those who infringe on copyright with pirates might actually make the former seem rather more admirable.
But there is something else interesting going on here. "Pirates", the film, is one of the most successful in recent times; and yet, as these figures show, it is also one of the most copied/'pirated' on the P2P networks. As noted by TechDirt, this goes to show in the most dramatic way possible, that
despite what movie execs say, their films can "compete with free" -- and do pretty well. Whether it's offering something more convenient, offering moviegoers a better experience, or using free content as a jumping-off point to sell people other stuff, there's lots of ways movie studios and theaters can thrive in the face of file-sharing. But to do that, they've got to own up to the obvious, and quit blaming piracy instead of changing how they do business.
As the open meme sweeps through field after field, there is a tendency to assume that openness on its own is enough. But as this wise post by Pedro Beltrão about open science explains, there's something else you need if you are to get the full benefits of opening up: modularity.
Open source thrives because major tasks are split up into smaller ones, joined by clean interfaces. This enables tasks to be distributed, and sometimes performed in parallel. Competition operates at the level of the small tasks - the best solutions are chosen - rather than at the top level, which is how proprietary software typically works.
But as Beltrão points out, science is still encouraging competition at the topmost level - at the point when results are published - which leads to teams being scooped and work wasted. Far more sensible if the whole were split up into smaller tasks where competition can operate more fruitfully, and he has some practical suggestions about how that might be achieved.
Being of the older generation, I've never really gone wild over YouTube. But I recognise and salute its cultural importance, because it represents yet another instance of people's innate desire to share. But as this post by fellow old fogey John Battelle points out, YouTube has a big problem: the majority of its content is basically illegal according to current copyright laws.
So here we have a young people's phenomenon that is spreading like wildfire, and that is doomed once the Content Commissariat realise what's going on and decide to get their legal clubs out. In other words, it's Napster 2.0.
There is a great irony at the heart of the Internet. Free software and its characteristic distributed development method were made possible by the Internet. Similarly, many of the earliest free software programs - Sendmail, BIND etc. - helped create the Internet. And yet today, the knots of the Net's interconnections - the routers - are generally proprietary (and usually from Cisco).
So here's an idea: how about creating an open source router? Enter Vyatta, which is doing precisely that. It's been working on the idea for a while, and, according to GigaOM, is close to launching its first product.
Assuming they get it right, I don't see any reason why this shouldn't steadily chip away at Cisco's dominant market share, just as every other open alternative to commoditised products has done. As they do, expect other open source solutions to enter this market soon.
23 July 2006
When will they ever learn?
Unlike traditional forms of identification, the VeriChip can’t be lost, stolen, misplaced, or counterfeited.
That's what the human-implantable RFID VeriChip site says. And this is what happened at the Hackers on Planet Earth (HOPE) 6 conference:
two presenters demonstrated the electronic equivalent of making a copy of an implanted RFID or radio frequency ID chip.
The point was to show just how easy it is to fool a detection device that purports to uniquely identify any individual.
So let's just do a quick recap: which technologies are available for establishing identity unambiguously these days?
Implanted RFID: nope
So, tell me Mr Blair, how exactly you were going to implement this ID card system in a way that it can't be spoofed to hell?
Apparently, Benjamin Horst has succeeded in rustling up enough dosh to take out an ad for OpenOffice.org, along the lines of the big Firefox ad campaign. Well done, that man.
This is doubly good news. First, because OpenOffice.org will gain some useful publicity, and secondly, because it shows that the Firefox ad was not just a one-off. In this sense, these campaigns point the way for future open source marketing drives.
22 July 2006
I knew that GoDaddy was evil, but this tie-up between the company and Google suggests that the latter is moving deep into the darkness too.
The scheme goes by the inglorious name of "CashParking": basically, it encourages scammers to register every possible domain name, and then load the home pages up with Google AdSense to generate some money from the poor individuals who wander there by mistake. As a result, it will reduce the overall attractiveness and value of the Web, and encourage even more of this behaviour. Typical parasitism, in other words.
Shame on you, Google: aren't your soaraway profits enough? (Via Searchblog.)
A long and thoughtful post by Matt Asay on why JBoss has been so successful. No secret sauce, but plenty of fascinating figures along the way. This is an area I'm getting into in a big way: my recent LWN.net article was a shot across the bows. You have been warned.
21 July 2006
Here's a useful round-up of free software projects at the United Nations - there's more than you might think. Pity that this section is so mealy-mouthed:
Most of the traditional software industry has its base in the developed world; there is concern that promoting FOSS could hurt this industry. From the developing countries' perspective, however, FOSS is a way to introduce competition in order to lower costs and expand options. The different views of the role of software in development have hindered the UN's ability to create a single coherent strategy for FOSS to apply to all member states.
Free software does not "hurt" the traditional industry, it merely brings a much-needed balance between users and suppliers. Business models may change, but as IBM and others have shown, there's plenty of money to be made under this new regime. To phrase it in terms of "hurt" gives succour to outdated and paternalistic practices that have no place in the new computing landscape.
This stuff is getting too easy.
First, find some ancient remains - Croatian Neanderthal bones are great. Next, sequence lots - at least 20 times coverage. Don't worry if all you're getting are tiny fragments with around 100 DNA letters, and the signal is vastly swamped by bacterial noise. Just bung the results into a computer, and tell it (a) to cancel out all bacterial genome sequences (b) to join up all the rest. Result: one Neanderthal genome.
There's just one problem:
If the Neanderthal genome were fully recovered, it might in principle be possible to bring the species back from extinction by inserting the Neanderthal genome into a human egg and having volunteers bear Neanderthal infants. There would, however, be great technical and ethical barriers to any such venture.
Understatement of the Year, Number 369.
First cars, then trains, now planes. New Scientist is reporting that it is now possible to create almost an entire plane by "printing" the components:
In rapid prototyping, a three-dimensional design for a part - a wing strut, say - is fed from a computer-aided design (CAD) system to a microwave-oven-sized chamber dubbed a 3D printer. Inside the chamber, a computer steers two finely focussed, powerful laser beams at a polymer or metal powder, sintering it and fusing it layer by layer to form complex, solid 3D shapes.
Two things are interesting here. First, this is precisely what Michael Hart, the founder of Project Gutenberg, has been predicting for years. Indeed, he sees Project Gutenberg, which essentially lets you print your own books, as just the first, quite small step in the next industrial revolution, where physical objects will be printed routinely.
Secondly, note that the parts are printed under the control of a software program. So if the program and the data are open, this means that effectively the physical object will also be open. As usual, openness brings with it all the usual advantages of speed and lack of redundancy - you can re-use parts or parts of parts in other designs to create quickly entirely new objects.
The group behind the Adelphi Charter has apparently proposed that the UK Patent Office be replaced with an Office of Intellectual Property:
The proposed new statutory body, the Office of Intellectual Property, with the suggested acronym OfIP, would cover all kinds of IP, including copyright and patents, and would have to report annually to the British Parliament on its activity, said John Howkins, director of the Creative Economy Forum, which put forward the proposal. The forum is a group of international IP experts - business, academics, non-governmental organisation representatives - interested in the creative economy.
An interesting idea, though I'm not wild about enshrining the term "intellectual property" in the title: how about the Office of Intellectual Monopolies? That's got a nice Orwellian ring about it.
Our bodies are open. That is, unless we are planning some criminal activity, we do not try to hide the basic physical facts about ourselves - our voice, our face, our eyes, our fingerprints. Unfortunately, these are precisely the characteristics that biometric ID schemes depend on for verification. This is tantamount to walking around with a large sign saying "my password is xxxx".
And this isn't just my opinion. Here's what one Bori Toth, biometric research and advisory lead at Deloitte & Touche, no less, has to say on the subject:
Many people are trying to regard biometrics as secret but they aren't. Our faces and irises are visible and our voices are being recorded. Fingerprints and DNA are left everywhere we go and it's been proved that these are real threats.
So the use of precisely these spoofable biometrics is just one more reason to bin the whole idiotic ID card idea, which rather depends on them being foolproof. (Via Slashdot.)
Here's a good analysis and general round-up of why Microsoft's decision to produce an extremely spiffy and completely different interface for Office 2007 was as bad a move as a very bad move can be. For me, the killer quote is:
OpenOffice.org and Microsoft Office Currently Look More Similar Than Office 2003 and Office 2007
Training your users to go from Office 2003, or before, to OpenOffice.org, might be easier than teaching them the whole new Office 2007 system.
Just how easy is Microsoft trying to make the decision to jump to OpenOffice.org?
...is all very well, but will only work properly if attitudes to current intellectual monopolies change. Simply licensing stuff to and from others will prove too sclerotic without a radical simplification in the area of patents.
What is needed is a commons-based approach, where most ideas are available for all to use: then the value is added in the way ideas are combined, not just in coming up with them in the first place.
For younger readers of this blog, the name Andy Tanenbaum may not mean much. But for oldies such as myself, it is highly redolent of those epic days when Linux was but a fledgling kernel, and taunts like "your mother was a hamster" and "Linux is obsolete" were thrown down like gauntlets.
I had the pleasure of interviewing Tanenbaum for my book Rebel Code, and it was fascinating to learn how close he came to creating what we now call GNU/Linux with his Minix. But Tanenbaum failed to do one crucial thing that Linus did almost without thinking: to let go. Understandably, as a professor of computer science Tanenbaum wanted to keep control of his teaching materials. But that one, tiny, reasonable brake was enough to stunt the growth of Minix and lend wings to Linux when it appeared in 1991.
Tanenbaum is still teaching, at the Vrije Universiteit in Amsterdam (another Dutch story, then - must be the Rembrandt Effect), and I was interested to note this piece about some of his recent work on developing an anti-RFID device. Good to see him still moving forward in his work. (Via openspectrum.info.)
A short, but interesting reflection on the changing nature of conversations - naked and not-so-naked. The key point:
The bottom line is that technology ushers in new forms of social organization that escape notice precisely because they are invisible to adherents of the old paradigm.
Although I can't quite claim to go back to the very first commercial domain, I do remember the Wired story about how many major US corporations had neglected to register relevant domains. And I also remember how around $7.5 million was paid for the utterly generic and pointless business.com domain.
So I've seen a thing or two. And yet I can still be disgusted by the depths to which the scammers can sink when it comes to domain names. Try this, for example: a company that seems to be magically reserving domain names shortly after people have entered them as a Whois search - only to dump it if it doesn't pull in any traffic.
It's this kind of parasitical business model that is pushing the domain name system close to breakdown, and making the Internet far less efficient than it could be.
Just in time for Rembrandt's 400th birthday, here's some good news from Holland:
De gemeente Groningen heeft besloten om een overeenkomst met Microsoft voor de levering van de Office-suite van de softwaregigant te laten verlopen. De noordelijke gemeente heeft namelijk besloten om over te stappen op het opensourcepakket OpenOffice.org. Hiermee is de gemeente Groningen volgens eigen zeggen de grootste gemeente in Nederland die serieus met opensourcesoftware aan de slag gaat.
Which, I think, says (roughly) that the northern Dutch municipality Groningen has decided not to renew its contract with Microsoft for Office, but to go with OpenOffice.org, confirming Groningen's position as the open source leader in Dutch local government.
What's interesting is that it's OpenOffice.org that's driving open source uptake again. Sure, Firefox is more widely used, but it rarely figures as a conscious decision. And it's certainly not one that loses Microsoft any revenue (though its managers probably lose some sleep), as OpenOffice.org will in Groningen, to the tune of 330,000 Euros. (Via LXer.)
20 July 2006
It looks like Bill Gates is one step closer to getting it. According to this press release from his foundation, regarding a major research grant to create a series of research consortia to accelerate HIV vaccine development:
These consortia will be linked to five central laboratories and data analysis facilities, enabling investigators to openly share data and compare results, and allowing the most promising vaccine approaches to be quickly prioritized for further development.
As a condition for receiving funding, the newly-funded vaccine discovery consortia have agreed to use the central facilities to test vaccine candidates, share information with other investigators, and compare results using standardized benchmarks.
In other words, Gates is demanding open data sharing, and maybe open access too (it's not clear yet, as Peter Suber notes).
But this is a slippery slope, Bill: once you accept the inherent efficiency of sharing data "openly", as the press release emphasises, it's only a short conceptual leap before you find yourself accepting and then encouraging the other ways of sharing stuff "openly"....
Comments are the ichor that courses through the blogosphere's veins. A blog with no comments is probably dead, and a blogger that doesn't comment on the blogs of others probably needs to get out more.
But if it's hard enough keeping track of all the interesting things happening so that you can blog about some of them, keeping track of all the comments to your comments has been practically impossible. No longer. As this TechCrunch piece notes , there are now no less than three rival services that will help you track comments. Maybe I ought to try one.
I have to admit that I tend to read The Reg more for its entertainment value than for its incisive analysis (with the honorable exception of John Lettice's pieces on ID cards, which always manage to be worth reading on both counts). But there's no doubt that sometimes there's some sharp thinking as well as sharp writing.
Like this piece on Microsoft's snuggle-up with XenSource in the field of server virtualisation:
Knowing that it can't compete in the market in the interim, Microsoft has played the old IBM trick of creating confusion. Don't go with VMware. Go with XenSource. That's who we like. Have a look at what they have to offer.
Blooks are generally blogs turned into books, but I suppose we can stretch the term to include books that are available from blogs. Here's a heartening story about the latter type.
A novelist became fed up waiting for an agent or publisher to deign to acknowledge his existence (don't we all know it?). So he did the obvious thing: bunged it up on his Web site for anyone to download. His reasoning?
I'm putting this full-length novel online and encouraging you to read it, send it to your friends, blog about it, distribute it on your blogs, etc. and we'll all see what happens. Maybe nothing. But maybe ... something. In fact after weighing the pros and cons of doing this, I can't find any actual downside.
I wrote The Agency Delta because I had a story. Now I want people to read it. I think it's a great story. Now tell me what you think.
This is the future, you writer people. (Via Self Publish Blooks.)
I wrote earlier today about the fallacy of assuming that once you start offering money the spirit that informs the world of collaborative efforts like open content evaporates, leaving crass cupidity. It occurred to me afterwards, that we have already been here before.
Back in 1998, the first wave of open source IPOs hit. One of the main beneficiaries of the VA Linux IPO was Eric Raymond. As he wrote at the time:
A few hours ago, I learned that I am now (at least in theory) absurdly rich. ... VA had indeed gone out on NASDAQ -- and I had become worth approximately forty-one million dollars while I wasn't looking.
He then turns away from this typically self-centred story to examine (with characteristic insight) the wider implications of the IPOs that were happening:
Reporters often ask me these days if I think the open-source community will be corrupted by the influx of big money. I tell them what I believe, which is this: commercial demand for programmers has been so intense for so long that anyone who can be seriously distracted by money is already gone. Our community has been self-selected for caring about other things -- accomplishment, pride, artistic passion, and each other.
This is still true. As proof, witness the Season of KDE 2006:
As in 2005, KDE again was a participating organization in this years Google Summer of Code 2006. Many interesting and much needed project ideas were submitted and students from all over the world began to apply for them. The KDE project received more than 200 student applications. Sadly Google's capacities are not limitless and thus, only 24 students were selected to participate in Google's Summer of Code under the mentorship of the KDE project.
Driven by the urge not to let many good applications go to waste the KDE project decided to give many of the rejected students a chance to realize their ideas after all in the first Season of KDE. Since KDE does not have Google's financial capacities the students will not get paid for their efforts. Still it is a very good opportunity for students to get involved in KDE development while being mentored by an experienced KDE developer and as a result be an active part of the Free Software Community.
In other words, no Google moolah is flowing, but the aspirants coders are still coding - out of sheer hacker love. Kudos to the students for doing so, and to their mentors for giving their time. That's what this open stuff is all about.
Like many, I've had great fun playing around with Google Trends. The tricky thing is trying to find something sensible to say about what you find there. Luckily, when it comes to GNU/Linux and related matters, Steven Vaughan-Nichols has already done it.
Matthew Aslett usefully flags up in his blog the rise and rise of open source in canonical top ten lists of computing - like the one is his own title, Computer Business Review. Yes, it's all arbitrary of course, but is always has been; so the appearance of open sourcey-ness all over the place is symptomatic, if nothing else.
(Parenthetically, I was pleased to see Angela Eager mentioned in his post: I gave Angela one of her first jobs in tech journalism a couple of geological epochs ago. It's good to see that training stood her in good stead.)
Larry Sanger (who does) explains to Jason Calcanis (who doesn't) what all this open content is really about - and why it isn't going away once companies start waving fistfuls of dosh in the air.
A little while ago I wrote about a slightly provocative list from Business 2.0 that suggested that a certain Linus Torvalds doesn't really matter any more. Joe Barr has followed this up with a hilarious exchange with The Man to find out his feelings on the same. An excerpt:
NewsForge: Have you really made a billion dollars from Linux?
Torvalds: No. Linux was just the cover story. I made all my money smuggling drugs while traveling to international conferences under the guise of talking about "the future of technology" or some such tripe.
It's wit like this that shows most clearly why Linus does matter. (Via fUSION Anomalog.)
As ODF continues its long march to conquer the world, the number of ODF documents that you come across online will increase. This makes an ODF Viewer for Firefox an indispensable tool. And there's one on the way - but be careful, it's alpha code only at the moment. (Via Bob Sutor's Blog.)
19 July 2006
I'm a big fan of open source meshes, with their potential to offer alternative ways of accessing the Internet. I'd not heard of the Champaign-Urbana Community Wireless Network before, but this story on GigaOM about serious NSF backing for work on an open source mesh network looks promising.
An important new collection of essays on open access has been published. It's called Open Access: Key Strategic, Technical and Economic Aspects. Hearteningly, most of the chapters have been self-archived by the authors: kudos to them for doing so, and to Chandos Publishing for being enlightened enough to allow it. (via Open Access News.)
18 July 2006
The World Wide Web Consortium (W3C) is the sticky stuff that holds the Web together; without it, the whole caboodle would slowly come unstuck, fraying into lots of proprietary strands.
So this kind of posting, which seems to indicate problems at the heart of the W3C, is deeply worrying:
I believe for our society to progress it's essential that our culture, our knowledge, and our society itself are as accessible as possible to everyone; web standards are how we choose to achieve this on the World Wide Web, and for us to communicate, especially if we have special needs or novel ideas about information access, it depends on compliance to web standards. With this in mind I became interested in assuring standards compliance on the Web and involved in the development of tools meant to help in this respect at the World Wide Web Consortium seven years ago.
I now have to discontinue my participation in this area at the W3C and would like to explain how the World Wide Web Consortium failed to provide what I think would have been and still is necessary to advance the tools and services to an acceptable level, which will explain why I am leaving now.
An interesting report from the French Ministry of Defence, that OpenOffice.org may have neglected some security issues in its headlong rush to achieve parity with Microsoft Office. The problem seems to lie with macros, and frankly, I'm not surprised. I never use them, and I really think that anyone who does is asking for trouble. A word processor is for, you know, processing words; it does not need to pass the Turing Test.
Still, this is the kind of stuff that's easily fixed with the odd huge window marked "Danger: do not run this macro" every five seconds. (Via in Ars Technica.)
What makes a film open source? I've written before about Elephant's Dream, which is open source in the sense you can download all the Blender files that go to make it up. And now there's this:
Boy Who Never Slept is a free full length movie that anyone can watch, share, and even use in their own derivative works (open source).BWNS is about an insomniac writer and his relationship with a teenage girl he meets online. What begins as merely a friendship, evolves into an unlikely love wrapped in harsh reality..
There's a more thoughtful explanation, too:
In this sense an open source movie refers to finished video content released with a derivative license along with the "source" or original files used to create the finished video, are released with a derivative license as well. This allows a lot more ability for new artists to edit, remix, and evolve the original work.
Other items that may or may not be released along with the video content include the script or screenplay, the soundtrack, and the sound effects.
These source files are available here. (Via Enterprise Open Source Magazine.)
One noticeable effect of blogs is that they can bring out the worst in people. In part, this is the email problem of being unable to judge tone writ large. But it also seems to be the case that the sheer ease-of-use of the medium encourages all kinds of loonies to creep out of the woodwork. Religious wars on the relative merits of free software and open source are quite mild compared to no-holds-barred attitude among the political blogs, which seem to polarise writers and readers alike.
That's why I tend to avoid sites like the one this appeared on, but as you will see from the post in question, something interesting is about to happen: the self-professed "lizardoids" are about to take on the "moonbats" in the Web 2.0 arena. What this means in practice is that there are going to be huge battles for the soul of Digg, with lots of marking up and down.
It might be quite entertaining, but it certainly won't be pretty. (Via BGSL.)
Yesterday, when I was writing about the structures found in DNA, I said
Between the genes lie stretches of the main program that calls the subroutines
This is, of course, a gross over-simplification. One of the most interesting discoveries of recent years is that between your common or garden genes there are other structures that do not code for proteins, but for strings of RNA. It turns out that the latter play crucial roles in many biological processes, for example development. Indeed, they are fast emerging as one of genomics' superstars.
So it is only right that Nature Genetics should devote an entire issue to the subject; even better, it's freely available until August 2006. So get downloading now. Admittedly, microRNAs aren't the lightest of subject-matters, but they're mega-important.
Last night, my Windows 2000 box died. To be fair, it was nothing to do with Windows, but a dodgy hard disc. And yes, of course I have backups...it's just that they're not entirely up-to-date, and missing even a few days' data is a pain. I could re-install Windows and hope that gave me access to my data (stored on a separate partition), but this would take a few hours that I don't have, and might not work. Luckily there's a better way.
Booting up the PC with the Knoppix 5.0 Live DVD inside produced not only a working machine in a couple of minutes, with access to all of my data, but a cool 5000 programs at my beck and call. Including K3b, which meant that I could simply burn copies of the data I was missing. Problem solved.
Thanks, Knoppix: you're a gent.
The Future of Media Report has two main things going for it. First, it comes from an Australian group, which gives it a slightly different perspective on things. Secondly, it is packed full of interesting graphs and charts. Make that three: it's available under a liberal CC licence.
17 July 2006
That old contrarian curmudgeon, Andrew Orlowski, has found a soul-mate in Richard Bennett: "[t]he veteran engineer played a role in the design of the internet we use today, and helped shaped Wi-Fi" as Orlowski explains before an interview. In addition,
Bennett argues that the measures proposed to 'save' the internet, which in many cases are sincerely held, could hasten its demise. Network congestion is familiar to anyone's whose left a BitTorrent client running at home, and it's the popularity of such new applications that makes better network management an imperative if we expect VoIP to work well. The problem, he says, is that many of the drafts proposed to ensure 'Net Neutrality' would prohibit such network management, and leave VoIP and video struggling.
The conversation that follows is extremely interesting, and certainly hits home. But I have big problems with this part:
They all seem to be worried that ISPs have secret plan to sell top rank - to pick a search engine that loads faster than anyone else's. But it's not clear that a), anyone has done that; b), that it's technically achievable; or c) that it is necessarily abusive; or d) that their customers would stand for it.
These all seems very weak arguments in against net neutrality; I'd rather err on the side of hippy edge-to-edge goodness.
I've written approvingly of PLoS ONE before, and it's also good to see that the underlying software platform will be open source. But I was disappointed to read this post calling for some "help to shape the future":
Now is your chance to get very actively involved in the creation of TOPAZ, the new Open Source publishing platform which PLoS is involved in developing and which will be supporting PLoS ONE when it launches. What we need is some people in the San Francisco area who would be willing to be on a focus group to give us some advice on the feel and functionality that you would like to see. It is a great project and we really do want your views.
Surely a global perspective is absolutely critical to what PLoS ONE is trying to achieve? So limiting focus groups to a very particular part of the anglophone world seems foolish, to say the least.
And it's not as if there aren't other ways that this could be done, taking input from all around the world. For example, I've heard this thing called "The Internet" can be quite handy in these circumstances....
Think of GNU/Linux and you probably think of servers, maybe with a smidgeon of desktop thrown in for good measure. In fact, the domain where GNU/Linux utterly dominates is that of high-performance computing.
But it may be that GNU/Linux's finest hour is yet to come - as a mobile phone operating system. After all, it is likely that there will be a mobile phone for most people on this planet one day, but the same cannot be said about conventional PCs.
So news stories like this one, about the doubling of membership of a GNU/Linux phone standards group, are actually rather important.
Is it just me, or has the entire VC industry gone utterly bonkers over Internet video? It seems that every day there's a new YouTube me-too launch on TechCrunch - with the usual "I think it's got potential" that seems to accompany every such story - and now, thanks to IP Democracy, we have a neat little table that suggests that VCs are, indeed, barking:
we take a look at the amount of venture money that has flowed into IP video start-ups over the past year and find that over $600 million has been invested in the YouTubes and Sling Medias and MobiTVs of the world since around this time last year. Our list (see table below) doesn’t even include investments for web sites or technology companies that focus a lot — but not primarily — on IP video efforts.
I realise that the Internet is a series of tubes, but as far as video is concerned, things really seem to be going down them.
Blige, serious goings-on in the EU parliament. In a text recently adopted, it
18. Takes note of the Commissions' view that the EU must acquire a cost-effective, legally watertight and user-friendly system of intellectual property protection so as to attract technologically advanced companies; considers that the protection of intellectual property must not interfere with open access to public goods and public knowledge; urges the Commission to promote a socially inclusive knowledge-based society by supporting, for example, free and open source software and licensing concepts like the General Public License (GPL) and the Public Documentation Licence (PDL);
This is a gauntlet thrown down to the European Commission, particularly Microsoft's friend, Mr McCreevy. I wonder what will happen next. (Via Heise Online.)
Virus writers are not known for their morals - or for being fanboys: they will use whatever means necessary to achieve their dubious ends. So when MacAfee warns that
Malicious-software writers are increasingly using open-source methodologies when developing their code
you can be pretty sure that this follows an extremely objective evaluation of the various competing development methods.
Maybe not quite the testimonial most free software enthusiasts were looking for, but a testimonial nonetheless.
The genome – the totality of DNA found in practically every cell in our body - is a kind of computer program, stored on 23 pairs of biological DVDs, called chromosomes. Within each chromosome, there are thousands of special sub-routines known as genes. Between the genes lie stretches of the main program that calls the subroutines, as well as spacing elements to make the code more legible, and non-functional comments – doubtless deeply cool when they were first written – that have by now lost all their meaning for us.
DNA's digital code – written not in binary, but quaternary (usually represented by the initials of the four chemicals that store it: A, C, G and T) – is run in a wide range of cellular computers, using a central processing unit (known as a ribosome), and with various initial values and time-dependent inputs supplied in a special format, as proteins. The cell computer produces similarly-formatted outputs, which may act on both itself and other cells.
Thanks to a far-sighted agreement known as the Bermuda Principles, the digital code that lies at the heart of life is freely available from three main databases: one each in the US, UK and Japan. As a result, the DNA that was obtained through the Human Genome Project is open source's greatest triumph.
But so far, no human genome can be said to represent any single human being: that of the Human Genome Project is in fact a composite, made up of a couple of dozen anonymous donors. But soon, all that will change; for the first time, the complete genome of a single person will be placed in the public databases for anyone to download and to use, creating in effect the world's first open source man.
His name is Craig Venter, and for nearly two decades he has been simultaneously revered and reviled as one of the most innovative researchers in the world of genomics. He was the person behind the company Celera that sought to sequence the human genome before the public Human Genome Project, with the aim of patenting as much of it as possible. Fortunately, the Human Genome Project managed to stitch together the thousands of DNA fragments it had analysed – not least thanks to some serious hardware running GNU/Linux – and to put its own human genome in the public domain, thus thwarting Celera's plans to make it proprietary.
A nice twist to this story is that it turned out that Celera's DNA sequence was not, as originally claimed, another composite, but came almost entirely from one person: Craig Venter himself. So his latest project is in many ways simply the completion of this earlier attempt to become the first human with a fully-sequenced genome. The difference now, though, it that it will be in the public databases, and hence accessible by anyone.
This will have profound consequences. Aside from placing his DNA fingerprint out in the open – which will certainly be handy for any police forces that wish to investigate Venter – it means that anyone can analyse his DNA for anything. At the very least, scientists will be able to carry out tests for genetic pre-dispositions to all kinds of common and not-so-common diseases.
So it might happen that a laboratory somewhere discovers that Venter is carrying a genetic variant that has potentially serious health implications. Most of us will be able to choose whether to take such tests and hence whether to know the results, which is just as well. In the case of incurable diseases, for example, the knowledge that there is a high probability – perhaps even certainty – that you will succumb at some point in the future is not very useful unless there is a cure or at least a treatment available. Venter no longer has that choice. Whether he wants it or not, others can carry out the test and announce the result; since Venter is a scientific celebrity and a public figure, he is bound to get to hear about it one way or another.
So while his decision to sequence his genome might be seen as the ultimate act of egotism, by choosing to publish the result he will in fact be providing science with a wonderfully rich resource - the complete code of his life - and at some considerable risk, if only psychological, to himself.