30 December 2006

Warning: Taggers at Work (Again)

Over the New Year period I'll finish off tagging older posts to improve navigation on this site. So please excuse the RSS deluge that follows.

29 December 2006

Banking on Benkler

Trust Yochai to give us a way forward - and some hope:

In the mass media environment, there was a general culture of "I saw it in print, therefore it must be true." This culture led to a relative atrophy of critical faculties, and made the public sphere highly manipulable, or simply prone to error. It is not, for example, that well-trained media critics could not point out the dozens of ways in which any given news report or television program were biased or incomplete. They could. But the readers, viewers, and listeners by and large adopted a trusting relationship to their media. We long spoke about the need to teach critical television watching. But that never happened, really. I think as a new generation grows up reading things that never have a clear voice of authority, that have only provisional status as inputs, we will begin to see a more critical, investigative form of reading, as well as listening and viewing. The act of reading will be more like an act of investigation, as one picks up pieces of evidence with variable levels of credibility, triangulates them, and arrives at a conclusion that continues, nonetheless, to be revisable and falsifiable. This is the essence of the scientific method. It is high time that people adopt it more broadly. I embrace this uncertainty, for with it comes critical reading. This trend is then strengthened by the widespread practices of cultural production, what I have characterized as the re-emergence of a new folk culture in the digital environment. People who create know how to be more critical users.

Those who Cannot Remember the Past...

...are condemned to release again it ten years later.

I was interest to read that Sun has launched its Looking Glass interface. Not just because it's yet another 3D-ish approach, with some interesting applications coming through. But also because Sun seems to be blithely unaware of the history of the Looking Glass moniker. As I wrote in Rebel Code:

Caldera was set up in October 1994, and released betas of its first product, the Caldera Network Desktop (CND) in 1995. The final version came out in February 1996, and offered a novel graphical desktop rather like Windows 95. This "Looking Glass" desktop, as it was called, was proprietary, as were several other applications that Caldera bundled with the package."

Caldera, of course, eventually metamorphosed (hello, Kafka) into SCO....

Enter the Metaverse/Matrix/Neuronet

An eagle-eyed Mark Wallace spotted the International Association of Virtual Reality Technologies (IAVRT), a new Web site/organisation, with its intriguing - and possibly redundant - Neuronet:

IAVRT is working with its VR member peers and the global community to create and govern a new real-time virtual reality network, separate and distinct from the Internet, which will be called the Neuronet. The Neuronet will be designed from the ground up as the world's first - and only - network designed specifically for the transmission of virtual reality and next generation gaming data. The Neuronet will organize the virtual reality world and ensure its safety, reliability, and functionality.

The purpose of the Neurornet will be to facilitate cinematic and immersive virtual reality experiences across distances. These will include almost every type of experience imaginable with some of the most obvious being real-time video chat, video streaming, virtual reality travel, history, adventure, gaming, entertainment, sports, hobbies, business, education, medicine and training to name just a few.

The Neuronet will function similarly to the Internet in its ability connect users in different locations, but instead of the user interface mechanisms associated with the Internet, it will use Virtual reality (VR) technologies to facilitate cinematic and immersive virtual reality experiences for end-users.

Sick in the Genome

From the nation that brought you whaleburgers:

The breeder told Mr. Sasaki that he had bred a dog with three generations of offspring — in human terms, first with its daughter, then a granddaughter and then a great-granddaughter — until Keika was born. The other four puppies in the litter were so hideously deformed that they were killed right after birth.

(Via Boing Boing.)

Google's Open Airbag

Google-watchers of the world, arise: a new name to add to the list of Googly things: Airbag.

In many cases trying to determine the exact cause of the crash can be as frustrating as the crash itself. Identifying the causality for crashes is a critical aspect of fixing the crash condition and making sure it doesn't re-occur.

That's where Google's open source Airbag project comes in. According to Google, Airbag is a set of client and server components that implements a crash-reporting system.

(Via Linux Today.)

Free Software's Rottweiler

I've noted before that FSF is changing; Bruce Byfield has noticed too, and written a good summary of what the new FSF has done in 2006 - and what lies in store:

Looking ahead to 2007, [executive director of the FSF] Brown sees only more of the same activism for the FSF. Both the BadVista and Defective By Design campaigns will continue, and he suggests that other campaigns in the coming year will probably focus on hardware drivers for GNU/Linux and software patents.

"It's going to be a busy year," Brown predicts. "2006 was great, but 2007 is going to be huge."

I can't wait.

Against Anti-Anti-Copyright

There's an interesting piece in the Reg by the photojournalist Sion Touhig, entitled "How the anti-copyright lobby makes big business richer." It's well worth reading, even though I think its attacks on the "anti-copyright lobby" are misguided.

The main problem according to the article seems to be big business taking copyrighted material from the Web, or employing user-generated material without paying for it. Addressing both of these seems a better solution than simply hankering for a past that will never return now that the Web 2.0 genie is out of the bottle.

If photographers can't afford to sue - another problem that needs to be sorted - they can at least name and shame: a central Web site for the purpose would do very nicely. And as for the exploitation of user-generated content, the solution here is education. If people were more aware of the cc licences, and used them, then the situation would be more regulated, if not controlled.

At the end of the day, though, photo-journalists will need to adapt, and find new ways of generating money from their work - maybe quite radical ones - just as we writers have had to adapt over the last decade. Perhaps that's regrettable, but it's also the way things are.

Your (Second) Life Flashes Past You

Although not quite on the level of LWN's comprehensive review of the year, Giff Constable's Second Life timeline is a handy summary of what happened when in undoubtedly crucial year for SL. And for all those getting their knickers in a twist over the issue of SL's user base (you know who you are) he makes an important observation:

There has been a lot of questions and skepticism around the numbers, and retention rates, which Linden Lab estimates around 10% - 15%, but if you just look at concurrent users, a year ago max concurrent users was around 5 thousand. Today it is around 20 thousand, a 300% growth rate, although still a fairly small pond.

I'd underline that this is concurrent users, so the total number in a given day is some multiple of this: if I had to guess, I'd say around six to eight times. This would give an active user base of around 150,000, which is consistent with a 10% retention rate and two million signups.

Let's Go 3D

As perceptive readers of this blog may have noticed, there's been an increasing number of stories about the rise of 3D technologies in computing, particularly in terms of the interface we use. Well, here's another one - a short but well-written piece about the different strategies of Google and Microsoft in this sphere from my favourite news magazine, Der Spiegel. (Via Ogle Earth.)

28 December 2006

From ODF to UOF and Back Again

Since both of the ODF and UOF office formats are based on XML, it isn't (theoretically) hard to move between them. Nonetheless, it's good to know that someone has actually put together code to do exactly that:

Peking University recently released a program to convert office documents between OpenDocument Format and the Specification for the Chinese office file format based on XML (UOF for short). Both standards are XML office document standards, UOF being a "National Standard of the People's Republic of China". The converter, which took nearly a year to complete, enables users to convert text, spreadsheet and presentation documents between ODF and UOF.

Why Wireless is Hopeless: Manufacturers are Clueless

Here's a thorough journalistic investigation into why manufacturers of wireless hardware have been less than helpful to the free software world:

Some of the non-responsiveness of manufacturers may just be bad PR work, but the same companies that wouldn't talk to me have also refused to reply to free software programmers who have requested the same information. The impression I got from most of these companies (excepting Intel) was that they were not at all prepared to deal with the issues of firmware redistribution rights and hardware API documentation requests. That they have ignored free software programmers' requests is not necessarily a sign of unwillingness to participate, but perhaps a general sense of confusion as to how they are able to help. No one seems to know whom to talk to at the company, and in some cases the proper documentation may not exist -- or it may belong to yet another company that the hardware manufacturer outsourced the firmware development to.

Open Source Software City: Foundations Laid

I've not mentioned Korea's open source software city before because details seemed rather scarce, and there are, after all, plenty of other cities using open source. Now, thanks to ZDNet Korea, we have something more, er, concrete:

Gwangju was designated as OSS City by Korea IT Industry Promotion Agency (KIPA) to bring up its economy and competitiveness through IT industry using open source to construct improved infrastructures in city's key industries like opto-electronics, automobile, mobile, and semiconductor.

The project with total cost of $45.7 million in three phases will run from 2006 to 2010. The first phase began in 2006 has completed Information Strategy Planning, surveying applicable open-source areas for the city to install open-source software as a main operating system of their infrastructures. Basing from the findings the open-source solutions were applied to Gwangju Information & Culture Industry Promotion Agency and Jeonnam girl's commercial high school giving channel to produce specialists through education sector.

(Via LXer.)

Thanks a Trillion

The RIAA has done a huge service by taking on the doughty AllofMP3 service.

The December 21 lawsuit argues that 11 million songs were allegedly pirated, and seeks damages totaling $150,000 per violation. That's a $1.65 trillion lawsuit - a value slightly less than the Gross Domestic Product for the United Kingdom in 2005.

Put like that, you realise that RIAA is now certifiably bonkers: a tiny Russian company has caused almost as much financial damage as one of the world's biggest economies to an industry worth at most a few billion dollars? I don't think so.

27 December 2006

Virtually Not Shocking at All

Now, why is it that I am not surprised by this result of a virtual recreation of the famous Milgram experiment?

The main conclusion of our study is that humans tend to respond realistically at subjective, physiological, and behavioural levels in interaction with virtual characters notwithstanding their cognitive certainty that they are not real. The specific conclusion of this study is that within the context of the particular experimental conditions described participants became stressed as a result of giving ‘electric shocks’ to the virtual Learner. It could even be said that many showed care for the well-being of the virtual Learner – demonstrated, for example, by their delay in administering the shocks after her failure to answer towards the end of the experiment. To some extent based on previous evidence this was to be expected. In fact, it has even been taken for granted that virtual humans can substitute for real humans when studying the responses of people to a social situation. For example, this was the strategy used in the fMRI study described in [19], where participants passively observed virtual characters gazing at the participants themselves or at other virtual characters. However, no previous experiments have studied what might happen when participants have to actively engage in behaviours that would have consequences for the virtual humans. The evidence of our experiments suggests that presence is maintained and that people do tend to respond to the situation as if it were real.

And people still dismiss Second Life and its ilk as just as "game"....

(Parenthetically, great to see this published on the new PLoS ONE.)

More than Academic?

I'm always a bit sceptical about academic studies of open source, since they tend to tell what you already knew, but five years late and dressed up in obfuscatory language. That said, there seems to be some genuine content in this specimen, entitled "Two Case Studies of Open Source Software Development: Apache and Mozilla". Worth a quick gander, at least. (Via AC/OS.)

26 December 2006

The Mainframe is Dead; Long Live the Mainframe

Interesting:

IBM today announced a mainframe milestone as more than 390 IBM business partners now offer nearly 1,000 applications for System z customers running Linux, a 100 percent increase over the last year. IBM recently reported a 30 percent year-to-year growth of mainframe customers running Linux....

This increase in Linux application development for the mainframe is being driven by a number of factors, including the overwhelming acceptance of partitioned Linux virtual servers -- and the associated great price and performance -- which is driving new workloads on System z.

(Via Enterprise Open Source Magazine.)

Open Source Goes Dutch

More good news from Europe (Via Australia, bizarrely):

The City of Amsterdam said Friday it will spend euro300,000 (US$400,00) testing open source software in two administrative districts in 2000...

City spokeswoman Marjolijn van Goethem said Amsterdam's housing department and one of its borough offices _ Zeeburg _ would test a Linux-based operating system on city computers, and open-source document software, replacing Microsoft Windows and Office.

The nub:

"Earlier this year, a study ordered by the (Amsterdam) city council showed that an 'open' software strategy leads to more independence from suppliers," the city said in a statement. "In addition, the use of open software can lead to better exchange and storage of information, without unacceptable financial or logistical risks."

(Via tuxmachines.org.)

Distro Delight

Beautiful.

A Nobel's Noble Words on the Pharmaceutical Commons

Great piece in the BMJ excoriating greed and stupidity in the pharmaceutical industry:

It is hard to see how the patent issued by the US government for the healing properties of turmeric, which had been known for hundreds of years, stimulated research. Had the patent been enforced in India, poor people who wanted to use this compound would have had to pay royalties to the United States.

And:

In 1995 the Uruguay round trade negotiations concluded in the establishment of the World Trade Organization, which imposed US style intellectual property rights around the world. These rights were intended to reduce access to generic medicines and they succeeded. As generic medicines cost a fraction of their brand name counterparts, billions could no longer afford the drugs they needed.

History will not be kind to those behind this disgraceful state of affairs. (Via Slashdot.)

Electric London

Well, London is electric, so it makes sense for local delivery lorries to go electric too. The only questions are (a) what took so long given that milk floats have been doing it for years (as the article above points out)? and (b) why isn't everyone doing this?

Desperately Seeking Search Wikia

It seems appropriate to return to active blogging after hours spent mindlessly tagging old posts (and I hope you lot are grateful) with a mega-story that could well shape the online world next year: Jimmy Wales' planned rival to Google, built on open source technology (Nutch and Lucene), and open source methodology. There's not much more to say at this point, but I predict I (and everyone and his/her dog) will be writing more about it.

23 December 2006

Warning: Taggers at Work

As I've noted before, tagging seems to be something people visiting this site find useful. So I've decided to tag all the older posts on this blog that were written before Blogger got around to adding that facility.

Please, therefore, note that most posts that turn up on RSS feeds over the next few days are not really new: do check the date before you get too excited by what is likely to be very old news.

Squeezing the Espresso Book Machine

Printing-on-demand has long been a dream, and the Espresso Book Machine looks like it's making it a reality, albeit on a small scale. But what's interesting about this - aside from the ability to get public domain books printed on the spot - is that it depends on the existence of a knowledge commons. Pity that recent copyright laws are doing their utmost to squeeze that space. (Via Slashdot.)

Kind of Blue

One of the less well-known benefits of creating a commons is that it allows people to experiment with those resources in an unfettered way. This often means that they come up with new and exciting uses that would never have arisen had the underlying material remained enclosed.

A good example is Flickr. This is a tremendous resource, and people just keep on coming up with new ways of using it. The latest is the wonderful Flickr Color Selectr: just chose a colour, and the site will search through Flickr for cc pictures that match it. Not just useful, but highly therapeutic too, for when you're feeling kind of blue.... (Via Creative Commons.)

22 December 2006

XXX for XML on its Xth Birthday

Back in the good old Web 1.0 days, XML was really hot. Here's a useful reminder that (a) XML is 10 years old (gosh, doesn't time fly when you're having fun?) and (b) it's still hot.

Last month marked ten years since the World Wide Web Consortium (W3C) Standard Generalized Markup Language (SGML) on the Web Editorial Review Board publicly unveiled the first draft of Extensible Markup Language (XML) 1.0 at the SGML 96 conference. In November 1996, in the same hotel, Tim Bray threw the printed 27-page XML spec into the audience from the stage, from whence it fluttered lightly down; then, he said, "If that had been the SGML spec, it would have taken out the first three rows." The point was made. Although SGML remains in production to this day, as a couple of sessions reminded attendees, the markup community rapidly moved on to XML and never looked back.

Two areas stand out in this report on the conference: XQuery and Darwin Information Typing Architecture (DITA). Here's to the next X.

Red Letter Day for Red Hat

Time to throw those hats in the air, methinks:

Red Hat, Inc. the world's leading provider of open source solutions, today announced financial results for its fiscal year 2007 third quarter.

Total revenue for the quarter was $105.8 million, an increase of 45% from the year-ago quarter and 6% from the prior quarter. Subscription revenue was $88.9 million, up 48% year-over-year and 5% sequentially.

Net income for the quarter was $14.6 million or $0.07 per diluted share compared with $11.0 million or $0.05 per diluted share for the prior quarter. Non-GAAP adjusted net income for the quarter was $29.6 million, or $0.14 per diluted share, after adjusting for stock compensation and tax expense as detailed in the tables below. This compares to non-GAAP adjusted net income of $22.7 million, or $0.11 per diluted share in the third quarter of last fiscal year.

These figures are important for a number of reasons (and no, I don't own any shares - never have, never will.) It shows that Red Hat has been unaffected by all of Larry's Machiavellian machinations; it also indicates the rude health of open source's bellwether. That's good not just for Red Hat, but for the whole free software ecosystem too.

Open Source: Just the Ticket for Librarians

Here's a well-written story about how librarians have undertaken a major open source project with great success:

The system, Evergreen, whose 1.0 release came in November, is an Integrated Library System (ILS): the software that manages, catalogs, and tracks the circulation of library holdings. It's written in C, JavaScript and Perl, is GPLed, runs on Linux with Apache, uses a PostgreSQL database, Jabber for messaging and XUL as client-side software. The system allows easy clustering and is based entirely on open protocols.

21 December 2006

Wengo's Wideo Widget

Wengo, the people behind OpenWengo, an open source VOIP project, are offering a free video widget (to the first 10,000 applicants, at least) that consists of just a few lines of HTML code (but uses Flash). (Via Quoi9.)

Allison Does the Noble Samba

Top Samba man Jeremy Allison, whom I had the pleasure of interviewing many moons ago, has done the decent thing, and cut his ties with (ex-)employer Novell:

I have decided to leave Novell.

This has been a very difficult decision, but one I feel I have no choice but to make.

As many of you will guess, this is due to the Microsoft/Novell patent agreement, which I believe is a mistake and will be damaging to Novell's success in the future. But my main issue with this deal is I believe that even if it does not violate the letter of the licence it violates the intent of the GPL licence the Samba code is released under, which is to treat all recipients of the code equally.

Sad day for Novell. Luckily, Jeremy will soon be snapped up elsewhere. Bravo for taking a stand.

Update: And the lucky winner is...Google - again.

Oh! to Be in Turkmenistan...

...now that Turkmenbashi's not there.

This is an area of the world that has always fascinated me; in the wake of Niyazov's unexpected death, it's about to get even more interesting....

Heading Towards 3D

Once this kind of thing becomes commonplace, there's no stopping the 3D wave. (Via TechCrunch.)

On the Statute Book

Great that we've finally been granted free beer access to our laws; pity that it's not free as in freedom. And, of course, positively treasonable, that we don't have access to the original Anglo-Norman texts. (Via Open Knowledge Foundation.)

The Sergeant's (Digital) Song

Well, here's a right rollicum-rorum:

Yesterday, UK telecom regulator Ofcom issued a Consultation paper on future uses of the "Digital Dividend" - the frequencies to be released when TV broadcasters migrate from analog to digital transmission.

At the same time, they released a related set of "preparatory reports" by several teams of consultants.

There is a significant difference of opinion between Ofcom and the consultants on the question of whether to reserve "Digital Dividend" frequencies for license exempt applications.

This difference leads Ofcom to encourage the public to use the just-launched consultation to provide better arguments and new proposals for worthwhile license exempt applications in the UHF band.

Ignoring highly-paid consultants? Whatever next:?

Then Little Boney he’ll pounce down,
And march his men on London town!

(Via openspectrum.info.)

Scanning the Big Delta

"Delta Scan" sounds like one of those appalling airport potboilers involving mad scientists, terrorists and implausibly durable secret agents, but it's actually something much more exciting: an attempt to peek into the future of our science and technology. A hopeless task, clearly, but worth attempting if only as a five-neuron exercise.

The results are remarkably rich; considerable credit must go to the UK's Office of Science and Innovation for commissioning the report and - particularly - making it freely available. I was glad to see that there are plenty of links in the documents, which are short and to the point. Great for, er, scanning.

Open Peer Review: Not in Their Nature

One door opens, another door closes: Nature has decided to bin its open peer review experiment:

Despite the significant interest in the trial, only a small proportion of authors opted to participate. There was a significant level of expressed interest in open peer review among those authors who opted to post their manuscripts openly and who responded after the event, in contrast to the views of the editors. A small majority of those authors who did participate received comments, but typically very few, despite significant web traffic. Most comments were not technically substantive. Feedback suggests that there is a marked reluctance among researchers to offer open comments.

Nature and its publishers will continue to explore participative uses of the web. But for now at least, we will not implement open peer review.

I suspect that Nature was probably the worst possible place to try this experiment. Nature is simply the top spot for scientific publishing: getting a paper published there can make somebody's career. So the last thing most people want is anything that might increase the risk of rejection. Public discussion of submitted papers certainly falls into that category, both for the commenter and commented (think scientific mafia).

In a way, this is what makes PLoS ONE so important: it's a tabula rasa for this kind of thing, and can redefine what scientific publishing is about. Nature and its contributors are hardly likely to want to do the same. Kudos to the title for trying, but I bet they're relieved it flopped. (Via Techdirt.)

Open Sourcing Second Life

Here's a subject close to my heart: opening up Second Life. And this is what the alpha geek behind it, Cory Ondrejka, had to say on the subject yesterday:


As we’ve talked about, the long term goals for Second Life are to make it a more open platform. Part of that process is learning how projects like libSL can be beneficial to all of Second Life. We should be thrilled that we’ve built an interesting enough set of technologies and communities that people want to tinker and explore. In the long run, this is why we’ve talked about wanting to be able to Open Source eventually. My hope is that in 2007 we’ll be able to get there.

Also of note:

HTML and Firefox . . . ah my two favorite topics of all time. We have an external contractor who has tons of experience working on it right now. Basically we’ve been trying to make sure that we can get Flash working correctly because so many of the interesting parts of the Web are moving to Flash-based players/plugins/etc. Getting the control inputs and updates to work correctly is a bear but they do seem to be making progress, which is very exciting. The order of operations will be to roll a full internal browser first, then supplement the parcel media types with URLs, and then move to full HTML on a prim. Note that HTML on a prim has several pieces, from being able to interpret straight HTML in order to build text, do layout, etc, all the way to having a face of a prim point at a web page. In terms of timeline, the next major Firefox roll out will be in Q1 – ie, more functionality in the existing pages that use it plus a floater that is a browser – followed by the parcel URL in Q2. HTML on a prim will be part of a larger rearchitecture of textures – we need to go to materials per face rather than texture per face – which several of the devs are itching to work on, but will realistically not start until Q2.

Firefox in Second Life: perfect.

PLoS ONE: Plus One for Science

PLoS ONE, the new way of publishing scientific papers, has gone live. As well as fascinating papers on the Syntax and Meaning of Wild Gibbon Songs, to say nothing of populist numbers like Regulated Polyploidy in Halophilic Archaea, you can also find a sandbox for playing around with the new features of this site. It's obviously premature to say whether this experiment in Web 2.0 science publishing will work, but it certainly deserves to.

20 December 2006

The Real Story Behind Red (and White) Hat's Name

Computerworld has a short piece that has some background on Red Hat's unusual name. Bob Young is quoted as saying:


"When [Red Hat co-founder] Marc [Ewing] was at university he used to name his software projects red hat -- red hat one, red hat two -- just to differentiate them from his friends. So, of course, when he started his Linux project he just named it Red Hat Linux for lack of a better term," said Young, who left Red Hat in 2005 to focus his energies on another company he founded, online independent publishing marketplace Lulu.com.

Well, nearly right Bob. This is the real story, as told to me by Marc Ewing himself some years back:

In college I used to wear my grandfather's lacrosse hat, which was red and white striped. It was my favorite hat, and I lost it somewhere in Philadelphia in my last year. I named the company to memorialize the hat. Of course, Red and White Hat Software wasn't very catchy, so I took a little liberty.

So there you have it: the company should really be called Red and White Hat.

The New Richard Stallman

It's hard to tell whether it's because RMS has changed, or whether the world has changed, but something is new here: RMS is starting to engage with politicians at the highest levels.

For example, a few months back I wrote about his meeting with Ségolène Royal - who, as I and many others predicted, is now officially a candidate for the French Presidential elections.

Now here's RMS hobnobbing with Rafael Correa, the new president of Ecuador. I had to laugh at this description:


Stallman spoke for almost 20 minutes, without Correa saying anything, just listening.

So some things remain the same.

In any case, the result seems good:

When Stallman finished speaking, there were a few questions, and a short conversations where the two of them were in agreement on everything, and Correa asked his advisers if Ecuador should migrate to free software. They said yes, and everyone, including Stallman, left the meeting with a broad smile on their faces.

(Via Linux and Open Source Blog.)

LWN's 2006 Linux and free software timeline

A lot has happened in the last year in the world of free software. That makes it hard (a) to remember who exactly did what and (b) to get the big picture. One invaluable tool for doing both is LWN's 2006 Linux and free software timeline, which offers all the main events with handy links to the original stories. They've also got other timelines going back to 1998, if you want to see an even bigger picture. Great stuff for a trip down free memory lane.

Update 1: And here's C|net's list of top stories in the same field.

Update 2: Meanwhile, here's Matthew Aslett's open source year in quotations.

19 December 2006

Behold: Ajax3D the Great

Something that seems to have everything going for it: Ajax3D. Yup: Ajax meets 3D - or X3D, to be more precise. Here's what a rather useful white paper on the subject by Tony Parisi, one of the pioneers of the by-now antediluvian VRML standard has to say:

Ajax3D combines the power of X3D, the standard for real-time 3D on the web, with the ease of use and ubiquity of Ajax. Ajax3D employs the X3D Scene Access Interface (SAI)—the X3D equivalent of the DOM— to control 3D worlds via Javascript. With the simple addition of an X3D plugin to today’s web browsers, we can bring the awesome power of video game technology to the everyday web experience.

The initial development has begun. Media Machines has created the first showcase applications and tutorials, and has launched a web site, www.ajax3d.org, as an open industry forum to explore technologies, techniques and best practices. This white paper describes the key technical concepts behind Ajax3D and, via examples, introduces the beginning of a formal programming framework for general use.

(Via Enterprise Open Source Magazine.)

ID'ing Reality

The truth begins to sink in:

The government has abandoned plans for a giant new computer system to run the national identity cards scheme.

Instead of a single multi-billion pound system, information will be held on three existing, separate databases.

Well, that's a start. Just as hopeful is the statement:

Home Secretary John Reid denied this was a "U-turn" saying it would save cash, boost efficiency and cut fraud.

So, presumably cancelling the whole thing would also not be a "U-turn", since it too "would save cash, boost efficiency and cut fraud"....

dmoz RIP?

DMOZ - now called the Open Directory Project - just doesn't have the respect it deserves. That's partly because it's had more names than even Firefox/Firebird/Phoenix.

It started out as GnuHoo, but RMS took exception to that, and it became NewHoo - which Yahoo promptly took exception to. It managed to avoid the horrible ZURL (shouldn't that be Zurg?), before metamorphosing into the Open Directory Project, also known as dmoz (from directory.mozilla.org) to its friends.

But it's real importance is not as an open Yahoo: it was the direct inspiration for Nupedia - NewHoo, Nupedia - geddit? - which in turn gave rise to the complementary Wikipedia: need I say more?

So it's sad to hear that dmoz is fizzling. It may not serve much purpose at present, but it's had a glorious past. (Via John Battelle's Searchblog.)

Google Gets Earthier

Google has acquired the mapping company Endoxon:

Endoxon is a developer of internet mapping solutions, mobile services, data processing, cartography, direct marketing and the Trinity software suite. Since 1988, Endoxon and its 75 employees have created ground-breaking solutions for a wide variety of geographic needs. Endoxon is a pioneer in AJAX mapping technologies. Endoxon technologies enable the integration and processing of geo-referenced data and high-resolution aerial and satellite images for dynamic internet and mobile services.

What's interesting about this is that it shows Google pushing forward in the field of mapping, cartography and 3D interfaces - and area that is emerging as increasingly important. (Via Ogle Earth.)

Digital Library of the Commons

OnTheCommons has an interesting post about a new book called Understanding Knowledge as a Commons. This sounds great - see Peter Suber's comment below for details on open access to its contents. but sadly - and ironically - seems not to be open access (though I'd bet that Peter Suber's contribution to the collection is doubtless available somewhere),. However, tThis article did mention something I'd not come across before: the Digital Libary of the Commons.

This turns out to be a wonderful resource:

a gateway to the international literature on the commons. This site contains an author-submission portal; an archive of full-text articles, papers, and dissertations; the Comprehensive Bibliography of the Commons; a Keyword Thesaurus, and links to relevant reference sources on the study of the commons.

Among the list of commons areas, there is Information and Knowledge Commons:

anticommons, copyright, indigenous, local, scientific knowledge issues, intellectual property rights, the Internet, libraries, patents, virtual commons, etc.

Strange that free software is not included. But good, nonetheless.

18 December 2006

British Judges - Gawd Bless 'Em

Here's a interesting little tale of two nations sharing a common tongue but divided by patent culture:

In the US the courts found that Smith had infringed the patents in its use of similar designing software and ordered that it remove certain functions from its software.

The English court took a rigorous approach to analysing the patents and found that it did not adequately describe the system it sought to patent. In order to be valid a patent must describe a process so completely that a person who knows that subject area must be able to replicate it using only the contents of the patent.

The High Court found that Halliburton's patent did not do that, and the Court of Appeal has now agreed. Justice Jacob ruled that the patent was missing vital details, contained wrong equations, demanded a higher level of expertise than allowed and that it relied on material external to the patent, and therefore was not a valid patent.

Shot, sir!

A pity they didn't just chuck it on the basis you can't patent software, but at least the bewigged gents "took a rigorous approach."

It's a Small, Small, Small, Small World

And if anyone's wondering why I keep posting stuff about Chinese currencies - virtual or real - try this for a little hint about the interconnectedness of things (which is what this blog is all about), and the deep nature of a commons:


At least one-third of California's fine particulate pollution -- known as aerosol -- has floated across from Asia, says Steve Cliff, an atmospheric scientist at the University of California at Davis. "In May this year, almost all the fine aerosol present at Lake Tahoe [300 km east of San Francisco] came from China," says Tom Cahill, a UC Davis emeritus professor of atmospheric sciences. "So the haze that you see in spring at Crater Lake [Oregon] or other remote areas is in fact Chinese in origin."

...

The irony of finding Chinese mercury in American rivers, of course, is that much of it was emitted to produce goods being consumed in the United States. There's been a growing awareness that importing commodities from the rest of the world displaces pollution from the U.S. onto other countries; this story brings it full circle and demonstrates yet again that in this fishbowl called Earth, pollution can't be displaced "elsewhere" for long.

The World's Economic Centre of Gravity...

...just started to move to the right:


The euro held steady after the Iranian government said it had ordered the central bank to transform the state's dollar-denominated assets held abroad into euros and use the European currency for foreign transactions.

But I've no illusions that this is anything but the start of a shift even further eastward...

Open Public Data: Halfway There

Well, now, here's some progress:

The OFT's market study into the commercial use of public information has found that more competition in public sector information could benefit the UK economy by around £1billion a year.

Download Commercial use of public information (pdf 707 kb).

Examples of public sector information include weather observations collected by the Met Office, records held by The National Archives used by the public to trace their family history, and mapping data collated by Ordnance Survey. The underlying raw information is vital for businesses wanting to make value-added products and services such as in-car satellite navigation systems.

Public sector information holders (PSIHs) are usually the only source for much of this raw data, and although some make this available to businesses for free, others charge. A number of PSIHs also compete with businesses in turning the raw information into value-added products and services. This means PSIHs may have reason to restrict access to information provided solely by themselves.

The study found that raw information is not as easily available as it should be, licensing arrangements are restrictive, prices are not always linked to costs and PSIHs may be charging higher prices to competing businesses and giving them less attractive terms than their own value-added operations.

It's good news that the Office of Fair Trading has grasped that the public sector trough-scoffers cost taxpayers serious money through their greed; however, realising that making the information freely available - not just for commercial use - would generate far more dosh still will take a while, I fear. (Via Open Knowledge Foundation.)