Over the New Year period I'll finish off tagging older posts to improve navigation on this site. So please excuse the RSS deluge that follows.
30 December 2006
29 December 2006
Trust Yochai to give us a way forward - and some hope:
In the mass media environment, there was a general culture of "I saw it in print, therefore it must be true." This culture led to a relative atrophy of critical faculties, and made the public sphere highly manipulable, or simply prone to error. It is not, for example, that well-trained media critics could not point out the dozens of ways in which any given news report or television program were biased or incomplete. They could. But the readers, viewers, and listeners by and large adopted a trusting relationship to their media. We long spoke about the need to teach critical television watching. But that never happened, really. I think as a new generation grows up reading things that never have a clear voice of authority, that have only provisional status as inputs, we will begin to see a more critical, investigative form of reading, as well as listening and viewing. The act of reading will be more like an act of investigation, as one picks up pieces of evidence with variable levels of credibility, triangulates them, and arrives at a conclusion that continues, nonetheless, to be revisable and falsifiable. This is the essence of the scientific method. It is high time that people adopt it more broadly. I embrace this uncertainty, for with it comes critical reading. This trend is then strengthened by the widespread practices of cultural production, what I have characterized as the re-emergence of a new folk culture in the digital environment. People who create know how to be more critical users.
...are condemned to release again it ten years later.
I was interest to read that Sun has launched its Looking Glass interface. Not just because it's yet another 3D-ish approach, with some interesting applications coming through. But also because Sun seems to be blithely unaware of the history of the Looking Glass moniker. As I wrote in Rebel Code:
Caldera was set up in October 1994, and released betas of its first product, the Caldera Network Desktop (CND) in 1995. The final version came out in February 1996, and offered a novel graphical desktop rather like Windows 95. This "Looking Glass" desktop, as it was called, was proprietary, as were several other applications that Caldera bundled with the package."
Caldera, of course, eventually metamorphosed (hello, Kafka) into SCO....
An eagle-eyed Mark Wallace spotted the International Association of Virtual Reality Technologies (IAVRT), a new Web site/organisation, with its intriguing - and possibly redundant - Neuronet:
IAVRT is working with its VR member peers and the global community to create and govern a new real-time virtual reality network, separate and distinct from the Internet, which will be called the Neuronet. The Neuronet will be designed from the ground up as the world's first - and only - network designed specifically for the transmission of virtual reality and next generation gaming data. The Neuronet will organize the virtual reality world and ensure its safety, reliability, and functionality.
The purpose of the Neurornet will be to facilitate cinematic and immersive virtual reality experiences across distances. These will include almost every type of experience imaginable with some of the most obvious being real-time video chat, video streaming, virtual reality travel, history, adventure, gaming, entertainment, sports, hobbies, business, education, medicine and training to name just a few.
The Neuronet will function similarly to the Internet in its ability connect users in different locations, but instead of the user interface mechanisms associated with the Internet, it will use Virtual reality (VR) technologies to facilitate cinematic and immersive virtual reality experiences for end-users.
From the nation that brought you whaleburgers:
The breeder told Mr. Sasaki that he had bred a dog with three generations of offspring — in human terms, first with its daughter, then a granddaughter and then a great-granddaughter — until Keika was born. The other four puppies in the litter were so hideously deformed that they were killed right after birth.
(Via Boing Boing.)
Google-watchers of the world, arise: a new name to add to the list of Googly things: Airbag.
In many cases trying to determine the exact cause of the crash can be as frustrating as the crash itself. Identifying the causality for crashes is a critical aspect of fixing the crash condition and making sure it doesn't re-occur.
That's where Google's open source Airbag project comes in. According to Google, Airbag is a set of client and server components that implements a crash-reporting system.
(Via Linux Today.)
I've noted before that FSF is changing; Bruce Byfield has noticed too, and written a good summary of what the new FSF has done in 2006 - and what lies in store:
Looking ahead to 2007, [executive director of the FSF] Brown sees only more of the same activism for the FSF. Both the BadVista and Defective By Design campaigns will continue, and he suggests that other campaigns in the coming year will probably focus on hardware drivers for GNU/Linux and software patents.
"It's going to be a busy year," Brown predicts. "2006 was great, but 2007 is going to be huge."
I can't wait.
There's an interesting piece in the Reg by the photojournalist Sion Touhig, entitled "How the anti-copyright lobby makes big business richer." It's well worth reading, even though I think its attacks on the "anti-copyright lobby" are misguided.
The main problem according to the article seems to be big business taking copyrighted material from the Web, or employing user-generated material without paying for it. Addressing both of these seems a better solution than simply hankering for a past that will never return now that the Web 2.0 genie is out of the bottle.
If photographers can't afford to sue - another problem that needs to be sorted - they can at least name and shame: a central Web site for the purpose would do very nicely. And as for the exploitation of user-generated content, the solution here is education. If people were more aware of the cc licences, and used them, then the situation would be more regulated, if not controlled.
At the end of the day, though, photo-journalists will need to adapt, and find new ways of generating money from their work - maybe quite radical ones - just as we writers have had to adapt over the last decade. Perhaps that's regrettable, but it's also the way things are.
Although not quite on the level of LWN's comprehensive review of the year, Giff Constable's Second Life timeline is a handy summary of what happened when in undoubtedly crucial year for SL. And for all those getting their knickers in a twist over the issue of SL's user base (you know who you are) he makes an important observation:
There has been a lot of questions and skepticism around the numbers, and retention rates, which Linden Lab estimates around 10% - 15%, but if you just look at concurrent users, a year ago max concurrent users was around 5 thousand. Today it is around 20 thousand, a 300% growth rate, although still a fairly small pond.
I'd underline that this is concurrent users, so the total number in a given day is some multiple of this: if I had to guess, I'd say around six to eight times. This would give an active user base of around 150,000, which is consistent with a 10% retention rate and two million signups.
As perceptive readers of this blog may have noticed, there's been an increasing number of stories about the rise of 3D technologies in computing, particularly in terms of the interface we use. Well, here's another one - a short but well-written piece about the different strategies of Google and Microsoft in this sphere from my favourite news magazine, Der Spiegel. (Via Ogle Earth.)
28 December 2006
Since both of the ODF and UOF office formats are based on XML, it isn't (theoretically) hard to move between them. Nonetheless, it's good to know that someone has actually put together code to do exactly that:
Peking University recently released a program to convert office documents between OpenDocument Format and the Specification for the Chinese office file format based on XML (UOF for short). Both standards are XML office document standards, UOF being a "National Standard of the People's Republic of China". The converter, which took nearly a year to complete, enables users to convert text, spreadsheet and presentation documents between ODF and UOF.
Here's a thorough journalistic investigation into why manufacturers of wireless hardware have been less than helpful to the free software world:
Some of the non-responsiveness of manufacturers may just be bad PR work, but the same companies that wouldn't talk to me have also refused to reply to free software programmers who have requested the same information. The impression I got from most of these companies (excepting Intel) was that they were not at all prepared to deal with the issues of firmware redistribution rights and hardware API documentation requests. That they have ignored free software programmers' requests is not necessarily a sign of unwillingness to participate, but perhaps a general sense of confusion as to how they are able to help. No one seems to know whom to talk to at the company, and in some cases the proper documentation may not exist -- or it may belong to yet another company that the hardware manufacturer outsourced the firmware development to.
I've not mentioned Korea's open source software city before because details seemed rather scarce, and there are, after all, plenty of other cities using open source. Now, thanks to ZDNet Korea, we have something more, er, concrete:
Gwangju was designated as OSS City by Korea IT Industry Promotion Agency (KIPA) to bring up its economy and competitiveness through IT industry using open source to construct improved infrastructures in city's key industries like opto-electronics, automobile, mobile, and semiconductor.
The project with total cost of $45.7 million in three phases will run from 2006 to 2010. The first phase began in 2006 has completed Information Strategy Planning, surveying applicable open-source areas for the city to install open-source software as a main operating system of their infrastructures. Basing from the findings the open-source solutions were applied to Gwangju Information & Culture Industry Promotion Agency and Jeonnam girl's commercial high school giving channel to produce specialists through education sector.
The RIAA has done a huge service by taking on the doughty AllofMP3 service.
The December 21 lawsuit argues that 11 million songs were allegedly pirated, and seeks damages totaling $150,000 per violation. That's a $1.65 trillion lawsuit - a value slightly less than the Gross Domestic Product for the United Kingdom in 2005.
Put like that, you realise that RIAA is now certifiably bonkers: a tiny Russian company has caused almost as much financial damage as one of the world's biggest economies to an industry worth at most a few billion dollars? I don't think so.
27 December 2006
Now, why is it that I am not surprised by this result of a virtual recreation of the famous Milgram experiment?
The main conclusion of our study is that humans tend to respond realistically at subjective, physiological, and behavioural levels in interaction with virtual characters notwithstanding their cognitive certainty that they are not real. The specific conclusion of this study is that within the context of the particular experimental conditions described participants became stressed as a result of giving ‘electric shocks’ to the virtual Learner. It could even be said that many showed care for the well-being of the virtual Learner – demonstrated, for example, by their delay in administering the shocks after her failure to answer towards the end of the experiment. To some extent based on previous evidence this was to be expected. In fact, it has even been taken for granted that virtual humans can substitute for real humans when studying the responses of people to a social situation. For example, this was the strategy used in the fMRI study described in , where participants passively observed virtual characters gazing at the participants themselves or at other virtual characters. However, no previous experiments have studied what might happen when participants have to actively engage in behaviours that would have consequences for the virtual humans. The evidence of our experiments suggests that presence is maintained and that people do tend to respond to the situation as if it were real.
And people still dismiss Second Life and its ilk as just as "game"....
(Parenthetically, great to see this published on the new PLoS ONE.)
I'm always a bit sceptical about academic studies of open source, since they tend to tell what you already knew, but five years late and dressed up in obfuscatory language. That said, there seems to be some genuine content in this specimen, entitled "Two Case Studies of Open Source Software Development: Apache and Mozilla". Worth a quick gander, at least. (Via AC/OS.)
26 December 2006
IBM today announced a mainframe milestone as more than 390 IBM business partners now offer nearly 1,000 applications for System z customers running Linux, a 100 percent increase over the last year. IBM recently reported a 30 percent year-to-year growth of mainframe customers running Linux....
This increase in Linux application development for the mainframe is being driven by a number of factors, including the overwhelming acceptance of partitioned Linux virtual servers -- and the associated great price and performance -- which is driving new workloads on System z.
(Via Enterprise Open Source Magazine.)
More good news from Europe (Via Australia, bizarrely):
The City of Amsterdam said Friday it will spend euro300,000 (US$400,00) testing open source software in two administrative districts in 2000...
City spokeswoman Marjolijn van Goethem said Amsterdam's housing department and one of its borough offices _ Zeeburg _ would test a Linux-based operating system on city computers, and open-source document software, replacing Microsoft Windows and Office.
"Earlier this year, a study ordered by the (Amsterdam) city council showed that an 'open' software strategy leads to more independence from suppliers," the city said in a statement. "In addition, the use of open software can lead to better exchange and storage of information, without unacceptable financial or logistical risks."
Great piece in the BMJ excoriating greed and stupidity in the pharmaceutical industry:
It is hard to see how the patent issued by the US government for the healing properties of turmeric, which had been known for hundreds of years, stimulated research. Had the patent been enforced in India, poor people who wanted to use this compound would have had to pay royalties to the United States.
In 1995 the Uruguay round trade negotiations concluded in the establishment of the World Trade Organization, which imposed US style intellectual property rights around the world. These rights were intended to reduce access to generic medicines and they succeeded. As generic medicines cost a fraction of their brand name counterparts, billions could no longer afford the drugs they needed.
History will not be kind to those behind this disgraceful state of affairs. (Via Slashdot.)
Well, London is electric, so it makes sense for local delivery lorries to go electric too. The only questions are (a) what took so long given that milk floats have been doing it for years (as the article above points out)? and (b) why isn't everyone doing this?
It seems appropriate to return to active blogging after hours spent mindlessly tagging old posts (and I hope you lot are grateful) with a mega-story that could well shape the online world next year: Jimmy Wales' planned rival to Google, built on open source technology (Nutch and Lucene), and open source methodology. There's not much more to say at this point, but I predict I (and everyone and his/her dog) will be writing more about it.
23 December 2006
As I've noted before, tagging seems to be something people visiting this site find useful. So I've decided to tag all the older posts on this blog that were written before Blogger got around to adding that facility.
Please, therefore, note that most posts that turn up on RSS feeds over the next few days are not really new: do check the date before you get too excited by what is likely to be very old news.
Printing-on-demand has long been a dream, and the Espresso Book Machine looks like it's making it a reality, albeit on a small scale. But what's interesting about this - aside from the ability to get public domain books printed on the spot - is that it depends on the existence of a knowledge commons. Pity that recent copyright laws are doing their utmost to squeeze that space. (Via Slashdot.)
One of the less well-known benefits of creating a commons is that it allows people to experiment with those resources in an unfettered way. This often means that they come up with new and exciting uses that would never have arisen had the underlying material remained enclosed.
A good example is Flickr. This is a tremendous resource, and people just keep on coming up with new ways of using it. The latest is the wonderful Flickr Color Selectr: just chose a colour, and the site will search through Flickr for cc pictures that match it. Not just useful, but highly therapeutic too, for when you're feeling kind of blue.... (Via Creative Commons.)
22 December 2006
Back in the good old Web 1.0 days, XML was really hot. Here's a useful reminder that (a) XML is 10 years old (gosh, doesn't time fly when you're having fun?) and (b) it's still hot.
Last month marked ten years since the World Wide Web Consortium (W3C) Standard Generalized Markup Language (SGML) on the Web Editorial Review Board publicly unveiled the first draft of Extensible Markup Language (XML) 1.0 at the SGML 96 conference. In November 1996, in the same hotel, Tim Bray threw the printed 27-page XML spec into the audience from the stage, from whence it fluttered lightly down; then, he said, "If that had been the SGML spec, it would have taken out the first three rows." The point was made. Although SGML remains in production to this day, as a couple of sessions reminded attendees, the markup community rapidly moved on to XML and never looked back.
Two areas stand out in this report on the conference: XQuery and Darwin Information Typing Architecture (DITA). Here's to the next X.
Time to throw those hats in the air, methinks:
Red Hat, Inc. the world's leading provider of open source solutions, today announced financial results for its fiscal year 2007 third quarter.
Total revenue for the quarter was $105.8 million, an increase of 45% from the year-ago quarter and 6% from the prior quarter. Subscription revenue was $88.9 million, up 48% year-over-year and 5% sequentially.
Net income for the quarter was $14.6 million or $0.07 per diluted share compared with $11.0 million or $0.05 per diluted share for the prior quarter. Non-GAAP adjusted net income for the quarter was $29.6 million, or $0.14 per diluted share, after adjusting for stock compensation and tax expense as detailed in the tables below. This compares to non-GAAP adjusted net income of $22.7 million, or $0.11 per diluted share in the third quarter of last fiscal year.
These figures are important for a number of reasons (and no, I don't own any shares - never have, never will.) It shows that Red Hat has been unaffected by all of Larry's Machiavellian machinations; it also indicates the rude health of open source's bellwether. That's good not just for Red Hat, but for the whole free software ecosystem too.
Here's a well-written story about how librarians have undertaken a major open source project with great success:
21 December 2006
Wengo, the people behind OpenWengo, an open source VOIP project, are offering a free video widget (to the first 10,000 applicants, at least) that consists of just a few lines of HTML code (but uses Flash). (Via Quoi9.)
Top Samba man Jeremy Allison, whom I had the pleasure of interviewing many moons ago, has done the decent thing, and cut his ties with (ex-)employer Novell:
I have decided to leave Novell.
This has been a very difficult decision, but one I feel I have no choice but to make.
As many of you will guess, this is due to the Microsoft/Novell patent agreement, which I believe is a mistake and will be damaging to Novell's success in the future. But my main issue with this deal is I believe that even if it does not violate the letter of the licence it violates the intent of the GPL licence the Samba code is released under, which is to treat all recipients of the code equally.
Sad day for Novell. Luckily, Jeremy will soon be snapped up elsewhere. Bravo for taking a stand.
Update: And the lucky winner is...Google - again.
Great that we've finally been granted free beer access to our laws; pity that it's not free as in freedom. And, of course, positively treasonable, that we don't have access to the original Anglo-Norman texts. (Via Open Knowledge Foundation.)
Well, here's a right rollicum-rorum:
Yesterday, UK telecom regulator Ofcom issued a Consultation paper on future uses of the "Digital Dividend" - the frequencies to be released when TV broadcasters migrate from analog to digital transmission.
At the same time, they released a related set of "preparatory reports" by several teams of consultants.
There is a significant difference of opinion between Ofcom and the consultants on the question of whether to reserve "Digital Dividend" frequencies for license exempt applications.
This difference leads Ofcom to encourage the public to use the just-launched consultation to provide better arguments and new proposals for worthwhile license exempt applications in the UHF band.
Ignoring highly-paid consultants? Whatever next:?
Then Little Boney he’ll pounce down,
And march his men on London town!
"Delta Scan" sounds like one of those appalling airport potboilers involving mad scientists, terrorists and implausibly durable secret agents, but it's actually something much more exciting: an attempt to peek into the future of our science and technology. A hopeless task, clearly, but worth attempting if only as a five-neuron exercise.
The results are remarkably rich; considerable credit must go to the UK's Office of Science and Innovation for commissioning the report and - particularly - making it freely available. I was glad to see that there are plenty of links in the documents, which are short and to the point. Great for, er, scanning.
One door opens, another door closes: Nature has decided to bin its open peer review experiment:
Despite the significant interest in the trial, only a small proportion of authors opted to participate. There was a significant level of expressed interest in open peer review among those authors who opted to post their manuscripts openly and who responded after the event, in contrast to the views of the editors. A small majority of those authors who did participate received comments, but typically very few, despite significant web traffic. Most comments were not technically substantive. Feedback suggests that there is a marked reluctance among researchers to offer open comments.
Nature and its publishers will continue to explore participative uses of the web. But for now at least, we will not implement open peer review.
I suspect that Nature was probably the worst possible place to try this experiment. Nature is simply the top spot for scientific publishing: getting a paper published there can make somebody's career. So the last thing most people want is anything that might increase the risk of rejection. Public discussion of submitted papers certainly falls into that category, both for the commenter and commented (think scientific mafia).
In a way, this is what makes PLoS ONE so important: it's a tabula rasa for this kind of thing, and can redefine what scientific publishing is about. Nature and its contributors are hardly likely to want to do the same. Kudos to the title for trying, but I bet they're relieved it flopped. (Via Techdirt.)
Here's a subject close to my heart: opening up Second Life. And this is what the alpha geek behind it, Cory Ondrejka, had to say on the subject yesterday:
As we’ve talked about, the long term goals for Second Life are to make it a more open platform. Part of that process is learning how projects like libSL can be beneficial to all of Second Life. We should be thrilled that we’ve built an interesting enough set of technologies and communities that people want to tinker and explore. In the long run, this is why we’ve talked about wanting to be able to Open Source eventually. My hope is that in 2007 we’ll be able to get there.
Also of note:
HTML and Firefox . . . ah my two favorite topics of all time. We have an external contractor who has tons of experience working on it right now. Basically we’ve been trying to make sure that we can get Flash working correctly because so many of the interesting parts of the Web are moving to Flash-based players/plugins/etc. Getting the control inputs and updates to work correctly is a bear but they do seem to be making progress, which is very exciting. The order of operations will be to roll a full internal browser first, then supplement the parcel media types with URLs, and then move to full HTML on a prim. Note that HTML on a prim has several pieces, from being able to interpret straight HTML in order to build text, do layout, etc, all the way to having a face of a prim point at a web page. In terms of timeline, the next major Firefox roll out will be in Q1 – ie, more functionality in the existing pages that use it plus a floater that is a browser – followed by the parcel URL in Q2. HTML on a prim will be part of a larger rearchitecture of textures – we need to go to materials per face rather than texture per face – which several of the devs are itching to work on, but will realistically not start until Q2.
Firefox in Second Life: perfect.
PLoS ONE, the new way of publishing scientific papers, has gone live. As well as fascinating papers on the Syntax and Meaning of Wild Gibbon Songs, to say nothing of populist numbers like Regulated Polyploidy in Halophilic Archaea, you can also find a sandbox for playing around with the new features of this site. It's obviously premature to say whether this experiment in Web 2.0 science publishing will work, but it certainly deserves to.
20 December 2006
Computerworld has a short piece that has some background on Red Hat's unusual name. Bob Young is quoted as saying:
"When [Red Hat co-founder] Marc [Ewing] was at university he used to name his software projects red hat -- red hat one, red hat two -- just to differentiate them from his friends. So, of course, when he started his Linux project he just named it Red Hat Linux for lack of a better term," said Young, who left Red Hat in 2005 to focus his energies on another company he founded, online independent publishing marketplace Lulu.com.
Well, nearly right Bob. This is the real story, as told to me by Marc Ewing himself some years back:
In college I used to wear my grandfather's lacrosse hat, which was red and white striped. It was my favorite hat, and I lost it somewhere in Philadelphia in my last year. I named the company to memorialize the hat. Of course, Red and White Hat Software wasn't very catchy, so I took a little liberty.
So there you have it: the company should really be called Red and White Hat.
It's hard to tell whether it's because RMS has changed, or whether the world has changed, but something is new here: RMS is starting to engage with politicians at the highest levels.
For example, a few months back I wrote about his meeting with Ségolène Royal - who, as I and many others predicted, is now officially a candidate for the French Presidential elections.
Now here's RMS hobnobbing with Rafael Correa, the new president of Ecuador. I had to laugh at this description:
Stallman spoke for almost 20 minutes, without Correa saying anything, just listening.
So some things remain the same.
In any case, the result seems good:
When Stallman finished speaking, there were a few questions, and a short conversations where the two of them were in agreement on everything, and Correa asked his advisers if Ecuador should migrate to free software. They said yes, and everyone, including Stallman, left the meeting with a broad smile on their faces.
(Via Linux and Open Source Blog.)
A lot has happened in the last year in the world of free software. That makes it hard (a) to remember who exactly did what and (b) to get the big picture. One invaluable tool for doing both is LWN's 2006 Linux and free software timeline, which offers all the main events with handy links to the original stories. They've also got other timelines going back to 1998, if you want to see an even bigger picture. Great stuff for a trip down free memory lane.
Update 1: And here's C|net's list of top stories in the same field.
Update 2: Meanwhile, here's Matthew Aslett's open source year in quotations.
19 December 2006
Something that seems to have everything going for it: Ajax3D. Yup: Ajax meets 3D - or X3D, to be more precise. Here's what a rather useful white paper on the subject by Tony Parisi, one of the pioneers of the by-now antediluvian VRML standard has to say:
The initial development has begun. Media Machines has created the first showcase applications and tutorials, and has launched a web site, www.ajax3d.org, as an open industry forum to explore technologies, techniques and best practices. This white paper describes the key technical concepts behind Ajax3D and, via examples, introduces the beginning of a formal programming framework for general use.
(Via Enterprise Open Source Magazine.)
The truth begins to sink in:
The government has abandoned plans for a giant new computer system to run the national identity cards scheme.
Instead of a single multi-billion pound system, information will be held on three existing, separate databases.
Well, that's a start. Just as hopeful is the statement:
Home Secretary John Reid denied this was a "U-turn" saying it would save cash, boost efficiency and cut fraud.
So, presumably cancelling the whole thing would also not be a "U-turn", since it too "would save cash, boost efficiency and cut fraud"....
DMOZ - now called the Open Directory Project - just doesn't have the respect it deserves. That's partly because it's had more names than even Firefox/Firebird/Phoenix.
It started out as GnuHoo, but RMS took exception to that, and it became NewHoo - which Yahoo promptly took exception to. It managed to avoid the horrible ZURL (shouldn't that be Zurg?), before metamorphosing into the Open Directory Project, also known as dmoz (from directory.mozilla.org) to its friends.
But it's real importance is not as an open Yahoo: it was the direct inspiration for Nupedia - NewHoo, Nupedia - geddit? - which in turn gave rise to the complementary Wikipedia: need I say more?
So it's sad to hear that dmoz is fizzling. It may not serve much purpose at present, but it's had a glorious past. (Via John Battelle's Searchblog.)
Google has acquired the mapping company Endoxon:
Endoxon is a developer of internet mapping solutions, mobile services, data processing, cartography, direct marketing and the Trinity software suite. Since 1988, Endoxon and its 75 employees have created ground-breaking solutions for a wide variety of geographic needs. Endoxon is a pioneer in AJAX mapping technologies. Endoxon technologies enable the integration and processing of geo-referenced data and high-resolution aerial and satellite images for dynamic internet and mobile services.
What's interesting about this is that it shows Google pushing forward in the field of mapping, cartography and 3D interfaces - and area that is emerging as increasingly important. (Via Ogle Earth.)
OnTheCommons has an interesting post about a new book called Understanding Knowledge as a Commons. This sounds great - see Peter Suber's comment below for details on open access to its contents.
but sadly - and ironically - seems not to be open access (though I'd bet that Peter Suber's contribution to the collection is doubtless available somewhere),. However, tThis article did mention something I'd not come across before: the Digital Libary of the Commons.
This turns out to be a wonderful resource:
a gateway to the international literature on the commons. This site contains an author-submission portal; an archive of full-text articles, papers, and dissertations; the Comprehensive Bibliography of the Commons; a Keyword Thesaurus, and links to relevant reference sources on the study of the commons.
Among the list of commons areas, there is Information and Knowledge Commons:
anticommons, copyright, indigenous, local, scientific knowledge issues, intellectual property rights, the Internet, libraries, patents, virtual commons, etc.
Strange that free software is not included. But good, nonetheless.
18 December 2006
Here's a interesting little tale of two nations sharing a common tongue but divided by patent culture:
In the US the courts found that Smith had infringed the patents in its use of similar designing software and ordered that it remove certain functions from its software.
The English court took a rigorous approach to analysing the patents and found that it did not adequately describe the system it sought to patent. In order to be valid a patent must describe a process so completely that a person who knows that subject area must be able to replicate it using only the contents of the patent.
The High Court found that Halliburton's patent did not do that, and the Court of Appeal has now agreed. Justice Jacob ruled that the patent was missing vital details, contained wrong equations, demanded a higher level of expertise than allowed and that it relied on material external to the patent, and therefore was not a valid patent.
A pity they didn't just chuck it on the basis you can't patent software, but at least the bewigged gents "took a rigorous approach."
And if anyone's wondering why I keep posting stuff about Chinese currencies - virtual or real - try this for a little hint about the interconnectedness of things (which is what this blog is all about), and the deep nature of a commons:
At least one-third of California's fine particulate pollution -- known as aerosol -- has floated across from Asia, says Steve Cliff, an atmospheric scientist at the University of California at Davis. "In May this year, almost all the fine aerosol present at Lake Tahoe [300 km east of San Francisco] came from China," says Tom Cahill, a UC Davis emeritus professor of atmospheric sciences. "So the haze that you see in spring at Crater Lake [Oregon] or other remote areas is in fact Chinese in origin."
The irony of finding Chinese mercury in American rivers, of course, is that much of it was emitted to produce goods being consumed in the United States. There's been a growing awareness that importing commodities from the rest of the world displaces pollution from the U.S. onto other countries; this story brings it full circle and demonstrates yet again that in this fishbowl called Earth, pollution can't be displaced "elsewhere" for long.
...just started to move to the right:
The euro held steady after the Iranian government said it had ordered the central bank to transform the state's dollar-denominated assets held abroad into euros and use the European currency for foreign transactions.
But I've no illusions that this is anything but the start of a shift even further eastward...
Well, now, here's some progress:
The OFT's market study into the commercial use of public information has found that more competition in public sector information could benefit the UK economy by around £1billion a year.
Download Commercial use of public information (pdf 707 kb).
Examples of public sector information include weather observations collected by the Met Office, records held by The National Archives used by the public to trace their family history, and mapping data collated by Ordnance Survey. The underlying raw information is vital for businesses wanting to make value-added products and services such as in-car satellite navigation systems.
Public sector information holders (PSIHs) are usually the only source for much of this raw data, and although some make this available to businesses for free, others charge. A number of PSIHs also compete with businesses in turning the raw information into value-added products and services. This means PSIHs may have reason to restrict access to information provided solely by themselves.
The study found that raw information is not as easily available as it should be, licensing arrangements are restrictive, prices are not always linked to costs and PSIHs may be charging higher prices to competing businesses and giving them less attractive terms than their own value-added operations.
It's good news that the Office of Fair Trading has grasped that the public sector trough-scoffers cost taxpayers serious money through their greed; however, realising that making the information freely available - not just for commercial use - would generate far more dosh still will take a while, I fear. (Via Open Knowledge Foundation.)
The Great UK DNA Database Lie continues to grow. Despite Government efforts to paint this as a deeply necessary tool to catch all those wicked evil people out there - "if you're innocent, you have nothing to fear" etc. etc. - it is increasingly becoming clear that, in the interests of total control, it is trying to create a DNA database of everyone.
As The Reg explains:
Less than two thirds of people whose profile is stored on the National DNA Database are there for having been cautioned or convicted of a criminal offence, Home Office figures have revealed.
In response to a parliamentary question, John Reid last week responded that 3,457,000 individuals are on the database, but 1,139,445 have no criminal record. The figure is eight times the total of 139,463 reported by the Home Office Earlier in March.
That's over 2% of the UK population that shouldn't be on there: only another 95% to go.
16 December 2006
There aren't many details yet, but Raph Koster, generally regarded as one of the most original minds in the online gaming world, has lifted a corner of the curtain on his new company, with the delightfully erudite - and tricky - name of Areae:
Areae means "many places" in Latin. Depending on who you ask, you pronounce it "Airy-eh" or "Airy-eye" or "Area-ee"… well. It doesn’t matter. What matters is what it means: many places, many worlds.
Areae, Inc. is a company dedicated to taking the tired old virtual world and making it into something fresh and new. Something anyone can jump into. Something where anyone can find something fun to do or a game to play. Something where anyone can build their own place on the virtual frontier.
For me, the real giveway is the logo, which consists of smaller and larger interconnected blobs: sounds like a system of interconnected, perhaps standalone virtual worlds to me. Watch out Second Life....
15 December 2006
The FSF is undergoing a remarkable change at the moment. From being a deeply worthy, but rather dull organisation, it has started to turn into the Rottweiler of the free software world. First there was Defective by Desig, targeting DRM, and now we have the splendidly-named BadVista going for the jugular of Microsoft's new operating system:
"Vista is an upsell masquerading as an upgrade. It is an overall regression when you look at the most important aspect of owning and using a computer: your control over what it does. Obviously MS Windows is already proprietary and very restrictive, and well worth rejecting. But the new 'features' in Vista are a Trojan Horse to smuggle in even more restrictions. We'll be focusing attention on detailing how they work, how to resist them, and why people should care", said FSF program administrator John Sullivan.
Oh, come on John, tell us what you really think.
This is rather impressive in its way:
The TAB is owned by GateHouse Media, a newspaper conglomerate that owns 75 daily and 231 weekly newspapers. And the TAB isn’t the only paper that got a silver CC badge this week. Without fanfare, the company is rolling out Creative Commons licenses covering nearly all of the 121 dailies and weeklies they own in Massachusetts. The CC license now covers 96 of the company’s TownOnline sites, which are grouped within a portal for their many Eastern Massachusetts newspapers.
CC for 96 newspapers? Go, Larry, go. (Via Michael Geist's Blog.)
I came across this story on LWN.net:
The Software Freedom Law Center (SFLC), provider of pro-bono legal services to protect and advance Free and Open Source Software, today filed a brief with the United States Supreme Court arguing against the patenting of software.
In the case Microsoft v. AT&T, the Supreme Court will decide whether U.S. patents can apply to software that is copied and distributed overseas. The Court of Appeals for the Federal Circuit, a specialized patent court known for allowing patents on software and business methods, originally decided in favor of AT&T, expanding the international reach of U.S. software patents.
Well, to coin a phrase, I bloody well hope not.
But aside from the worrying implications of this kind of extra-territoriality, and the fact that the Software Freedom Law Center is supporting Microsoft in this case, I found the following statement from the amicus brief a little odd:
One could not send or receive e-mail, surf the World Wide Web, perform a Google search or take advantage of many of the other benefits offered by the Internet without Free and Open Source Software, which also includes the Linux operating system that is today’s strongest competitor to Petitioner’s Windows operating system.
Er, sorry Eben, that wouldn't be the GNU/Linux operating system, by any chance? You remember, the one that Richard can get a little funny about when he sees it described as the Linux operating system...?
News that the total number of Second Life signups has breached the two million mark comes at a time when such figures are being heavily criticised.
Of course, the two million figure does not reflect the true number of SL users. But think of it as a proxy for the real number - a kind of SL index. The fact that the index has doubled in a couple of months is the real news; even with a retention rate of "only" 10%, it's the rate of growth that matters.
What we are seeing is Second Life and the virtual world idea begin to break through into the mainstream. Even if the two million number were "real" it would be footling compared to the size of the Net; but even 10% of it is not footling compared to zero.
As to criticisms it's VRML all over again - and speaking as someone who wanted to believe in VRML, but never quite could because it was so obviously limited - SL is something else, for a reason that has nothing to do with SL's technology, cool though it is.
Unlike ten years ago, broadband is available and relatively cheap today. I have a line that regularly gives me 4 Mbit/s and over for a very reasonable price; this means that I simply never have to think about bandwidth anymore. Speeds will continue to rise, and virtual worlds will be able to take even more bandwidth for granted, with a resultant improvement in experience.
If Second Life can do this well in today's computing environment, it will do even better in tomorrow's.
Far be it from me to give any publicity to the arch-enemy Disney (hawk, spit), but this story about the latter's plans for a MMORPG spin-off from the "Pirates of the Caribbean" film is of note. It shows clearly how virtual worlds are entering the mainstream alongside traditional, "physical" merchandising.
14 December 2006
"Mii" avatars for the Wii console have been given a license to travel, thanks to a "How To" article published by LiquidIce's Nintendo Wii Hacks. Using the built-in Bluetooth capabilities of the Wiimote control device, a Bluetooth-capable PC, and a program called MiiTransfer, Mii avatar data can be scraped from the Wiimote to the PC, then shared through web sites such as the Mii Transfer Station (see the article for details).
I don't know if this is the first such case of avatar liberation, but it certainly won't be the last.
In case you hadn't noticed, we live in a digital world cordoned off by passwords. Nearly everything online requires them, so you are faced with the classic dilemma: use one, hard-to-guess, hard-to-remember one for everything, or use lots of easy-to-remember, easy to guess ones - or maybe just one easy-to-guess.
This fascinating analysis by Bruce Schneier of a clutch of compromised passwords from MySpace is slightly better news than you might have expected:
We used to quip that "password" is the most common password. Now it's "password1." Who said users haven't learned anything about security?
But seriously, passwords are getting better. I'm impressed that less than 4 percent were dictionary words and that the great majority were at least alphanumeric.
The story has some good links to historical studies of passwords, as well as the usual sharp Brucie thoughts. Alas, these include the following:
None of this changes the reality that passwords have outlived their usefulness as a serious security device. Over the years, password crackers have been getting faster and faster. Current commercial products can test tens -- even hundreds -- of millions of passwords per second. At the same time, there's a maximum complexity to the passwords average people are willing to memorize (.pdf). Those lines crossed years ago, and typical real-world passwords are now software-guessable.
"Hundreds of millions of passwords per second"??? Gulp.
...in the nicest possible way, with Google Patent Search:
As part of Google’s mission to organize the world’s information and make it universally accessible and useful, we’re constantly working to expand the diversity of content we make available to our users. With Google Patent Search, you can now search the full text of the U.S. patent corpus and find patents that interest you.
Try explaining to people that silence is a commons that needs to be defended, and you will get some blank looks, I suspect. But put in these terms, and they might begin to understand the issues:
Christmas music in shops is "torture", the "forgotten pollutant" which shop workers must be able to silence for the sake of their sanity, according to activists, trade unions and a peer. The government is being asked to investigate the problem.
I don't normally comment on these kind of in-house spitting matches, but I can't help feeling that five years down the line, this will turn out to be quite a pivotal moment.
The Editor of TechCrunch UK, Sam Sethi, about whom I know nothing, has been fired by the TechCrunch god, Michael Arrington. You can get the official TechCrunchy story here; speaking as a UK journalist, ex-publisher and nascent blogger, I have to say it is about as convincing as Tony Blair's explanations of why we need ID cards. But what's really interesting is that it touches so many nerve points.
For example, we have the rather droll sight of Arrington accusing his employee of "crossing the line" on ethical behaviour. Since TechCrunch is one of the most supine blogs in the Web 2.0 world, I can only assume that this means it crossed the line by becoming too critical. Judging by the very restrained comments about Le Web 3, this certainly looks to be the case: as a Brit hack, I have to say this is very mild stuff about what seems to have been a pretty poor conference.
Moreover, as several of the more perceptive comments to Arrington's post point out, there are important cultural, not to say legal issues, here: you can't just fire people on the spot for anything less than gross misconduct, which certainly hasn't taken place as far as I can tell, and using all normal definitions of "gross".
The whole affair is particularly amusing because Arrington has managed to break three cardinal rules of blogging. First, you don't delete blog posts. It just isn't done, unless there are legally compelling reasons to do so. Second, you certainly don't try to defend your deletion in another blog post, since this will only amplify the maladroitness of the initial action.
And finally, you must never think to punish a really a good blogger by firing them, for the simple reason that they will simply go elsewhere and do it on their own (benefiting from all that lovely free publicity you've just given them), leaving you looking stupid, and them clever.
The fact that Arrington seems not to understand these issues speaks volumes about the exclusive world of venture capital-backed blogs - and the fact that in the near future, some of them are going to come a cropper, as good bloggers find that managing other bloggers ain't so easy. Not so much the beginning of the end, but certainly the end of the beginning.
Update: Michael Arrington has more on his side of the story.
A couple of weeks back, I wrote about attempts to take the MMORPG Ryzom open source; now it seems that these have received a big boost from a surprising quarter:
Free Software Foundation announces that it will officially support the Free Ryzom campaign (www.ryzom.org) with a pledge of $60,000.
The Free Ryzom campaign was established to purchase the online game and universe known as Ryzom, property of the now bankrupt Nevrax company, and release the entire game as free software.
As stated by Peter T. Brown, Executive Director of the Free Software Foundation, the FSF considers the Free Ryzom campaign "a high priority project for the free software movement". The aim of the campaign is to publish the source code to the entire game under the terms of the widely-used GPL, as well as publishing all of the artwork and other content under similar free licenses.
The Free Ryzom campaign represents a unique opportunity for the free software movement and the emerging free gaming field. A fully free MMORPG (massively multiplayer online roleplaying game) engine and client/server architecture would allow the development of a myriad of universes, each one evolving its own philosophy and unique content - but sharing in general technical improvements. If successful, this campaign would allow any user to create their own universe and produce their own content based on the Ryzom/Nevrax architecture.
What's particularly interesting about this move is that it confirms how MMORPGs and virtual worlds are moving into the mainstream: after all, the FSF has only limited resources, and would not choose to spend its hard-earned dosh on anything that it does not perceive as pushing forward its cause in a major way.
13 December 2006
Mark Shuttleworth, he of VeriSign and Ubuntu, has noticed Second Life:
Second Life of course brings a new twist to the idea of immersion, though for now it’s immersion on the virtual side of the looking glass. What interests me are the ways in which there is cross-over between the virtual world and the real world. When I’m walking around town, does my mobile phone alert me to changes in the virtual world? And when I’m working at my PC, how much can I stay focused on work, say, while my PC also keeps me abreast of what’s going on with my avatar?
I think there’s going to be a need for innovation around the ways we blur the lines between real and virtual worlds, and this is again one of those places that I think the free software community cold steal a lead on the proprietary world.
Well, yes; or maybe no. (Via tuxmachines.org.)
Maybe you can push a meme too far:
We are publishing how to use the high powered laser system, set up, techniques, business practices and templates. You could start your own laser business, we'll even help you.
This is worrying: politicians have discovered that open source is vaguely, well, groovy:
"I am very impressed with the phenomenon of open source software. In the wiki logic, there is a capacity to share and cooperate on research and economic activity. This is a crucial moment of the history of humanity."
Tuangou 团购 - not sourcing from crowds, but sourcing for crowds:
for tuangou, it's all about shopping, aggregating enough people to be able to get a discount or better terms on everything from cars to home remodelling materials to real estate.
This post also has a good picture of the physical reality of tuangou.
12 December 2006
Here's an interesting piece of research that attempts to get some (relatively) hard facts about residents of Second Life. It's only an indication, of course, but it certainly gives the lie to many misconceptions about the place:
Respondents who claim to feel happy in Second Life, also feel happy in the physical world: this hypothesis does not hold for those who spend more than 30 hours a week in Second Life and feel unhappy in the physical world. The relation is strongest for the 18-30 hrs group: the happier they feel in real life, the happier they feel in Second Life. The relation is weakest for the under 18 hrs group. The hypothesis that the more hours one spends in Second Life, the unhappier one feels in the physical word, does not seem to hold. The socially skilled, who feel comfortable in the real world, also feel most comfortable in Second Life.
Update: And talking of hard facts, here's a new page of Second Life numbers. What's most interesting is that they are all showing the same shape.
How can I resist a pitch like this?
Openserving extends the essence of the open source model — free software and content — to all aspects of web-based computing.
You can set up your own collaborative blogging site.
All articles are sorted democratically.
Here's a bonus: keep 100% of the ad revenue for yourself.
Highly laudable, but it's this bit that worries me:
Wikia founder and chairman Jimmy Wales said, "Social change has accelerated beyond the original Wikipedia concept of six years ago. People are rapidly adopting new conventions for working together to do great things, and Wikia is a major beneficiary of that trend. OpenServing is the next phase of this experiment. We don’t have all the business model answers, but we are confident – as we always have been – that the wisdom of our community will prevail."
I see: so you mean that although you lose money on every user, you make up on volume?
The EU Internal Markets Commissioner has warned that Europe is about to miss out on a chance to forge a pan-European patent disputes forum because of long-standing international bickering on what a system might look like.
Following further fruitless discussions between European countries this week, Charlie McCreevy has warned that the entire plan could stall. “Anything remotely concerning this patent area is fraught with minefields at every turn of the road,” McCreevy told the Financial Times.
I'm such a sucker for clever neologisms:
There's a turf war heating up over a strip of web real estate called "Iconistan."
You won't find this mythical land on a map, as Iconistan exists only at the bottom of blog posts. It's where that little crowd of icons gathers, begging you to post a bit of news to Digg, Reddit, del.icio.us and various other social news and community sites.
One of the key ideas that underpins this blog is that one day all knowledge will be freely available online. Open source is the means, and open content/open genomics/open data etc. will be the result.
Clearly, there is a long way to go, but it is important to keep things in perspective. Ten years ago, Wikipedia did not exist; today, it provides an unparalleled collection of knowledge, for all its faults. Looking just a little further back, say 15 years ago, the Web consisted of a few servers worldwide, and GNU was kernel-less - Linux had only just come into existence. We have gained much in those ten and fifteen years.
And now here's another straw in the wind, a sign from the future: the complete works of Mozart available online, free:
Starting on December 12, 2006 the ISM and the Packard Humanities Institute will make the complete musical texts of the NMA available to everyone for private, scholarly, and educational use as NMA Online. Free access will be provided on the Internet at http://dme.mozarteum.at/. The music pages are linked with the scans of the NMA’s critical reports. Comprehensive search capabilities allow users to easily find, study, and print any of Mozart’s works as PDF files. The NMA Online is the first extensive, up-to-date complete works edition that is available to everybody at no charge.
This is a wonderful resource, if rather slow because of the huge interest it has provoked. But that's a detail: this is the open future, and it's coming.
Today is the first anniversary of opendotdotdot. Since this fact is of little interest to anyone but myself, I thought it might be useful to extract what pearls of wisdom I can from the experience of writing 1,260 posts in that time.
1. Google is your friend
Over half the traffic that arrives on these pages comes from Google. Moreover, the absolute number of visitors directed here by the Google machine just keeps on going up. I can only presume that the more posts you make, the more Googlejuice you generate, and the more you move up the ranks. Indeed, when I take a look at some of the keywords people enter in Google before arriving here, it is gratifying to find this blog pretty highly placed, albeit for some pretty wacky terms (like "wackypedia" and "seed gestapo") as well as a few surprisingly mainstream ones (like "philip rosedale interview").
2. Yahoo and MSN are a waste of time
I'm constantly amazed at how utterly Google dominates the search engine field in terms of the visitors it provides. I had expected MSN to be way behind, but not Yahoo; in fact, I get more visitors from MSN than Yahoo, which barely registers at all. Maybe this says something about the kind of readers/searches that end up at my blog, but it also emphasises the importance of Google. Interestingly, Technorati also generates far more leads than either Yahoo or MSN.
3. People like tags
The first version of the blog had no tags, for the simple reason that Blogger did not offer the facility, and I was too lazy to do it by hand. But when the new Blogger beta came out with tags, I dutifully employed them. I'm glad I did, because people really like using them to search within the blog. In fact, like me, people are lazy: they just can't be bothered entering a search term in the blog search box, but they can summon up the energy to click on one of the tags. One consequence of this is that I intend to go back and tag all of the older posts, since it clearly is something people find useful.
4. People like weird stuff
I am resigned to the fact that I am completely unable to judge which posts will be popular or not. Sometimes I spend ages crafting some witty/profound/novel/hot post, only to have it roundly ignored by almost everybody. Equally, I've often knocked out a trivial/pointless/content-free post only to find everyone and their dog rush to admire its insights. Sigh.
5. It gets easier
In many ways, the most important lesson that I take away from a year's blogging is that the more you do, the easier it gets. This is not just because you learn to type faster, but also because I've found that blogging helps me think faster and maybe even better. It's also a direct consequence of the fact that it's such damn good fun.
11 December 2006
As the author of two books with the word "code" in the title, I naturally gravitate to other tomes that also draw on this word. But Larry Lessig's Code is rather special: it's one of the definitive texts of the Net age. I remember reading version 1.0, when it came out; and now, following a suitably wikified genesis, here's Codev2. (Via Michael Geist's Blog.)
One of the reasons I think copyright issues regarding text, images, music etc. are important is that they are a foretaste of when things will get really interesting, in two quite separate spheres.
The first is that of virtual worlds, where everything is digital, and therefore potentially able to be copied. The CopyBot incident in Second Life was a hint of what is to come there. Perhaps even more disturbing is the second: once three-dimensional fabbers are perfected and become cheap, and you can start copying physical objects, so-called real-world economies are going to have a lot of problems.
But it looks like there's a stage even beyond that. New Scientist reports that a kind of Bio-Fabber is already in the early stages of development:
An inkjet device that prints tiny “bio-ink” patterns has been used to simultaneously grow two different tissues from the stem cells of adult mice. Surgeons could one day use the technology to repair various damaged tissues at the same time, the researchers say.
Now, put together copyright with issues of privacy and identity....
Update: Here's a rather deeper meditation on the same subject, with a rather wittier title: "Bioprinters vs. the Meatrix".
What do you do if your publisher never actually publishes the book you have written about OpenOffice.org for them? Easy: buy back the rights and self-publish:
I had it done in May, but there was much dithering at Prentice Hall. It's not just them -- lots of publishers are facing the fact the publishing is different now, and book contracts have been getting cancelled all over the place. I had the opportunity to get the publication rights back, so I jumped at the chance and ha've self-published the Guidebook.
The author, Solveig Haugland, probably knows more about using OpenOffice.org than anyone, so the book is highly recommended. She's even made available some sample chapters so you can try before you buy.
After the Summer of Code, now the Winter of Code:
The South Korean government and local tech companies have started an open source student developer contest, similar to Google's Summer of Code.
Dubbed Winter of Code, the competition will begin during Korea's winter recess in January next year. Organized by Korean games publisher NCsoft, local IT firms and the Korea IT Industry Promotion Agency, the contest aims to nurture new developers and promote open source software development in the country.
Rob Weir has a characteristically sharp and original analysis of the recent approval by ECMA of Microsoft's Open Office XML:
Thus the remarkable achievement of Microsoft and Ecma TC45, who not only managed to create a standard an order of magnitude larger than any other markup standard I've seen, but at the same time managed to complete the review/edit/approve cycle faster than any other markup standard I've seen. They have achieved an unprecedented review/edit/approval rate of 18.3 pages/day, 20-times faster than industry practice, a record which will likely stand unchallenged for the ages.
I think we would all like to know how they did it. Special training? Performance enhancing drugs? Time travel? A pact with the Devil? I believe you will all share with me an earnest plea for them to share the secret of their productivity and efficiency with the world and especially with ISO, who will surely need similar performance enhancements in order for them to review this behemoth of a standard within their "fast track" process.
I am optimistic, that once the secret of OOXML's achievement gets out, the way we make standards will never be the same again.
10 December 2006
Ministers were today urged to consider abandoning the multi-billion pound Joint Strike Fighter project unless the United States agrees within weeks to share sensitive technology.
Ministers have previously threatened that the UK could pull out of plans to buy up to 150 of the military planes for the RAF and Navy unless America agreed to transfer secrets about its software that Britain argues are needed in order to operate and maintain them independently.
Talking of the blurring of distinction between life and games, here's a great rumination on certain aspects of modern China (a subject that interests me greatly). I was particularly struck by these two passages:
In China’s case, I’d say morality is probably 5% instinctual, 20% customs and traditions, and 75% fear of law and loss, with an overall lower bar for morality. It is interesting to observe how this is very similar to how morality evolves in an MMORPG (Massively Multiplayer On Line Role Playing Game). Religion has nothing to say about how your Avatar’s life should be conducted (hah! What Would Arthas Do?), and there is little rule of law on the servers. Thus, if one was to take a walk through SecondLife, one would commonly find copious quantities of sex-related items for sale, and presumably there are many people who will also sell you virtual sex for Linden dollars. Maybe this is a stretch, but I think the underlying moral lessons are not too different from the scene I saw in the Hard Rock Cafe Beijing.
Beijing is in the process of building an enormous Olympic park. They tear down whole neighborhoods and pave roads over them in a matter of weeks. They are building an 11 or 12-route subway system that promises to rival the subway system in Manhattan for connectivity and completeness. Watching this happen reminds me of how I play Sim City. If you’ve ever played the game, you’ve probably remorselessly bulldozed huge sections of Sim Cities that you messed up the planning on, and improved your city’s long-term productivity through doing that. The Beijing government seems to restructure the city with about the same attitude and efficiency...
Fascinating. (Via GridBlog.)
One of things I have come to appreciate, albeit rather belatedly, is how gamer culture is going mainstream. By that, I don't just mean that it's more acceptable to be a gamer, or that more and more sectors of society are playing games, but that the gaming world-view is starting to seep into other areas of life.
Take Amazon's new Askville, for example:
Askville is a place where you can share and discuss knowledge with other people by asking and answering questions on any topic. It’s a fun place to meet others with similar interests to you and a place where you can share what you know. You can learn something new everyday or help and meet others using your knowledge. Askville even helps you learn by giving you cool tools to help you find information online while you are answering questions. It’s all about sharing—what you know and what you want to know—so go ahead and meet someone new today and Askville!
But most interestingly:
Every time you answer a question on Askville you will earn or potentially lose experience points in the topics that were associated with that question. Askville uses experience points to determine how knowledgeable a user is in a given topic. Experience points are broken up into various levels. To reach a certain level you need to have earned a certain number of experience points in that topic. Go to Experience Points, Levels, and Quest Coins in the FAQs to learn more about experience points and levels.
Which, of course, is precisely how a game works. In other words, Askville is a game. Life is a game. (Via TechCrunch.)
09 December 2006
08 December 2006
Following the recent excitement about Second Life's energy consumption, it seems that people are beginning to realise that it's not the only one with problems:
The nation's biggest technology companies sat down with federal regulators Wednesday to assess the industry's thirst for power amid fears that volatile and expensive energy could hinder the growing sector.
The fierce competitors at the table -- including Google, IBM, Microsoft, Cisco, Sun Microsystems and Hewlett-Packard -- rarely gather to talk strategy. But they were lured by the chance to influence the development of national energy standards.
"I think we may be at the beginning of a potential energy crisis for the IT sector," Victor Varney, a vice president for Silicon Graphics, told the regulators. "It's clearly coming."
In another sign of the ascendancy of Eclipse, Google has joined the club:
Ending months of speculation, Google has officially joined the Eclipse Foundation as an add-in provider, according to officials at both organizations.
Children are increasingly swapping music via mobile phones, often without realising they can be breaking the law.
A survey of almost 1,500 eight to 13-year-olds found almost a third shared music via their mobiles.
Almost a half (45%) of children who said they did not swap music via their phones said they would like to.
Inevitably this story is provoking howls from the music industry. But it teaches us two things.
First, that - just as RMS has always said - sharing is natural, part of our genetic make-up that allows us to function as social beings. And secondly, that this is an opportunity, not a threat: these children are doing free marketing for the music companies, spreading the word about cool music. Copyright owners should rejoice, not fret about it - and move on to coming up with a way to make some money selling more music off the back of all this fantastic viral distribution.
As if Chinese leaders did not have enough of a headache trying to manage the country's rising but still undervalued currency in the testy world of international trade, now the growing popularity of virtual money enters the already complex equation.
The so-called "QQ" coin - issued by Tencent, China's largest instant-messaging service provider - has become so popular that the country's central bank is worried that it could affect the value of the yuan. Li Chao, spokesman and director of the General Office of the People's Bank of China (PBOC), has expressed his concern in the Chinese media and announced that the central bank will draft regulations next year governing virtual transactions.
So tell me again the difference between real and virtual currencies....
I have a great deal of respect for the German nation. More than anyone else, I think, they have come to terms with their recent history - specifically with the Nazi period - and emerged stronger, wiser and more admirable (compare, for example, Turkey's rather sad denial that a genocide of over a million Armenians lies festering in its past). But recently, I've noticed some signs that German society - or at least its politicans, which I concede is not quite the same thing - are really out of touch with reality.
I wrote yesterday about its daft plans to monitor PCs while connected to the Internet - blithely ignoring the near-impossibility of this idea. Now we have something else equally stupid: the criminalisation of violent video games. According to Der Spiegel - probably the best news magazine in the world - the Bavarian minister for internal affairs wants to make the "production, sale and purchase of such games punishable by up to one year's imprisonment."
This is so obviously a knee-jerk reaction by frightened old politicians, unable to deal with the technological changes that are happening around them. What makes it particularly sickening is that it concerns itself with virtual violence, and blithely ignores the rather more pressing issue of all the violence present in this world - as practised, for example, by the US Government in its various torture camps around the world. Get real, people.
This has to be the coolest name for a project in recent years: BrainGumbo. No wonder, perhaps, since it's pressing some of the hottest hot buttons in computing today:
The BrainGumbo Project aims to build a virtual movie (machinima) production studio in Second Life, from the work of amateur enthusiasts and free components.
07 December 2006
I'm impressed: IBM has just stuck a dirty great clog in the engine of Microsoft's machinations to get its Office XML format adopted as a formal standard:
IBM voted NO today in ECMA on approval for Microsoft’s Open XML spec.
Heavy stuff: I think we expect some horse heads to start turning up soon.... (Via C|net.)
Correction: further to Bob's comment, I've gone to the ECMA site and found the press release announcing the approval of the standard. Naively, I thought that somebody voting against it would block it: not so. Apologies for my over-enthusiastic analysis. I suppose IBM's move was therefore more symbolic than anything. Ah well.
In a significant sign that Second Life is beginning to expand beyond its early-user, largely anglophone base, the Internet arm of Brasil Telecom has announced that it is coming out with a localised version of Second Life this month.
The company estimates that there are currently 30 to 40 thousand Brazilians already active in Second Life, a number it hopes to double in the first year of operation. It also aims to help those currently using the English-language version to migrate to the new Brazilian client.
A separate arm of Brasil Telecom is being created purely for this venture; worryingly, the company behind it talks about "selling advertising inside the game". I don't think that's going to go down to well with the (virtual) locals....
At first sight, there's something appropriate about Paul Allen paying for the Allen Brain Atlas:
an interactive, genome-wide image database of gene expression in the mouse brain. A combination of RNA in situ hybridization data, detailed Reference Atlases and informatics analysis tools are integrated to provide a searchable digital atlas of gene expression. Together, these resources present a comprehensive online platform for exploration of the brain at the cellular and molecular level.
After all, he did work on an "electronic brain" as they were mockingly called back in those dim, dark days of early computing. And it comes as no surprise that the freely-available and rather impressive 3D Brain Explorer - think Google Earth for the mouse brain - is only available for Windows XP and the Macintosh.
But dig a little deeper, and you find something rather telling about the real "brain" behind this brain:
Processing the amount of data produced during the Atlas project (approximately 1 terabyte/day) requires a fully automated data processing and analysis pipeline. A goal of informatics is to provide the infrastructure that will allow scaling of an increase in image data and complexity of image processing. The IDP was designed to be modularized and scalable to support a library of informatics algorithms and to function so that additional incorporation of informatics modules does not interrupt production systems. The system must also have the flexibility to accommodate defining multiple workflows using some or all algorithms and is iterative in its processing of gene image series. Parts of the process are computationally intensive, such as image quality assurance/quality control (QA/QC) and preprocessing, registration, and signal quantification. These tasks are scheduled and run in parallel on the server cluster.
Right. And just as a matter of interest, what might that cluster be running?
The cluster consists of a total of 148 CPUs, 32 HP BL35p blades with dual AMD 2.4Ghz, 4GB RAM and 21 IBM HS20 blades with dual Intel 2.8Ghz Hyperthreaded, 4GB RAM, all running Fedora Linux.
Obviously someone used their brain.