I'm sure you already got it, but I'd like to state it one more time. I love openness. And I'm delighted to see the amount of open projects popping everywhere nowadays to counter copyright, patents or other control mechanisms over ideas.

Open Source Software and beyond

Back in 1998 I was a student in Germany, paying my rent as independent worker around computers (I sold a few bad webpages and the ancestor of po4a to monitor web site translations). I was a big fan of computers since much more time (I did my first computer programs when I was 10 years old or so), but the only system I knew were DOS and Windows. I never heard of anything else before. During that summer, I came across the free software project, and in particular the Debian project. I can still remember the shock that I felt when I realized that this technical project was grounded using ethical views. The facts that the interactions between members were following a constitution which stated that the chief has less power than the mass was quite ... unexpected. This is the very reason why I came to the Debian distribution (and then the quality of the packaging system also kept me here). This talk constitute a very good entry point to the Debian philosophy, history and decision making process. It's very interesting to see this philosophy currently spreading out of the Linux community, and getting into the real life.

It has been said that what its constituting element is that it is done by the masses, as in the Linux kernel development model, were given enough eyeballs, all bugs are shallow. It even got called crowdsourcing (crowd+outsourcing). Recently, classical business actors are taking advantage of crowdsourcing for various reasons, as explained by this infographic. Crowdsourcing is a nice idea, related to the philosophy of the hackers, but I don't think that the crowd aspect is the most important point here. I see it at best as a conclusion, not a mandatory condition. It was already reported that the vast majority of projects are developed by only one contributor each. In my mind, this fact explain how the incredible ssh vulnerability could happen in Debian. If that package were maintained by more than one individual, the chances to detect this issue before its release would have been higher.

It's also tempting to think that the specificity is on the decision making side. Indeed, there much more community-based groups than IRL, where business corporate and state funded agencies occupy almost all the space. These groups are ruled by some sort of do-o-cracy (the one who does it is right, ie actions speak louder than words), the very rules that I found so magic when I came to discover Debian. But this is no constitutive element either: first of all, this is merely constitutive of the associative world not of the hacker movement. Then, even the purist community-based projects such as Debian or the FSF actively collaborate with hierarchical entities such as business corporate and state agencies.

What's really important to me is that everything is made openly. I feel like it's the root of everything. Instead of trying to protect your knowledge, you share what you have so that you receive what others have. Of course this works better when the stuff you have can be duplicated a no cost (like software which can be copied for free over the Internet), but we shall see how this idea spreads to other contexts. To ensure the reciprocity, it is absolutely mandatory that what you are sharing is not only open but also as free as free speech, preventing anyone to take it for their own. It thus solves one of the major issue with the public domain licensing schema, where someone can take what you've given away under the public domain, integrate it in a closed system, and refuse you from benefiting of the result.

The openness propels the ideas, as long as it's protected by the freedom, and then allows crowdsourcing to communities of volunteers. Openness also impacts the price, since results are often cheaper, but that's really a side effect, and it's perfectly ok to sell something that is open.

Opening computer related areas

A lot of computer enthusiasts also like electronic, which explains why a lot of people are working on an Open Hardware movement. Since the actual circuits do have a reproduction price, the final product cannot be freely shared. This is the complete specification which is. Projects ensure that every information mandatory to build and/or modify the product is freely available.

Arduino is an initiative aiming at making open electronics easy. It is of course open itself, and you are perfectly free to do your own arduino. Yet, most of the people buy it ready to use from its conceptors (explaining the buissness plan making it possible). Experts and hobbyist use it to prototype their realization while artists, designers and everyone can use it to learn electronics and do whatever they are up to: we are working with Clément on a stupid game called FukuTruc which is a nuclear power plant simulator for less than 10$ (not counting the arduino).

Arduino raised a very interesting community around open source extension kits that you plug directly on the arduino PCB. Individuals or little companies design their own extensions, publish the specifications and live from selling the mounted extensions. The individual involved (such as http://warrantyvoidifremoved.com/ or http://dangerousprototypes.com/) are somewhere between technical experts, artists and activists. Some arduino extensions are modular electronic elements allowing learners to forget about the soldering iron (and undo their mistakes), gameduino, to build your own old school games, open genetic sequencer or open flying drones. Actually, every possible gadget seems to be worked on by someone, somewhere. Playing wooden Labyrinth with a wii fit is now possible...

Gizmo is another community not directly related to arduino that crowdsource several open source hardware projects, such as a PDA a game console or a GPS tracker.

When it comes to building your own chip, the technical difficulties of reproduction become almost impossible to overcome for now. But it does not mean that nothing happens here. The CERN already pushed the OHL (Open Hardware License) as an equivalent to the GPL, while others proposed the OSHW. FPGA are an interesting way of specializing generic chips, if reproducing specialized chips is beyond hobbyist capacities for now. This way is demonstrated by milkymist.

The network is also something to be freed, particularly in France where our politics are voting laws such as Hadopi. Reporters Without Borders recently classified France as Under Surveillance. As a reaction, freedom protection groups are acting for the net neutrality, such as April, la quadrature du net. French Data Network is even an associative ISP, crowdsourcing the network infrastructure to ensure that it remain free of any censorship (I happen to be member of its local incarnation although I still have a classical closed ISP for now).

Ironically, the software is also to be freed again with the advance of cloud computing and web applications. 10 years ago, we switched from the closed systems that had the sexier interfaces to a free operating system encumbered by its clumsy text interface. Things evolved a lot since then, and Linux interfaces are now as ergonomic as Windows -- the freedom and openness comes in addition. We are now facing the same issue with the Web 2.0: What is the point of using Linux if you rely on a completely system such as Google mail, Facebook, doodle or Tweeter? Alternatives begin to emerge, such as Zimbra for hosting mails, http://identi.ca to tweet, FramaDate to find the next meeting date, http://selectricity.org/ to organize a poll, etc. No direct concurrent of Facebook is usable yet (don't cite Google+ or I'll bite! How is it more open?), but several are cooking such as Jappix or Diaspora. For the time being, I'm still doomed to FaceBook :( Network effects remain particularly pernicious things, and I'm not completely sure of how we will manage to deal with them.

Artistic and cultural open scene

Many artists embraced this ways of doing things. The easy to understand, modular licences of Creative Commons helped this process a lot. OxyRadio is a french net radio broadcasting only free and open music. That is to say that every artist in their playlist decided to put his work under a license similar to the ones used in open source software. As a matter of fact, this playlist is still somewhat limited: right now they have 1171 songs, from 234 artists. That's a bit limited but this figure can only increase with time :)

I usually listen to music while working, and la grosse radio is propelling me most of the time (in the rock version, I never listen to the hardrock or reggae variants). It is a net radio animated by fans. It broadcast mainly "regular" closed songs, and the specificity is more on the way they build the playlists. Their website allows to vote for or against the songs as they get broadcasted, and computers are used to tune the playlist accordingly. Similar features are more often targeting a single user rather than a community, and I like very much that my fellows pick songs for me when I have better things to do than voting myself. The process goes one step further where independent artists can upload their music to the system. Volunteers then listen to this music and it get either filtered out or added to the playlist, and thus to the global voting system. The goal is to act as a bridge between artists, event organizers and music fans that effectively bypass the majors. This is how I discovered Bagdad Rodeo for example.

Teaching material

Modern technologies enabling distant learning are naturally propelled by free software such as Moodle. More interestingly, the contents of the teaching are getting free too, just like music. Several big universities such as the MIT, Berkeley or Notre Dame are releasing the content of their courses under creative commons license. This is a huge step forward since it allows people all around the world to benefit from these OpenCourseWare without actually having to attend these teachings. For me as a teacher at university, it quite changes things. My work should be about helping learners to acquire the knowledge that waits for them on the Internet, not defining what is to be learned. That being said, the OCW are often rather raw material such as lectures in video. I'm glad that this material exists, but they seem difficult to use: they cannot replace face to face teaching, nor can they easily integrated in a classical curriculum. Moreover, they are not open since I cannot modify them.

I'm much more interested in community efforts to build teaching material, as it seems more hackable to me. A blazing example is sesamath: this is a community of math teachers in France who wrote free math manual for kids between 10 and 15 years. They seem to me of professional quality, even if I am not an expert. They are free in the sense that the source code is provided (a LibreOffice document), and printed copies are sold by an associated printer. The web site also hosts a lot of interactive exercises for the learners, as well as specialized material targeting teachers (answer books, specific annotated animations). This project is also notable because it clearly wasn't founded by hardcore player of the software open-source community (they are not using LaTeX, and the animations are in flash), but by experts of their field. (Side note: I'm dreaming of changing PLM into such a platform for programming and computer science, stay tuned ;)

Open Data

Nowadays, almost every kind of data is considered to be freed. wikipedia is the open encyclopedia that everything knows now. From my personal experience, it's of surprisingly good quality. I really should participate at some point. OpenStreetMap aims at constituting a freely usable and editable map and geographical database. It may sound like a crazy idea, but thanks to its 320,000 members, this community achieved high quality maps for every countries in the world (although not the complete world). The data is even used by Microsoft's Bing system now, which demonstrates once again that open systems and corporate systems can peacefully collaborate.

Several governments committed to publish their data to the public and launched dedicated websites such as http://www.data.gov/ (see list on wikipedia). All sort of information are made public on these pages, alongside with some dedicated web applications to help the data navigation. And it's better this way. In France, only some local institutions have decided to go for open data initiatives. I hope that this will spread further in the future. The excellent french web journal OWNI is specialized open culture and citizenship. It often applies or reports datajournalism methods to give an interpretation to the large data sets released this way. See here or here. You really should follow owni if you read french (if only they could use identi.ca instead of twitter, it'd be even better).

The openness movement is very strong in science, too. Once upon a time, we used to consider scientific information as a public good. This was partly because of sad stories where fundamental research were long ignored and even dupplicated because of bad diffusion of scientific ideas. But then came the editors, acting just like the majors with the cultural media. They are certainly helping the process, but at end of the day, the article are reviewed by experts of the field and nobody reads the paper edition anymore. The pricing of most scientific editors is thus difficult to bare, explaining why Open Access gained much attention. I even advocated Open Reviewing recently ;) On the science itself, some people (including me) think that we should sheer some more light on the process. Given the technical level of our experiments, free form descriptions of the methodological settings is not enough anymore. The precise descriptions ought to be formalized precisely and uploaded to specialized sites such as http://www.myexperiment.org/ where others can comment on them, share them, reuse and improve them. Actually, a large body of my research activity pursue this goal: the SimGrid simulation tool aims at making the research on our domain more reproducible and easy to share.

DIY and Web²: the revolution is coming IRL

The most interesting change is that this movement is not limited to virtual goods such as ideas, data, music or software anymore. It invades the industry. As in open electronic, the idea is to share the construction plans of almost every possible objects, and crowdsource the R&D phase.

Of course, you need a way to build the objects from the open source schematic. This can still be done manually in open electronic, but for most of the objects around us, you need an industrial equipment that goes ways beyond a simple soldering iron. The emerging solution is constituted by a worldwide network of FabLabs, each of them providing access to the needed equipment, and serving as information hubs. Some machines will be in every home, some in every city, while some more specialized machines can be only available in big cities. The most innovative of these machines is probably the reprap, or 3D printers capable of producing plastic objects. When it comes to the production, this becomes the DIY (Do It Yourself) culture. Of course, repraps are built using .. repraps :) The most environmentally conscientious of us are exploring solutions to make their reprap self-sufficient by building a granule extruder to directly recycle the plastics that would have been trashed otherwise.

One could say that some goods simply cannot be produced by hobbyists, no matter how brilliant and motivated they are, but are reserved to mass production means. Yeah well, that's what they said about software a few years ago, and now only the desktop remains [partially] closed source while Linux is taking every other domains like a storm. Plus, some projects are already producing quite unexpected things, such as a race car, a construction set composed of DIY industrial machines and even an open sourced GSM infrastructure, hardware and software. The last one is impressive when you know that only a handful of proprietary implementation exist in the world, and that these solutions were in production to propel the network of a pacific island for over one year as an experiment. So? Where will it stop? Is there some things that cannot be done open-source leveraging crowdsourcing effects? The dudes from the CCC, in front line of hacking movement since its inception, don't seem to think so. The open source GSM infrastructure begun with a talk at CCC'08 exploring the idea. Nowadays, each CCC camp deploys its own GSM infrastructure.

To prove the world that the hackers do not fear the challenge, the goal of this year's venue is to ... go to the moon! The most astonishing is that they actually have good chances of success. I'm confident that they will be able to send satelites soon (they already send payloads in near space). Then, they'll send rockets to the moon (e.g. to deploy a communication infrastructure outside of the reach of the censors). They may even manage to send a hacker to the moon within the 23 years time frame announced. This is particularly amusing when both states and corporations seem to give up on that goal. Could actually hacker succeed where regular organization fail? I happen to think so :)