A few weeks ago I started reading Richard Florida's game-changing (profound? or controversial?) book, The Rise of the Creative Class. Something he spends a lot of time on in the early part of the book is watching how advances in science and technology fueled changes in the way businesses were run, and the kinds of work people were doing; this leads to changes in the layout of cities, quality of life, home and entertainment, trade, and society in general.
For example, he talks about enterprising gentlemen who would buy large quantities of raw materials and ship them to various craftsmen for processing. These craftsmen would not have direct interaction with the end-users of their products. This process was called "factoring". Later, instead of working with a network of skilled labor over a wide geographical area, they consolidated their talent under one roof and created a "factory". We all know what happened next: big cities got bigger as more people moved out of the country for these new kinds of jobs, and the first-world countries and cities with factories changed in a very real way.
Web 2.0
Something very similar has been happening in the past 4 years with the way we write and deploy web applications - Tim O'Reilly noticed this change happening, and coined the phrase "Web 2.0" to try and give us a rallying point around the kinds of changes that were taking place. In the Wikipedia entry for Web 2.0, there are mostly references to the kinds of applications and content that exist on the web. Second, it talks about the ways applications interact with people, and finally the way that applications interact with each other; there is very little reference to the way in which applications are built and deployed.
Still, talking about communications protocols and the kinds of applications that are built, fancy business models and altruistic ethics still don't explain how we got here. Out of the entire article, the only line that really talks about the manufacturing process, in this case computer programming, barely scratches the surface:
Web 2.0 technologies tend to foster innovation in the assembly of systems and sites composed by pulling together features from distributed, independent developers. (This could be seen as a kind of "open source" or possible "Agile" development process, consistent with an end to the traditional software adoption cycle, typified by the so-called "perpetual beta".)
If you read Rise of the Creative Class, you see that agile processes and perpetual beta are not new tools in the hands of enlightened managers. He shares stories from the optics factory where his father worked, where most of the managers on the shop floor had been promoted from the labor positions. He argues that one of the major things that kept the factory running smoothly was the managers' faith in the experience of the laborers - if they had an idea about how to speed up production, or if they improved a process or design, the managers understood that their laborers knew what they were doing, and that their voices should be heard. Once the factory started hiring the MBAs and engineers, the management stopped listening to what was happening on the floor and productivity (and innovation) dropped almost to zero.
Research & Development
At the same time, a new class of mega-corporation was emerging, and along with mega-corporations were mega research and development budgets. The Bell Labs and Xerox PARCs of the world ruled the roost, and tons of new ideas were hatched - some succeeded, most failed. At some point, the mega-corporation stopped spending so much on R&D, and the venture capitalists entered the scene. Instead of the large corporations innovating in-house, entrepreneurs and tinkerers created startup companies, and the innovation was externalized. Bigger corporations could now incubate businesses that rose to the top of the pickle barrel by acquiring technologies along with their inventors. The VCs get a return on their investment, and the big companies only need to spend money on projects with a healthy track-record.
Back to Bell Labs, though. There are (at least) two incredibly important inventions that need mentioning before we move on. In an effort to computerize the telephone network, some guys at Bell Labs developed an operating system called UNIX. A few years later, in order to make the code portable to other chipsets and machines, they developed the C language, and UNIX was no longer just developed in assembly code, but could now be compiled to run on different architectures.
Right there we have the birth of two technologies that have utterly defined everything that makes Web 2.0 possible. UNIX and C were used in universities, which were a very important part of the research and development ecosystem, and remain to this day a place where massive amounts of patents are filed and startup companies birthed. By the 1990s, UNIX and C were standardized, and you could run them just about anywhere and, there were now open source (freely licensable) versions of both.
The Building Blocks of the Web
Show me one piece of Web 2.0 that is not based in some way on C or UNIX (with the exception of Microsoft .NET, which is still written on top of C). Nearly every web server is running a UNIX or LINUX operating system, and all of the programming languages fueling the web sites and services we know and love are all implemented in C. Take Perl (developed at NASA), Python (developed at Holland's CWI research institute), Ruby (developed by the head of an R&D department in Japan), and PHP (developed in Silicon Valley).
One thing all of the above tools allow is the ability for the programmer to work at a higher level. That was the original promise of C, to be able to work higher than machine code. The aforementioned interpreted languages also remove the burden of memory management and lots of the noodly semantics from the development process.
One thing you'll hear over and over is that all of the technologies used to build Web 2.0 existed before the 2.0 era, and that's all too true. If you've read Florida's book, you'll remember he points out that fancy machines are not what made factories successful, it was the innovation of the people running the machines, whether simple or advanced. It was the changes in workflow, allowing workers to focus on a much smaller part of the manufacturing process, which created specialists, who then innovated on the process further.
It's not like people were never developing great ideas on the web before it went 2.0, on the contrary. One of the most successful features of Amazon.com is their user-generated product reviews, a feature which existed when the service launched.
Old Tools + New Process = Explosive Invention
OK, so we've got the same tools, and one thing that is different this time around is how freely available it all is, thanks to the open source movement. When we're talking about software, the smallest innovation can be shared with millions of people through the channels of Free, Libre and Open Source Software (FLOSS). Because of things like the copyleft in the GPL, when you make a change to GPL software and want to distribute it, you have to also license your newly distributed code as GPL, and everyone can benefit. Not everyone in the world is cool with re-distributing their work, but it's nearly impossible to avoid the touch of open source these days.
Now, instead of just using programming languages, the web community has realized that most of the code a programmer had to write for a website was being duplicated from one project to the next, and that everyone had to solve the same kinds of problems over and over. This has given rise to several web-specific programming frameworks. All of the above-mentioned languages have at least one framework available, for example: Catalyst, Django, Ruby on Rails, and Zend Framework.
The Calculus
In man's quest to understand the world, a number of methods of communication have been developed which are ubiquitous: arguably the most universal would be numbers and mathematical symbols, like those used in calculus. The word calculus actually comes from a word used to refer to small stones used for counting and calculations. The human race is often in need of tools to communicate, yet math has achieved that communication with a shared vocabulary, theories, Laws, and symbols that communicate instantly and universally.
Mathematics is called "the language of science" - I can think of few situations where this is more true than computer science and relational databases. A large part of low-level languages and operating systems is simply allocating and recalling bits of data from memory, and of course performing simple operations or instructions on said data - an incredible feat that now can take place over one thousand trillion times per second in modern super-computers.
Think of programming frameworks and design patterns (on which frameworks are based) as the shared language - the calculus and theories - of modern computer science. Instead of lengthy explanations of abstract concepts, experienced programmers are able to speed up their communication with each other with the vocabulary provided by these frameworks. Additionally, growing popularity of practices such as Don't Repeat Yourself (DRY) and favoring Convention over Configuration have helped frameworks gain widespread adoption, taking frameworks out of the realm of trends and fads, because programmers are able to focus on the parts of their jobs which are specific to the problem at hand, instead of having to "re-invent the wheel".
Frameworks allow programmers to tackle 80% of their problems in 20% of the time, which reduces the time between invention and realization. Because these frameworks are open source and widely available, communities have begun to support technology-focused user groups, and hundreds of new Ruby on Rails programmers (e.g. "Web 2.0" developers) are entering the ecosystem all the time. Some cities have a higher concentration of these developers than others, and these cities also tend to be those with more entrepreneurs and inventors, especially if you pay attention to the cities on Richard Florida's creativity index.
R&D sans University
What am I getting at? Research and Development, particularly on the web, is no longer fueled by big corporations or universities, despite the fact that the tools that are enabling the current advances were all developed in the "R∧D 1.0" institutions. In the past, you needed a big government contract or a CEO who was liberal with his research budget in order to get the capital to develop new products. These days, we've got tons of self-starters, folks who are going months without pay, working nights and weekends, getting funding from mom and dad or other Angel Investors, and roping their friends and relations into some crazy scheme... and it's all happening on the Internet.
Then Yahoo, Google, AOL, or Microsoft will come along and snatch up your company and make you a millionaire... that's the dream, at least.
Try to think of the last startup you heard of that was born in a research lab at a university... Google and Ask (Teoma) came from university, but those were almost 10 years ago. The part of this blog post that I can't research is this one, but I hope people will leave some helpful info in comments, and I'm sure I'll post about this subject in the future.
Our Opportunity
So here we are, the young, resourceful, well-educated youth of America. What is stopping us from developing new products, starting our own IPTV station, running for government office or just releasing some open source projects? Nothing. Try to find some inspiration, experience the world, travel, meet new and interesting people, try new things, eat food that scares you, speak in front of large groups, run a BarCamp, make some friends, get involved with your community, learn a new skill, and above all, invent something.
A huge thanks is in order to the fine editors and contributors who make Wikipedia such a great resource. You guys rock.
Commenting on this Blog post is closed.