Tacit knowledge and the Decline of the North American Century
Never leave public policy in the hands of a banker or an accountant. They will end up 'trimming the fat' so much, that if the engine of state were a person or a car, their cost-cutting would leave each respectively without legs or an engine. Their analysis tends to consider "cost" at the expense of an analysis of "function".
The recent news that 52,000 persons were left unemployed in the US economy is certainly worrisome of dynamics that have been going on for a while, but it is perhaps not the most grave in nature. Far more serious over the long run in the outsourcing of routine computing work to India, the exporting of manufacturing to China, and other cost-cutting measures is the decline of 'tacit knowledge' within the local North American environment--that knowledge that is acquired only through hands-on experience of direct and personal interaction with the material or process. This direct interaction leads to a deeper acquaintance with internal processes, and eventually at some point to the financially important innovations that could never have been foreseen beforehand.
While the suggestion can be debated using lower-class car racing culture, chinese clock tinkering during the colonial period, or the fall of the 'lone inventor' at the turn of the twentieth century (visa viz. rise of industrial labs) as a counter examples to counterargue that "tinkering" leads only for entertainment rather than enhanced productivity--this dynamic contributed to the formation of a continuous innovation in the computer and information industries (ICT), the central engine of recent US economic history. As a Forbes article recently noted of the 'eternally innovative' Steve Jobs, it basically consists in the putting together of different pieces to create something new that did not exist before. (Before the return of Jobs, Gil Amalio's typical cost cutting measures were heading the company to financial collapse.) While this is an obvious comment, we might note that, by definition, if the existing information environment reduces the number of potential pieces available to future innovators, innovations will eventually 'dry up'; ignorance of pieces that would have been placed together produce "Inventions" (products which are more than the sum of their parts) naturally prohibits potential future innovations from emerging. Manuel Castells, for different reasons but at a similar effect, argues that the internet and other ICT components did not emerge in the former Soviet Union given the relative absence of 'liberty'.
Over the long run, the ability to "tinker" is key to the acquisition of tacit knowledge, and (ultimately) of economic development. This is an important idea that should not be casually dismissed. Few are aware, for example, of how important scientists became "important" by the mere act of tinkering. The interelations and ineractions between science, technology, and economics runs much deeper than one might imagine.
To take a somewhat clicheish example, Newton was a constant tinkerer when young, "spending his free time tinkering with kites, water wheels, sundials, and clocks". Einstein's father, who owned a company that manufactured electrical equipment, gave a pocket compass to him. “As he grew, Einstein built models and mechanical devices for fun, and began to show a talent for mathematics." In the United States, Richard Feynman “tinkered with crystal radio sets" in his youth, stimulating his entry into science and memorably recorded in his autobiography "Surely You're Joking, Mr. Feynman." I have known very talented engineers who, upon recollecting upon their youths, noted how fixing ordinary machines (lawnmowers) with their fathers not only established important father-son bonds, but also profoundly shaped their mechanical world-views. Tinkering ultimately contributes not only to the formation of scientists, but to new scientific ideas as well.
One might suggest that the retirement of Bill Gates, and what appears to be the likely demise of Microsoft and the fallible Vista OS, also marks the end of an era where the computer had relatively open structures. For however problematic it might have been during its 20 year long run--and it was extremely contentious given Microsoft's use mafia-like coercive tactics in its operative system environment, the overall structure of the computer's hardware component parts were readily open, accessible, and manipulable--a paradigm established by the formation of the first IBM "PC" which had relied on Gates for the floppy driver and OS, and on 'open-shelf' hardware components. (The Altair and the Apple II were previous examples of this kind.) This open structure might have created many headaches for computer companies (and in some cases bankruptcy), not only because it made the market more dynamic and chaotic in nature (hence requiring investment to keep up with growth and innovation), but it also stimulated the industry because it gave users a great deal of direct control and power over their machines, given the ease with which they could be opened and dissected, and modified. It led to that much abused and maligned class of workers in the new economy: the geeks. The tacit knowledge acquired in their creation and development contributed to the open source world in the ephemeral software level, and the tremendously innovative ICT economic sector.
What is somewhat striking, and somewhat odd, that it appears as if Steve Jobs has been on route to 'destroying' the basic paradigms of this computing environment. The creation of sleek aesthetic computers, beautiful and pleasing as they might be, has also led to the closing off of the "public sphere", here defined as consisting of the computer's the internal components, at both a superficial and functional layers. The magic of aesthetic computers clashes with the functional practicality and learning environment of 'box computers'--computers which were horribly 'ugly' but which could be opened with simple and readily available tools (screwdriver, etc.).
The recent magical MacBook "Air" laptop created by Apple--the one which Steve Jobs pulls from an envelope in a recent (2008) Expo--is in this sense perhaps the perfect example of the overall trend. The processor cannot be readily purchased online as one would with any other processor, but rather is built directly into the computer board. It's very sleekness prevents anyone from 'peeking' into its innards, or at least difficult enough to effectively prohibit its entry in actual practice--a trend which can be observed for some time in the laptop industry, and which is particularly true of the iPods as well. The iPod (iPod Touch) well on its way to becoming the new laptops of the future, is even harder to 'pierce through'. The more aesthetically pleasing the computer, the less the individual can learn from it; computing becomes less of a 'total learning experience', and the user interaction operates exclusively via the screen as opposed to the 'background' environment (i.e. boxed computer). What long-term social effect these trends will have is hard to tell; much of it depends on the degree to which new 'sleek' computers displace the older 'boxed' versions-- as well as in the reaction of the open source software programming community.
The emergence of OSX has, quite sadly, also constituted a 'closing of the commons'. Programming which was arduously developed by countless anonymous programmers--free to be opened, and modified 'at will'--has been gradually closing itself off from the public domain, making up what might be referred to as the 'privatization' in the intellectual realm of software. Although it has certainly been improved as an operating system, key functionalities of the OS which previously existed have gradually been trimmed off, reducing the general user's ability to manipulate the operating system and (inversely) enhancing the company's final control over its product. Its shere greater complexity, by itself, drastically increased the learning curve of its relative 'mastery'--particularly to newcomers who lacked the experience of simpler operating systems. While it might be pointed out that the openness of older hardware-software environments led to varying levels of financial losses to companies, they constituted an overall net gain for society by creating rich learning environments. The stereotypical geek, as human capital, was its most precious return on investment (ROI).
It is somewhat ironic and indicative of the trend above mentioned, that while Apple has added to their Macintosh computers the ability to use and operate other operating systems--as was the case with the use of the INTEL processor which allowed Macintosh computers to 'natively' run Windows or Vista virtualization (rather than emulation)--at the same time it has been gradually eliminating the user's ability to use previous versions of its own operating system (as would be OS 9, or the 'classic version'). Classic had been around for a very long time, and had created a rich software environment that could be readily opened and tinkered with, despite its serious limitations. Given the speed of the new hardware, there is no reason why this older software could not be emulated in the new computers; obviously, it is not in Apple's corporate interest to do so, and hence has not promoted it.
It is likely that the continued cost reduction of RAM memory, the possible elimination of the hard drive (or transformation towards a solid state of a drastically faster nature), a new paradigm will likely be pushed by corporations which will even further restrict direct user interaction with his/her 'work environment' (i.e. computer), further limiting access to the internal component parts of the computer, be it software or hardware. Some recently suggested the an eventual return of an older computing environment, one akin to mainframe computing where the computer acts a 'dumb' terminal, a model which would essentially overturn this computing paradigm altogether by 'killing the PC' as we know it today. While this might come as a net gain for Apple and other companies who seek to obtain greater control over the user-computer interaction, which would enhance its profitability along with its continued stock market rise, it seems like the fundamental change in paradigm will ultimately undermine the most valuable commodity of all during the period: human capital formation.
One might suggest that the closing of the computer commons will eventually lead to a drastic decline of tacit knowledge in the general user community, and ultimately gravely undermining North America's computing "hegemony". It is a trend that is perhaps indicative of overall changes in the US industrial economy, and which suggest that the 'recession', over a long term, will not just be a downward segment of the business cycle but an altogether drastic shift in the 'average center' about which all points of the business cycle move.