[Write] [Home] [Up] [Previous] [Next]

Technology and Pleasure:


by Gisle Hannemyr

"Hackers" are identified as a specific subgroup of computer workers. The history of the hacker com­munity is told. The explicit and implicit ideologies expressed through hacking is analyzed and presented. Computer artifacts of origin both inside and outside the hacker community are compared and the embedded properties of the resulting artifacts are inferred. Hacking is discussed in the context of being a method for system development. Finally, it is argued that this system development method under certain circumstances may yield superior software artifacts.

Table of Contents


Whenever computer centers have become established, that is to say, in countless places in the United States, as well as in virtually all other industrial regions of the world, bright young men of disheveled appearance, often with sunken glowing eyes, can be seen sitting at computer consoles, their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler's on the rolling dice. When not so transfixed, they often sit at tables strewn with computer printouts over which they pore like possessed students of a cabalistic text. They work until they nearly drop, twenty, thirty hours at a time. Their food, if they arrange it, is brought to them: coffee, Cokes, sandwiches. If possible, they sleep on cots near the computer. But only for a few hours – then back to the console or the printouts. Their rumpled clothes, their unwashed and unshaven faces, and their uncombed hair all testify that they are oblivious to their bodies and to the world in which they move. They exist, at least when so engaged, only through and for the computers. These are computer bums, compulsive programmers. They are an international phenomenon. (Weizenbaum 1976)

The hacker is a person outside the system who is never excluded by its rules. (Turkle 1984, p. 227)

“Hackers” are computer aficionados who break in to corporate and government computer systems using their home computer and a telephone modem. (Meyer 1989)

The popular image of the computer hacker seems to be part compulsive programmer preferring the company of computers to people, and part criminal mastermind using his or her technical prowess to perpetrate anti-social acts. But this is at best only half the story. A number of people I know who are proud to be called “hackers” are sensitive, sharing, social and honest.

Part of the confusion surrounding the word “hacker” stems from the fact that it as been applied to at least three distinct communities.

The “original” hackers were computer professionals who, in the mid-sixties, adopted the word “hack” as a synonym for computer work, and particularly for computer work executed with a certain level of craftsmanship. They subsequently started to apply the noun “hacker” to particularly skilled computer workers who took pride in their work and found joy in doing so.

Then in the seventies, assorted techno-hippies emerged as the computerized faction of the counterculture of the day. These were grassroots activists who believed that technology was power, and as computers was the supreme manifestation of the power of technology, they believed that they should be put into the hands of the people. While these activists did not speak of themselves as hackers or identify closely with the master programmers that populated the first wave, the term was thrust upon them in 1984 when they first were celebrated by the publication of Steven Levy's landmark Hackers: Heroes of the Computer Revolution (Levy 1984), and then again by the first Hacker's Conference hosted by the Point Foundation and the editors of the Whole Earth Review. What characterized the second wave hackers was that they desperately wanted computers and computer systems designed to be useful and accessible to citizens, and in the process they pioneered public access terminals, computer conferencing, and personal computers.

Finally, in the second half of the eighties the so-called computer underground emerged, appropriated the terms “hacker” and “hacking” and partly changed their meaning. To the computer underground, “to hack” meant to break into or sabotage a computer system, and a “hacker” was the perpetrator of such activities.

Popular media's fascination with things subversive and spectacular has long ago ensured that it is the latter rather than the former definition that reign supreme. However, the strong association between the word “hacker” and the “computer criminal” has the unfortunate side effect that it hides the “other side” of hacking, the side that involve skilled craftsmen who believe that a computer is more than a means of production – it is, among many other things, an instrument for creation, communication, mastery, artistic expression and political empowerment.

In the outset, however, it should be noted that the three hacking communities are not completely disjunct. The hacker of the sixties was not beyond appreciating lock-picking skills, both those addressing physical locks barring access to computer rooms, and software protection schemes such as password files and encryption schemes, and he also believed that information was born to be free – including the source code he had written and the knowledge he had about the inner workings of various systems. In 1990, when the Electronic Frontier Foundation was set up as a response to Operation Sun Devil (a US Secret Service raid on the computer underground), funding was provided by John Gilmour (of Sun Microsystems), Mich Kapor (co-creator of Lotus 1-2-3), Steve Wozniak (co-founder of Apple Computer) and other well-to-do second wave hackers. As far as politics go: Today's generation-x hackers share with their artisan and activist hacker predecessors a distrust in authority, and a tendency to position themselves outside bourgeois society's norms and values.

Some commentators (Lafayette 1993, Rosteck 1994) considers hackers (of the anarchist variety) to be radical partisans, very much in the same manner the Russian nihilists in the 19th century was considered to be part the radical political movement of that time. Others (Kelly 1994) have attempted to co-opt hackers as the avant-garde of neo-laissez-faire economic liberalism.

In this essay, I shall try to put some perspective on these two claims. My main purpose, however, is to instigate a discussion on hacking as a valid method for developing software for information systems.

Tinker, Taylor, Scholar, Hacker

My belief in [a new era of respect for avocations and a future with more active engagement in making, doing, and expressing] comes from watching computer hackers, both young and old. Their programs are like paintings: they have aesthetic qualities and are shown and discussed in terms of their meaning from many perspectives. Their programs include behavior and style that reflect their makers. These people are the forerunners of the new expressionists. (Negroponte 1994)

In the 1950s, people working with computers had much in common with artists, artisans and craftsmen. There was room for creativity and independence. Management methods of control were not yet developed. There was no clear division of labor. Skilled programmers, like all good craftsmen, had intimate knowledge and understanding of the systems they worked with. A humorous account of the state of affairs in those early days is rendered in Ed Nather's The Story of Mel.

This did not last. By the mid-sixties, management wanted to bring computer work in line with other industrial activities, which essentially meant that they wanted programming to be part of a managed and controlled process.

To accomplish this, they turned to a more than fifty year old fad, called “Scientific Management” (Taylor 1911). Scientific Management was invented by the engineer Frederick Winslow Taylor, and aimed at taking away from workers the control of the actual mode of execution of every work activity, from the simplest to the most complicated. Taylor's argument was that only by doing this could management have the desired control over productivity and quality.

The methods advocated by Taylor were to increase standardization and specialization of work. In the computer field, this spelled, among other things, the introduction of programming standards, code reviews, structured walkthroughs and miscellaneous programming productivity metrics.

The most profound effect of application of Taylorist principles to computer work was the introduction of a detailed division of labor in the field. Computer workers found themselves stratified into a strict hierarchy where a “system analyst” was to head software development team consisting, in decreasing order of status and seniority, “programmers”, “coders”, “testers” and “maintainers”. Then, below these on the ladder was a number of new adjunct positions created to serve the software development team: “computer console operators”, “computer room technicians”, “key punch operators”, “tape jockeys” and “stock room attendants”. Putting the different grade of workers in different locations further enforced the division of labor. Most corporations in the sixties and seventies hid their mainframes in locked computer rooms, to which programmers had no access. This isolated programmers from technicians, diminishing their social interaction and cutting off the opportunity for the exchange of ideas. It also prevented programmers from learning very much about the workings of the machine they programmed.

As noted in (Braverman 1974) and (Greenbaum 1976), at the core of this process was dequalification of computer work, the destruction of programming as a craft, and the disintegration of working communities of programmers – all in order to give management more control over computer workers.

The emergence of hackers as an identifiable group coincides closely in time with the introduction of various Taylorist methods in software development. Many of the most skilled programmers resented what was happening to their trade. One of the things that characterized the early hackers, was their almost wholesale rejection of Taylorist principles and practices, and their continued insistence that computer work was an art and a craft and that quality and excellence in computer work had to be rooted in artistic expression and craftsmanship and not in regulations. So, long before the proponents of sociotechique and “Scandinavian School” system developers questioned the Taylorist roots of modern software development methods (Budde et al 1992), hackers voted against it with their feet – by migrating to communities where a non-Taylorist stance vis-à-vis computer work was tolerated.

Hacker lore abound with horror stories about earnest hackers who, due to some misfortune or just some stupid misunderstanding, suddenly find themselves caught in the Taylorist web of some major corporation. The high point of these stories is often to expose some Taylorist principle to scorn and ridicule as corporate stupidity defies itself and Taylorist productivity measures (such as line counting) prove to be easily subverted. Thus, in the folklore, the hacker emerges triumphant, as the moral as well as actual victor of the skirmish. Many of these stories have since found their way into Scott Adams comic strip Dilbert, partly based upon Adams own experiences as “corporate victim assigned to cubicle 4S700R at the headquarters of Pacific Bell” and “true life” email submissions from computer workers out in the field (Adams 1996).

In real life, things are not always so amusing, and sometimes real anger surfaces when hackers voice their feelings about the destruction of their craft – as in this message, posted to an Internet mailing list known as unix-wizards in February 1989:

Programming standards and code review committees attract all the jerks trying to angle their way from the ranks of us hackers into the Vice-Presidency of the Division. While these characters are deceiving themselves into believing they have a career path, they cause everyone else a good deal of trouble. […] Structured Programming is often the buzzword for an attempt to routinize and deskill programming work to reinforce the control of hierarchy over the programming process – separate from and sometimes different from, improving quality.

Dahlbom and Mathiasen (1997) – building on Lévi-Strauss – introduces the terms “tinkering” and “engineering” and discusses briefly how these two approaches relate to software development. They argue that engineers are “modern” and tinkerers “illiterate”, that engineers work “top down” while tinkerers “bottom up” and so on. My first reading of Dahlbom and Mathisen's text left me with the impression that hackers were the “tinkerers” in their terminology, while “engineers” were those using all of the “scientific” and professional methods. Then the following paragraph hit me like a sledgehammer:

Modern societies have engineers, illiterate societies have bricoleurs or tinkerers. As engineers. we organize our thinking in projects, choosing means and tools once the aim of the project has been decided. As tinkerers, we use what we have, letting our means and tools determine what we do. As engineers, we set our goals first, often having to invent tools to be able to reach them. (ibid. p. 173)

Suddenly I understood what the anger surfacing in unix-wizards was all about! The hacker had been forced by his programming-illiterate boss into using some tools and methods that he considered unsuitable or inadequate for the task at hand. The hacker wanted to work as a professional, as an “engineer”, and management had forced him to become a “tinkerer”.

The Hacker Community

I think one general rule of software design is that you should be writing a program that you want to use. Ken and Dennis wanted to use Unix. They did what they needed in order to make it work. We wanted to use sendmail, it wasn't something where we said: “Oh, let's write a mailer and send it out…” Bill Joy didn't come to me and say “Oh, Eric, what we need is this.” We had a problem that needed to be solved. Ken had a problem of sorts: he didn't have an adequate system to do space games so he wrote one. Compare this to X.400, where I'm convinced that people who never actually use mail simply write papers about it. Other proprietary OSes, too, because you assign people to do the jobs. (Eric Allman, quoted in Salus 1997)

As Taylorist-inspired software development methods descended upon the corporate world, the hackers entrenched themselves at the campuses of the large technical universities of the US (in particular MIT, Stanford and Carnegie-Mellon), where their non-conformism was tolerated and their special skills appreciated.

As other outsider communities [1], the hackers developed a strong skepticism for the products of mainstream society, and they often preferred to develop their own programming tools and computer systems rather than rely on commercial solutions. For example MIT hackers developed their own operating system (ITS – Incompatible Timesharing System [2]) for its DEC PDP-6 and PDP-10 computers which pioneered advanced concepts such as networking, file-sharing between machines and terminal-independent I/O. The hacking community also tried to develop its own hardware. The MIT Lispmachine, the Stanford University Network (SUN) workstation and the Carnegie-Mellon SPICE computer are other examples of efforts in this direction. In the Bay Area, community efforts such as the Homebrew Computer Club designed, built and wrote the software for what became better known as personal computers.

Even today, this tradition continues. The hacker's operating system of choice is Linux, a free [3] Unix-compatible operating system developed as a community effort headed by Linus Torvalds at the University of Helsinki. And most hackers prefer the tools and utilities developed (again as a communal effort) by the GNU [4] project at the Free Software Foundation (in Cambridge, Massachusetts) to their commercial counterparts.

The reason the hackers preferred writing their own software and constructing their own machines was not just rooted in a frontiersman belief in self-reliance. By rejecting the fragmented and controlled world of corporate employment, the hacker community developed its own distinct perspective on computing and computers. This perspective favored open systems, integrated solutions and distributed resources. Whether this perspective emerged from the hacker's work style, or vice versa is impossible to tell, but the early hackers rapidly adopted a community-oriented style of working. They preferred to work as intimately as possible with technology, and as interactively as possible with each other. This meant that they wanted direct access to the computer, and they also wanted to share files, look at each others screen images, review and re-use each other's source code, co-author code libraries, and so on. The commercial machines and operating system of that era with their batch job submission systems, operator priesthoods and rings of protection were not suitable for this, and the hackers proceeded to construct tools that they felt they needed to support their style of working.

The next step in establishing a hacking community was the ARPAnet. The Advanced Research Project Agency of the US Department of Defense (DoD) funded the ARPAnet. The main objective behind the ARPAnet program was to link computers at scientific laboratories so that those researchers could share computer resources (Hafner and Lyon 1996).

When the ARPAnet emerged, it sported an unorthodox architecture that made it radically different from existing communication infrastructures such as the long-distance telephone network. Instead of establishing a (virtual) circuit between two end points and using this circuit to carry communication, messages were routed through the network by splitting them into small, independent and totally self-contained “packages” which were then left to find their own way to their destination, where they were re-assembled into complete messages again before being delivered [5]. This solution has a number of interesting implications. It means that the entire network is distributed and self-similar – there is no “center” and therefore no means by which an authority can assume “central control”. With a certain degree of redundancy this also makes the network very robust against failure. If a portion of the Net breaks or is blocked for other reasons – the Net will automatically route packages around it (hence the hacker proverb – originally coined by John Gilmour: “The Net interprets censorship as damage, and routes around it.”)

Another characteristic of ARPAnet was the ease with which new services could be added. The network provides the infrastructure to transport messages between any two points connected to the network. To create a new service, one designs a new set of messages and defines the semantics of those messages be writing one or more computer programs that understands and are able to act upon the messages. Now, anyone connected to the network with the new programs installed on their computer can take advantage of the new service. As the Net itself provides a most eminent infrastructure for disseminating computer programs (which are just one special type of message), it was easy for all interested parties to bootstrap into new services as they became available.

Hackers were attracted to the ARPAnet project. Both the engineering challenges involved and the goal (to enable the sharing of resources, tools and technologies) must have held a strong appeal, as well as the distributed architecture and the flexibility and power to create new computer-based services.

Consequently, first wave hackers became some of the most vigorous members in the communities commissioned to develop the ARPAnet. This resulted in hacker sensibilities being ingrained with the Net, and – as the ARPAnet became the Internet, it became an intrinsic part of the hacker culture – and the preferred place of residence of the first wave hacker community.

While the ARPAnet from the very beginning nurtured an unorthodox community, access to the ARPAnet was controlled. It was only available to affiliates of research institutions and universities with DoD funding. This made ARPAnet off-limits to members the grass roots movements that were the breeding ground of second wave hackers.

The second wave hackers therefore set out to create their own communication infrastructure. First they set up stand-alone Bulletin Board Systems (BBSs) which could be reached by individuals using a modem to dial into BBSs. Then the boards were connected into homebrew global computer networks such as FIDOnet and PeaceNet. In 1985, Stewart Brand, the editor/publisher of The Whole Earth Catalog and Review, established the Whole Earth 'Lectronic Link (The Well), which rapidly emerged as one of the first and most significant of the online communities of the techno-counterculture of the mid-eighties (Rheingold 1993). The Well started out as a BBS-system, and did not hook up to the Internet (the open successor of the ARPAnet) until 1992. But because Brand had elected to build the Well on Unix technology, the effort had sufficient credibility among first wave hackers (who shunned MS-DOS-based BBS-systems) to make them participate in the community and help Brand build and run the system. The Well was therefore influential in bringing together hackers of the first and second waves. Brand (with Kevin Kelly, who later emerged as the executive editor of Wired) also organized one of the first “hacker” conferences in 1984, to which he invited prominent first and second wave hackers as well as assorted counterculture celebrities.

As the Net grew, it also helped the hacker community and culture spread beyond it core areas (the large US technical universities and Bay Area computer counterculture), to become a worldwide pheno­menon. Finally, the recent commercial success of the Internet has made hackers skilled in creating dis­tri­buted applications an appreciated resource in Internet-savvy companies, where some of them proudly display their roots on their business cards.

jwz business card

The Hacker Ethic

[Tim Berners-Lee] didn't patent the [World Wide Web]. He didn't copyright. He made that openly available. And that's what has fuelled a great deal of the network development, and all the innovative ideas. […] There is a continuing ethic in the community to give back to the network what it has given you. (Cerf 1997)

A major impetus of Steven Levy's work on hackers (Levy 1984) was his exploration of the hacker value and belief system. Levy calls it the “hacker ethic”. While Levy may be accused of romanticizing his subject, his book is nevertheless the best-published study of this community so far.

Below, I've paraphrased Levy's “hacker ethic” into a set of imperatives that reflects on the hacker mode of operation, followed by a set of position statements that reflects on the hacker attitude.


Position statements

Another source to a reading of the hacker ethic is Richard M. Stallman's The GNU Manifesto that outlines the rationale behind the GNU project and Stallman's own resignation from the MIT AI Lab:

I consider that the golden rule requires that if I like a program I must share it with other people who like it. Software sellers want to divide the users and conquer them, making each user agree not to share with others. I refuse to break solidarity with other users in this way. I cannot in good conscience sign a nondisclosure agreement or a software license agreement. For years I worked within the Artificial Intelligence Lab to resist such tendencies and other inhospitalities, but eventually they had gone too far: I could not remain in an institution where such things are done for me against my will.

So that I can continue to use computers without dishonor, I have decided to put together a sufficient body of free software so that I will be able to get along without any software that is not free. I have resigned from the AI Lab to deny MIT any legal excuse to prevent me from giving GNU away. (Stallman 1985)

Ten years earlier, the techno-hippies of the somewhat deceptively named People's Computer Company (it was a magazine, not a corporation), the Berkeley Community Memory project, the Portola Institute (later renamed Point Foundation) and Midpenninsula Free University tried to voice their feelings about technology in society. None were more vocal than Ted Nelson, who put out a self-published pamphlet: Computer Lib / Dream Machines (later re-published by Microsoft Press), which more or less was a call to arms:

I have an axe to grind. I want to see computers useful to individuals, and the sooner the better […] Anyone who agrees with these principles is on my side. And anyone who does not, is not. THIS BOOK IS FOR PERSONAL FREEDOM. AND AGAINST RESTRICTION AND COERCION. A chant you can take to the streets: COMPUTER POWER TO THE PEOPLE! DOWN WITH CYBERCRUD! [6] (Nelson 1974)

Later, Nelson sets down his ideals for designing computer artifacts, which includes such objectives as:

The embodiment of Nelson's ideas is Xanadu – a distributed interlinked hypertext system that Nelson has been working on since 1960. Still, after nearly 40 years of gestation, Xanadu is still not widely deployed for public use. In fact, it is Tim Berners-Lee's World Wide Web that finally created the docuverse that Nelson envisioned. Xanadu advocates (Pam 1995) have argued that the World Wide Web is technological inferior to the design proposed by the Xanadu project. That may very well be true, but Nelson's failure in getting people to implement Xanadu may nevertheless serve to illustrate the hacker idiom that rhetoric is inferior to practice.

There is, however, another observation to be made from the failure of Xanadu when pitted against the World Wide Web.

Ted Nelson is almost the typical techno-hippie-cum-entrepreneur. His rhetoric champions such progressive ideals as democracy, justice, tolerance, self-fulfillment and social liberation, but his beliefs is clearly that of market capitalism. In a recent presentation of Project Xanadu, Nelson describe it thusly:

[Xanadu] is a complete business system for electronic publishing based on this ideal with a win-win set of arrangements, contracts and software for the sale of copyrighted material in large and small amounts. It is a planned worldwide publishing network based on this business system. It is optimized for a point-and-click universe, where users jump from document to document, following links and buying small pieces as they go. (Pam 1997)

In fact, if you look beyond the rhetoric, there is very little in Project Xanadu that distinguishes it from other enterprises. The name “Xanadu”, the Xanadu software and Xanadu group's servicemark, the Flaming-X symbol, are all copyrighted, trademarked and jealously defended by Nelson and his cohorts. And at the very core of the Xanadu system is an incredible complex scheme for keeping track of ownership to, and extracting royalties for, intellectual property.

Not only did Tim Berners-Lee not bother to copyright or patent the World Wide Web, he made the source code available for others to experiment and improve. While this gave us such pieces of misengineering as the blink tag, the open source policy of Tim Berners-Lee and subsequent developers of Web technology appears to me as the distinguishing quality that made the World Wide Web succeed and the plethora of competing hypertext schemes (of which Xanadu probably was the first, best designed and most functional) fail.

The Virtual Class

Recently, with all the glitz and glamour surrounding high technology and the Internet, hackers have found themselves co-opted by technophiles, neo-classical economic liberalists and purveyors of technological hyperbole. Suddenly, hackers were hailed as the avant-garde of the computer revolution.

At the forefront of this campaign has been Wired. Its pages are crammed with psychedelic graphics and unreadable typefaces that regularly describe hackers (of absolutely all kinds - from digital vandals to ace programmers) as larger than life heroes. But Wired is not a revolutionary publication. The magazine's first managing editor John Battelle (as quoted in Leonard 1994) made it no secret that Wired does not nurture much faith in the hacker ethic:

People are going to have to realize that the Net is another medium, and it has to be sponsored commercially and it has to play by the rules of the marketplace. You're still going to have sponsorship, advertising, the rules of the game, because it's just necessary to make commerce work. I think that a lot of what some of the original Net god-utopians were thinking is that there was just going to be this sort of huge anarchist, utopian, bliss medium, where there are no rules and everything is just sort of open. That's a great thought, but it's not going to work. And when the Time Warners get on the Net in a hard fashion it's going to be the people who first create the commerce and the environment, like Wired, that will be the market leaders.

As pointed out by Winner (1995) Wired's political platform is a mixture of social Darwinism, laissez-faire capitalism and technological determinism, combined with an admiration for self-indulgence, profit-seeking and boundless egos. The magazine's political allegiances are also evident through its ties with Newt Gingrich's conservative thinktank Progress and Freedom Foundation. For the post-modern techno-yuppies that use their ability to innovate and create original products to free themselves from the mores of regular employment and the corporate world and gain considerable autonomy over their pace of work and place of employment, Wired has rapidly become the most authoritative guide to vogue and vocabulary, attitude and artifacts.

Dubbed “symbolic analysts” by Robert Reich (1991) and “the virtual class” by Kroker and Weinstein (1994), these individuals may at first glance be indistinguishable from equally non-conformist technology-loving artisans of the hacker community. The ideological differences between techno-yuppies and hackers are, however pronounced.

The techno-yuppies seem to share a fundamental belief in both the glory and the deterministic nature of computer technology. Technology is going to produce social changes “so profound their only parallel is probably the discovery of fire” proclaims Wired publisher Louis Rossetto (1993) in the first issue of the magazine. “Like a force of nature, the digital age cannot be denied or stopped” chimes senior columnist Nicholas Negroponte (1995b), it will “flatten organizations, globalize society, decentralize control, and help harmonize people” (Negroponte 1995a).

In contrast, the hacker's fascination with technology is not because they believe that technology will bring about great and revolutionary changes (or make any societal difference whatsoever). Hackers love technology for its own sake. But hackers believe that technology is too a good a thing to be proprietary. Therefore, hackers pay considerable attention to the problem of how to make technology available to the public at zero or very little cost. In The GNU Manifesto, Stallman (1985) a number of alternative solutions to this problem is proposed. One of them is to make the government impose a “software tax” on the sale of all computers. Revenues generated by this means would be used to fund future software development, with the results made freely available to all citizens at zero or little cost.

While the techno-yuppies usually portrait the Internet as a laissez-faire utopia created out of thin air, the hackers who actually provided the skill and labor for its gestation are well aware that the construction and initial running costs of the Internet was funded with public money, first through the Advanced Research Project Agency (ARPA) and later through the NSF (National Science Foundation).

Sometimes the difference in outlook between the techno-yuppies and the hacker community becomes manifest, as when Stewart Brand – organizer of the first hacker conference – in 1995 was invited to the annual meeting of the conservative Progress and Freedom Foundation:

Brand patiently waited out countless denigrations of government, relentless rhetorical slander, until at last he broke out: “What about the G.I. Bill, which paid its way in four years and has been a pure profit ever since? [What about] ARPA and computers, ARPA and the Internet, and ARPA and God knows what else?” He wasn't invited back this year. (Hudson 1996)

Neither do most hackers share the techno-yuppies view of the Internet as a lawless environment where supervision and control is both undesirable and impossible. Instead, there is growing concern in the hacker community that the increasingly commercial nature of the World Wide Web and the robber-baron capitalism of Internet junk mailers are overgrazing their beloved digital commons. After an interim where technological fixes (e.g. cancel 'bots [7]) were implemented and found to be inadequate, stronger regulation of abuses from both responsible service providers and governments has been called for (Seminerio and Broersma 1998). Other hackers have already abandoned the present-day Internet as a great experiment which has now turned sour, and has embarked upon the task of creating “Internet 2”, which among other things, is being designed with much better mechanisms for stopping abuses and enforcing policy.

In general, hackers strive to free themselves from the capitalist mode of productions. Hackers therefore tend to be associated with universities or projects that have no paying client. If necessary, hackers may at times do consulting “for money” just to make ends meet, but their priorities are clearly to earn a living in order to create beautiful technology, not the other way around. When interviewed about this, several hackers of my personal acquaintance cite an essay by Alfie Kohn entitled Studies Find Reward Often No Motivator: Creativity and intrinsic interest diminish if task is done for gain (Kohn 1987) as explanation for their motivation and lifestyle.

The independence of the typical techno-yuppie is a more elusive thing. As noted by Greenbaum (1995) and Barbrook and Cameron (1996) techno-yuppies are usually well-paid and also have considerable autonomy over their pace of work and place of employment. But they are also tied by the terms of the consultancy contracts they enter into and the fact that they have no guaranty of continued employment beyond the expiration date of their assignment.

Deconstructing Software

Joseph Weizenbaum, we already have noted, did not like hackers:

To hack is, according to the dictionary, “to cut irregularly, without skill or definite purpose; to mangle by or as if by repeated strokes of a cutting instrument”. I have already said that the compulsive programmer, or hacker as he calls himself, is usually a superb technician. It seems therefore that he is not “without skill” as the definition will have it. But the definition fits in the deeper sense that the hacker is “without definite purpose”: he cannot set before him a clearly defined long-term goal and a plan for achieving it, for he has only technique, not knowledge. He has nothing he can analyze or synthesize; in short, he has nothing to form theories about. His skill is therefore aimless, even disembodied. It is simply not connected with anything other than the instrument on which it may be exercised. His skill is that of a monastic copyist who, though illiterate, is a first rate calligrapher. (Weizenbaum 1976)

As the real target in Weizenbaum's pamphlet “Computer Power and Human Reason” is instrumental rationality (the belief that because a task is technically feasible, it should be performed), it is only fitting that he includes an attack on a community who apparently showed little restraint in embracing technology for technology's sake.

But the hackers that Weizenbaum observed and wrote about (the residents of the 9th floor at MIT Tech Square in the early-to-mid-seventies) were prolific, productive and creative implementers. They designed and constructed computer networks, timesharing operating systems, workstations and computer languages. With that in mind, I find Weizenbaum's statement about the hacker's superior technical skill linked to a lack of purpose and aim intriguing. To what extent is Weizenbaum's observation about hackers being “aimless” correct? Is there no sense or purpose to the software artifacts created by hackers?

Unfortunately, when reviewing a computer artifact, it is very seldom one has access to authoritative material that asserts the purpose or aim behind its creation. Even for commercial products, where advertising copy sometimes purport to gives such information, one finds that it usually is just blurb created by the marketing department.

Nevertheless one has the computer artifacts themselves. I will attempt to “read” these artifact in order to deconstruct some original design decisions as well as the aim and the purpose of the project. Doing this is not without its pitfalls, as my personal preferences and experiences as a user of such artifacts doubtless will mediate the result, but with the exception of being present as an observer during the entire gestation process, I can think of no better method to uncover the aims and purposes behind computer artifacts.

First, let us tabulate a list of software artifacts rooted within the hacker community with their corresponding mainstream counterpart to see if some pattern emerges. While the list to some extent may be viewed as arbitrary, my criterion for inclusion on either side is that the artifact should be fairly generic, widely deployed, or popular by its user community.

Software Roots
  Within the hacker community Outside the hacker community
Network Infrastructure Internet, SMTP, sendmail ISO/OSI, SNA, X.400
Programming Languages Lisp, C, C++, Perl Cobol, Ada
Multimedia Formats HTML, XML Acrobat/PDF
Operating Systems ITS, Unix, Linux VMS, MVS, MS Windows/NT
Window Systems X.11 MS Windows
Text Editors emacs, vi MS Notepad, MS Word
Typesetting TeX, LaTeX MS Word

Software constructed by hackers seem to favor such properties as flexibility, tailorability, modularity and openendedness to facilitate on-going experimentation. Software originating in the mainstream is characterized by the promise of control, completeness and immutability.

Most of the artifacts originating outside the hacker community are the product of corporate entities. It is, however, interesting to note that the two most dysfunctional and misengineered (at least, in this author's opinion) – the Cobol programming language and the ISO/OSI network infrastructure – originated from semi-democratic processes with the best intentions and ample room for user participation. The team that created Cobol set out to create a programming language that was intended to be more “user-friendly” than any other computer language – including the quality of being readable by non-programmers. And the ISO/OSI network infrastructure was designed by networking experts and users in concert. Through an elaborate system of meetings, discussions and voting, all concerns was addressed and all opinions was listened to by the committee, before the design decisions were meticulously documented in a series of official documents.

Looking at some of the designs in more detail, it seems clear that the creators of the emacs text editor understood well its potential applications. The range of problems and environments in which an editor might be used was well beyond what could be imagined by its inventors. To deal with these unpredictable situations, the editor has its own embedded production-strength Lisp programming language, which users can modify as necessary. This particular feature has been well exploited by users to create an impressive range of applications needing a powerful text manipulation tool - from software development systems and mail-readers, to desktop publishing. To save needless duplication of effort, literally thousands of ready-made Lisp plug-in modules for emacs created by communal effort have been assembled and can be downloaded from the Net. These can be used as-is, or act as starting points for new modifications.

For typesetting complex text and graphics, the hacker community has provided TeX. TeX works seamlessly with emacs (and most other text editors should the user have other preferences), and again provides almost infinite tailorability and flexibility. The built-in rule set (which may be overridden when required) knows a number of important typographical rules and provides output that is pleasing to the eye and adheres to generally accepted typographic conventions.

Contrast emacs/TeX to Microsoft Word, a text editor and typesetter rolled into one package. While the latest offering (Word '97) seems to be crammed with all sorts of “features” [8], for me it seems to be impossible to adapt it to do what I really want it to do. Even very simple adaptations, such as changing key bindings to get a consistent set of interfaces when switching between tools and environments, seems to be impossible. The only possible reading of this design decision is that Microsoft does not want me to switch between tools and environments, but prefers that I remain locked in their proprietary world.

Having purchased and started to use a product I do not like and do not want to own (Microsoft Word) in order to fulfill a client's expectations, I have now also resigned myself to choosing one of two equally unpleasant options: Either to rely on the mostly immutable built-in rule-set and deliver sub-standard looking documents – or spend a disproportionate amount of my time to layout the document manually. To configure the typesetting portion of Word to automatically and consistently provide professional looking typography is in my opinion beyond the capabilities of the tool.

Microsoft Word is by no means unextensible. It comes with a powerful macro and scripting capability and permits the use of Microsoft's Visual Basic as an extension language. It is reasonable well integrated with the other products in Microsoft's Office family (Excel, PowerPoint, Access and Outlook), and Microsoft's OLE (Object Linking and Embedding) provides a means to integrate Word (and the other Office products) with third party applications. This extensibility does, however, not necessarily imply tailorability. The property of extensibility may be interpreted as the expression in software of an imperialist strategy of assimilation and conquest [9]. The property of tailorability is one where the artifact is open to yield to user and environmental requirements.

Another interesting comparison may be made between HTML (Hyper Text Markup Language) and the Adobe Acrobat PDF format. Both are essentially vehicles for presentation of hyperlinked multimedia documents across distributed systems. HTML imposes no restrictions on how the markups are interpreted and presented. There are some conventions and guidelines, but the final decisions are essentially left to the implementers, which means that it is possible for hackers to implement any personal preferences they may have, including those who substitute plain text for graphical elements to cater for visually disabled users. By comparison, the Adobe Acrobat PDF format is designed to re-create an excruciating exact facsimile of the original document, and allow very few deviations from this. Further, the HTML system makes the source code of the document available [10] to the user, to be studied, admired, copied and modified. Nothing like this is possible with the Acrobat/PDF, which touts the immutability of the “original” as a major feature.

Similar readings may be made for all items in the list above, and for many similar pair of items originating respectively inside and outside the hacking community [11].

Another observation that emerges from studying the table above is the attitude towards the end user (i.e. a user with no inclination towards adapting, extending or developing computer artifacts). None of the artifacts originating within the hacker community has the property commonly referred to as “user-friendly”. It almost seems as if the hacker attitude towards end users can be summed up in: “Since it was hard to write, it should be hard to use.”

A case in point is emacs: Emacs was originally developed on the ITS operating system running on the 36-bit DEC-10 computer in 1975. More than twenty years later, it is still almost universally loved as the “one true editor” by hackers all over the world (there is a contingent of ascetics who considers emacs to rich too their taste, and sticks to vi). But the apparent attraction of emacs lies is in its tailorability and flexibility – not its user interface. It has been adapted to work on every conceivable platform, but even in windowing environments, it more or less present itself as a command driven text editor of the glass-TTY area.

Several attempts on my part to turn end users on to emacs, and the subsequent discussions about the merits and failings of the program made it painfully apparent that the majority of end users despise the emacs editor and environment with the same fervor as I love it. In their opinion, and in particular when compared to Microsoft Word, it is considered utterly and totally user-hostile. They do not like the “old-fashioned” command driven interface, the “dull” look & feel, and the rather steep learning curve required to actually attain the tailorability and flexibility I had promised them.

Returning to Microsoft Word, it seems obvious that from the outset, “user-friendliness” must have been one of the most prominent aims of its design team: It was one of the first substantial office applications to provide a graphical user interface – and ousted WordPerfect from its spot as the world's best-selling office application on that strength alone. Now, Word '97 even comes with a company of nine animated cartoon characters who dispenses friendly advice and encouragement to novice users. Shown below is “Power Pup”.

Power Pup

Incidentally, while emacs do not include cute cartoon characters, it can be tailored to be just as mouse-intensive as Microsoft Word – if this really is what the user wants to do. One of the reasons it does not behave like that by default is because the text editor is one of the programs that receives the most intensive use in many computing environments. Having a mouse-driven (rather than a command-driven) interface may increase the risk of the user being afflicted by stress related problems such as carpal tunnel syndrome. What may be perceived as “user-friendly” (a mouse-intensive, slick, direct-manipulation user interface) at one level, turns out to be user-hostile at another.

To understand some of the implications, let us again compare hacking to industrial software production:

Commercial software is generally produced by teams of computer workers churning out the various components (look & feel, functions, database administration, low level interface, etc.) to a pre-set list of specifications. In short, the mode of production is not dissimilar to piecework performed by the metal workers described in Taylor (1911). As with the metal workers, it is unlikely that the computer workers themselves or anyone they actually know will make extensive use of what he creates. Even if he or she is a user of the actual product the component is a part of, it may be just as difficult to grasp the role of the component in the finished product as it is for a car owner who also is a metal worker to appreciate how the cogwheel he has machined fits into his car. As a result, the functional properties and qualities of the finished artifact are of little concern to the worker. His or her only purpose is to satisfy the specifications for the component he or she is commissioned to make. Sometimes the myopia this causes lead to serious errors of judgement [12].

A hacker, on the other hand, does not perform well producing piecework software components based upon pre-set specifications. As we have seen, his preferred working environment is a communal setting very much like that of an artisan or craftsman engaged in pre-industrial production.

Like the workers in programming teams, hackers make use of the components of others. But there is a difference. While industrial ideal is to treat such components as “black boxes”, of which nothing need to be known except how to interface to it – hackers generally require access to the artifact's interior and want to understand all parts of the systems they work on, including the components contributed by others.

This does not mean that hackers are oblivious to such staple engineering practices as object oriented programming and information hiding. These and other sound engineering practices are routinely employed when hackers create complex software systems. But the difference between the hacker's approach and those of the industrial programmer is one of outlook: between an agoric, integrated and holistic attitude towards the creation of artifacts and a proprietary, fragmented and reductionist one.

The Cathedral and the Baazar

While there exists a number of studies of hackers as a political, sociological and cultural phenomenon, I know of only one that (to some extent) describes the hacker as a programmer. This is the autobiographical essay The Cathedral and the Bazaar by Eric S. Raymond that was widely circulated as a text file among hackers after first being presented on the Linux Kongress in Würtzburg om May 27, 1997. It has since been published as part of a hardcover book with the same name (Raymond 1999).

The topic of the essay is to give a first hand account of Raymond's experiences in a project that involved adding support for the IMAP mail transfer protocol to an existing POP3 client (popmail by Carl Harris). However, its main motivation seems to be to promote a model of software development (referred to by Raymond as “the bazaar”) as an alternative to “the FSF's [i.e. the Free Software Fondation's] cathedral-building development model” (p. 39)[13].

Having a background in software engineering, I was struck by the similarity between the “bazaar” prescribed by Raymond, and the system development methods suggested by a number of European information system developers from the mid-eighties (e.g. Floyd 1989, Mumford 1995) as an alternative to the waterfall model. The basic ideas (rapid prototyping, iterative development, and strong user participation) are similar.

I released early and often (almost never less than every ten days, during periods of intense development, once a day). (Raymond 1999, p. 46)

One interesting measure of fetchmail's success is the sheer size of the project beta list […] At time of writing it has 249 members and is adding two or three a week. (ibid., p. 46)

Users are wonderful things to have, and not just because they demonstrate that you are serving a need, that you've done something right. Properly cultivated, they can become co-developers. […] Given a bit of encouragement, your users will diagnose problems, suggest fixes, and help improve the code far more quickly than you could unaided. (ibid., p. 36)

Given enough eyeballs, all bugs are shallow. […] although debugging requires debuggers to communicate with some coordinating developer, it doesn't require significant coordination between debuggers. Thus [debugging] doesn't fall prey to the same quadratic complexity and management costs that make adding developers problematic. (ibid., p. 43)

But, as evident by the four quotes from Raymond's essay given above, there are also some important differences.

Firstly, what distinguishes the Raymond's preferred method for software development prescribed from methods such as STEPS and ETHICS is the absence of formalism in the former. The absence of formalism may partly caused by Raymond lack of scientific training, but it is also obvious that Raymond regards formalism in method as demotivating.

Secondly, levering on modern tools for automatic system update and the Internet as an infrastructure for user/developer contact, Raymond speeds up his development cycles to a frenzy, co-opts his users as debuggers/developers, and indiscriminately adds everyone who wants to participate to the project beta list. This is different from the carefully metered out development cycles, the clear division of roles between users and developers, and the representative system for user participation, that figures prominently in both the STEPS and ETHICS methods.

Thirdly, both the developers' and the users' desire for participating in the endeavor is more or less taken for granted in STEPS and ETHICS. Raymond acknowledges that securing participation from all parties may pose a problem, and argues that the project coordinator need to have some of his/her focus on user and developer motivation, take certain steps to ensure it, and needs to possess personal qualities in this area.

Fourthly, STEPS, ETHICS and similar models are presented as universal approaches that can be used regardless of circumstances (I doubt whether this actually is true, but that discussion is beyond the scope of this paper). Careful reading of Raymond's paper makes it fairly clear that hacking as an approach to constructing software artifacts should not be considered universally applicable. The success of the project as described by Raymond seems to depend on at least three pre-conditions being present:

  1. The projected system must fill an unfilled personal need for the developer;
  2. The project need to secure user participation and maintain continued user support; and,
  3. The project coordinator/leader must have good interpersonal and communication skills.

The second pre-condition implies that hacking is not an applicable method when developing an information system “from scratch”. Since hacking does not involve formal requirements or system specifications, there will – at that point – be little to attract or interest the users. Hence, if the task at hand is to create a new system “from scratch” one should not consider hacking as a viable method for software creation, but rely on more conventional methods for system creation.

However, if some previous system exists that may be used as the starting point for developing a new system that eventually will replace it, or if the system development project has evolved to the point where prototypes of the projected system are sufficiently functional to interest users, hacking may be viewed as an alternative method for system development. Given that the right pre-conditions exist, hacking as a method for system development may result in a better system.

The pre-conditions listed above do not stipulate that hacking only works in a computer underground setting, nor does it limit the applicability of this method to the production of “free” software. Also, hacking is not an all-or-nothing proposition. A project may well start out being developed along any number of traditional methods, and then switch to the hacker approach when prototypes or early versions have evolved to the point where hacking becomes viable.

Looking around, I find that hacker-like approaches to software development are adopted in environments where one would least expect it.

For Microsoft, many customers are becoming debuggers as “beta” versions of new products are distributed in massive quantities (literally tens of thousands of copies) on the Internet. Microsoft has also developed closer communication channels between users and developers by having some of their developers participate in on-going discussions about their products on the Internet [14].

Netscape has gone even further down this route. By making the source code of its Navigator Internet browser open and freely available, Netscape is essentially gambling on hacking as a method to making it a superior product.


So far, hacking as a method for construction of information systems and software artifacts has been precluded from serious study and consideration. The term itself is also poorly understood, surrounded by much prejudice, folklore and mythology. Part of the confusion stems from attempts to hi-jack the term by a large number of special interest groups ranging from digital vandals to neo-classical economic liberalists. It is, however, possible to see through the confusion and excess baggage, and what remains is a community sharing an attitude to and method for construction of computer artifacts that has been consistent since the 1960s.

Hacking as a method for system development originated as a grass-roots reaction to attempts to impose an industrial mode of production on the development of software. The qualities emphasized by the implicit and explicit ideologies of this community result in the production of artifacts whose quality and usability characteristics are different from those gestated through an industrial mode of production.

This community has successfully created a number of usable and unique software artifacts – ranging from text editors to the Internet. Lately, large corporate entities such as Microsoft and Netscape have started experimenting with hacker-like approaches in areas such as quality assurance and user/developer communication. Still, hacking as a method for creating information systems and software artifacts has received virtually no attention from the scholarly community interested in methods for system development.

My belief is that “hacking” deserves to be put on the map as a viable method for the creation and construction of information systems and software artifacts. It should be studied alongside other system development methods, and practitioners in the field of system development should be aware of its applicability and able to take advantage of its “bag of tricks” when appropriate.


First, thanks to Eline Vedel, for encouraging me to write this piece in the first place, and for being available to discuss it at various points on the way. Also thanks to the participants at the 1997 Oksnøen Symposium on Pleasure and Technology, where an earlier and shorter version of the paper was presented, and to Rick Bryan, Ole Hanseth, Haavard Hegna, Arne Maus and Eric Monteiro, who took the time to read the draft and who provided stimulating critisism, discussion and suggestions.

Any errors in fact or logic that remain, as well as the opinions expressed are, of course, only my own.


Witness, for instance, the politics implicit in The Whole Earth Catalog and the activism of its editor/publisher, Stewart Brand.
The name is a pun on the IBM CTSS – Compatible Time-Sharing System.
The term “free software” in this context denotes four levels of freedom: 1) That it is free of any restrictions that limits its use and application; 2) that it is freely distributable; 3) that its freely portable between different operating platforms; and, 4) that the source code is available, so users are free to modify and tailor the software.
GNU is a recursive acronym standing for GNU's Not Unix. It designates an ongoing effort from 1985 within the hacker community to create a body of “free software” (see above) at least as complete and rich as the Unix operating system and its associated utilities.
It has been claimed that this particular design decision was due to the ideas of RAND Corporation scientist Paul Baran (Baran 1964), who had a strong interest in the survivability of communication systems under nuclear attack. Baran argued strongly for distributed networks and the use of packet messaging as means to achieve this. Bob Taylor, who as director of ARPA's Information Processing Techniques Office oversaw the formation of the ARPAnet, has denied that the design of the ARPAnet was motivated by anything related to supporting or surviving war. According to Taylor, the driving force behind the ARPAnet was simply to enable scientific laboratories all over the US to share computing resources (Hafner and Lyon 1996).
“cybercrud /si:'ber-kruhd/ (coined by Ted Nelson) Obfuscatory tech-talk. Verbiage with a high MEGO [My Eyes Glaze Over] factor. The computer equivalent of bureaucratese” (Raymond 1991). The irony of using a jargon word of his own invention (“cybercrud”) to protest obfuscatory tech-talk probably never occurred to Nelson.
A cancel 'bot is an automatic process which monitors messages sent over Usenet and deletes those who meet certain pre-set algorithmic criteria.
Microsoft Word will automatically change a lowercase “i” to a capital one. This is indeed a nice feature when one is writing English text. It is not so convenient that Word continues to make this correction even after you have changed the language of your document to something else.
The imperialist underpinnings of Microsoft Windows is apparently so strong that it is the norm for Windows programs to behave like conquerors. When you install a new program on a Windows machine, it will invariably claim whatever it can gain access to for itself. For instance, when installing Microsoft Word on a computer that already was running Adobe Framemaker, I found that ownership of all pre-existing Framemaker documents was automatically and expediently transferred Word and effectively rendered useless (Word isn't capable of opening them). The message was loud and clear: Thou shalt have no other word processors before Microsoft Word.
Note that when the Java language, rather than HTML, is used for Web presentations, the intrinsic access to the source code goes away. This is one of the reasons the hacking community has been split over whether Java should be considered an acceptable tool or not. Also, the chief architect of Java – James Gosling – has a long and stormy relationship with the hacking community. He was one of the original hackers at Carnegie-Mellon University, and also the main programmer behind the move of the hacker's favorite editor – emacs – from TECO/ITS to C/Unix. As the hacking community migrated from DEC mainframes to Unix, gosmacs, as Gosling's version of emacs became known became their favorite on that platform. Gosling then enraged the hacking community by retroactively imposing his own copyright on his (originally assumed by fellow hackers to be copyright-free) implementation and selling the exclusive rights of it to a commercial software company.
There are also some items whose status may be disputed. What about the Apple Macintosh and the IBM PC for instance? Hackers universally despise both machines.
According to Bruce Horn, who was part of the small team that created the Apple Macintosh, most members of the team (with the notable exception of Steve Jobs) were hackers in the original sense of the word. However it was also stipulated (by Jobs) that the machine should not be extensible and that the inner workings it was to be hidden from any user/programmer that had not entered into contractual relationship with Apple to become an “official” developer. This effectively prevented it from becoming a hacker's favorite. And, of course, Apple's decision to claim copyright on the “look & feel” of the Macintosh (which hackers believed was a mishmash of ideas that were common knowledge and had been part of the industry for years) - did not help to endear Apple and the Macintosh computer to the hacker community.
As for the IBM PC: The hardware originated outside the hacker community, in an IBM laboratory located in Boca Raton, Florida. Nevertheless, it sported nevertheless a number of hacker-favored properties, such as openendedness, accessibility and flexibility. What the original design lacked, however, was any type of networking support. It was really a personal computer, and therefore unsuitable for the hacker communal style of working. The hackers stuck to their DEC mainframes, or to networked workstations of their own design.
In a recent development of a large back-office computer system by a large and prestigious consultancy known for its strict belief in “specification” and “method” as means to superior software quality – the system as constructed turned out to be useless for its intended purpose because the coders had elected to use a data type known as “double precision floating point” to represent the decimal fractions the specifications called for.
Now, while “double precision floating point” is the appropriate data type for decimal fractions in contexts such as physics and engineering, it is a disastrous choice for data that at some point are to be processed by transaction systems. Despite its very promising name, “double precision floating point” is a data type that, on binary computers, is unable to hold an exact representation of many two-digit decimals fractions (e.g. 0.30). This leads to minuscule rounding errors that play havoc with the embedded controls of any properly designed transaction system.
One strange thing about Raymond's essay, is that it seems widely believed that the “cathedral” refers to development of proprietary software, while the essay is actually an attack on the software development model used by Richard M. Stallman's Free Software Foundation to develop “free software”.
To what extent it is possible to reap the benefits from this approach while still refusing to hand out source code is unclear. In the University of Oslo multimedia lab, where we among other things work on developing protocols and services for the next generation Internet (Internet 2), we found ourselves being stymied in our attempts in using Microsoft operating systems and network browsers for this purpose. Without source code available, it was just impossible to do the necessary modifications to the communication stack and device drivers to make the equipment work in an Internet 2 environment.


Adams, Scott (1996): The Dilbert Principle; HarperBusiness.

Baran, Paul (1964): On Distributed Communications Networks; IEEE Transactions on Communications Systems, vol. 12.

Barbrook, Richard and Andy Cameron (1996): The Californian Ideology; Hypermedia Research Center, University of Westminster, Web document.

Braverman, Harry (1974): Labor and Monopoly Capital; Monthly Review Press.

Budde, Kautz, Kuhlenkamp and Züllighoven (1992): Prototyping; Springer

Cerf, Vint (1997): Vint Cerf: Father of the Internet, interview conducted by technical editor Leo Laporte; Broadcast by MSNBC: The Site, June 3.

Dahlbom, Bo and Lars Mathiassen (1997): Computers and Design in Context; NCC Blackwell.

Greenbaum, Joan (1976): Division of Labor in the Computer Field; Monthly Review 28:3.

Greenbaum, Joan (1995): Windows on the Workplace; Monthly Review Press.

Floyd, Christiane (1989): STEPS to Software Development with Users; Proceedings of ESEC 1989, University of Warwick, Coventry, England, 11-15 September.

Hafner, Katie & Matthew Lyon (1996): Where Wizards Stay Up Late: The Origins of the Internet; Simon & Schuster.

Hudson, David (1996): Digital Dark Ages; SF Bay Guardian; November 6.

Kelly, Kevin (1994): Out of Control; Addison-Wesley.

Kohn, Alfie (1987): Studies Find Reward Often No Motivator: Creativity and intrinsic interest diminish if task is done for gain; Boston Globe, January 19.

Kroker, Arthur and Michael A. Weinstein (1994): Data Trash: the theory of the virtual class; New World Perspectives.

Lafayette, Lev [p.k.a. Anderson, Anthony Jon Lev] (1993): Technology and Freedom; unpublished, online copy.

Leonard, Andrew (1994): Hot-Wired. Wired Magazine and the Battle for the Soul of the Electronic Frontier; San Francisco Bay Guardian, March 2, pp. 17-20.

Levy, Stephen (1984): Hackers: Heroes of the Computer Revolution; Anchor Press/Doubleday.

Meyer, Gordon R. (1989): The Social Organization of the Computer Underground; Northern Illinois University Master Thesis, online copy.

Mumford, Enid (1995): Effective Systems Design and Requirements Analysis. The ETHICS Approach; Macmillan Press, 1995.

Negroponte, Nicholas (1994): Digital Expression; Wired 2:12.

Negroponte, Nicholas (1995a): Being Digital – A Book (p)review; Wired 3:02.

Negroponte, Nicholas (1995b): Being Digital; Hodder & Stoughton.

Nelson, Ted (1974): Computer Lib / Dream Machines; Mindful Press.

Pam, Andrew (1995): Where World Wide Web Went Wrong; Proceedings of the Asia-Pacific World Wide Web '95 Conference.

Pam, Andrew (ed.) (1997): Xanadu FAQ; Web document.

Raymond, Eric (1991): The New Hacker's Dictionary; The MIT Press.

Raymond, Eric S. (1999): The Cathedral and the Bazaar; O'Reilly.

Reich, Robert (1991): The Work of Nations. Preparing Ourselves for 21st-Century Capitalism; Random House.

Rheingold, Howard (1993): The Virtual Community: Homesteading on Electronic Frontier; Addison-Wesley.

Rossetto, Louis (1993): Why Wired; Wired 1:01, January.

Rosteck, Tanja S. (1994): Computer Hackers: Rebels With a Cause; Concordia University, Montreal, Quebec, Dept. of Sociology and Anthropology, Honours Seminar - Soci 409/3, Submitted April 27.

Taylor, Frederick Winslow (1911): The Principles of Scientific Management; series of articles published in “The American Magazine” March-May.

Seminerio, Maria and Matthew Broersma (1998): Usenet junk e-mail could swamp the system Friday; ZDNet Web dokument, April 3, 1998.

Salus, Peter H.(1997): How many Bits; Matrix News 7:4, pp. 7-8.

Stallman, Richard M. (1985): The Gnu Manifesto; The Gnu Emacs Manual, online copy.

Turkle, Sherry (1984): The Second Self: Computers and the Human Spirit; Simon and Schuster.

Weizenbaum, Joseph (1976): Computer Power and Human Reason; Freeman and Company.

Winner, Langdon (1995): Peter Pan in Cyberspace: Wired Magazine's Political Vision; Educom Review 30:3, May/June 1995.

Creative Commons License Invited paper for the 1997 Symposium on Pleasure and Technology, Sausalito, CA, May-5-9, 1997, a slightly shorter version was published in First Monday, Vol. 4:2, February 1999.
Copyright © 1997, 1999 Gisle Hannemyr. Some rights reserved.
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License.

[English] [Norwegian]
[Write] [Home] [Up] [Previous] [Next]