Summary of The Whale and the Reactor by Langdon Winner (pp. ix-39, 99-200).
Winner states implicitly that he wishes to add his book to a surprisingly
short list of works that can be
characterized as "philosophy of technology" (which includes Marx and
Heidegger). His book will deal
primarily with the political and social aspects of this philosophy,
pertinent since as he notes the world is
changing because of tech., no longer comprised of national entities--a
global economy, etc. In this
context he will also look at language and determine how adequate it
is presently for handling the state of
the art high tech world. His ultimate and ever present question
being asked throughout his book is,
"How can we limit modern technology to match our best sense of who
we are and the kind of world we
would like to build?" (xi), since the "basic task for a philosophy
of technology is to examine critically the
nature and significance of artificial aids to human activity" (4).
Winner makes a crucial distinction:
"technologies are not merely aids to human activity, but also powerful
forces acting to reshape that
activity and its meaning" (6). Of course, the social arena is
directly and profoundly influenced by tech.
W cites a recent court case from San Diego where, as in Los Angeles,
virtually everyone travels
everywhere by car, of "a young man who enjoyed taking long walks at
night through the streets of San
Diego and was repeatedly arrested by police as a suspicious character."
A criminal court ruled,
however, that "Merely traveling by foot is not yet a crime" (9).
Yet it is important not simply to see tech
as the "cause" of all world "effects." Rather, "as technologies
are being built and put to use, significant
alterations in patterns of human activity and human institutions are
already taking place" (11). All the
same, tech developments are absorbed into the ever mutating process
of human activity so that they
some to be taken for granted and are integrated into our view of what
is natural and/or inherent in the
world--they become "second nature," as Winner, taking after Wittgenstein,
terms it, they become part
of our "forms of life" (11).
In this context we can best appreciate certain crossroads or perhaps
better to say thresholds we are
facing, such as genetic engineering and the possibility of founding
human settlements in outer space.
These "call into question what it means to be human and what constitutes
'the human condition'" (13).
How do such developments change the fabric of everyday existence?
Chapter 2: "Do Artifacts Have Politics
W asks, Can technology "embody specific forms of power and authority"
(19). He reviews the ideas
of Kropotikin, Morris Hayes, Lillienthal, Boorstein and Mumford on
his way to answering his question.
For example, Hayes states that "deployment of nuclear power facilities
must lead society toward
authoritarianism" because of safety concerns (19-20). W believes
"that technical systems of various
kinds are deeply interwoven in the conditions of modern politics [and
further, that the] physical
arrangements of industrial production, warfare, communications, and
the like have fundamentally
changed the exercise of power and the experience of citizenship" (20).
Indeed, "human ends are
powerfully transformed as they are adapted to technical means" (21).
Artifacts "contain political properties" in two ways: 1) via "invention,
design, or arrangement of a
specific technical device or system that becomes a way of settling
an issue in the affairs of a particular
community." 2) via "'inherently political technologies', man-made
systems that appear to require or to
be strongly compatible with particular kinds of political relationships."
Here W means the term politics
to stand for "arrangements of power and authority in human associations
as well as the activities that
take place within those arrangements"; while the term technology is
meant to stand for "all of modern
practical artifices" (22).
Examples cited in the above regard include the highway overpasses at
Jones Beach, designed by
Robert Moses, which are "extraordinarily low" so that poor people cannot
gain access to the beach
because their only method of transportation there would have to be
by bus yet a bus cannot
get under the overpasses. Moses wanted to "build a particular
social effect" (22-23). And in fact he
made sure that a rail line would not be built to go to the beach.
Similarly, Baron Haussmann engineered broad Parisian thoroughfares to
make sure that there could be
not street fighting of the sort that occurred in 1848. This was
an early version of the kinds of planning
that informed the construction of college campuses in the 1970s after
the trauma of student rioting that
began in the 1960s.
Another example is that of McCormick's reaper manufacturing plant in
Chicago in the 1880s, when he
brought in kinds of modernization of the plant as a way to "'weed out
the bad element among the men',
namely, the skilled workers who had organized the union local" (24).
In these examples we "see the importance of technical arrangements that
precede the use of the things
in question. [A] given device might have been designed and built
in such a way that it produces a set of
consequences logically and temporally prior to any of its professed
uses" (25). This perception also
helps us to cope with phenomena like the organized movement of handicapped
people in the 1970s
[which] pointed out the countless ways in which machines, instruments,
and structures of common use .
. . mad it impossible for handicapped persons to move about freely,
a condition that systematically
excluded them from public life" (25). Some technologies, of course,
are politically motivated in the
sense that they are profit driven, such as the mechanical tomato harvester
which instigated the
hybridization of tomatoes that are tasteless but which are not susceptible
to being bruised by the
machinery. Workers here have been displaced, which means that
social relationships have been
altered, especially because the machinery requires highly concentrated
growing areas; thus the
landscape is altered and with it social and political entities.
Of course, increased profits in the hands of
a few means that these few are able to wield unusual political influence.
W sums this up when he writes: "The issues that divide or unite people
in society are settled not only in
the institutions and practices of politics proper, but also, and less
obviously, in tangible arrangements of
steel and concrete, wires and transistors, nuts and bolts" (29).
Thus it is perhaps important to device
flexible technologies that will not choose for us unalterably a form
of life.
Here then W reviews the history of thinking about this societal dynamic,
citing Engels, Plato, Marx,
Chandler, in that order, especially pointing out how specialized knowledge
of a technological nature
tends to be kept in the hands of a few, hence creating hierarchically
structured societies, and that
"characteristic of societies based on large, complex technological
systems [is the fact] that moral
reasons other than those of practical necessity appeal increasingly
obsolete, 'idealistic', and irrelevant"
(36). What then becomes of civil liberties in the high tech world,
when extraordinary dangers such as
the black market sale of plutonium to criminal elements require extraordinary
abrogations of civil rights
in order to prevent those sales?
Particularly the nuclear power industry, in fact, comes under W's attack.
He adds to the above idea
this one: "[A]dvocates of the further development of nuclear power
seem to believe that they are
working on a rather flexible technology whose adverse social effects
can be fixed by changing the
design parameters of reactors and nuclear waste disposal systems.
For reasons indicated above, I
believe them to be dead wrong in that faith. Yes, we may be able
to manage some of the 'risks' to
public health and safety that nuclear power brings. But as society
adapts to the more dangerous and
apparently indelible features of nuclear power, what will be the long-range
toll in human freedom?"
(39).
Chapter 6: "Mythinformation"
The "use of computers and advanced communications technologies is producing
a sweeping set of
transformations in every corner of social life" (99). This may
be called a computer "revolution," a term
that has been used in history in a variety of ways. Does this
technology constitute a revolution, truly?
What is a revolution? And what might it have to do with human
freedom? "To mention revolution also
brings to mind the relationships of different social classes.
Will the computer revolution bring about the
victory of one class over another? Will it be the occasion for
a realignment of class loyalties?" (101).
In any case, the computer industry engages in the most sweeping and
utopian pronouncements. For
example: "what water- ;and steam-powered machines were to the industrial
age, the computer will be
to the era now dawning. . . . Because 'knowledge is power', because
electronic information will spread
knowledge into every corner of world society, political influence will
be much more widely shared [and]
rule by centralized authority and social class dominance will gradually
fade away" (103). The grandiose
claims, "[t]aken as a whole, constitute . . . mythinformation: the
almost religious conviction that a
widespread adoption of computers and communications systems along with
easy access to electronic
information will automatically produce a better world for human living.
It is a peculiar form of
enthusiasm that characterizes social fashions of the latter decades
of the twentieth century" (105). And
this is typical of a technological breakthrough, which "provides an
occasion for flights of utopian fancy"
(106). W does also point out, conversely, that all myths contain
elements of truth, and he does not
wish to make light of the importance of computer tech. to human progress,
but does say that ordinary
people, while affected by it, do not benefit from it directly as do,
say, transitional corporations.
"Current developments in the information age suggest an increase in
power by those who already had a
great deal of power" (107). And of course those in power, who
most readily enjoy access to new
technology, use it to consolidate and increase their power--while computer
enthusiasts will insist that the
technology will further democracy. What will happen when (if)
the world is totally wired and
intraconnective? Will there not occur a pure(r) form of democracy?
W says that this "idea is entirely
faulty. It mistakes sheer supply of information with an educated
ability to gain knowledge and act
effectively based on that knowledge" 108-09). And, "[a]t times
knowledge brings merely an
enlightened impotence [where] one may know exactly what to do but lack
the wherewithal to act"
(109). Thus knowledge may not be power after all (as some like
Plato and Veblen have recognized).
Part of the problem with this idea in our contemporary society has
specifically to do with the fallacy that
"democracy is first and foremost a matter of distributing information"
(110). Indeed, "[p]ublic
participation in voting has steadily declined as television replaced
the face-to-face politics of precincts
and neighborhoods. Passive monitoring [of events through electronic
media] allows citizens to feel
involved while dampening the desire to take an active part" in political
society (111). Computer
utopianism in the political arena is a reprise of 18th and 19th century
notions that if only the masses
possessed the arms, they would rise up to seize power--yet this has
not at all always been the case.
W goes on to point out the danger of computer tech to civil rights and
privacy, because of what may
become a "benign surveillance," which will cause people to opt for
"passivity and compliance [as] the
safest route, avoiding activities that once represented political liberty"
(115). And W does allow that
the "linking of computers and telecommunications [will mean that] basic
structures of political order will
be recast. [U]ntil recently the crucial conditions
created by spatial boundaries of political societies
were never in question" (116-17).
Chapter 7: "The State of Nature Revisited"
The underlying question in this chapter is what is the relationship
of human action, specifically of
technology, to nature (built into this question is another one: what
is nature?)? W writes that no "artist,
no thinker, no political movement, no society has ever rested content
with the simple definition of nature
as 'the totality of all things'" (121). And indeed, "As the writings
of Hobbes, Locke, and Rousseau
clearly demonstrate, discussions of 'natural law', 'human nature',
and 'state of nature' are occasions for
the most extravagant theoretical fictions. [Yet these] conceptions
of nature have an astonishing power
to persuade," and in the last half century terms like "ecology," "ecosystem"
and "environment" embody a
"familiar quest for moral guidance" (122-23). Thinkers from the
past like Descartes, Newton, Hobbes,
Ophul, White, Leopold, Naess, Emerson, Marsh and Locke underwrite this
quest. In all this, again, we
find that the conception of nature is fluid. Empirical science,
all the same, as clearly proven that modern
technologies--e.g., the use of pesticides--is destructive of the environment.
W notes "how the moral
weight of predicted disaster provides an occasion for adopting a particular
interpretation of
nature--ecological theory--as a framework for understanding and judging
the works of modern society"
(129). W then discusses the difference between "shallow environmentalism"
and "deep ecology" and
"land ethic" (131 ff.). W then tries not so much to resolve the
discrepancies of these theories and
theorists as to move on to yet another point of view that may supersede
them, quoting Lukacs: "Nature
is a social category" (135). W concludes that an "ecosystem is
a utopia of sorts, doing well what
artificial structures do poorly [in] the ecosystem . . . unlike human
organizations, there are no
wrong-headed decisions to foul things up. Ecosystem change has
benign homeostasis as its telos; by
contrast, change in modern society produces chaotic disruptions that
never cease" (136). This
consideration is key to understanding how society may fashion economic
policy. W seems finally
equivocal: "Nature will justify anything" (137).
Ch. 8: "On Not Hitting the Tar-Baby"
Here W attacks the practice of risk assessment, which he sees as stemming
from Hobbes's ideas, as a
way of arriving at "norms to guide the moral aspects of scientific
and technical practice" (138). Arriving
at consensus, in this case it must be narrow, is questionable, in a
highly politicized process. But W
ultimately doubts risk assessment, and in doing so throws open the
question of how political, economic
and scientific language intersect not always for the better, when he
writes: "Questions that had
previously been talked about in such terms as the 'environmental crisis',
'dangerous side effects', 'health
hazards,' and the like were gradually redefined as qaustions of 'risk'.
The difference is of no small
importance" (142). What precisely is the notion of risk?
"The use of the concept of 'risk in business
dealings, sports, and gambling reveals how closely it is linked to
the sense of voluntary undertakings"
(145). This voluntarism sanctions the use of psychological complexity
to compound the difficulties
offered by at times "scientific uncertainty and the calculations of
risk/cost/benefit analysis" (145). Thus
W questions whether such risk assessments are accurate (even after
the option to engage in such
assessment, in the face of scientific evidence, has been exercised),
and risk takers tend to debunk
people who suffer anxiety over such evidence by calling them phobic
regarding technological
advancement; as a society, "we finally arrive at an unhappy destination--the
realm of invidious
comparison and social scorn" (147). Conversely, "[t]here is a
deep-seated tendency in our culture to
appreciate risk-taking in economic activity as a badge of courage"
(147). W makes clear that he is not
categorically deploring risk assessment, but he says that "certain
kinds of social interests can expect to
lose by the very act of entering [into the risk debate]. . . .
The root of this tendency lies, very simply, in
the way the concept of 'risk' is employed in everyday language" where
"risk" is not in the debate
substituted by other more tendentious terms like "danger," "peril,"
"hazard" and "threat," and in fact
"risk" becomes ever more legitimate as "[s]tandards of scientific certainty
are applied to the available
data to show how little we know about the relationship of cause and
effect as regards particular
industrial practices and their broader consequences" (149-50).
W proposes a middle ground that he
sees as more useful and specific. For instance, a "toxic waste
disposal site placed in your
neighborhood need not be defined as a risk; it might appropriately
be defined as a problem of toxic
waste [and automobile emissions] "might still be called by the old-fashioned
name, 'pollution' [etc.]"
(151). W allows as how, though, some situations can better be
subjected to risk analysis than others
like cancer and genetic engineering research.
Ch 9: "Brandy, Cigars, and Human Values"
W traces the history of the term value ("to be strong") as it makes
its way into social and political
thought and then into economics in the 18th and 19th centuries (Smith,
Ricardo, Marx). Nietsche
especially altered its meaning, usually employing it in the plural,
to mean "the sum total of principles,
ideals, desires" (157). Perry extended it to "the full range
of human interests" (157). W notes that the
"shift in the use of this term from an objective to a subjective meaning
is strongly linked to a change in
how we view our situation" [and another] consequence of this way of
talking and thinking is to exclude
much of what was formerly contained in traditional moral and ploitical
language" [which means that
there "is a loss of attention paid to shared reasons for action" (158-59).
In the context of risk
assessment, then, the fluidity of the concept of value means that we
are at a loss of an instrument for
"[weighing] various possibilities and [coming] to an intelligent choice"
unless we see social values as
mere "trade offs" (159).
Ch 10: "The Whale and the Reactor"
Here W acknowledges the dizzying impact of technology on the human psyche,
and weighs it against
the often frightening consequences of technology. (He is therefore
calling for rational limits on
technological research and development, in which tech needs to be seen
within a broader context. He
writes of the dangers of "technological somnambulism" [169 ff.]; we
often might better hesitate before
employing new technologies, to see better what possible harm they may
cause. He speaks critically of
our "religion of progress" [170 ff.] and of the possible loss of "qualities"
with new tech [172], and how
this is a self-nurturing process: "Technological development proceeds
steadily from what it has already
transformed and used up toward that which is still untouched" [174].
What, now, that we are involved
in changing "the gene pool" [174]? Where, now, do we store our
abundance of radio active waste?)
Even when we make decisions to institute new technologies, considering
their relative risks, even when
these decisions are based on the best available scientific evidence,
we may be making terrible
mistakes--as is often borne out be later scientific evidence that causes
revision of the earlier findings.
Unfortunately, what we often do in this circumstance, W says, is begin
to rationalize our original
decisions, rather than tearing down the edifice we have constructed
because it is now deemed to be not
safe; we begin to ask, "just how unsafe is it, then? is it safe
'enough'?" W's point is that unsafe is
unsafe. Period. "More and more the whole language used
to talk about technology and social
policy--the language of 'risks,' 'impacts,' and 'trade-off-smacks of
betrayal. . . . The excruciating
subtleties of measurement and modeling mask embarrassing shortcomings
in human judgment. We
have become careful with numbers, callous with everything else.
Our methodological rigor is becoming
spiritual rigor mortis" (176).