Abstract: Often, before they can learn something new, people
have to unlearn what they think they already know. That is, they
may have to discover that they should no longer rely on their
current beliefs and methods. This paper describes eight viewpoints
that can help people to do this.
1. Sticking to one's . . . uh . . . depth charges
Starting in the mid 1970s, the Swedish defense forces pursued
Soviet submarines lurking off the Swedish coast. Time and again,
the Swedes mounted large-scale anti-submarine searches that included
the dropping of grenades and depth charges and the detonating
of remote-control mines.
For example, in May and June 1988, there were nine live-fire engagements between the Anti-Submarine Warfare unit and suspected foreign submarines. The Swedish Defense Ministry reacted by giving commanders on the scene authority to decide when to open fire. On one occasion, "the prowler was detected and trapped in 282 feet of water . . . about 60 miles south of Stockholm," said a Ministry spokesman. The hunters opened up on the submarine with more firepower than Sweden had used previously. But the hunters lost contact with their prey amid the noise of exploding depth charges and underwater grenades, and the submarine apparently slipped away in the turbulence. "When we played back the tapes, we saw that the submarine was exactly where we thought it was," the spokesman explained. He said "it's probable" that the hunters hit and damaged the submarine although a search had failed to produce evidence of damage. "Our anti-submarine activities have continuously improved."
Dozens of such searches occurred every year, always during the
warmer months. Yet none of these searches ended with the capture
of a Soviet submarine. Only once, in October 1981, did the Swedes
actually capture a Soviet submarine, and in this instance, there
had been no hunt by the defense forces. Rather the submarine made
a navigation error and grounded on rocks along Sweden's southern
coast.
Some people theorized that the Soviets might be seeking spots
where they could hide during warfare. Others posited that the
Soviets might be testing their submarines' ability to evade detection.
Still others speculated that the Soviets might be probing the
Swedish antisubmarine defenses. The Soviets consistently denied
that their submarines had been anywhere near Sweden, but these
denials only reinforced the Swedes' suspicions. In 1982, in a
clear reference to the Soviet Union, the chief officer of the
navy, Rear Admiral Per Rudbeck, declared that "a foreign
power is preparing for war against us".
Sweden's ineffectiveness against the Soviet submarines, although
embarrassing, did not surprise them. No one really expected David
to defeat Goliath. The Swedes had never intended their navy to
command respect as a military power, whereas the Soviet navy was
renowned for its skill and technology.
Then in February 1995, Sweden's defense chief Owe Wiktorin told
a news conference that the Swedish navy had acquired new hydrophonic
instruments in 1992, and these had shown that minks give off sounds
similar to submarines. Earlier equipment had identified the sounds
as submarines, he said, but there may never have been Soviet submarines
lurking off the Swedish coast. "The intruding submarines
were not submarines but minks, or at least most likely minks."
Wiktorin said the Defense Ministry was certain that no foreign
submarines had intruded into Sweden's territorial waters since
1992; defense analysts were checking sound recordings of suspected
submarines from before 1992 to see if the sounds were really just
small animals like minks and otters swimming from island to island.
Eventually, Wiktorin reported: "There is overwhelming evidence
(technical, acoustical, and visual) that there have been five
foreign submarine operations on Swedish territory since 1981,"
including the Soviet submarine that ran aground in 1981.
That the navy might pursuing animals had been proposed as early as July 1987. After the navy had dropped depth charges and fired anti-submarine grenades unsuccessfully for almost three weeks in search of foreign intruders, Tero Harkonen, a Swedish seal expert, speculated that the anti-submarine hunt had been triggered by the play of young seals. "They can play, gush through the water, and even create foam on the surface," he explained. However, navy officials maintained that the navy had surrounded a foreign mini-submarine or another type of underwater vessel with nets. They declared that there had been many "reliable" sightings of suspected alien submarines and submarine activity in the area over the preceding two months, including air bubbles from a diver. The navy continued searching and dropping depth charges for another ten days before giving up the effort.
The Swedish Defense Ministry's recognition of error was partly
the result of a change in government: A different political party
had control in 1995. The recognition was also partly a result
of the dissolution of the Soviet Union. The Soviet Union had collapsed,
and its no-longer-so-secretive remnants no longer seemed capable
of the remarkable underwater feats that the Swedes had been attributing
to them. Indeed, during their 1995 review of earlier antisubmarine
hunts, the Swedes consulted Russian antisubmarine experts. The
Russian experts, the Swedes reported, agreed that the Swedes had
been pursuing submarines but denied that the submarines had been
Soviet ones.
This story illustrates three points. First, learning often cannot
occur until after there has been unlearning. Unlearning is a process
that shows people they should no longer rely on their current
beliefs and methods. Because current beliefs and methods shape
perceptions, they blind people to some potential interpretations
of evidence. As long as current beliefs and methods seem to produce
reasonable results, people do not discard their current beliefs
and methods (Kuhn, 1962). As Henry Petroski (1992: 180-181) put
it: "Technologists, like scientists, tend to hold onto their
theories until incontrovertible evidence, usually in the form
of failures, convinces them to accept new paradigms." Indeed,
the Swedish navy shows that people may adhere to their current
beliefs and methods despite very poor results. Even after two
decades of abject failure, the leaders of the Swedish navy continued
to construe their organization's failures as the logical result
of an amateurish defense force from a small country competing
against a highly sophisticated defense force from a large country.
Surprisingly perhaps, technical experts may be among the most
resistant to new ideas and to evidence that contradicts their
current beliefs and methods. Their resistance has several bases.
Experts must specialize and their specialized niches can become
evolutionary dead-ends (Beyer, 1981). Because experts' niches
confer high incomes and social statuses, they have much to lose
from social and technical changes. Expertise creates perceptual
filters that keep experts from noticing social and technical changes
(Armstrong, 1985). Even while experts are gaining perception within
their domains, they may be overlooking relevant events just outside
their domains.
Second, organizations make it more difficult to learn without
first unlearning. People in organizations find it hard to ignore
their current beliefs and methods because they create explicit
justifications for policies and actions. Also, they integrate
their beliefs and methods into coherent, rational structures in
which elements support each other. These coherent structures have
rigidity that arises from their complex interdependence. As a
result, people in organizations find it very difficult to deal
effectively with information that conflicts with their current
beliefs and methods. They do not know how to accommodate dissonant
information and they find it difficult to change a few elements
of their interdependent beliefs and methods. The Swedish sailors
who conducted the searches had been trained to interpret certain
sounds as a submarine and rising bubbles as a diver; they had
not been prepared for the sounds and bubbles made by animals.
A Swedish navy that had just spent three weeks dropping depth
charges and antisubmarine grenades in the belief that it had trapped
an intruder was not ready for the idea that it had been deceived
by playful young seals.
Tushman, Newman, and Romanelli (1986) characterized organizations'
development as long periods of convergent, incremental change
that are interrupted by brief periods of "frame-breaking
change." They said "frame-breaking change occurs in
response to or, better yet, in anticipation of major environmental
changes." However, even if abrupt changes do sometimes "break"
people's old perceptual frameworks, the more common and logical
causal sequence seems to be the opposite one. That is, people
undertake abrupt changes because they have unlearned their old
perceptual frameworks.
Third, unlearning by people in organizations may depend on political
changes. Belief structures link with political structures as specific
people espouse beliefs and methods and advocate policies (Hedberg,
1981). Since people resist information that threatens their reputations
and careers, it may be necessary to change who is processing information
before this information can be processed effectively. Thus, a
change in control of the Swedish government may have been essential
before the Defense Ministry could concede the possibility of errors
in the conduct of antisubmarine hunts. A change in control of
the Soviet Union may have been essential before the Swedes could
allow the possibility of Russian vulnerability or truthfulness.
Top managers' perceptual errors and self-deceptions are especially
potent because senior managers can block actions proposed by their
subordinates. Yet, senior managers are also especially prone to
perceive events erroneously and to overlook bad news. Although
their high statuses often persuade them that they have more expertise
than other people, their expertise tends to be out-of-date. They
have strong vested interests, and they know they will catch the
blame if current policies and actions prove wrong (Starbuck, 1989).
There is, of course, every reason for people to suspect that current
beliefs and methods are wrong. Not only do new discoveries convert
good current beliefs and methods into no-longer-good, but there
is normally no reason to trust that current beliefs and methods
ever were good. The QWERTY keyboard provides an on-going reminder
of the persistence of poor methods (Gould, 1986). Although C.
L. Sholes had reasons for placing the keys in particular positions,
he designed QWERTY for a machine that differed considerably from
modern typewriters. The widespread adoption of QWERTY was fostered
by a highly publicized contest between two typists in 1888. Frank
E. McGurrin, the typist who used QWERTY, won by a large margin.
But McGurrin had memorized the keyboard and could type without
looking at his fingers whereas his competitor had to look at his
keyboard in order to find the right keys.
2. How People Can Foster Unlearning
"There is not the slightest indication that [nuclear] energy will ever be obtainable. It would mean that the atom would have to be shattered at will." Albert Einstein, physicist, 1932.
Einstein later wrote to President Roosevelt to urge that the United
States attempt to construct an atomic bomb.
This article suggests ways to facilitate unlearning. Since the
essential requirement for unlearning is doubt, any event or message
that engenders doubt about current beliefs and methods can become
a stimulus for unlearning. There are at least eight viewpoints
that can help people turn events and messages into such stimuli.
People can start from the premises that current beliefs and methods
are "not good enough" or "merely experimental."
They can turn surprises, dissents, and warnings into question
marks. They can listen carefully to the ideas of collaborators
and strangers. They can look for feedback paths and they can try
to synthesize divergent interpretations of phenomena.
"It isn't good enough."
Dissatisfaction is probably the most common reason for doubting
current beliefs and methods. But dissatisfaction can take a very
long time produce results.
Robert Fulton launched the first commercially successful steamboat
in American waters in 1807 (Petroski, 1996; Ward, 1989). In 1816,
the boiler on a steamboat exploded and injured or killed nearly
all of the boat's crew. Over the next thirty years, boilers exploded
on 230 American steamboats. Thousands died; more were maimed.
Some people said these explosions were "acts of God;"
others attributed them to demons in the boilers; still others
theorized that high temperatures decomposed water into hydrogen
and oxygen, which then recombined explosively. In 1824, the inventors
and mechanics of Philadelphia formed the Franklin Institute and
this Institute sought to study the causes of boiler explosions.
By 1830, boiler explosions had become the Institute's highest
priority, but it lacked the funds to conduct experiments so it
sent out questionnaires. However, a particularly bloody explosion
in 1830 induced Congress to ask the Secretary of the Treasury
to investigate, and he granted funds to the Franklin Institute.
This $1500 was the first research grant awarded by the U.S. government.
The Institute's experiments disproved some theories about boilers
and showed some unexpected effects. It submitted a report on explosions
to Congress in 1836 and a report on boiler materials in 1837.
In April 1838, a steamboat exploded and killed about 200 people,
which motivated Congress to pass the Steamboat Act of 1838. Unfortunately,
the law required inspection of boilers but it did not provide
inspectors and it did not require that a steamboat be removed
from service if it failed an inspection. Many more steamboats
exploded. Finally, in 1852, Congress set up a regulatory agency
with enforcement powers.
But the legislation and the regulatory agency focused solely on
steamboats, although boilers had also been exploding in factories.
Indeed, there were several hundred boiler explosions annually
and they continued into the twentieth century. The worst was a
steamboat explosion that killed 1200 to 1500 people in 1865. However,
by the mid 1880s, there was general understanding that the explosion
were due to excessive pressures, defective materials, or inadequate
or malfunctioning equipment.
"It's only an experiment."
People who see themselves as experimenting are willing to deviate
temporarily from practices they consider optimal in order to test
their assumptions. When they deviate, they create opportunities
to surprise themselves. They also run experiments in ways that
reduce the losses failures would produce. For instance, they attend
carefully to feedback. They place fewer personal stakes on outcomes
looking successful, so they can evaluate outcomes more objectively.
They find it easier to alter their beliefs and methods to allow
for new insights. They keep on trying for improvements because
they know experiments rarely turn out perfectly.
For example, in 1964, 3M corporation began an exploratory research
program to develop new adhesives. Spencer Silver, one of the chemists
working on this project, later explained: "In the course
of this exploration, I tried an experiment to see what would happen
if I put a lot of it into the mixture. Before, we had used amounts
that would correspond to conventional wisdom. . . . If I had sat
down and factored it out beforehand, and thought about it, I wouldn't
have done the experiment. If I had really cracked the books and
gone through the literature, I would have stopped. The literature
was full of examples that said you can't do this notepads"
(Nayak & Ketteringham, 1986: 57). The result was that Silver
found a radically new adhesive: It sticks to surfaces without
bonding tightly so it removes easily without leaving traces. It
was so unusual that Silver and others at 3M had great difficulty
seeing how it could be applied usefully. But it eventually spawned
an important new product line: Post-It note pads.
"Surprises should be question marks."
Events that violate expectations, both unpleasant disruptions
and pleasant surprises, can become opportunities for unlearning.
For instance, the Allies developed the tank during World War I,
and most army officers viewed the tank as lethargic support for
the infantry (Fleming, 1995). However, George S. Patton, the commander
of an American tank unit had trained as a cavalryman and he saw
the tank as being able to perform the cavalry function of reconnaissance.
At battle of St.-Mihiel, a wide no-man's land developed and Patton
ordered a three-tank patrol to advance until it found the enemy
lines. When German cannons fired on the patrol, its commander,
Ted McClure, ordered his tanks to charge, with the result that
they routed the Germans and destroyed the cannons. This provided
a conceptual breakthrough for Patton, and subsequently other army
officers, by showing that tanks could make daring attacks.
Marcie Tyre and Wanda Orlikowski (1994) studied technological
adaptation in production processes. Sixty percent of the adaptation
occurred during the first 2.5 months after the introduction of
new processes, but 23 percent of the adaptation occurred during
a second 2.4-month spurt that started about eleven months after
the introduction of new processes. These later spurts were initiated
by events -- such as new equipment, new production requirements,
or new personnel -- that disrupted routine operations and stimulated
new thinking about the technology and its use. It took disruptions
to induce rethinking because users rapidly came to accept the
deficiencies and inadequacies of new technologies.
Too often, however, the analyses following disruptions extend
only to the immediate causes of the specific disruptions. If disruptions
are to affect unlearning strongly, people need to use them to
reveal weaknesses in their current beliefs and methods as well
as to stimulate improvements. Why didn't the original designs
anticipate the events that caused disruptions? Would organizational
changes or different engineering concepts have fostered more robust
designs?
The North American power grid seems to illustrate ineffective
unlearning. In 1964, the U. S. Federal Power Commission stated
that the North American electric-power grid could deal effectively
with a nuclear attack (Chiles, 1985). On November 9, 1965, one
of Toronto's power stations began having minor mechanical difficulties,
so Toronto began drawing more power from a station at Niagara
Falls. A relay at the Niagara Falls station incorrectly sensed
an overload and disconnected the overloaded transmission line
from the power grid. This switched 375 million watts onto four
other lines that were already near capacity. They too disconnected,
so 1.5 billion watts flowed onto two lines that fed into northern
New York. This power surge disconnected another connection between
the U. S. and Canada, so Ontario was both short of power and unable
to receive it. Circuit breakers clicked open throughout the eastern
U. S. and separated the power grid into subsystems. In a few areas,
the resulting blackouts lasted less than fifteen minutes. In New
York City, the blackout lasted thirteen hours. Although New York
City had enough generating capacity not only to sustain itself
but to supply power northward, the human dispatcher did not push
the right eight buttons quickly enough. Many generators were very
difficult to restart after all power had shut down because starting
them required external electric power.
The blackout evoked controversy. Some argued that an integrated
power grid was inherently faulty; utilities should have weak ties
to prevent disruptions from cascading. Others argued that strong
ties enable utilities to accommodate disruptions, so the ties
should be strengthened and the grid expanded. The advocates of
stronger ties carried the day. Electric power companies organized
into nine "reliability regions", and much control was
transferred from humans to automatic systems.
However, in June 1967, an overloaded transmission line in Pennsylvania
initiated the second biggest blackout, which affected 13 million
people in four states. More procedural improvements followed,
but stronger ties have called for more complex control equipment
that has been more likely to fail or to produce unexpected results.
Wider-scale integration has meant that events can have consequences
thousands of miles away. There were more power outages during
1976 than during any previous year. Then on July 13, 1977, lightning
hit Consolidated Edison's transmission lines several times in
a few minutes, another human operator made another mistake, and
New York City again blacked out. More procedural improvements
followed, but there were more power outages during 1981 than during
any previous year.
"All dissents and warnings have some validity."
It is, of course, not literally true that every dissenter is right
or that every warning should be taken seriously. There are a few
loonies out there. But, for each loony, there are dozens of sensible
people who see things going wrong and try to alert others. Listeners
need to guard against hasty rejections of bad news or unfamiliar
ideas. At a minimum, dissents and warnings can remind people that
diverse viewpoints exist and that their own beliefs and methods
may be wrong.
Organizational hierarchies tend to block dissents and warnings.
Lyman Porter and Karlene Roberts (1976) reviewed studies indicating
that people in hierarchies talk upward and listen upward. They
send more messages upward than downward, they pay more attention
to messages from their supervisors than to ones from their subordinates,
and they try harder to establish rapport with supervisors than
with subordinates. The messages that do get through enhance good
news and suppress bad news (Janis, 1972; Nystrom and Starbuck,
1984). This bias becomes problematic because bad news is much
more likely to motivate people to change than is good news (Hedberg,
1981).
Elting Morison (1966) recounted how by the U.S. Navy learned to
shoot much more accurately. In 1899, many gunners on five ships
fired at the hulk of a ship for five minutes and achieved only
two hits. Six years later, a single gunner fired at a small target
for one minute and made fifteen hits.
This improvement came from the efforts of Percy Scott, William
S. Sims, and Theodore Roosevelt. Scott, a British naval officer,
developed aiming techniques, gun sights, and gears that greatly
enhanced gunners' accuracy. Sims, an American naval officer, met
Scott, learned of his improvements, and tried them on his own
ship. Impressed by the results, Sims then began to write reports
to naval bureaus in Washington. First, the naval bureaus ignored
Sims' reports. Then, using incorrect logic and contrived data,
the Bureau of Ordnance rebutted Sims' reports. They proved with
mathematics that Sims' methods could not possibly work even though
he, Scott, and other officers were using them. After two years
of this rejection, Sims wrote to President Theodore Roosevelt.
Roosevelt listened and appointed Sims the Inspector of Target
Practice. In this position, Sims taught the U. S. Navy to shoot.
How can people decide whether to take dissents or warnings seriously?
Four rules seem sensible. First, assume that all dissents and
warnings are at least partially valid. Second, try to find evidence,
apart from the messages' contents, about the odds that messages
might be correct. For instance, do the sources of the messages
act as though they truly believe what they say? Are these sources
speaking of topics with which they have experience? Third, evaluate
the costs or benefits that would accrue if messages turn out to
be correct. Fanciful messages typically entail high costs or high
benefits; realistic messages likely entail low costs or low benefits.
Thus, it is the fanciful messages that most deserve attention.
Fourth, find ways to test the dissents and warnings that might
bring high costs or high benefits. Make probes to confirm, disconfirm,
or modify the ideas.
"Collaborators who disagree are both right."
Beliefs held by qualified observers nearly always have foundations
in some sort of truth. The most common problem is not to prove
that one set of beliefs is wrong but to reconcile apparent contradictions
by showing that they are not contradictions at all. These efforts
can lead everyone to new conceptualizations. They can also produce
some strange inversions.
In 1937, Hannes Alfvén wrote a theory about the origin
of cosmic rays (Alfvén, 1985). Showing that cosmic rays
could be caused by electromagnetic effects around double stars,
he pointed out that the known electromagnetic effects are not
strong enough to fill the entire universe with cosmic rays. Thus,
he conjectured, cosmic rays must arise in and be confined to a
single galaxy. When Alfvén's paper was rejected by the
most prominent physics journal, he wondered if this was because
the generally accepted view at that time was that cosmic rays
filled the entire universe. He published the paper in a much less
visible journal.
In 1948, Alfvén attended two lectures in which Edward Teller
argued that cosmic rays must arise in and be confined to one solar
system. Alfvén struck up an argument with Teller, and Teller
responded by inviting Alfvén to present his theory in Chicago.
Alfvén went to Chicago, but by the time he arrived there,
he had decided that Teller was right. Alfvén and Teller
co-authored a paper about the confinement of cosmic rays to one
solar system, and Alfvén went on to publish more articles
and a book about this theory.
After a few years, Teller changed his mind. He and almost everyone
else in astrophysics came around to agreeing with Alfvén's
original theory that cosmic rays must arise in and be confined
to a single galaxy. Alfvén won the 1970 Nobel Prize in
Physics partly for his early work on this topic. But Alfvén
himself did not believe in his single-galaxy theory: He continued
to believe the theory Teller had originated that cosmic rays arise
in and are confined to one solar system.
"What does a stranger think strange?"
It is usually easier to respect the views of collaborators than
those of strangers. Unfamiliar with current methods and unacquainted
with recent efforts, strangers are likely to make suggestions
that seem naïve or ignorant or foolish. Yet, new people often
introduce new perspectives. Although the newcomers may be less
expert than their predecessors, they are also free of some expectations
that their predecessors took for granted. Thus, strangers may
be able to see peculiarities that the indoctrinated cannot see
or they may be able to offer breakthrough suggestions. Indeed,
"reengineering" seems to be designed to exploit this
principle (Hammer and Champy, 1993).
During the 1970s, the Sony Corporation produced a small, portable,
monaural tape recorder (Nayak and Ketteringham, 1986). It was
named the Pressman because Sony expected reporters to use it to
record interviews. In 1978, the engineers who had developed the
Pressman tried to upgrade it to stereo sound. They succeeded in
squeezing the components needed for stereo playback into the Pressman's
chassis. But there was no room left for recording components,
so the engineers were left with a recorder that could not record.
Of course, a stereo Pressman would also have needed a second microphone
and second loudspeaker, presumably on extension cords. Unsure
what to do, the engineers dropped the project and used the unsuccessful
prototype to play background music in their laboratory.
Sony's founder in 1946 had been Masaru Ibuka. Although Ibuka had
retired, he was called Sony's Honorary Chairman and he had the
habit of occasionally roaming around the laboratories and factories.
One of these tours took Ibuka into the laboratory where the tape
recorder engineers were playing their unsuccessful prototype.
"And then one day, into our room came Mr. Ibuka, our Honorary
Chairman. He just popped into the room, saw us listening to this,
and thought it was very interesting." Ibuka said he thought
the small box was producing excellent sound. He suggested to the
engineers if they had considered producing a machine that had
no recording capability. Also, he: suggested, if the machine had
no speaker, its batteries would last much longer. He had just
visited another Sony laboratory where someone had developed very
small headphones that might be mated to this non-recording recorder.
Engineers and managers in both the tape recorder division and
the headphone division saw no merit in Ibuka's idea. A tape recorder
that lacked both a speaker and recording capability was no recorder
at all, so no one would buy it. Headphones were merely a supplement
to loudspeakers; if a device had only headphones, only one person
could listen.
Undeterred, Ibuka went to Sony's real Chairman, Akio Morita, and
said: "Let's put together one of these things and try it.
Let's see how it sounds." Morita could hardly refuse
such a small request from his company's founder and his friend
of many years. So a machine was assembled, and both Ibuka and
Morita liked the way it sounded. They began carrying it with them
wherever they went - on trips, to play sports -- to see how much
they liked it.
Morita decided that Sony should put the Walkman into production.
This made the managers of the tape recorder division quite unhappy
because, as they saw it, they were being ordered to produce an
ineffective device that would almost certainly lose money. With
the new lightweight headphones, it would cost $249. Not only was
this more expensive than tape recorders with speakers that could
record, but the expected teenage consumers could not possibly
spend more than $170. The marketing managers said bluntly, "This
is a dumb idea." Morita declared that the price would be
$165, and he told the tape recorder division to make 60,000 of
them.
The managers of the tape recorder division judged that they were
being commanded to lose $35 per unit sold. "There was no
profit. The more we produced, the more we lost." They secretly
decided to produce only 30,000 units and they allotted marketing
a budget of only $100,000.
Sony sold almost no Walkmans during the first month after the
product's introduction. Then sales picked up, and during the third
month, sales rocketed . . . until Sony ran out of inventory. That
was when Morita found out that the tape recorder division had
produced only 30,000 instead of 60,000. The tape recorder division
quickly corrected its error. Six months after the product's introduction,
Sony was producing and selling 30,000 units per month.
During the fourth month after the Walkman's introduction, Sony
began designing the Walkman II - much smaller, with better sound
and longer battery life. Sony planned its production for 200,000
units per month.
"All causal arrows have two heads."
People can use thought processes that tend to disclose and challenge
their tacit assumptions. One useful heuristic is to insist that
all causal paths carry influence in both directions: Whenever
one perceives that A affects B, one should also look for ways
in which B feeds back and affects A. There are some causal paths
that do not carry influence in both directions. However, one-directional
causation is rare because systems that can converge toward equilibrium
have to entail feedbacks. Searching carefully for these feedback
paths can lead one to see previously overlooked causal paths.
For example, Toyota developed the concept of a Just-In-Time inventory
system by inverting the causal flow. In the traditional view,
production converts raw materials into finished goods. A plant
turns raw materials into components that feed into in-process
inventories, and the plant produces finished products by drawing
components from inventories. The finished products go into finished-goods
inventories, not directly to customers. Customers must buy from
the finished-goods inventories. Thus, analysts view production
as flows of materials through stages of conversion; inventories
uncouple these consecutive stages.
According to Toyota's Taiichi Ohno, "we reversed our thinking
and considered the production process in terms of backward flow"
(Nayak and Ketteringham, 1986: 210). What flows backward is information
about customers' desires. When customers select finished products,
they create vacancies in the finished-goods inventory. As finished
products fill these vacancies, they remove components from the
in-process inventories. The inventory vacancies created by withdrawn
components convey information about the finished products that
customers want. The inventory vacancies cascading through the
production process automatically decompose customers' desires
into components and ultimately raw materials.
Inverting the causal flow led Ohno to see production as the conversion
of customers' preferences into demands for components and raw
materials. In this view, in-process inventories become barriers
that delay the flows of information. To speed this information
flow, Toyota set out to minimize its in-process inventories.
"The converse of every proposition is equally valid."
Dialectic reasoning is a generalization of two-directional causation.
Starting from a proposition (A affects B), one states the converse
proposition (B affects A) and then one insists that both the original
proposition and its converse are valid. The philosopher Georg
Hegel, who advocated this mode of reasoning, called the original
proposition the thesis, its converse the antithesis, and their
union, the synthesis. As with causal paths, not every thesis has
a valid antithesis and not every thesis can be synthesized with
its antithesis. But it is possible to apply dialectic reasoning
to almost all situations and the process of applying it helps
one to break free of tacit assumptions.
One can see dialectic reasoning in the work of Gideon Sundback,
who invented the zipper (Friedel, 1994). During the latter part
of the nineteenth century, the most common method of fastening
shoes was hooks and eyes. These were also used to fasten women's
skirts and men's trousers. But fastening them was slow work and
they did not stay fastened very well. The first zipper-like patents,
which emerged in 1893, proposed that a sliding "guide"
could mate hooks and eyes. These devices were rather complex and
they required precise assembly, so around 1904, their inventors
began attaching them to cloth tape that could be sewn into shoes
or clothing. The design, however, did not work well in that the
hooks and eyes tended to separate when the fastener was bent or
twisted.
The company that manufactured these devices hired Gideon Sundback
to improve their design. His first effort, although better than
its predecessors, had similar deficiencies and it was a commercial
failure. Around 1912, after pursuing improvements in the prior
design for six years, Sundback came up with a radically different
design. In it, a slide forced the beaded edge of a cloth tape
between two rows of metal clamps - somewhat like a Ziploc fastener.
Thus, Sundback had replaced the proposition 'a fastener involves
hooks and eyes' with its antithesis 'a fastener has neither hooks
nor eyes.'
The antithetical design also had serious deficiencies -- the cloth
tape wore out after only a few uses. But optimistic backers formed
a new Hookless Fastener Company, and Sundback continued his experiments.
In 1913, he produced a design very like the modern zipper. In
it, the hooks had shrunk to small protrusions and the eyes had
closed until they were indentations. It synthesized hooks and
eyes with their absence, and it synthesized hooks with eyes. The
two sides of the fastener were composed of identical elements.
Theories of leadership afford an example of dialectic processes
operating on a large scale (Webster & Starbuck, 1988). Early
in the twentieth century, most managers and management theorists
asserted that organizations work best if they have firm superiors
and obedient subordinates. Some fortunate people, it was said,
had inherent traits that made them good leaders whereas the less
fortunate did not.
By the 1930s, this orthodoxy had elicited counter arguments: Barnard
argued that authority is something that subordinates grant rather
than something that superiors impose. Weber pointed out that organizations
may depersonalize leadership and that subordinates may think their
superiors lack legitimacy. The Hawthorne studies presented evidence
that subordinates produce more when they have friendly superiors.
Syntheses emerged during the 1950s. Some psychologists studied
democratic leadership; others documented the sharing of leadership
tasks among members of work groups; and still others analyzed
the distinctive personalities of different kinds of leaders. Bales
distinguished leaders' social roles from their task roles. Then
the Ohio State leadership studies decomposed subordinates' perceptions
of their superiors into two statistically independent dimensions
- initiating structure and consideration. Initiating structure
embodied the essential properties of the leadership concepts of
1910, and consideration embodied the concepts of the 1930s. Thus,
antithetical views had become distinct dimensions of a complex
phenomenon.
3. Reprise
"I think there is a world market for about five computers." Thomas J. Watson, President, International Business Machines, 1943.
Watson later helped his son lead IBM's expansion in computers.
No one should be confident that their current beliefs and methods
are optimal. Optimality is unlikely. If beliefs seem accurate,
someone else is probably finding other beliefs equally effective.
If methods seem excellent today, better methods will appear tomorrow.
Thus, one is well-advised to remain ever skeptical. "It isn't
good enough" and "It's only an experiment" are
mental frameworks that help one stay constantly alert for opportunities
to improve. "It isn't good enough" reminds one to look
for more accurate beliefs or better methods. "It's only an
experiment" helps one to feel less committed to current beliefs
and methods.
Because current beliefs and methods bias information gathering,
signals from one's environment tend to support these beliefs and
methods. To obtain dissonant signals, one may have to be proactive.
Thus, one should try to turn surprises into question marks, should
respond to dissents and warnings as if they have some validity,
and should act as if collaborators' ideas are as deserving as
one's own.
It may be difficult to respect the views of strangers unversed
in current methods and unfamiliar with recent efforts. But strangers
can see errors or opportunities to which the indoctrinated are
blind.
One wanting to challenge current beliefs and to discover alternative
methods can apply two logical techniques. "All causal arrows
have two heads" helps one to look for neglected feedback
paths. "The converse of every proposition is equally valid"
helps one to reframe current beliefs within more general schemata.
"There is no reason for any individual to have a computer in their home." Ken Olson, President, Digital Equipment Corporation, 1977.
Five years later, DEC began to sell microcomputers.
Note: This article benefits from the insights of Raghu
Garud, John Hedberg, and John Mezias.
Alfvén, H. 1985. 'Memoirs of a dissident scientist.' In Y. Sekido and H. Elliot (eds.), Early History of Cosmic Ray Studies. Dordrecht: Reidel, 427-431.
Armstrong, J. S. (1985). Long-Range Forecasting: From Crystal Ball to Computer (2nd edn). New York: Wiley-Interscience.
Beyer, J. M. (1981). 'Ideologies, values, and decision making in organizations.' In Nystrom, P. C., and Starbuck, W. H. (eds.), Handbook of Organizational Design, Volume 2. Oxford: Oxford University Press, 166-202.
Chiles, J. R. 1985 'Learning from the big blackouts.' American Heritage of Invention and Technology, 1(2): 27-30.
Fleming, T. 1995 'Tanks.' American Heritage of Invention and Technology, 10(3): 54-63.
Friedel, R. 1994 'The history of the zipper.' American Heritage of Invention and Technology, 10(1): 8-16.
Gould, S. J. 1986 'The panda's thumb of technology.' Natural History, 96(1).
Hammer, M., and Champy, J. 1993 Reengineering the Corporation. New York: HarperBusiness.
Hedberg, B., 1981. How organizations learn and unlearn. In P.C. Nystrom and W. H. Starbuck (eds.). Handbook of Organizational Design, volume 1, Oxford University Press, New York: 3-27.
Janis, I. L., 1972 Victims of Groupthink. Boston: Houghton Mifflin.
Kuhn, T. S., 1962 The Structure of Scientific Revolutions. Chicago: University of Chicago Press.
Morison, E. 1966 Men. Machines and Modern Times. Cambridge, MA: MIT Press.
Nayak, P. R. and Ketteringham, J. M. 1986 Breakthroughs! New York: Rawson.
Nystrom, P.C. and Starbuck, W. H., 1984. To avoid organizational crises, unlearn. Organizational Dynamics, 12(4): 53-65.
Petroski, H. 1992 To Engineer Is Human. New York: Vintage.
Petroski, H. 1996 'Harnessing steam.' American Scientist, 84(1); 15-19.
Porter, L. W., and Roberts, K. H. (1976). "Communication in organizations." In M. D. Dunnette (ed.), Handbook of Industrial and Organizational Psychology: 1553-1589. Chicago: Rand McNally.
Starbuck, W. H. (1989). "Why organizations run into crises . . . and sometimes survive them." In K. C. Laudon and J. Turner (eds.), Information Technology and Management Strategy: 11-33. Englewood Cliffs, NJ: Prentice-Hall.
Tushman, M. L., Newman, W. H., and Romanelli, E. 1986 'Convergence and upheaval: Managing the unsteady pace of organizational evolution.' California Management Review, 29(1).
Tyre, M. J., and Orlikowski, W. J. 1994 'Windows of opportunity: Temporal patterns of technological adaptation in organizations.' Organization Science, 5(1): 98-118.
Ward, J. K. 1989 'The future of an explosion.' American Heritage of Invention and Technology, 5(1): 58-63.
Webster, J., and Starbuck, W. H. 1988 'Theory building in industrial
and organizational psychology.' In C. L. Cooper and I Robertson
(eds.), International Review of Industrial and Organizational
Psychology 1988. London: Wiley, 93-138.