Cross-Cultural Issues in
Alaskan Education
Vol. II
CULTURAL CONSIDERATIONS
IN TECHNOLOGICAL INNOVATIONS
James M. Orvik
Center for Cross-Cultural Studies
University of Alaska, Fairbanks
(Ed. note: This paper was originally prepared for
presentation at the 40th Annual Meeting of the Society for Applied
Anthropology, Denver, March 22, 1980 and has been slightly revised
for this volume.)
If Wendell Oswalt is to be taken seriously, this paper would be
more appropriate for presentation at the Annual Meeting of the
Society
for Neglected Anthropology. Oswalt bemoaned the tendency for anthropologists
to ignore modern technological influences. He thus speculated:
As
the number of aboriginal groups declined, Anthropological interest
in technology began to lose much of its momentum. Most ethnographers
have been reluctant to describe and analyze the technologies
of peoples in the process of westernization, if only because imported
manufactured
goods often have replaced locally handcrafted artifacts within
a relatively short span of time (Oswalt, 1976: 33-4).
If we ignore
for the moment some of the notable exceptions to the above statement,
it probably strikes a chord of recognition
in
many of us.
What I wish to discuss in this paper, however, has
nothing to do with the frequency with which anthropologists study
technology. Rather, I discuss the models that seen most often to be
used
by
social scientists,
not just anthropologists, to guide the study of technology.
These
models
not only influence the selection of questions raised about
technology, but guide the selection of answers as well.
First,
what do I mean by technology? A much-maligned term these days, technology
can raise a red flag for about half
the people.
For the
other half, preceding “technology” by the adjective “appropriate” evokes
a response of similar vigor, especially in Alaska.
The Random
House Dictionary of the English Language defines technology
as “1. the branch of knowledge that deals with industrial
arts, applied science, engineering, etc. 2. the application
of knowledge for practical ends, as in a particular field:
educational
technology,” for example. Funk and Wagnal Is (1954)
gives precedence to technology as “Theoretical knowledge
of industries,” and
secondarily, “the application of science to the arts.”
To
the extent I understand these definitions, they reflect
current popular usage. We are certainly used to equating
the suffix,
-ology, -- with the study of whatever precedes it,
treating it as a body
of knowable knowledge. On the other hand, I find such definitions
awkward
because of the infinite logical spiral they initiate. If
technology is the knowledge of something people know how
to do, so must
there be a knowledge of that knowledge, which requires
further knowledge,
and so on. What is needed is a definition that doesn’t
start with knowledge about knowledge, but simply assumes
certain abilities
to have existed and proceeds to the product. Oswalt (1976),
for example, defines technology “as all the ways
in which people produce artifacts (p. 33).” One is
then freed by this definition to study the “technology” of
something or someone. The important thing is that this
definition rightly implies all humans are technological.
More than that, Oswalt
is of
the opinion essentially that we are what we make:
My contention
is that man is first and foremost a technological animal
and that the major distinctions among lifeways should
be made on
the basis of manufactured forms (Oswalt, 1973: 21).
One
needn’t agree with Oswalt to see the distinction between
his definition of technology and the connotations prevailing
in the socio-political climate of today. Clearly, what most people
have in
mind is some nonspecified concept of Western technology,
usually with the assumption that applied science is its basis.
Ferguson,
a historian and “curator of technology,” made
a number of observations that help bridge the obvious
gap between the latter view that mixes the sciences with technology
to produce
Western technology, and the former view of Oswalt’s
that so overly includes all human productive endeavors
as to be nearly
useless. Ferguson
first states:
This scientific age too readily assumes
that whatever knowledge may be incorporated in the
artifacts of technology
must
be derived from science. This assumption is a bit of
folklore that ignores
the many nonscientific decisions, both large and small,
made
by technologists
as they design the world we inhabit (1977: 827).
The
point I wish to make is that the distinction that creeps into thinking
based on this folklore is not
only a false
one, but one
that helps
explain the apparent neglect of technology by anthropologists.
On the other hand, the same false distinction, between
Western and anyone else’s technology, because
it confounds science with technology, has encouraged
an entrepreneurship of technology
that
people, anthropologists especially, find increasingly
odious.
Alaska seems especially attractive to technology
entrepreneurs, perhaps because there are problems
posed by the natural
environment that challenge the modern technologist’s
sense of adventure. More likely, because great sums
of money are spent annually to solve
Alaska’s rural cross-cultural “problems” by
technological means, there are more than enough vendors
to go around. The example
that canes most readily to mind is the rapid, almost
revolutionary development of telecommunications in
rural Alaska. For over four years
the Alaska State Legislature, reflecting for the
most part rural interests, has subsidized the delivery
of entertainment T.V. to twenty-two small
vii ages throughout the state. Last year the legislature
appropriated ten mil lion dollars to expand the program
to every village with twenty-five
or more people in it. The Governor vetoed all but
a modest expansion of the program. Knowing the political
impossibility of discontinuing
their hold on this tiger’s tail, the legislature
has gently asked other state agencies to develop
less entertaining uses for the state’s
extensive investment in primetime T.V.
The State Department
of Education has responded to this gentle pressure
with considerable enthusiasm.
Under that
agency,
numerous uses of the state’s satellite telecommunications
network have been designed to take advantage of the “free” satellite
transponder time already leased by the state under
its original program. There are nonsatellite educational
technology applications resulting
from the momentum as well. Clearly we are in something
of a development spiral in which telecommunications
technology (the kind some people
don’t like) has been proliferated under what
may well be a self-justifying ritual.
In Alaska, perhaps
elsewhere as well, we can identify two main models
of thinking used to evaluate these
rapid technological
developments; the military model and the medical
model. The first I call the
military
model, not because anyone is at war, but because
the modern military is the embodiment of the systematic,
planned application
of science
to technology. If Gilbert and Sullivan were alive
today,
they
most certainly would create a “Modern Major
General” pattering
on about computer-based management systems. He might
sing something like:
Toward General Systems Theory
I’m ‘Bang for Buck’ in attitudes.
I’ll plan your life, remove all strife,
With systematic platitudes,
Like ‘Maximize your minimum
for optimum complexity.’
And if you will abide that swill,
I’ll pluck you from perplexity.
Postman’s (1976) term
for complete faith in planning is “systemaphilia,” a
morbid attraction to certainty. When applied
to complex problems such as waging a war or designing a curriculum,
systemaphilia “arises
from the assumption that human beings are sufficiently
clever, knowledgeable, and multiperspectived to design complete
and just
about perfect systems
of human activity.”
Planning and evolution
by definition contradict one another, they are
mutually exclusive processes.
I
won’t go so far as to say that
planning is “unnatural” in the dark,
perverse sense, but it is unnatural in the literal
sense. Both refer to possible ways things
can get done. One distinction may be that planning
is what brings change under intentional human
control. My contention is that planning
merely reorganizes intentional human control
by concentrating it in the hands of fewer humans.
As opposed to the planning of technology,
the evolution of technology has by its nature
been driven by the decentralized, diffused activity
of individuals in face-to-face interaction
with other individuals. The control is still
there, but it is organized in a way that never
allows the authority to implement technology
to
exceed the willingness of the receiver to refuse
it. Information is the key, not just ethnographic
information but the kind of intelligence
that notices what people are trying to do with
their everyday lives. Such information is created
by direct observation, not through “needs
assessments.” The assessment of needs is
itself a technology authoritatively implemented
without regard to its cultural appropriateness.
To
illustrate the point further, in one of their
planning documents the Alaska State Department
of Education
have proposed “A
technology based model for individualized instruction.”
The
instructional model described is designed
to maximize development of students’ decision-making
ability and their role in the instructional
design process. The model is predicated upon a systematically
integrated system of learning matrices. The
matrices are carefully
developed in close cooperation with the student
and are designated to provide a meaningful,
relevant education which is sensitive
to the
values of both the student and the school
(Nelson, 1979:ii).
It is clear that what is desired by
the author is a totally managed educational environment
in which
teaching
functions
are conceptualized “as
analogous to those of a ‘learning manager,’” connoting “greater
precision in the design and development of learning
matrix networks for individual students.”
If
the reader is still not convinced in the faith
Nelson (and DOE, by implication) has in systems,
the following
is offered:
The process concept inherent within
the advisor’s management
function also strongly supports the development
of a systematic, data-based, decision process. This single concept
-- systematic,
data- based decision-making
-- is virtually revolutionary in education
and will almost single-handedly ensure the delivery of a meaningful,
relevant
education (Nelson,
1979:46).
My objective here is not to evaluate
the completeness, consistency, or even the desirability of this
system. I have only one
criticism: the system presupposes the compliance
of the student (not to
mention the teacher) to participate. By not
leaving room to say no to the
system, students and teachers have only passive-aggressive
means at their disposal
by which to resist this particular form of
perfection. As such, this is a good example
of authority
to implement technology
which exceeds the willingness of the receiver
to refuse it. The authority
referred
to here is not the authority to coerce but
to seduce, by the
systematic employment of self-justifying
features, the principle such feature
being “contingency management.”
One
technique which has proven to be highly successful,
in motivation management, has
been contingency
management. This
refers to a
systematic approach for assuring the availability
of various kinds of reward,
contingent upon the satisfactory execution
of each desired performance (Nelson, 1979:48,
emphasis
added).
Who could resist such a prospect?
That’s the problem.
The second model is the one currently preferred
by most anthropologists. I call it the
medical model
because it treats technology
as a contagious disease to which minority
cultures are
particularly susceptible.
The difference between the military model
and the medical model
are sharp
and well focused.
I won’t dwell at
length on the features of the medical model,
because they self-evidently contrast with
the features of the military model. Where
one seeks planned perfection, the other
analyzes “unplanned
consequences of planned social change.” Where
one seeks to predict and control outcomes,
the other prefers to say “I told
you so.” Both models, however, share
a common feature. They both reflect one
of the more persistent residues of modern
folklore: that
technology is a product of science. Ferguson
points out that “this
scientific age too readily assumes that
whatever knowledge may be incorporated
in the artifacts of technology must be
derived from science” (Ferguson,
1977:827).
The belief in a mythological
relationship between science and technology
has important
educational
implications.
According to Ferguson,
such a belief de-emphasizes the role of
nonverbal visualization:
Pyramids, cathedrals,
and rockets exist not because of geometry, theory of structure,
or thermodynamics,
but
because they
were first a picture--
literally a vision -- in the minds of
those who build them.
He questions the long-term value of the
systematic reduction in status of aspects
of education
related to visualization
in engineering
design courses in favor of more analytical
methods:
Nonverbal thinking, which is
a central mechanism in engineering design, involves
perceptions,
the stock-
in-trade of
the artist not the scientist.
Because perceptive processes are
not assumed to entail ‘hard
thinking,’ it has been customary
in engineering curricula to consider
nonverbal thought among the more
primitive stages in
the development
of cognitive processes and inferior
to verbal or mathematical thought
(p. 834).
The implications of Ferguson’s
observations for education, especially
for cross-cultural education, are significant.
First,
there is ample
evidence within the dominant culture
that the variety of cognitive styles
represented in the general population
of students is systematically
ignored in education curricula (Messick,
1970; Witkin, 1973). If this is true
within the dominant culture, it may
be even more so
in
situations where dominant culture education
models are transferred to a radically
different environment where cognitive
differences
may be even more pronounced (Kleinfeld,
1973; Witkin, 1967).
Witkin and Berry
(1975) have done research suggesting that the mode of interact
ion a culture characteristically
has
with its,
environment
exerts a powerful influence over
how members of that culture process information
in
order to survive.
It follows that
a culture’s
technology is also a result of the
person-environment interaction.
We
face the prospect of a rural Alaskan
village curriculum based on increasingly
verbal information
foundations,
It should surprise
no
one that one “needs assessment” after
another gives “language
deficiency” top priority among
rural education problems to be solved.
I contend here that any need to enhance
nonverbal visualization
processes in order to solve community
problems never had a fair chance
to be revealed. The methods typically
used to assess needs -- primarily
verbal, primarily system-based --
preclude our ability even to raise
such needs as questions. It isn’t
a fair fight.
And yet we now have
sufficient reason to be concerned
that in the name
of “systematic planning” we
have the potential to exclude the
very educational processes needed
to foster intellectual self-sufficiency.
Intellectual self-sufficiency implies
a deep understanding of and adaptability
to the immediate context. But intellectual
self-sufficiency also implies
independence from the constraints
of science as folklore. Systemaphilia
is science folklore run amok, in
social science no less than in “hard” science.
Curriculum design that is overly
dependent on verbal descriptions
and verbal specification leaves untouched
those human capacities
by which people created the world
in which they live.
The dilemma implied
by my conceptualization of the military
v. the medical model
lies in its
erosive effect on intellectual
self-sufficiency. The proliferation
of sophisticated Western technology
may
be a
Faustian
bargain, but attacking it by means
of the medical model is nothing
less than Quixotic. In their extreme
forms, both approaches are bound
to create more,
rather than
less, dependency
on physical, economic,
and intellectual resources outside
the isolated environments
in which the members of Alaska’s
indigenous minority groups have chosen
to live.
The question is how to make
potential technological innovations
self-evaluating
within the
environments they are supposed
to improve.
I have stated already
my lack of faith in the ability of the military
model
to plan
away
the unwanted
features of
technology.
I have
little more faith in what I have
cal led the medical model because
it relies
heavily on a priori assumptions
that outside
innovations represent cultural
discontinuities too great to
be of benefit, followed
by post hoc evaluations affirming
the consequent. Nor will a laissez-faire position be of any help because
unnecessary technology is not self-rejecting.
This
is, “natural’ evolutionary
forces will not cause an inappropriate
technology to fall under its own
weight as long
as the
entrepreneurs of technology, local
or otherwise, continue to find
their work profitable.
A necessary
part of the answer
is to shift the locus of control
to
local
entrepreneurship. However,
even
though this shift
is a necessary
condition, further conditions need
to be
explored before sufficient safe-guards
against the unwanted
effects
of technology can
be realized.
Does the answer lie
in advocating a conservative policy about technology?
That is, should
we, as social scientists,
merely
encourage Native
constituencies to exercize prudence
in the face of the hard sell?
Possibly, but I
think in the long run such a
policy would
foster a climate of distrust
rather than cooperation.
Local entrepreneurship
is already
on the
increase through the provisions
of the Alaska Native Claims Settlement
Act.
An air of across-the-board conservatism
would erode the conditions by
which to gain the
experience to make
sound
technology
investments.
There is considerable
evidence that in Native communities the
consumption
of
technology is highly selective
(Orvik, 1977:171).
Telecommunications
demonstrations that have been
organized to
allow individuals a great deal
of choice when and when
not to participate
have shown
community-specific
patterns of consumption. More
specifically, it appears that
aspects of technology
based on the
presumption
of a cultural
deficit,
in this case standard English
oral language development,
are particularly
subject
to polite rejection by Native
consumers.
The latter finding
is perhaps the most important clue I can
offer
an “applied
anthropologist” who is
searching for a useful alternative
to the two models I have presented
in this paper. Ethnographic
insight,
tempered with a fair-minded
appreciation for the potential
value of technological innovations
may be, in the long run, the
most
appropriate
and realistic preventative
anyone could prescribe.
REFERENCES
Dictionary
of the English language 1975.
New York: Random House.
Ferguson, E. S. “The mind’s eye. Nonverbal thought in technology,” in
Science 197, 827-36, 1977.
Kleinfeld, J. “Intellectual strengths in culturally different
groups: An Eskimo illustration,” in Review
of educational research 43, 341-59, 1973.
Messick, S. “The criterion problem in the evaluation of instruction:
Assessing possible, not just intended, outcomes.” The
evaluation of instruction:
Issues and problems. M.C. Wittrock and D. Willey, eds.
New York: Holt, Rinehart & Winston, 1970.
Nelson, F. G. A general description
of a technology-based model
for individualized instruction.
Juneau,
Alaska: Alaska Department
of
Education, Office of Planning
and Research. Mimeo, 1979.
Orvik, J. M. “ESCA/Alaska: An educational demonstration.” Journal
of Communication 27. 166-72, 1977.
Oswalt, W. H. Habitat and
technology: The evolution
of hunting. New
York: Holt, Rinehart
and Winston,
1973.
_______. An anthropological
analysis of food-getting
technology. New
York: John
Wiley and Sons,
1976.
Postman, N. Crazy talk, stupid
talk. New York: Delacorte
Press, 1976.
Standard
dictionary
of
the English language
1954. New
York: Funk and Wagnalls.
Witkin,
H. A. “A
cognitive style approach
to cross-cultural research,” in
International journal
of psychology. 2:
233-50, 1967.
_______.“The role of cognitive style in academic performane and in teacher-student
relations.” (RB-73-11).
Princeton, New Jersey: Educational
Testing Service,
1973
Witkin, H. A. and J. W. Berry. “Psychological differentiation
in cross-cultural perspective.” Journal
of cross-cultural psychology
6. 4-87, 1975.
|