Buxton, W. (1982). Human Perspectives on Technology and Learning. Keynote address delivered at the Conference on Telidon and Education, Sydney, N.S. Unpublished manuscript.


 
 

Human Perspectives on Technology, Learning and Teaching

This paper is about education and information technologies. It is a treatise based on a sleight-of-hand, where we turn our unfailing ability to discover new ways to make the same mistake into a reaffirmation of our innate human creativity. Thus, we form the basis for an optimistic view of the future.

But for this optimistic vision to be realized, it is argued that we must become more critical and better skeptics. As a result of this combination of skepticism and optimism, we introduce a new discipline: "skeptomism."

Change is nearly always accompanied by mistakes. One of the main agents of recent change, and one which has provided the opportunity for some really inspired mistakes, is the "arrival" of the new information technologies. Many of these mistakes can be traced to our all too willing acceptance of the rhetoric which equated the introduction of these technologies with an improvement to society.

In casting a somewhat more skeptical eye, we see something quite different. Yes, the technology gives us access to data that we would not have access to otherwise. Yes, it allows us to have a daily interest savings account. And yes, it allows much work to be done more efficiently and provides a support mechanism for many aspects of decision making.

But let us examine the support of decision making as an example.

Many studies give compelling evidence that these tools - which were designed to make work easier and more efficient - are often perceived by their users to result in a degradation in the quality of working life.

Why?

Because they often reduce that component of work which is perceived to be most "human:" the component which exercises the human intellect in finding creative solutions to difficult problems.

Now, in this negative view of technology, there is the seed for some optimism. For what does it say?

But it also says: It is this second point which causes concern. But this concern is largely mollified if we recognize that the technology need not play such a negative role. In fact, examples such as computer music, art, and games demonstrate that the technology can serve as a catalyst to thought, creativity, enjoyment and gratification.

But if the technology has the ability to play such a positive role, why has this potential not been more widely realized?

The problem seems to be one of not knowing how best to develop and exploit innate human potential. But within whose sphere does this fall? Not the technologist's (at least not solely). Rather, yours and mine: professionals in areas like education, the library sciences and the arts.

Let us take educators as an example. What role can they play in exploiting the technology as a means to cultivate human creativity and development? Should they become programmers or engineers? The question is rhetorical.

We already have plenty of programmers, and yet the problems persist. No, their role is to develop that which they know best, education and epistemology, and to use the new technologies when appropriate to put that knowledge to best use.

Questions usually arise at this point, however, concerning how to determine what is appropriate and how to put it into practice when one does not have "technological skills". But while technological skills are required, the particular nature of those skills does not require programming, for example. Rather, what is required is:

My belief is that these skills are readily accessible. In considering educational applications, there is one key point to make: it is the teacher, rather than the student, who stands to learn the most from the use of computer-based technologies.

But I do not mean learning about computers, but learning about learning. The reason lies in one of the great cliches of the trade: a computer only does what it is programmed to do.

The implication of this is that in order to be used for education, the system must be programmed and that program constitutes an explicit pedagogical theory! Now that theory may be fuzzy, well thought out, ad hoc, or irrational, but it is a theory, nevertheless. The import of this observation is that it confronts us with the question "what is my pedagogical stance on the topic that I am programming?" At the same time, it provides us with a tool to actually test and evaluate any theories which we might have.

We can now see how this perspective can help us in the practical sense.

First, in evaluating a prospective system, we can ask, "What is the pedagogical stance of this system?", "Is it in keeping with my own approach?", "How does it permit me to evaluate my teaching techniques?" and "Is this system a challenge for me to reevaluate my own view of education?"

Our perspective now includes not only what is being taught, but how it is being presented and how it can be evaluated. There are alternatives, they do make a difference, and these are questions that we should ask of anything or anybody that we involve in the education of our children, employees or ourselves!

Seduced by the computer, we often forget that it is just one of many alternatives, including the blackboard, flash-cards, field trips, books and teachers. Each technology should be questioned as to its strengths and weaknesses and used accordingly.

The problem with information technologies is that the wrong things are often too easy, and the right things too hard. For example, it is easy to program "drill-and-practice" exercise that make rote learning even more boring than usual.

On the other hand, we can use computers to help students relearn something that they lose after about one year of traditional education: how to formulate a hypothesis and test it. Granted, to build the simulations and tools to support hypothesis building and testing is hard. But it can also help to unleash the innate creativity and curiosity that school too often hammers into the ground.

In the field of information technologies, what is easy is usually considered cheap, and what is hard expensive. But if we look at the above example (or comparable examples of how information technologies are used in the workplace), shouldn't we reevaluate our economic analysis? Can our society really afford the "bargain" of uninspired and uninspiring technologies and applications? In contrast, the harder approach seems a bargain that we can't afford to miss or delay.

In conclusion, this has been an article about activism. Hopefully it has made the point that it is both important and worthwhile to get involved. It is in current fashion to talk about the "microelectronic/communications revolution" and to compare it, for example, to the industrial revolution. But there is one big difference between the two.

In the nineteenth century, nobody asked the general public how they wanted things to evolve, or provided them with the forum or funds too make this known. Today, however, they do. We are able and enfranchised to influence the directions that these technologies take. It is now up to us to demonstrate that we have progressed as much since the industrial revolution as the system has.