acm - an acm publication

Articles

Philosophy, activism, and computer and information specialists

Ubiquity, Volume 2007 Issue November | BY Paul T. Durbin 

|

Full citation in the ACM Digital Library


In the decade before I retired from the University of Delaware Philosophy Department, I got to know a number of computer and information specialists in our Instructional Technology program. They were helping me, among other things, to put online a problem-based-learning (actually case-based) course on Contemporary Moral Problems, an innovative distance learning venture that was also interactive, involving dialogue between the students and with myself as moderator. But I never got into any discussions with them, at the time, about information technologies and ethics.

However, at least three times, before and after that period, I did write some items dealing with the topic.

In 1992, in my book, Social Responsibility in Science, Technology, and Medicine, I included a chapter that invited computer professionals to join with such colleagues as their compatriots in Computer Professionals for Social Responsibility to help deal with social problems connected to work in the computer and information fields. The type of problems I refer to there had been summarized by an Office of Technology Assessment (United States Congress) report in 1985 that said the problems had "outstripped the existing statutory framework for balancing interest in civil liberties and law enforcement"—problems such as computer usage monitoring, electronic mail monitoring, and cellular phone and satellite interceptions.

Computer Professionals for Social Responsibility publications claimed that its members had lobbied Congress, alerted the media, and watchdogged efforts of the Federal Bureau of Investigation (FBI), among other activities. The book is an appeal to technical professionals of various sorts—not only computer and information specialists—to get involved in greater numbers than had done so up to then in activism of this sort on what were widely perceived as technosocial problems.

Ten years later, I was invited to participate in a workshop at the University of Salamanca in Spain that focused, among other things, on problems of privacy in the information age. In an article based on my talk there, that appeared a few years later (2005), I again addressed social problems associated with computer and information specialists. In addition to the privacy theme of the workshop, I also addressed issues related to computer work as related to the war on terror; the September 11 2001 attacks on the World Trade Center in New York and the Pentagon in Washington, D.C., had occurred just months before my talk. On the privacy issue, I could again refer to a number of activist groups—those that Lawrence Lessig, a legal expert who had written a great deal about computer issues, said he had worked with: the Center for Democracy and Technology, and the Electronic Frontier Foundation, among others. On issues such as encryption and the war on terror, I had fewer activist organizations to refer to, but the message was still the same: I hoped that more computer experts would get involved in activism than had done so up to that point. The issue, of course, is especially tricky when one is dealing with an issue like the war on terror; but my message (not very hopeful, I admit) remained the same.

Still more recently (2007), I included a discussion of similar issues in my online book, "Philosophy of Technology: In Search of Discourse Synthesis." There, because I was writing a history and not sermonizing to computer professionals, I limited myself to chiding the Society for Philosophy and Technology's leading expert on computer ethics, Deborah Johnson, for mostly limiting herself to recommendations to include computer ethics in the training of computer professionals. However, I had to admit that she went at least one step beyond that: "The bottom line is that all of us will benefit from a world in which computer professionals take responsibility; ideally we would have all computer professionals working to shape computer systems for the good of humanity." In her book on Computers, Ethics, and Social Values (co-edited with Helen Nissenbaum), Johnson had actually gone still another step beyond that vague wish, including in the anthology an essay by a renowned computer activist, Terry Winograd, in which he said, "We need to create an environment in which the consideration of human values in our technological work is not a brave act, but a professional norm." And Winograd mentions in his essay a number of his own efforts at activism, including efforts involving (once again) Computer Professionals for Social Responsibility.

In making these calls for more activism in the name of social responsibility, I fall back on my personal philosophy, which I base on the philosophy of the American Pragmatists, especially John Dewey and G. H. Mead. I have written on this topic numerous times, and in recent years I have distilled the message to its basic elements.

First, I believe, with Mead, that ethics is not some abstract system of deontological or utilitarian rules, but the community attempting to solve its problems in as intelligent a way as possible. (Hans Joas, in an intellectual biography of Mead, has pointed out this contrast, for Mead, between his community action view and both deontological and utilitarian ethical theories.) Mead wrote, almost a century ago, as follows:

The order of the universe that we live in is the moral order. It has become the moral order by becoming the self-conscious method of the members of a human society. The world that comes to us from the past possesses and controls us. We possess and control the world that we discover and invent. And this is the world of the moral order. It is a splendid adventure if we can rise to it.

In my opinion, society expects us to rise to the challenge, especially democratic society. Democratic societies, at least, have a right to expect that experts will help them, experts from all parts of academia and all the professions. I would even go so far as to say that there is at least an implicit social contract between professionals and the democratic societies in which they live, giving rise to this expectation that professionals will shoulder their responsibilities to improve the societies in which they live and work.

In these terms, some people talk about an obligation to shoulder social responsibilities. But I don't prefer that terminology. As I have written elsewhere:

That's not what I think is called for here. The problems calling out for action in our troubled technological world are so urgent and so numerous—from global climate change to gang violence, from attacks on democracy to failures in education, from the global level to the local technosocial problems in your community—that it isn't necessary to talk about obligations, even social obligations. No, it's a matter of opportunities that beckon the technically trained—including philosophers and other academics—to work alongside those citizens already at work trying to solve the problems at hand.

I am an optimist, but not a blind optimist. I am happy that there are computer professionals who are activist in joining with others to solve the technosocial problems that vex our society, including the computer and information professions. And I will be happier still if more join their ranks. But I also recognize that many will not, that many will remain satisfied that they are doing their best if they just do their jobs well.

But the problems remain, and the opportunities to do something about them. And I'm optimistic enough to hope, like Mead, that some computer and information specialists will rise to the challenge.


REFERENCES

First, to things I have written:

Social Responsibility in Science, Technology, and Medicine. Bethlehem, PA: Lehigh University Press, 1992. See chapter 8.

"Ethics and New Technologies." In F. Adams, ed., Ethical Issues for the Twenty-First Century. Charlottesville, VA: Philosophy Documentation Center, 2005. Pp. 37-56.

"Philosophy of Technology: In Search of Discourse Synthesis." Available online from the Society for Philosophy and Technology, in its journal Techne; see spt.org, under journal, 10:2, 2007. See chapter 20.

Chapter 5 in Philosophy of Technology: 5 Questions, ed. Jan-Kyrre Berg Olsen and Evan Selinger. Automatic Press, 2007. Pp. 45-54.


Other references:

Joas, Hans. G. H. Mead: A Contemporary Re-Examination of His Thought. Cambridge, MA: MIT Press, 1985.

Johnson, Deborah (co-edited with Helen Nissenbaum). Computers, Ethics, and Social Values. Englewood Cliffs, NJ: Prentice-Hall, 1995.

Lessig, Lawrence. Code and Other Laws of Cyberspace. New York: Basic Books, 1999.

Lessig, Lawrence. The Future of Ideas: The Fate of the Commons in a Connected World. New York: Random House, 2001.

Mead, G. H. "The Moral Philosopher and the Moral Life." In A. Reck, ed., Selected Writings: George Herbert Mead. Indianapolis, IN: Bobbs-Merrill, 1964. Pp. 6-24.

Office of Technology Assessment (United States Congress). Electronic Surveillance and Civil Liberties. Washington, DC: U.S. Congress Office of Technology Assessment, 1985.


About the Author
Dr. Paul T. Durbin is emeritus professor of philosophy and environmental policy, retired from the University of Delaware and currently living in Spain. He has recently published two books online, both available from his University of Delaware website http://www.udel.edu/Philosophy/sites/pd/writing.html Details about Professor Paul Durbin is at http://www.udel.edu/Philosophy/sites/pd/ Paul Durbin latest publications are: 2006. "Introducing Philosophy Pragmatist Style: An Essay"; 2007. "Philosophy of Technology: In Search of Discourse Synthesis." Techne 10:2 http://scholar.lib.vt.edu/ejournals/SPT/v10n2/pdf/




Source: Ubiquity Volume 8, Issue 45 (November 13, 2007 - November 19, 2007)



Forum

Printer Friendly Version





[Home]   [About Ubiquity]   [The Editors]  


Ubiquity welcomes the submissions of articles from everyone interested in the future of information technology. Everything published in Ubiquity is copyrighted �2007 by the ACM and the individual authors.

COMMENTS

POST A COMMENT
Leave this field empty