X

Debating the morality behind software development

IBM's Grady Booch says the days when developers could dash off code without considering the larger implications are--and should be--coming to an end.

Charles Cooper Former Executive Editor / News
Charles Cooper was an executive editor at CNET News. He has covered technology and business for more than 25 years, working at CBSNews.com, the Associated Press, Computer & Software News, Computer Shopper, PC Week, and ZDNet.
Charles Cooper
7 min read
In the long history of software development--let's loosely mark the starting point around the time of the ENIAC--code writers have dealt with a myriad of technical and business challenges. It's fair to say they've also not had to confront questions of morality or ethics about how governments later deploy their finished work.

Until now.

Grady Booch, the inventor of the Unified Modeling Language, says those days of splendid isolation are--and should be--coming to an end.

Booch was the first chief scientist at Rational Software when it began in 1981. He kept the job after IBM bought the company in 2003 when he was also elevated to the rank of IBM fellow.

Maybe it's the freedom that comes with possessing a small fortune or perhaps it's just in his DNA to make waves, but Booch relishes the "voice in the wilderness" mantle--both inside and outside of the technology world's largest corporation. CNET News.com spoke with Booch about his ideas concerning software and ethics during a recent swing he made out to the West Coast.

Q: You've gone on record talking about this question of morality in software. I didn't think one could classify software as moral or immoral. What's behind your thinking?
Booch: Even though what we're doing is deeply technical stuff, there are ethical, moral implications about what we do. And it's not just in our sciences--look at the struggles the physicists of the '40s and '50s had dealing with their ability to unlock the secrets of the universe.

Even though what we're doing is deeply technical stuff, there are ethical, moral implications about what we do. And it's not just in our sciences--look at the struggles the physicists of the '40s and '50s had dealing with their ability to unlock the secrets of the universe.

You're talking about nuclear power?
Booch: I'm talking about nuclear power and not just nuclear power, but also nuclear weapons and the like. It's our ability to unlock these secrets of the universe for either good or bad.

And so?
And so there were great discussions then--and even today--to the effect that I may have the ability to do these things, but should I do these things? The same thing is true in software systems.

But those are technical issues, then.
Booch: They're not.

But they don't have anything to do with morality.
Booch: I'm leading up to where the morality issue goes up this (development) ladder. It's the place where it's not just a matter of whether we can build or want to build but also the question of whether we should build.

Here's an example. London's installing more video cameras per square mile on the street than anybody else. All right, not a lot of software there. But what happens when they couple that with facial recognition software so I can actually track individuals as they go through the city?

But that's not a question that the software developer gets presented with. That's something for the city of London to consider based upon its needs.
Booch: Yes, but at the ultimate level, the software developer can say, "Do I want to actually build a system that potentially could violate human rights?"

What software developer do you know actually thinks about that when he or she sits down at the keyboard?
Booch: I know many. There is a group called Computer Professionals for Social Responsibility where many of its members think about that kind of thing. That group was formed to deal with the social issues of the developers. Do I as a developer, coming fresh out of college, decide to go work for someplace in Silicon Valley working on a benign business application? Or do I work for some defense contractor? So that's a moral decision that a person has to make to use his or her skills.

For the sake of argument, I also could say that while Google is a for-profit corporation, it's been involved in China and some people have raised questions in connection with the company's policies. So when some freshly minted engineering candidate out of Berkeley decides where to apply for a job, does Google then get put on par with the military as far as these moral questions are concerned?
Booch: That's a decision that person has to make. The issues you raise are philosophical ones. Let's say I'm working on some bit of software that enables sort of a social networking kind of thing that enables connectivity among people and there's potential for the exposure of lots of information. Well, do I then add a particular feature realizing it may have a coolness factor. But at the same time I may just have found a way that pedophiles can get into this network more easily.

Using your logic, wouldn't it also be fair to say that somebody who was instrumental in designing the cell phone would have faced those same issues because a pedophile can use a cell phone for nefarious purposes?
Booch: The question is whether I, as a technologist, add features that potentially eat away at personal privacy but also enable the use of a law enforcement agency to track this person? Which way do I push this because, as a technologist, I have the ability to deliver things to people who don't know how to do that technology. Nonetheless, they are the ones who will make policy that would be impacted by what I create.

Isn't this again an issue for the consumer--whether it be an organization or corporation or some place in the public sector, rather than something you lay at the feet of the people responsible for the creative level? I mean you work for IBM...
Booch: Correct.

So what you do in front of your keyboard is not inherently good or evil. It's what IBM does with that technology which would presumably have an impact. I recall a few years ago a book describing IBM's interactions with the Third Reich before the war.
Booch: Right.

The Web is an incredibly subversive agent and it's the individuals who are going to make the difference, not the policy makers.

So how far can you logically push the argument before it becomes, "well yeah, sure, but..."
Booch: What I love about this discussion is that we're seeing a dialogue here that's starting to open up in the software field. It's already been there in the worlds of physics, chemistry and biology. The very fact that this dialogue is going on (in the computer business) in some ways is a suggestion to me that our industry is beginning to mature because at least these things are on the table.

Do you really think so? Bill Joy's article on the risks of nanotechnology came out and kicked up a fuss. But the morality question you're raising isn't something that gets the time of day in this industry. Look, I've written several columns chastising the powers that be in Silicon Valley for its policy vis-a-vis China.
Booch: Right.

I'm not a China basher and I know the realities of doing business. But there's a stone wall of apathy about this issue. Most people in Silicon Valley don't give damn.
Booch: You see that's where the individual comes into play...and can make some incredible differences. He or she might find ways to penetrate the barriers that these countries put up. It's a moral decision for me to say, "I'm going to actively do that because I believe in the open and free flow of information despite a particular government's policy." An individual can very much make a difference in this regard. The Web is an incredibly subversive agent and it's the individuals who are going to make the difference, not the policy makers.

Allow me to play devil's advocate for a moment.
Booch: Please.

If computer scientists dig in their heels at even the possibility that their work might later get used by organizations that they politically find not to their liking, do you risk being called a Luddite? That is, you're willing to innovate up to this point and no more because peering over the abyss, you don't like what you think you're seeing.
Booch: Well, now you get to a wonderfully deep philosophical issue. Do I hold back? The difficulty is that science has this really sneaky way of oozing through all the pores...Even though I would personally prefer to make the decision to say, "No, I'm not going to do that," I still have the responsibility to educate those who are in a position in the policy-making realm, so that they understand the implications of what they're doing.

In my lectures I tend to end it with this little-bitty sound bite, which talks about how it's an incredible privilege and responsibility to be a software developer. We collectively and literally change the world. I can't think of any other industry that has impacted every other business in the way that we as humans and civilizations connect. What a cool business to be in.