You've got the usual things to worry about. You need to set up subsidiaries or joint ventures, hire local management, figure out how to scale your infrastructure, think through tax implications, and so forth.
There's also a more unusual challenge that you may not be able to ignore. If you are in the business of Internet or telecommunications services, you are going to be asked to censor the information you provide to people. And you'll probably be asked to turn over information about your users to the local police, rarely with anything approaching what we'd call "due process" in the United States.
There are typical requirements when operating almost anywhere--even liberal democracies identify information to be removed, such as that which infringes copyright or meets some test of obscenity. They require businesses to help identify users at times, and some impose blanket data-retention requirements for these purposes.
But in, the practices have extra bite. The information the government can relate to civic dialogue and freedom. The people they seek to identify might be political dissidents or religious practitioners. Often, the requirements to redact or block will be stated or implied only generally without specific requests for individual cases. That means that your company will have to be prepared to , trying to divine what the regulators have in mind--and act to censor without explicit orders to do so.
Over the past five years, there has been a OpenNet Initiative (a consortium of four university teams: the University of Cambridge, Harvard Law School's Berkman Center for Internet & Society, Oxford Internet Institute, and the Citizen Lab of the University of Toronto) has made plain. These countries rely on private enterprise for their control, and some of America's most prominent Internet companies have found trouble trying to follow local law in parts of the Middle East, the former Soviet Union, and East Asia-- --against a backdrop of international criticism.. In 2002, only a small handful of countries censored the Internet; the number is more than two dozen in 2007, as our research through the
--and subjected to a human rights lawsuit--for turning over information to Chinese authorities about a journalist that allegedly led to his arrest and imprisonment. The problem? The jailing was for no crime that a court in Yahoo's home jurisdiction of California could recognize. Human rights activists won't let the world forget Yahoo's role.
for selling the routers and switches that make censorship and surveillance possible. , for offering a blog service that generates an error rejecting "profanity" when a user includes the word "democracy" in the title of a blog. Google has come that omits compared with what you'd find if you searched from the United States or Western Europe.
Side-by-sidefor "Tiananmen Square" in Google.com and Google.cn show the . Anyone who can see both sets of images, the latter lacking any shots of a person staring down a tank in 1989, is forced to consider what it would be like to live .
This represents two opposing pressures on nearly every corporation whose business involves information technologies. While liberal democracies have so far remained remarkably hands-off as the Internet has matured, the desire of more closed regimes to tap the Internet's economic potentialof the information space pressures companies to limit the freedoms they can offer many users, with rules that can be contradictory from one jurisdiction to another. And as companies accede, a second pressure arises from perceived betrayal of the values of the company's owners, customers or watchdogs.
What's a corporation to do?
The thorny ethical problem arises when the corporation is asked to do something squarely at odds with the law, norms or ethics of the corporation's home country. Should a search engine agree to censor its search results as a condition of doing business in a new place? Should an e-mail service provider turn over the name of one of its subscribers to the government of a foreign state without knowing what the person is said to have done wrong? Should a blog service provider code its application so as to disallow someone from typing a banned term into a subject line?
Reasonable people disagree as to the best means of resolving these emerging ethical concerns. One might contend that there is no ethical problem here--or, at least, that the ethical problem is nothing new. If an Internet censorship and surveillance regime is entirely legitimate from the perspective of international law and norms, the argument goes, then a private party required to participate in that regime has a fairly easy choice.
If a company disagrees with what it is being asked to do, then it should simply exercise its business judgment and refuse to compete in those markets. Alternatively, a company could decide to refuse to comply with the demands that it believes puts the firm in a position in which its values are compromised--and then accept the consequences, including possibly being forced to leave the market.
But this is not an ideal state of affairs. It means that ethically sensitive companies may squander the chance to engage with these countries and may yield development opportunities to more mercenary firms. And those companies may prove more willing to carry out repressive mandates. Moreover, leaving the market to firms with a tin ear for ethical implications will ratchet up calls for new legal barriers to international commerce.
In the United States, Rep. Christopher Smith (R-N.J.) has proposed the Global Online Freedom Act, which would that technology firms can conduct overseas--so much so that opening your new business line in China would probably be a nonstarter. Such legislation could help advance the cause of human rights, and it would have the benefit of applying equally to all companies under its jurisdiction. But it . The threat of legislation may be more effective in improving behavior than actually passing the law.
The most efficient and thorough way to address this conundrum is for corporations themselves to take the lead. Technology companies, acting as an industry, are best placed to work together to address some of the ethical issues by adopting a code of conduct to govern their activities under authoritarian regimes. This approach could, at a minimum, clarify to citizens within those regimes what they need to know about what companies will and will not do in response to demands from the state.
Google has refused to bring certain services into places like China, so as to avoid having to turn over personal information. Yahoo's chief executive, Jerry Yang, made a bold statement denouncing online censorship and surveillance, no doubt provoking the ire of the Chinese authorities--while not satisfying the demands of human rights critics.
At the same time, Yahoo has established a senior, cross-functional team of Yahoo executives worldwide to coordinate its efforts to address privacy and freedom-of-expression issues moving forward.
The more promising route is for one or more groups of industry members to come up with a common, voluntary code of conduct to guide the activities of individual firms in regimes that carry out online censorship and surveillance. Such a process has begun. Google, Microsoft, Vodafone, Yahoo and TeliaSonera are actively working together on a code. This process includes nongovernment organizations (NGOs)--including Business for Social Responsibility and the Center for Democracy and Technology, which chair the group--and academics, including teams from the Berkman Center for Internet & Society at Harvard Law School, the Oxford Internet Institute, and the University of St. Gallen in Switzerland.
Regulators with relevant expertise and authority have also weighed in on the process, as have investors and leading human rights groups. Just as noteworthy are those who are not yet involved in this process, especially those firms that sell relevant services and products directly to governments, such as Cisco, WebSense and Secure Computing.
The code that this group develops will most likely set out broad, common principles. These principles ought to contain enough detail to inform users about what to expect and to hold the companies to a meaningful standard, but without being so prescribed as to make the code impossible to implement from company to company and from country to country--especially in a fast-changing technological environment. This ever-changing context means that the code must continually evolve, taking on new challenges to speech and privacy, and ensuring that companies' responses are both dynamic and treated as internally driven organizational priorities. The code should also provide a road map for when a company might refuse to engage in regimes that put it in a position where it cannot comply with both the code and with local laws.
Developing (meaningful) codes of conduct
If the industry itself does not succeed through such an approach, the likelihood increases that an outside group will come up with a set of principles that will gain traction. This approach might place more pressure on companies to act. The Paris-based Reporters Without Borders has drafted such a set of principles, as have a group of academics based at the University of California at Berkeley's School of Law, Boalt Hall--while also participating in the company-centered process. An outsider's code might be something to which companies could be encouraged to subscribe, on the model of the Sullivan Principles and the Apartheid-era South Africa, with a governing institution to support the principles and the companies that subscribe to them.
The development of a code of conduct itself solves only a small part of the problem; it is in the successful application of the code that a long-term solution lies. In the context of other instances of corporate codes of ethics implicating human rights, such as the sweatshop issue, those involved say that getting to the code was the easy part.
A critical part of such a voluntary process to establish a code, regardless of its substantive terms and who drafted it, is to develop an institution charged with monitoring (and ideally supporting through best practices) adherence to the code and pointing out shortcomings. One might imagine an institution--perhaps not a new institution, but a pre-existing entity charged with this duty--that might include among its participants representatives of NGOs or other stakeholders without a direct financial stake in the outcome of the proceedings.
The best way to make this approach sustainable would be for the industry consensus to be given the status of law over time. This process would help to address three of the primary shortcomings of the industry self-regulation model. First, self-regulation can amount to the fox guarding the chicken coop. Second, self-regulation permits some actors to opt-out of the system and to gain an unfair competitive advantage as a result. Last, the self-regulatory system could collapse or be amended, for the worse, at any time--and may or may not persist in an optimal form, even if such an optimal form could be reached initially.
An industry-led approach would also bring with it the benefit of improved clarity for end users. If the code is well-drafted and well-implemented, users of Internet-based services would know what to expect in terms of what their service provider would do when faced with a censorship or surveillance demand.
The benefit of such an approach could well extend further. By working together on a common code and harnessing the support of their home countries, the NGO community, investors, academics and others, the affected industry might well be able to present a united front that would enable individual companies to resist excessive demands from regimes without having to leave the market as a result of noncompliance.
Industry need not--and ought not--go it alone.