X

At AT&T, making the right connections

High-ranking exec Hossein Eslambolchi takes stock of the Google factor in Wi-Fi and the nation's "severe" cybersecurity risk.

Charles Cooper Former Executive Editor / News
Charles Cooper was an executive editor at CNET News. He has covered technology and business for more than 25 years, working at CBSNews.com, the Associated Press, Computer & Software News, Computer Shopper, PC Week, and ZDNet.
Charles Cooper
6 min read
In a field as prone to sharp left turns as the technology industry, divining the future is a thankless job.

Most of the time, the boldest predictions wind up on the cutting-room floor. Still, when AT&T CEO Dave Dorman has to decide on which technology direction the telecommunications giant should take, one executive whose advice looms largest is Hossein Eslambolchi.

Besides being president of AT&T's global networking technology services unit, Eslambolchi also serves as AT&T's chief technology officer and as its chief information officer. With that job portfolio, he needs to figure out the next big thing long before it becomes conventional wisdom.

One clear concern for Eslambolchi is cybersecurity. In a wide-ranging conversation he had with CNET News.com, Eslambolchi said that when he looks on the horizon, he does not like what he sees.

What do you think about the speculation that Google is going to build a Google Net broadband network, and about Google perhaps getting into the wireless business?
Eslambolchi: I think it's a pretty clever approach. There are two elements of any communications that you have to be very cognizant of. The first one is around the network itself, and there have been discussions, I think, with Google in the industry about building their own kind of fiber-optic network infrastructure.

I think having a core network is kind of necessary, but it's not sufficient. Sufficiency comes in when you connect to the edge of the network--where the battleground is going to be in the 21st century, where technology such as Wi-Fi, WiMax and broadband over power lines...would allow customers over that infrastructure to get access through Google in a data stream and a hosting infrastructure.

Is it feasible for a search engine to become an access provider? It's not been done--at least not in my memory.
Eslambolchi: Well, the question is, what is the core competency? I think Google has demonstrated a very elegant approach on the core competency that they have developed in this industry and globally with their brand name on the search engine. And they're doing a great job doing so.

We are not taking security seriously in terms of cybersecurity.

Moving away from that core competency brings a lot of more challenges--including security challenges, IP security, cybersecurity...so from an operational perspective, it's a little bit difficult to imagine what is it going to look like and whether they have the core competency to be able to scale an infrastructure that would ultimately require millions of connections.

Why is Google doing something like this now. Why not AT&T?
Eslambolchi: I can't comment or speculate what the intention is, but as I said earlier, the battleground is really at the edge of the network. They want to have access; they want to have control of the access point. Access is where you have the highest level of cost, the least amount of competition, and the best ability to differentiate yourself. So bringing technology such as Wi-Fi...would be a pretty nice way of getting access to those customers who want to use data over the Web and do searching and all the other applications over that Web infrastructure.

But could AT&T, if it wanted to, make an attempt to do something of the sort that we think Google is trying to accomplish?
Eslambolchi: For us, building an insecure infrastructure such as Wi-Fi is really not warranted, because our customers will not accept putting their highly sensitive applications over Wi-Fi with all the security issues that exist...There are a lot of issues that have to be worked, and I'm not convinced whether that Wi-Fi infrastructure is going to be secure enough, resilient enough and reliable enough to be able to support all of these mission-critical applications for our large enterprise customers.

Recently, I had a chance to speak with someone who used to work inside the federal government's cybersecurity department. It was a job that originally was held by Richard Clarke, and it's been a revolving door ever since. Do you think the government has been serious enough giving teeth to a job that has become or remained a revolving door?
Eslambolchi: Well, I can talk about the security in general, Charlie.

Let's talk first about the government and security. Do you think Uncle Sam is just paying lip service?
Eslambolchi: If you're asking me from my technical view, I think we have a severe security risk in this country, and also we have it globally. We are not taking security seriously in terms of

cybersecurity, and that's why we've seen a significant number of attacks in various networks across not only this country, but also globally.

If we look at the percentage of IT investment on security, it's about 3 percent in this country. If you go back to other countries, that number moves up to about 18 percent...It's going to really pay a dividend in a wrong way. Security flaws cost the industry in 2003 about $13.5 billion; in 2004, it has been about $17 billion, and I expect this year the projection could be over $20 billion to $22 billion of damages.

How much more vulnerable could it be if we're talking about another FEMA-like disaster?
Eslambolchi: If we do not make software infrastructure more secure, we're going to end up with a problem of biblical proportions in this country and globally on the Internet within the next five to seven years.

Specifically what are you suggesting?
Eslambolchi: One is that we need to have the software vendors write better software code...When you write software code, you really have to write it with engineers who really have gone through security training.

But Hossein, that's been true for several years. There has been no progress since then?
Eslambolchi: I think there has been very little progress. There has been some progress in some of the software infrastructure, but the hackers are a very smart group of guys, and if you think you have solved one end of your problem, these hackers will find a way to hack you through other sophisticated tools that they're developing.

If we do not make software infrastructure more secure, we're going to end up with a problem of biblical proportions.

It's a never-ending story, and I don't think you will ever get to a point to say that all the software infrastructure is globally secure, because the hackers will find a way to find the flaws in the software. This is going to take a decade, if not two decades, from an education perspective to fix that problem.

There is a school of thought that argues that as network services and Web services become more abundant, that the focus of software development will be in that direction, that more applications would be written to sit above the Internet layer rather than on any stack provided by a proprietary vendor. How do you see this debate resolving itself?
Eslambolchi: It's very clear that the whole model of proprietary is history. We really have to move the industry. So in the sense of whether the application sits on top of the Internet--(that) happens today. Look at voice. Voice is just a service or application that sits on top of the IP layer itself and provides the application-layer functionality. I personally believe (that) within the next 10 to 20 years, the current Internet is going to be running its course, because we are trying to force the 21st century-type applications into a 20th century technology.

We have to move from network-based routing, which we do today, into more of what I call application-layer routing, (where) you don't have to break the packets down and you can have any size packet. You just route it based on the application. That is where the next-generation Internet has to get designed.