Four tall, gray buildings stand among the homesteads and farms on Route 126 outside Prineville, Oregon.
They house a Facebook data center stuffed with servers. Sitting on racks that resemble rows of grain, the servers store the videos of your Ice Bucket Challenge and the photos of your nephew's graduation.
The facility, however, is an outlier. That's because it's also packed with phones.
The social network has chosen rural Oregon to test roughly 2,000 phones -- almost all of them out-of-date models that many of us would hesitate to use in public -- dating back to the Apple, as well as a Samsung , both from 2011. Why is Facebook testing this tired tech? To rope in its next billion users.
"Guys in Brazil and Southeast Asia, they don't have the cool new phone," said Ken Patchett, Facebook's director of western data center operations.
Facebook has already saturated the richer countries, where consumers have the luxury of lining up for the latest gadgets and, more importantly have the the cash, credit or subsidies to buy them.
Top-shelf phones typically cost about $700, which millions of people around the world are willing to pay. But for someone who makes about $5,000 a year, like the average Chinese worker, that's too pricey. A Galaxy Nexus, by contrast, can be had for $140 or less.
Facebook therefore needs to make sure its app works on those phones too. This need leads to the data center sitting 500 miles away from the company's Menlo Park, California, headquarters. Facebook developers have been instructed to send their latest code to the data center to make sure it works well, no matter which phone it runs on.
Focusing on the future
Facebook already attracts more than a billion users every day, but its focus on the next billion can be traced back to initiatives like Internet.org. Launched three years ago, Internet.org was pitched as an effort to "bring affordable internet access to everyone in the world."
"We believe that every person should have access to free basic internet services -- tools for health, education, jobs and basic communication," Facebook CEO Mark Zuckerberg said on the initiative's first anniversary.
But as phones proliferated across the planet, Facebook realized its app -- one of the most used in the world -- didn't run as well on low-end phones. Sometimes updates to the app would perform even worse on older devices.
So last year, the company began a weekly tradition of slowing internet speeds in its offices to simulate the slower internet connections in emerging markets. "To build for a global audience like ours, we know that we need to design features that work seamlessly even on a 2G network," Facebook said at the time.
It also began testing its service on low-end devices, telling employees to ditch their beloved iPhones for low-end Androids instead. Engineers began testing various phones at their desks. Now, there are thousands of phones in Prineville, and more to be added in other data centers as well.
It's an odd way to use a data center, but Facebook isn't your typical company. Most of the technology in this 110,000-square-foot facility -- called Building 4 -- is designed to store, retrieve and transmit everything from cat videos to 360-degree photos of Wimbledon.
Touring the center
When you type Facebook.com into a computer or tap on one of the company's apps on your phone, your device is connected with an intricately designed network filled with hundreds of thousands of servers sitting in large facilities like the one Facebook opened up to journalists on Tuesday in Prineville.
Think of it like the digital equivalent of a warehouse. The floors are gray, there's a hum of fans blowing and the walls are barren, save for the occasional exit sign or art project from a Prineville school.
Prineville was Facebook's first data center, opened in 2011. The company has since built five more in Sweden, Ireland, Iowa, Texas and North Carolina.
The facility doesn't rely on that much energy to power your "Happy Birthday!" posts either. Though Facebook wouldn't say how much the facility needs, it has about 84 megawatts worth of backup power on site. By comparison, the city of San Francisco peaked at 950 megawatts back in 2010.
It also only counts about 165 staff between employees and contractors. That's nearly one person for every 10 million Facebook users logging in each month.
So what happens if Prineville suddenly goes offline? There are effectively several copies of all our data in the company's other centers. And to make sure it works, Facebook sometimes takes one of the data centers offline, without warning to staff.
It's hard not to notice the chilly air next to these servers, and that's by design. The air in the data center is kept between about 60 degrees and 80 degrees Fahrenheit, the ideal operating temperature for these servers, which were designed by Facebook just like the wiring, the cooling and even the bins employees use to move parts around the floor. Facebook chose Prineville in part for its cool, dry air, which is mixed with water and computer exhaust until it's the ideal temperature.
"It's like an old Dodge 318 -- an engine doesn't run well when it's cold," said Patchett. If a computer is too hot or too cold, it also isn't efficient.
Facebook isn't done building these data centers. The Prineville location, for example, has been under construction since the start, with the latest building set to open in December. Facebook is also still building its Texas data center, which was announced last year.
Updated July 14, 8am: To indicate there is roughly one data center staff for every 10 million Facebook accounts, not every 10,000 as originally written.