If you've been reading the headlines about Apple's fight with the FBI, you know it's easy to assume we're all doomed.
Either law enforcement will lose the ability to thwart terrorist plots, or we'll be forced to live in a police state. Neither of those outcomes is exactly what you'd call appealing.
But Apple and the US government say those are exactly the things at stake in a court battle playing out in California. The two sides will meet before a magistrate judge on Tuesday in Riverside to make their arguments on whether Apple should build a new version of its mobile software so the FBI can hack into an iPhone 5C used by one of the San Bernardino shooters.
Apple argues it shouldn't be forced to make its phones less secure. The company, which has the backing of Silicon Valley notables like Google and Facebook, argues that creating software to break into one phone puts all other iPhone users at risk if the technology falls into the wrong hands. The FBI calls its request "modest," says it can't get into the device without Apple's help, and notes that information on the iPhone 5C could reveal more about the terrorists' activities.
Tuesday will be the first chance for both parties to make arguments before Magistrate Judge Sheri Pym. This could drag on for a while, even years. Judge Pym won't make a ruling immediately, and her decision faces appeal, possibly all the way to the Supreme Court.
"Because technology is moving at warp speed, we don't have two to three years to wait for a solution here in this particular case or in the boatload of cases after it," said Ed McAndrew, a former federal cybercrimes prosecutor and now a lawyer at Ballard Spahr.
While we wait it out, let's look at those worst-case scenarios from each side.
In the eyes of the FBI, this is about keeping Americans safe. If law enforcement can't get access to data on iPhones, criminals can "go dark," the government says. FBI Director James Comey warned a Congressional committee earlier this month that offering a place no authorities could touch would create a haven for terrorists and criminals.
"Before these devices came around, there was no closet, basement or drawer in America that could not be entered with a judge's order," he said. Privacy is important, Comey said, but so is stopping murder, violence and pedophilia.
A loss in this case could also hurt the FBI's ability to get info from other tech companies, like Facebook.
Apple counters the government's warning by saying the FBI shouldn't be fixated on what it can access but realize there's a "mountain" of information that now is available because of technology.
"Going dark -- this is a crock," Apple CEO Tim Cook said during an interview with Time. "No one's going dark."
And experts say law enforcement has to find a way to fight crime in a world with strong encryption.
"The cost of maintaining a free society is that sometimes criminals won't be caught," said John Hasnas, a professor of ethics at Georgetown's McDonough School of Business. "Sometimes there are bad things we can't prevent."
Apple says the government is asking for a back door into all iPhones. If the FBI is able to get access to one phone, it'll ask for access to more, the company said. There's also no way to guarantee that the loophole won't fall into the hands of criminals. It would become a top prize for hackers, and Apple undoubtedly would face attacks.
Apple also fears the government's demands won't stop with unlocking iPhones. Next, law enforcement could ask for access to an iPhone's camera and microphone to keep tabs on you, Eddy Cue, Apple's head of Internet software and services, said during a recent Univision interview.
"Where does this stop?" Cue said. "In a divorce case? In an immigration case? In a tax case with the IRS? Someday, someone will be able to turn on a phone's microphone. This should not happen in this country."
Then there are the international implications. No foreign government has a back door into Apple's products, but if the US government is successful, you can bet other countries, such as China, will come knocking too.
At stake are the "very freedoms and liberty our government is meant to protect," Cook said.
The FBI counters by saying that Apple helped it gain access to devices in the past without causing a loss of privacy and freedoms. Stacey Perino, an FBI electronics engineer, argued in a declaration that even if Apple didn't destroy the new software and criminals got access to it, they couldn't use it to hack all iPhones. That's because the code would run only on an iPhone if it had Apple's unique digital signature, Perino said.
Finding middle ground?
Even the American public is split on the issue. According to a poll by CNET sister publication CBS News and The New York Times, half of Americans believe Apple should unlock the phone, while 45 percent think it shouldn't. More than two-thirds of Americans think unlocking the phone will make it at least somewhat likely that other iPhones are more vulnerable to hackers, and 58 percent of Americans remain concerned about losing some privacy in the fight against terrorism.
Some are hoping to find common ground.
Apple has recommended that a commission set the parameters for tech's interactions with the government. Two US lawmakers, Rep. Michael McCaul, a Republican from Texas, and Sen. Mark Warner, a Democrat from Virginia, agree and say they want Congress to form a commission charged with addressing issues on digital security that have put authorities and private companies at odds.
"Both the FBI and Apple are taking absolutist positions and in many ways are talking past each other," Warner said in an interview on Wednesday. "I do believe there are technology solutions that can protect encryption and not lead to back doors."
Warner hopes to have a commission approved within a couple of months and to see the group issue a full report in about a year.
Even if the US comes to some agreement, it won't matter much if other countries have conflicting policies.
In the end, all it could take is one more attack for everything to change.
"We know there will be another catastrophic attack on the homeland," said a former counterterrorism official at the White House who didn't want to be named. "When that happens, the privacy-security pendulum will swing wildly back toward the national security side. The public will forget about encryption and will be very willing to give up some of their privacy for enhanced security."