It's been a year since Apple fought the FBI over data privacy, and we've barely heard a peep from either side on the issue. So everything's fine, right?
The FBI's attempt to force Apple to unlock an iPhone used by a terrorist set up a grand legal battle between security and privacy. On one side is a massive tech company envisioning a future similar to the setting in George Orwell's "1984" (which, coincidentally, has become a bestseller again after President Donald Trump's inauguration). On the other is the world's most powerful government dangling the threat of a terrorist attack if it can't get access to vital information.
The stakes were sky-high. Cybersecurity experts said the dispute could have far-reaching implications for everything from how private our personal photos are to how tech companies operate in other countries.
Both were poised to head to court, and then a funny thing happened: The FBI suddenly said it didn't need Apple's help, and the whole affair just faded away.
But that doesn't mean everything is hunky-dory.
Because the battle never went to court, we never got an answer on whether security or privacy takes priority. A year later, the only thing that's clear from the public battle is just how hazy everything still is. And the conflict isn't going away anytime soon, especially if there's another terrorist attack.
"This past year was kind of a missed opportunity to work this thing out," said William Snyder, visiting assistant professor of law at the Syracuse University College of Law. "It hasn't gone away. The question is whether you deal with it now when things are calm or later when the stakes are high."
The FBI referred CNET to comments made by FBI Director James Comey in April, when he talked about how the US has always balanced privacy with public safety and how encryption has upset that balance. "The logic of strong encryption means that all of our lives, including law enforcement's life, will soon be affected by strong encryption," he said. "The notion that privacy should be absolute, or that the government should keep their hands off our phones, to me just makes no sense given our history and our values."
Apple CEO Tim Cook, meanwhile, has continued to champion strong encryption and Apple's efforts to protect customer data. Last week at the University of Glasgow in Scotland, he said that "it wasn't that we were being activists; it's that we were being asked to do something that we knew was wrong. And so we had a choice to just blindly do what the institution said to do, or to fight. And we just fought."
What happened again?
Here's a quick refresher: In early 2016, the FBI wanted Apple to create software to unlock an iPhone 5C used by Syed Farook, who weeks earlier had killed 14 people in a terrorist attack in San Bernardino, California.
Apple helped pull data from Farook's iCloud account, but some dates were missing. And the FBI couldn't get into the phone because it didn't know the passcode.
On February 16, 2016, US Magistrate Judge Sheri Pym ordered Apple to create that software for the FBI. Apple refused, with Cook arguing that the order went too far and would threaten the security of all iPhone users. Bypassing the iPhone's password meant creating a "back door" in its iOS mobile software that could then be used to access every other iPhone, he said.
The two sides battled over the following weeks in legal filings and public comments. The fight ended with a whimper on March 21 -- the day before a slated court hearing -- when the FBI found a third party to unlock the phone. It turned out the government didn't need Apple's help after all.
A separate case in Brooklyn, New York, that involved a confessed drug dealer ended in a similar fashion, with the FBI dropping its request for Apple's help after finding another way into the iPhone.
In both instances, the FBI initially said Apple was the only organization that could get into the iPhones. But both times, the bureau ended up being able to access the phone with the help of third parties at the 11th hour. The government didn't specify in either instance who helped it get into the iPhones, but reports later named Israeli security firm Cellebrite as the company that helped the FBI in the San Bernardino case. Cellebrite earlier this year was hacked, something Apple had worried about.
"I would characterize this as the opening volley in what's going to be a very long-term conversation," said Paul Rosenzweig, a former Department of Homeland Security official and founder of cybersecurity company Redbranch Law and Consulting.
The encryption debate
What the fight came down to was the encryption used on Farook's iPhone 5C. The technology scrambles data and requires a passcode before letting you have access. If investigators copied the hard drive, the data would remain scrambled. And if investigators entered the wrong passcode 10 times, the iPhone's data would be wiped.
Tech firms and privacy advocates argue that encryption is essential to secure personal information and communications. The government and law enforcement officials counter that encryption hurts their ability to investigate criminal and terrorist activity.
Apple's battle with the FBI got the average consumer and Congress thinking about the once wonky topic of encryption. It spurred others to act. Facebook-owned messaging app WhatsApp rolled out end-to-end encryption in early April, which means it doesn't have access to those messages and can't be forced to surrender them to the authorities.
The same time Apple was battling the FBI, draft legislation leaked for a possible encryption bill from two US senators, Richard Burr, a Republican from North Carolina, and Dianne Feinstein, a Democrat from California. The bill would have given federal judges the authority to order tech companies like Apple to help law enforcement officials access encrypted data. Tech companies essentially would be legally required to build back doors into their products, the very thing Apple fought against.
"The consensus among security and privacy and legal experts was that was a terrible idea," said Larry Downes, project director for the Center for Business and Public Policy at Georgetown University McDonough School of Business. "It would mean the end of any actual privacy protection."
In late May, that bill -- which was never actually introduced to the Senate -- died. No other encryption bill has been proposed since. As of January, more than half the volume of internet traffic is now encrypted, according to Firefox browser maker Mozilla.
While Apple may not have been back in court over this issue, others have.
Microsoft and Google faced legal battles over giving law enforcement access to data stored in their cloud services, and law enforcement has asked Amazon to send recordings made by its Echo smart speaker that relate to a murder in Arkansas. Microsoft has prevailed in court with its argument that it shouldn't have to hand over data held in an Irish data center until Ireland gives approval. Google, though, wasn't as lucky. Earlier this month, a US judge ruled that Google has to give the FBI emails stored overseas.
"It's a question about information stored in the cloud that isn't encrypted, but the government wants to get through the Stored Communications Act," said David Opderbeck, a professor of law at Seton Hall University Law School. It's an issue that could pop up more, he said.
Apple, meanwhile, continues to beef up the security of its devices. In August, it introduced its first "bug bounty" program for outside researchers to find vulnerabilities in its software. That's long been a common practice for other tech companies, but Apple previously performed its checks internally. It now offers up to $200,000 for any flaws found and reported to the company.
The Trump card
The wildcard in all of this is President Trump. He's expected to take an even tougher stance than President Barack Obama when it comes to strengthening law enforcement access.
"I can only expect the new administration ... to go even further than the Obama administration did in terms of data collection and the expectation that even private companies are going to be compelled to share private information with the government," said Charley Moore, CEO of online legal technology company Rocket Lawyer.
The White House didn't respond to a request for comment.
During the campaign, Trump repeatedly bashed Apple for not helping the government hack the iPhone. "Who do they think they are?" he asked at one time. Another occasion, Trump called for a boycott of Apple's products until the company helped unlock the device. That never happened.
"They have to open it up," Trump said in mid-February 2016. "I think security -- overall, we have to open it up and we have to use our heads. We have to use common sense."
Some of Trump's early moves have butted up against the technology industry. His ban on immigrants from seven Muslim-majority nations caused an outcry joined by Apple, Microsoft and dozens of other companies. His pick for the head of the Federal Communications Commission, Ajit Pai, will likely dismantle net neutrality, which most of the tech industry favors.
Trump also withdrew from the Trans-Pacific Partnership, which had a provision that would ensure no company from a member state had to provide access to its software source code as a condition of entering the market another member state's market. "That provision would have gone a long way toward avoiding the parade of foreign horribles that Apple raised in its opposition to the FBI warrant," said Joshua Rich, a partner at law firm McDonnell Boehnen Hulbert & Berghoff. "But because the US withdrew, other TPP member states will not be required to extend those protections to US companies."
Trump made Hillary Clinton's use of a private server while secretary of state a key campaign talking point. And in late January, he was slated to sign an executive order on cybersecurity but canceled those plans without explanation.
With Trump in the White House and with Congress controlled by the Republican party, pro-law enforcement legislation is likely. The hope among cybersecurity experts is that anything related to encryption or privacy that happens takes place soon, not in the aftermath of an attack as was the case with the Patriot Act. That post-Sept. 11 law increased surveillance powers and caused major privacy concerns.
"The pendulum swung wildly to the national security side because we suffered through a catastrophic attack," said Daniel Rosenthal, a former counterterrorism official with the National Security Council who currently works as an associate managing director at Kroll, a corporate investigations and risk consulting firm. "If we suffer through another [attack], there's a good chance the pendulum swings again, away from the privacy and data security side."
Special Reports: CNET's in-depth features in one place.
Does the Mac still matter? Apple execs tell why the MacBook Pro was over four years in the making, and why we should care.