This story is part of, CNET's complete coverage from and about Apple's annual developers conference.
Apple's announced a new Safety Check feature to help potential victims in abusive relationships.
Why it matters
This is the latest example of the tech industry taking on tough personal technology issues that don't have clear or easy answers.
Apple is communicating with victim-survivor advocacy organizations to identify other features that can help people in crisis.
Among the long-requested and popular new features Apple plans to bring to the iPhone this fall, likeas well as a function to find and , is one that isn't just a convenience -- using it could mean life or death.
On Monday, Apple announced Safety Check,, designed to aid domestic violence victims. The setting, coming this fall with iOS 16, is designed to help someone quickly cut ties with a potential abuser. Safety Check does this by helping a person quickly see with whom they're automatically sharing sensitive info like their location or photos. But in an emergency, it also lets a person simply and quickly disable access and information sharing to every device other than the one in their hands.
Notably, the app also includes a prominent button at the top right of the screen, labeled Quick Exit. As the name implies, it's designed to help a potential victim quickly hide that they'd been looking at Safety Check, in case their abuser doesn't allow them privacy. If the abuser reopens the settings app, where Safety Check is kept, it'll start at the default general settings page, effectively covering up the victim's tracks.
"Many people share passwords and access to their devices with a partner," Katie Skinner, a privacy engineering manager at Apple, said at the company's WWDC event Monday. "However, in abusive relationships, this can threaten personal safety and make it harder for victims to get help."
Safety Check, and the careful way in which it was coded, are part of a larger effort among tech companies to stop their products from being used as tools of abuse. It's also the latest sign of Apple's willingness to wade into building technology to tackle sensitive topics. And though the company says it's earnest in its approach, it's drawn criticism for some of its moves. Last year, the company announced efforts to detect child exploitation imagery on some of its phones, tablets and computers, a move that critics worried.
Still, victim advocates say Apple's one of the few large companies publicly working on these issues. While many tech giants including Microsoft, Facebook, Twitter and Google have built and implemented systemsand behavior on their respective sites, they've struggled to build tools that stop abuse as it's happening.
Unfortunately, the abuse has gotten worse. A survey of practitioners who work on domestic violence conducted in November 2020 found that 99.3% had clients who had experienced "technology-facilitated stalking and abuse," according to the Women's Services Network, which worked on the report with Curtin University in Australia. Moreover, the organizations learned that reports of tracking of victims had jumped more than 244% since they last conducted the survey in 2015.
Amid all this, tech companies like Apple have increasingly worked working with victim organizations to understand how their tools can be both misused by a perpetrator and helpful to a potential victim. The result are features, like Safety Check's Quick Exit button, that advocates say are a sign Apple's building these features in what they call a "trauma-informed" way.
"Most people cannot appreciate the sense of urgency" many victims have, said Renee Williams, executive director of the National Center for Victims of Crime. "Apple's been very receptive."
Some of the tech industry's biggest wins have come from identifying abusers. In 2009, Microsoft helped create image recognition software called PhotoDNA, which is now used by social networks and websites around the world to identify child abuse imagery when it's uploaded to the internet. Similar programs have since been built to help identify known , livestreams of and other things that large tech companies try to keep off their platforms.
As tech has become more pervasive in our lives, these efforts have taken on increased importance. And unlike adding a new video technology or increasing a computer's performance, these social issues don't always have clear answers.
In 2021, Apple made one of its first public moves into victim-focused technology when it announced new features for its iMessage service designed to analyze messages sent to users marked as children to. If its system suspected an image, it would blur the attachment and warn the person receiving it to make sure they'd wanted to see it. Apple's service would also point children to resources that could help them if they're being victimized through the service.
At the time, Apple said it built the message-scanning technology with privacy in mind. But activists worried Apple's system was also designed to alert an identified parent if their child chose to view the suspected attached image anyway. That, some critics said, could incite abuse from a potentially dangerous parent.
Apple's additional efforts to detect potential child abuse images that might be synchronized to its photo service through iPhones, iPads and Mac computers was criticized by security experts who.
Still, victim advocates acknowledged that Apple was one of the few device companies working on tools meant to support victims of potential abuse as it's happening. Microsoft and Google didn't respond to requests for comment about whether they plan to introduce features akin to Safety Check to help victims who might be using Windows and Xbox software for PCs and video game consoles, or Android mobile software for phones and tablets.
Learning, but much to do
The tech industry has been working with victims organizations for over a decade, seeking ways to adopt safety mindsets within their products. Advocates say that in the past few years in particular, manywithin the tech giants, staffed in some cases with people from the nonprofit world who worked on the issues the tech industry was taking on.
Apple started consulting with some victims rights advocates about Safety Check last year, asking for input and ideas for how to best build the system.
"We are starting to see recognition that there is a corporate or social responsibility to ensure your apps can't be too simply misused," Karen Bentley, CEO of Wesnet. And she said that's particularly tough because, as technology has evolved to become easier to use, so has the potential for it to be a tool of abuse.
That's part of why she says Apple's Safety Check is "brilliant," because it can quickly and easily separate someone's digital information and communications from their abuser. "If you're experiencing domestic violence you're likely to be experiencing some of that violence in technology," she said.
Though Safety Check has moved from an idea into test software and will be made widely available with the iOS 16 suite of software updates for iPhones and iPads in the fall, Apple said it plans more work on these issues.
Unfortunately, Safety Check doesn't extend to ways abusers might be tracking people using devices they don't own -- such as if someone slips one of Apple's $29 AirTag trackers into their coat pocket or onto their car to stalk them. Safety Check also isn't designed for phones set up under child accounts, for people under the age of 13, though the feature's still in testing and could change.
"Unfortunately, abusers are persistent and are constantly updating their tactics," said Erica Olsen, project director for Safety Net, a program from the National Network to End Domestic Violence that trains companies, community groups and governments on how to improve victim safety and privacy. "There will always be more to do in this space."
Apple said it's expanding training with its employees who interact with customers, including sales people in its stores, to know how features like Safety Check work and be able to teach it when appropriate. The company has also created guidelines for its support staff to help identify and help potential victims.
In one instance, for example, AppleCare teams are being taught to listen for when an iPhone owner calls expressing concern that they don't have control over their own device or their own iCloud account. In another, AppleCare can guide someone on how to remove their Apple ID from a family group.
Apple also updated its personal safety user guide in January to instruct people how to reset and regain control of an iCloud account that might be compromised or being used as a tool for abuse.
Craig Federighi, Apple's head of software engineering, said the company will continue expanding its personal safety features as part of its larger commitment to its customers. "Protecting you and your privacy is, and will always be, the center of what we do," he said.