For years, Apple has touted the security built into its iPhones and iPads. More than a decade ago, it added ways to encrypt information on the iPhone. In 2010, it introduced. And in 2013, it to help people unlock phones. Over the years, it’s been able to bring those technologies to the Mac too — but now, with its new M1 chips for the MacMini, MacBook Air and MacBook Pro, it’ll be able to supercharge those efforts.
On its website Thursday, Apple updated its “Platform Security” documents, describing how Mac computers now work in much more similar ways to their iPhone counterparts. The documents dive into nitty-gritty details of how various security systems within computers and phones talk to one another, and how it’s designed to protect an Apple user’s privacy.
“Secure software requires a foundation of security built into hardware,” Apple said in its security update, which came in at nearly 200 pages long. “That’s why Apple devices—running iOS, iPadOS, macOS, tvOS, or watchOS—have security capabilities designed into silicon.”
It may seem odd for a companyto share so much detail about nearly anything. The tech giant is as much known for its marketing as it is for its devices, and while the company does share some technical details about its product on its website, it’s meant for general audiences.
The Platform Security information though is different. Apple said it began publishing this information for business customers more than a decade ago. But the company soon learned that security researchers it works with to. That’s part of why you’ll find terms like “kernel integrity protection” and “pointer authentication codes,” both of which are part of the company’s various security systems.
Apple isn’t the only company who works with security researchers of course. Over the past decade, the tech industry at large has instituted “” programs to pay outside researchers to encourage them to help identify vulnerabilities in its devices. Companies including Microsoft, Google and Facebook have and publicly thanked some for before they become a widely exploited. Apple itself pays up to $1.5 million for such bounties.
Apple said part of the way it designs security systems is to encourage people to use them, or to have them running in the background without people having to know how they work and what to do to use them.
For example, iMessage has encryption built in — users don’t have to turn it on. And it also built its TouchID fingerprint sensor andto encourage people to use its encryption systems, which are activated when people set a passcode. Before it built TouchID, for example, Apple said less than 49% people used the passcodes on their phones. After its introduction, 92% of people did.
“This is important because a strong passcode or password forms the foundation for how a user’s iPhone, iPad, Mac, or Apple Watch cryptographically protects that user’s data,” Apple said in its security document.