In brief: Apple is notorious for its walled garden approach and renowned for promoting security and privacy as the top feature of its products. However, security researchers believe this also means hackers who do manage to breach the wall tend to remain undetected a lot more often than you’d think.
For years, Apple has touted the privacy and the security of its devices and explained through its marketing that it values those two features more than other tech companies. Lately, that has attracted legal fights with companies like Epic, which are interested in breaking the walled garden that Apple has built around its ecosystem and aligning it to what the rest of the industry is doing.
However, the Cupertino giant may have inadvertently created a bigger problem than the one it set out to solve. Creating a digital fortress around its products and services has given some of the world’s top hackers one of the best places to hide. It may be harder to break into an iPhone, but once in, it’s also easier for that bad actor to conceal their activity for a long time.
A report from the MIT Technology Review takes a deep dive into Apple’s intense drive to bolster product security while touching on the unintended consequences of that approach. The analysis cites Citizen Lab’s senior cybersecurity researcher Bill Marczak, who explains that top-tier hackers have the resources and motivation to develop zero-click exploits that allow them to run their malicious code while users are none the wiser.
It’s not just malicious actors that do this. Companies like Israel-based NSO Group have been at it for years, and while they promise to only provide their tools to legitimate organizations such as law enforcement, there’s always a risk they could be misused. Additionally, companies like Facebook have tried to purchase NSO’s spyware tools specifically to gain the ability to monitor iPhone and iPad users.
Marczak was one of the first to raise awareness about the existence of NSO and notes that when investigating an Al Jazeera journalist’s iPhone last year, he initially found no evidence of hacking on it. As the investigation dragged on, the Citizen Lab team discovered the phone was pinging servers that belong to NSO. When Apple released iOS 14, it broke the researchers’ “jailbreaking” tool and cut off access to specific folders that hackers tend to use to hide their malicious code.
Modern computers have been moving in a similar direction to Apple’s lockdown philosophy, albeit with a limited degree of success. In the case of Macs, we’ve already seen the introduction of T-series security chips (which are now integrated into the M1 SoC for Apple Silicon Macs) that can govern encrypted storage, secure boot, perform image signal processing and biometric authentication, and even physically disable microphones to prevent snooping.
Even that implementation is not perfect and theoretically allows a skilled hacker to bake in a keylogger and steal credentials while being virtually impossible to detect. On the software side, Apple’s approach is a similar double-edged sword. On the one hand, any software that runs on a Mac has to pass a Notarization check. On the other hand, that can fail spectacularly when too many people update to the latest version of macOS at the same time.
Security researchers are somewhat limited because Apple doesn’t allow Mac analysis tools the kind of deep access needed to look for evidence of hacks—they aren’t allowed to peek at the memory allocations of other processes. That means apps cannot check another app’s personal space, which is suitable for protecting end users but a significant limitation for security research. Other companies like Google are going down a similar path. For instance, Chromebooks are locked down so that you can’t run anything outside of the web browser.
Apple believes this approach to security is right—that the tradeoffs are a small price to pay for making the life of malicious actors very difficult when they’re looking to get access to sensitive data on your devices. Security researchers tend to agree, but they’re also worried that as more people gravitate toward mobile devices designed around the walled garden paradigm, it will be more challenging to assess whether a device has been compromised. They fear malicious actors will get away with it more often than not without leaving a trace.
Image credit: Africa Studio