Thursday, July 4, 2024
HomeGadgets and ReviewsApple’s AI Cloud System Makes Big Privacy Promises, but Can It Keep...

Apple’s AI Cloud System Makes Big Privacy Promises, but Can It Keep Them?

Apple’s new Apple Intelligence The system is designed to embed generative AI into the core of iOS. Many new servicesThis includes text and image generation, organization and scheduling features, and more. But while the system offers exciting new features, it also brings complications. First, the AI ​​system relies on vast amounts of data from iPhone users, which could pose privacy risks. Second, the AI ​​system requires a massive increase in computing power, forcing Apple to rely increasingly on cloud systems to meet user demands.

Apple has always offered iPhone users unparalleled privacy. A big part of the company’s brandPart of these privacy guarantees is the option to store mobile data locally or in the cloud. While increased reliance on the cloud may raise privacy alarms, Apple appears to have anticipated these concerns, stating that the company ” Private Cloud Computing (PCC)It’s a cloud security system designed to protect users’ data from prying eyes while it’s being used to fulfill AI-related requests.

On paper, Apple’s new privacy system looks very impressive. The company claims to have developed “the most advanced security architecture ever deployed for large-scale cloud AI computing.” But what looks like a big achievement on paper could ultimately raise broader questions about user privacy in the future. And, at least for now, it’s unclear whether Apple can deliver on its lofty promises.

How Apple’s private cloud computing works

In many ways, a cloud system is nothing more than a giant database. If a malicious actor gets into that system/database, they can see the data contained within. But Apple’s Private Cloud Computing (PCC) has put in place a number of unique safeguards designed to prevent such access.

Apple says it has implemented security systems at both the software and hardware levels. The company has created custom servers to house the new cloud system, and those servers undergo a rigorous screening process during manufacturing to ensure they are secure. “We stock and image the components of the PCC nodes in high resolution,” the company claims. The servers are also equipped with physical security mechanisms such as tamper-evident seals. iPhone users’ devices can only connect to servers certified as part of a secured system, and those connections are end-to-end encrypted, so the data sent is nearly impossible to change in transit.

Once your data reaches Apple’s servers, it’s protected even further to ensure your privacy. Stateless Computing The goal is to create a system where user data isn’t retained beyond the point at which it’s used to fulfill an AI service request, which means that your data won’t be stored in the system for very long, according to Apple: it travels from your phone to the cloud, where it interacts with Apple’s sophisticated AI algorithms to fulfill random questions or requests you submit (such as “Draw me a picture of the Eiffel Tower on Mars”), after which (again, according to Apple) the data is deleted.

Apple also has many other security and privacy measures in place, which you can learn more about below. Company blogWhile these defenses vary widely, they all appear to be designed with one goal in mind: to prevent intrusions into the company’s new cloud systems.

But is this really legal?

Companies make big promises about cybersecurity all the time, but it’s usually impossible to verify whether they’re true. The collapsed cryptocurrency exchange FTX once claimed to store users’ digital assets in isolated servers. A subsequent investigation revealed that It was complete nonsense.But of course not Apple. In an effort to prove to outside observers that it is truly securing its cloud, the company announced it would launch something it calls a “transparency log,” including full production. Software Image (Basically copies of the code used in the system.) They plan to periodically publish these logs so that outside researchers can verify that the cloud is operating as Apple describes it.

What people are saying about PCC

Apple’s new privacy system has divided the tech community: While many have been impressed by the massive effort and unparalleled transparency that has characterized the project, others are concerned about the broader impact it could have on mobile privacy in general, most notably and vocally by Elon Musk. Immediately began to proclaim Apple has failed its customers.

Simon Willison, a web developer and programmer, told Gizmodo that he was impressed by the “ambition” of the new cloud system.

“They’re tackling several incredibly difficult problems in privacy engineering at the same time,” he said. “I think the most impressive part is the auditability — being able to publish images for review in the transparency log and verify that devices are using it to communicate only with servers running software that’s publicly available. Apple employs some of the best privacy engineers in the industry, and even by their standards, this is a tall order.”

But not everyone is enthusiastic. Matthew Green, a cryptography professor at Johns Hopkins University, expressed skepticism about Apple’s new system and the promise it brings.

“I don’t like it,” Green said with a sigh. “My biggest concern is that this system will centralize even more user data in data centers, when currently most of it is stored on people’s actual phones.”

Until now, Apple has made local data storage central to its mobile design, as cloud systems are notoriously lacking in privacy.

“Apple has always taken this approach because cloud servers are not secure,” Green says. “The problem is that with all the AI ​​work they’re doing right now, Apple’s built-in chips aren’t powerful enough to do the things they want them to do, so they have to send the data to their servers, and they’re trying to build super-secure servers that no one can hack.”

He understands why Apple is making this move, but doesn’t necessarily agree with it because it means greater reliance on the cloud.

According to Green, Apple has also not said whether it will explain to users which data will remain local and which will be shared with the cloud, meaning users may not know what data will be exported from their phones. At the same time, Apple has not said whether iPhone users can opt out of the new PCC system. Forcing users to share a certain percentage of their data with Apple’s cloud could reduce, rather than increase, the average user’s autonomy. Gizmodo has reached out to Apple on these two points and will update this article if the company responds.

To Green, Apple’s new PCC system signals a shift in the mobile phone industry toward a reliance on the cloud, which he says could lead to a less secure privacy environment overall.

“I have very mixed feelings,” Green said. “I don’t think enough companies are going to adopt very sophisticated AI. [to the point] “No company is going to want to be left behind. I think companies that don’t have good AI capabilities will probably be severely punished by consumers.”

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments

error: Content is protected !!