Apple’s cloud AI system makes big privacy promises, but can it deliver?

Apple’s cloud AI system makes big privacy promises, but can it deliver?
Apple’s cloud AI system makes big privacy promises, but can it deliver?

What’s new from Apple Apple Intelligence The system is designed to infuse generative AI into the core of iOS. The system offers users a series of new services, including text and image generation, as well as organizational and programming features. However, while the system provides impressive new capabilities, it also brings complications. For one thing, the AI ​​system relies on a huge amount of data from iPhone users, which presents potential privacy risks. Over time, the AI ​​system’s substantial need for greater computing power means that Apple will have to increasingly rely on its cloud system to meet its needs. user requests.

Miura 1, the first Spanish rocket

Historically, Apple has offered iPhone customers unparalleled privacy; much of the company’s brand. Part of those privacy guarantees has been the option to choose when mobile data is stored locally and when it is stored in the cloud. While greater reliance on the cloud could raise some privacy alarms, Apple anticipated these concerns and created an amazing new system. that calls your Private Cloud Computing or PCC. This is actually a cloud security system designed to keep user data away from prying eyes while being used to help comply with AI. -related requests.

On paper, Apple’s new privacy system sounds really impressive. The company claims to have created “the most advanced security architecture ever implemented for the cloud.” “AI computes at scale.” But what seems like a huge achievement on paper could ultimately cause broader problems for user privacy in the future. . And it’s unclear, at least at this point, whether Apple will be able to deliver on its lofty promises.

How Apple’s private cloud computing is supposed to work

In many ways, cloud systems are simply giant databases. If a bad actor gets into that system/database, he can look at the data contained within. However, Apple Private cloud computing (PCC) offers a number of unique safeguards that are designed to prevent that type of access.

Apple says it has implemented its security system at both the software and hardware levels. The company created custom servers that will host the new cloud system, and those servers go through a rigorous screening process during manufacturing to ensure they are secure. “High resolution images of the PCC node components,” the company states. Servers are also being equipped with physical security mechanisms such as a tamper-proof seal. iPhone users’ devices can only connect to servers that have been certified as part of the protected system, and those connections are end-to-end encrypted, meaning the data being transmitted is virtually untouchable while in transit.

Once data reaches Apple’s servers, there are further protections to ensure it remains private. Apple says its cloud is taking advantage stateless computing to create a system where user data is not retained beyond the point at which it is used to fulfill an AI service request. Therefore, according to Apple, your data will not have a significant useful life on your system. The data will travel from your phone to the cloud, interact with Apple’s high-octane AI algorithms, thus fulfilling any random question or request you’ve sent (“draw me a picture” of the Eiffel Tower on Mars”), and then the data (again, according to Apple) will be deleted.

Apple has instituted a number of other security and privacy protections which you can read about in more detail on the company blog. These defenses, while diverse, all seem designed to do one thing: prevent any breach of the company’s new cloud system.

But is this really legit?

Companies make big cybersecurity promises all the time and it is usually impossible to verify whether they are telling the truth or not. FTX, the failed cryptocurrency exchange, once claimed that it kept users’ digital assets on isolated servers. Later research showed that it was pure shit. But Apple is different, of course. To demonstrate to outside observers that it is truly securing its cloud, the company says it will launch something called “transparency logging” that involves full production software images (basically copies of the code used by the system). It plans to publish these logs regularly so outside researchers can verify that the cloud is operating as Apple says.

What people say about the PCC

Apple’s new privacy system has notably polarized the technology community. While the considerable effort and unparalleled transparency that characterize the project have impressed many, some are wary of the broader impacts it may have on mobile privacy in general. Most notably, aka Elon Musk immediately began to proclaim that Apple had betrayed its customers.

Simon Willison, a web developer and programmer, told Gizmodo that the “scale of ambition” of the new cloud system impressed him.

“They are tackling multiple extremely difficult problems in the field of privacy engineering, all at once,” he said. “The most impressive part I think is the auditability, the part where they will publish images for review in a transparency log that devices can use to ensure that they are only talking to a server that is running software that has been made public. “Apple employs some of the best privacy engineers in the business, but even by its standards, this is a formidable job.”

But not everyone is so enthusiastic. Matthew Green, a cryptography professor at Johns Hopkins University, expressed skepticism about Apple’s new system and its promises. That came together.

“I don’t love it,” Green said with a sigh. “My big concern is that it will centralize many more users. data in a data center, whereas right now most of it is on people’s real phones.”

Apple has historically made local data storage a pillar of its mobile design, because cloud systems are known for their privacy shortcomings.

“Cloud servers are not secure, so Apple has always had this approach,” Green said. “The problem is, with all this AI stuff going on, Apple’s internal chips aren’t powerful enough to do the things they want it to do. “We need to send the data to the servers and they’re trying to build these super-secured servers that no one can hack into.”

He understands why Apple is taking this step, but doesn’t necessarily agree with it since it means greater reliance on the cloud.

Green says Apple has also not made clear whether it will explain to users what data remains local and what data will be shared with the cloud. This means that users may not know what data is exported from their phones. At the same time, Apple has not made clear whether iPhone users will be able to opt out of the new PCC system. If users are forced to share a certain percentage of their data with Apple’s cloud, it may indicate less autonomy for the average user, not more. Gizmodo has reached out to Apple for clarification on both points and we will update this story if the company responds.

For Green, Apple’s new PCC system signals a shift in the phone industry toward a more cloud-dependent stance. This could lead to a less secure privacy environment overall, she says.

“I have mixed feelings about it,” Green said. “I think enough companies are going to implement very sophisticated AI[tothepointwherenocompanywillwanttobeleftbehind“Ithinkconsumerswillprobablypunishcompaniesthatdon’thavegreatAIfeatures”[paraElpuntodondeningunaempresaquerráquedarseatrásCreoquelosconsumidoresprobablementecastigaránalasempresasquenotienenexcelentesfuncionesdeIA”

This content has been automatically translated from the original material. Due to the nuances of machine translation, there may be slight differences. For the original version, click here.

 
For Latest Updates Follow us on Google News
 

-