This is how AI will be integrated into the iPhone, protecting your data

This is how AI will be integrated into the iPhone, protecting your data
This is how AI will be integrated into the iPhone, protecting your data

Apple practically makes several announcements per month regarding the development of artificial intelligence. About to present its new iPads, the company has launched a new set of natural language models. Due to its characteristics, this novelty reinforces the idea that Apple’s AI will run entirely on the device as recently reported.

To prioritize the speed of work and the security of personal data, Apple’s commitment is to ensure that AI does not leave the device and does not run in nearby data centers. But this requires adaptation of the technology to the capabilities of small products such as an iPhone.

This Wednesday, April 24, Apple has released OpenELM, a group of four small language models. He has published them in the Hugging Face open source library and their qualities are more conducive to running on a handheld device.

Four new models

Each element in this series has a different size: 270 million parameters, 450 million parameters, 1.1 billion and 3.0 billion parameters, the largest of the four models. The parameters refer to the number of variables that the model covers when making decisions among its data set with which it has been trained.

The larger the model, the more difficult it is to execute, so it could not be processed locally on devices such as a mobile phone. Specifically, these open source models would be efficient in generating textso they could be asked to perform tasks such as writing emails.

Photomontage with an iPhone and an AI symbol.

Manuel Fernandez

Omicrono / PNGWing

These four models join others that Apple has released to the developer community in recent months. For now it has not presented any intended for commercial use, as its competitors have done.

Nor has he reported on the possible use that could be given to this AI on his device, although we won’t have to wait long to find out. In the month of June, The company is expected to show the changes that this technology will bring to its software.

Other ads

The company has been leaving small clues. In December it launched MLX, a machine learning framework that makes it easy to run AI models on Apple Silicon chips. On the other hand, the Cupertino giant has created MGIE, an image editing model that could edit photographs. Ferret-UI would also serve to navigate the phone’s operating system with the help of AI.

All of these options will be confirmed or ruled out, in theory, during WWDC in June. Still, with all the work done by Apple engineers, it wouldn’t be enough to catch up in this frenetic race the industry is in. For this reason, Apple would have contacted Google and other companies to reach agreements and be able to use their most advanced models in Apple products.

 
For Latest Updates Follow us on Google News
 

-

NEXT 5 Stardew Valley-type games for Android mobiles