• Home
  • Ai
  • Ai News
  • iOS 18.2 Beta 2 Introduces API for Developers to Enable Content Awareness Using Siri and Apple Intelligence

iOS 18.2 Beta 2 Introduces API for Developers to Enable Content Awareness Using Siri and Apple Intelligence

With the new API, users will be able to say or type “Hey Siri, what's this document about?” to ask the assistant to provide a summary of a document.

iOS 18.2 Beta 2 Introduces API for Developers to Enable Content Awareness Using Siri and Apple Intelligence

Apple Intelligence is available on the iPhone 15 Pro and newer models

Highlights
  • iOS 18.2 beta 2 was rolled out to developers on Monday
  • The latest update adds support for an API for Apple Intelligence
  • Users will be able to send content of their screens to Siri for analysis
Advertisement

iOS 18.2 beta 2 was rolled out to developers and testers on the developer beta channel on Monday, as Apple prepares the next version of its smartphone operating system that is expected to arrive in early December with more Apple Intelligence features. The latest beta releases also include support for a new application programming interface (API) that will allow developers to provide the system with access to on-screen content, allowing Siri and Apple Intelligence to send information to third-party services for processing.

Apple Introduces API for Siri's Onscreen Awareness Feature

On the Apple Developer website, the company has provided documentation (via Macrumors) for the new API titled 'Making onscreen content available to Siri and Apple Intelligence' that is designed to allow access to an app's onscreen content, enabling Siri and Apple Intelligence to understand what content the user is accessing.

If a developer adds support for the onscreen content API, their application will provide the contents of the screen to Siri/ Apple Intelligence when a user explicitly requests it, according to the company. The information on a user's screen can then be shared with a third-party service (such as OpenAI's ChatGPT).

Apple has also provided an example of Siri accessing onscreen content. While browsing the web, a user can say or type “Hey Siri, what's this document about?” to ask Siri to provide a summary of a document.

Developers can also add support for onscreen awareness in browser, document reader, file management apps, mail, photos, presentations, spreadsheets, and word processing apps. Apple says that this list is not exhaustive, so more apps should be able to take advantage of the API in the future.

It's worth noting that iOS 18.2 won't bring support for the new Siri, which is expected to offer greatly improved functionality. That's expected to arrive on iOS 18.4 along with support for in-app actions, which will reportedly be released by Apple in April 2025, which is ample time for developers to integrate support for the API into their apps.

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Apple Intelligence, Siri, AI, Apple, APIs
David Delima
As a writer on technology with Gadgets 360, David Delima is interested in open-source technology, cybersecurity, consumer privacy, and loves to read and write about how the Internet works. David can be contacted via email at DavidD@ndtv.com, on Twitter at @DxDavey, and Mastodon at mstdn.social/@delima. More
Vivo X Fold 3 Pro Reportedly Receiving Android 15-Based FuntouchOS 15 Update in Some Regions
Facebook Gadgets360 Twitter Share Tweet Snapchat LinkedIn Reddit Comment google-newsGoogle News

Advertisement

Follow Us
© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »