Published Date : 11/06/2025
If you’ve upgraded to a newer iPhone model recently, you’ve probably noticed that Apple Intelligence is showing up in some of your most-used apps, like Messages, Mail, and Notes. Apple Intelligence (also abbreviated to AI) was introduced into Apple’s ecosystem in October 2024, and it’s here to stay as Apple competes with Google, OpenAI, Anthropic, and others to build the best AI tools.
Cupertino marketing executives have branded Apple Intelligence as “AI for the rest of us.” The platform is designed to leverage the things that generative AI already does well, like text and image generation, to improve existing features. Like other platforms, including ChatGPT and Google Gemini, Apple Intelligence was trained on large information models. These systems use deep learning to form connections, whether it be text, images, video, or music.
The text offering, powered by LLM, presents itself as Writing Tools. The feature is available across various Apple apps, including Mail, Messages, Pages, and Notifications. It can be used to provide summaries of long text, proofread, and even write messages for you, using content and tone prompts.
Image generation has also been integrated, albeit a bit less seamlessly. Users can prompt Apple Intelligence to generate custom emojis (Genmojis) in an Apple house style. Image Playground, meanwhile, is a standalone image generation app that utilizes prompts to create visual content that can be used in Messages, Keynote, or shared via social media.
Apple Intelligence also marks a long-awaited face-lift for Siri. The smart assistant was early to the game but has mostly been neglected for the past several years. Siri is now integrated much more deeply into Apple’s operating systems. Instead of the familiar icon, users will see a glowing light around the edge of their iPhone screen when it’s active.
More importantly, new Siri works across apps. For example, you can ask Siri to edit a photo and then insert it directly into a text message. It’s a frictionless experience the assistant had previously lacked. Onscreen awareness means Siri uses the context of the content you’re currently engaged with to provide an appropriate answer.
Leading up to WWDC 2025, many expected that Apple would introduce an even more souped-up version of Siri, but we’ll have to wait a bit longer. “As we’ve shared, we’re continuing our work to deliver the features that make Siri even more personal,” said Apple SVP of Software Engineering Craig Federighi at WWDC 2025. “This work needed more time to reach our high-quality bar, and we look forward to sharing more about it in the coming year.”
At WWDC 2025, Apple also unveiled a new AI feature called Visual Intelligence, which helps you do an image search for things you see as you browse. Apple also introduced a Live Translation feature that can translate conversations in real time in the Messages, FaceTime, and Phone apps. Visual Intelligence and Live Translation are expected to be available later in 2025 when iOS 26 launches to the public.
Apple Intelligence took center stage at WWDC 2024, after months of speculation. The platform was announced in the wake of a torrent of generative AI news from companies like Google and OpenAI, causing concern that Apple had missed the boat on the latest tech craze. Contrary to such speculation, however, Apple had a team in place, working on what proved to be a very Apple approach to artificial intelligence.
Apple Intelligence isn’t a standalone feature. Rather, it’s about integrating into existing offerings. While it is a branding exercise in a very real sense, the LLM-driven technology will operate behind the scenes. As far as the consumer is concerned, the technology will mostly present itself in the form of new features for existing apps.
We learned more during Apple’s iPhone 16 event in September 2024. During the event, Apple touted several AI-powered features coming to its devices, from translation on the Apple Watch Series 10, visual search on iPhones, and a number of tweaks to Siri’s capabilities. The first wave of Apple Intelligence arrived at the end of October as part of iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1.
The features launched first in U.S. English. Apple later added Australian, Canadian, New Zealand, South African, and U.K. English localizations. Support for Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Vietnamese will arrive in 2025.
Apple Intelligence is available to users of the latest iPhone models, including the iPhone 15 and 16 series. However, it’s not limited to just the latest devices. Users of older models, such as the iPhone 13 and 14, can also enjoy these AI features with the latest software updates. Apple is committed to ensuring that all its users can benefit from the latest advancements in AI technology, regardless of the device they are using.
In summary, Apple Intelligence is a significant step forward in Apple’s AI strategy, offering a wide range of features that enhance the user experience across various apps. From writing tools and image generation to a revamped Siri and upcoming features like Visual Intelligence and Live Translation, Apple is making AI more accessible and user-friendly for everyone.
Q: What is Apple Intelligence?
A: Apple Intelligence is a set of AI features integrated into Apple’s ecosystem, designed to enhance existing apps with text and image generation, and a revamped Siri.
Q: When was Apple Intelligence unveiled?
A: Apple Intelligence was unveiled at WWDC 2024, following months of speculation about Apple’s AI strategy.
Q: What new features does Apple Intelligence bring to Siri?
A: Apple Intelligence brings a more integrated and context-aware Siri that can edit photos, insert content into messages, and provide more personalized assistance across apps.
Q: Which devices support Apple Intelligence?
A: Apple Intelligence is supported on the latest iPhone models, including the iPhone 15 and 16 series, as well as older models like the iPhone 13 and 14 with the latest software updates.
Q: What are some upcoming features of Apple Intelligence?
A: Upcoming features include Visual Intelligence for image search and Live Translation for real-time conversation translation, expected to be available with iOS 26 in 2025.