Sitting with hundreds of developers at Apple Park during the Worldwide Developers Conference a few months ago, I got a first look at Apple Intelligence — Cupertino’s bold new step into generative artificial intelligence. The debut of Apple Intelligence was highly anticipated, but tech journalists like us, who cover Apple up close, were more interested in how the company would approach artificial intelligence differently and whether Apple has a concrete plan to integrate it into the iPhone, Mac, and iPad. This was a crucial point because, until now, no one had convincingly demonstrated how AI could genuinely enhance the user experience despite the hype around seemingly underwhelming use cases. However, Apple’s approach to AI through Apple Intelligence is more nuanced and well-integrated into familiar applications and interfaces, all while prioritising data privacy. It might still be basic and on a surface level, it works. It is exactly what artificial intelligence should do: improve the software and devices we use, rather than address problems that don’t exist in the first place. Here’s an early look at Apple Intelligence, as part of the iOS 18.1 developer beta update, which I am running on the iPhone 15 Pro. Waitlist for early preview of Apple Intelligence I had to wait for hours to get access to Apple Intelligence, which is set to launch but is currently only available in the US and works with select Apple devices. At the moment, Apple is allowing developers to try Apple Intelligence, a flavour of AI I would say, but access is only being granted after one joins the waitlist. The feature is limited to US users only. Apple advises that before downloading the update, “Both device language and Siri language must be set to U.S. English, and the device region must be set to the United States”. Apple has clarified that Apple Intelligence is not available in the EU or China. It is currently available only on the iPhone 15 Pro and iPhone 15 Pro Max and I believe Apple Intelligence will be a part of all iPhone 16 models. It will also work on iPads with M1 or later chips if you download the iPadOS 18.1 beta, and on Macs with the same chips with the macOS 15.1 beta. I would also like to clarify this is not a public beta — Apple has typically rolled out a new version of iOS months before the launch in the past. In case, you have a developer account and would like to try out Apple Intelligence (to be clear only select features are live and ready to experience as part of the iOS 18.1 developer beta), make sure the device is updated to iOS 18.1. Once done, you need to request access to Apple Intelligence. Follow the instructions: Go to Settings and below General, and you will notice a new option, Apple Intelligence and Siri. Choose this and then at the top of the next screen tap Join the Apple Intelligence Waitlist. If you're wondering how I got access, it's quite straightforward. I changed the region and language settings to US and US English, respectively, on my iPhone 15 Pro. Having a developer account allowed me to get early access to Apple Intelligence. Needless to say, being a tech journalist has its perks, including early access to software and hardware before anyone else. What it is like to use Apple Intelligence As soon as I got access to Apple Intelligence, the first thing I tried was the new Siri. I am a regular user of Apple’s Siri voice assistant, but honestly, I mostly use it to set alarms. I wanted Siri to improve, but over the years, the assistant seemed to get worse. With Apple Intelligence, Siri received a major makeover and a significant upgrade this time around. Once you fire up Siri, a translucent glow pops up around the edges of the screen instead of a hovering circle on the iPhone. This is a refreshing visual change, but more than that, what made an impact —at least for me — was how Siri now has a more natural conversational ability. I could engage Siri in a conversation and keep going as if it were a dialogue between two people. I could even interrupt Siri and ask other questions. For example, I asked Siri what the nearest train station was around London Bridge, and it came up with a list of stations. I then asked Siri to recommend a few restaurants nearby. I chose the first restaurant on the list, and Siri dug into details about it without having to restate the state or city. The good thing about the new Siri is that it remembers the context of the conversation. A Google search could also do the job, but I found Siri to be a better and much easier medium for getting quick information like this. Another Apple Intelligence feature that left me impressed is the Writing Tools. These tools include several options to help you improve the language of a write-up, adjust the tone of an email, or create a summary. I tried enhancing the language of a story I had been working on for a long time using Writing Tools. All you need to do is select the text and then slide over to the Writing Tools under Copy and Paste. It offers different styles such as proofreading or rewriting, and quick shortcuts to tones of voice like Friendly, Professional, or Concise. You can also get a summary or brief points based on what is written in the story. While I would still edit a copy myself and may not rely entirely on AI, I expect many people to use this feature to adjust the tone of an email and make it sound more professional. While you can use solutions like Copilot or ChatGPT to fix the tone of an email or a paragraph, those programmes aren’t integrated as deeply as Apple Intelligence, which is baked into the operating system and the apps. For example, I can use Writing Tools inside iMessage, which is helpful, similar to how Meta AI is integrated into WhatsApp. It’s only been a day since I got my hands on Apple Intelligence, and I haven’t had the chance to explore every feature Apple has made available in the iOS 18.1 developer beta. However, there are features that I would use every day. For example, in the Notes app, I can record and transcribe audio recordings, and Apple Intelligence generates a summary for me. This is a must-have tool for someone like me who records interviews and needs to file stories based on those recordings. I don’t deny that Otter.ai made me dependent on its audio transcription software, but it doesn’t come cheap; after three recordings, it asks for a subscription. However, Apple Intelligence is free, and the quality of its audio transcription is as good as Otter's. Its call recording feature is also very useful for journalists. Sure, there are call recording apps available, but when it comes to recording sensitive information, I don’t trust third-party apps. Apple Intelligence goes further: not only does it record the audio, but it automatically captures the recording in the Notes app. Within a minute after ending the call, it accurately and clearly labels the transcription. I can then go to the Notes app, replay the audio, and search through the entire transcription. A good thing about the call recording feature is that, for transparency's sake, the system announces that the call is being recorded while it is in progress. Lastly, Apple Intelligence is also making memories generative and smart in the Photos app. There is a new feature in the Photos app where you can create a memory just by providing a prompt. For example, if you input “Trip to Kerala,” the system will create a storybook using photos and audio. What I did not expect was the use of a song from a Malayalam-language movie, Thammil Thammil, in the background. This demonstrates how Apple Intelligence can identify the language spoken in the state and use a song from a popular Malayalam film accordingly. Early impressions There’s a lot of chatter around artificial intelligence and its use cases, some of which are either exaggerated or oversold by companies. I think Apple is aware of the extent AI can be effectively used, and it has seamlessly integrated Apple Intelligence into the apps and devices I use the most. To be clear, while what Cupertino offers with Apple Intelligence may not be groundbreaking, these features blend well with what we want from AI — from adjusting the tone of an email to enhancing photo albums. As I mentioned earlier, Apple Intelligence is still in its early days, as many of the features Apple originally showed have not been included in the iOS 18.1 beta. If you ask me what Apple could have done differently compared to Google, Microsoft, and OpenAI, the difference is that many of the features I tried as part of Apple Intelligence work, which I consider a good start. Of course, Apple Intelligence is not perfect — some areas need improvement, but this is an early beta version.