Apple detailed the future of artificial intelligence and how it would move forward to developers and executives at its annual AI summit. While specifics are scarce, we do know that the company is heavily invested in the concept of a leading language model and how it can benefit from it in relation to Siri.
According to reports, Apple is testing language model concepts on a weekly basis in an attempt to emulate OpenAI’s highly successful endeavour. Apple’s virtual assistant currently lags far behind Google Assistant and Amazon’s Alexa. Siri lacks context when compared to Google Assistant, and the virtual assistant could benefit greatly from Apple’s research. Siri excels at what it is designed to do, but if you need actual assistance, the others are far superior.
We recently learned that a developer was successful in adding ChatGPT to the Apple Watch.
While the addition is in the form of an app, it is appropriate for the wearable. If Apple incorporates something similar to Siri, it will be a huge step forward for the virtual assistant. Furthermore, Siri on the iPhone, iPad, and even Mac could provide significant assistance to users, which is currently not possible.
In comparison to the previous version, the recent GPT-4 update will include support for images and much more, allowing the language model to provide detailed responses. Additionally, the response’s accuracy is improved. Overall, the platform has opened up a wide range of opportunities for everyone.
It can be used both academically and professionally. It remains to be seen how Apple will put its language model concepts into action. While incorporating the technology directly into Siri makes sense, the company could also offer a standalone platform.
Also Read: