“The transition to Apple silicon represents the biggest leap ever for the Mac.”
Apple’s kicked off the 31st edition of their flagship WWDC conference yesterday. Dubbed as the biggest WWDC to date, WWDC20 will bring together the global Apple developer community of more than 23 million in an unprecedented, virtual way, from June 22 to 26. On Day 1, Apple made some landmark announcements from their new iOS features to their silicon focused initiative and many more. In the next section we list a few key announcements that are relevant to the AI/ML community.
Make Way For Apple Silicon
In a historic day for the Mac, Apple today announced it will transition the Mac to its world-class custom silicon to deliver industry-leading performance and powerful new technologies. Developers can now get started updating their apps to take advantage of the advanced capabilities of Apple silicon in the Mac. This transition will also establish a common architecture across all Apple products, making it far easier for developers to write and optimize their apps for the entire ecosystem.
“From the beginning, the Mac has always embraced big changes to stay at the forefront of personal computing. Today we’re announcing our transition to Apple silicon, making this a historic day for the Mac,” said Tim Cook, Apple’s CEO
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.
“The A13 Bionic is the fastest CPU ever in a smartphone,” Apple said onstage, last year. in one of their conferences. Building upon this architecture, Apple is designing a family of SoCs for the Mac. This will give the Mac industry-leading performance per watt and higher performance GPUs — enabling app developers to write even more powerful pro apps and high-end games. And access to technologies such as the Neural Engine will make the Mac an amazing platform for developers to use machine learning
Apple plans to ship the first Mac with Apple silicon by the end of the year and complete the transition in about two years.
ML Features Get Even Better
Last year, at the WWDC event, Apple announced some fine machine learning updates and demonstrated how the developers can benefit from the customisation. Today Apple announced that their developer features get even better. Some key features:
- ARKit 4 introduces new ways to capture information about the real world using a new Depth API that is designed to work with the LiDAR sensor in iPad Pro, enabling entirely new types of apps, such as on-site architecture, design, landscaping, and manufacturing.
- Machine learning development is easier and more extensive with additional tools in Core ML for model deployment and encryption, new templates and training capabilities in Create ML, and more APIs for vision and natural language.
Core ML forms the foundation for domain-specific frameworks and functionality. Core MLsupports Vision for image analysis, NLP. This framework can be used with Create ML to train and deploy custom NLP models. With CreateML one can build models for object detection, activity and sound classification, and provide recommendations and take advantage of word embeddings and transfer learning for text classification.
With over 100 model layers now supported with Core ML, the ML team at Apple believes that apps can now use state-of-the-art models to deliver experiences that deeply understand vision, natural language and speech like never before.
Stay tuned to AIM for more updates about WWDC.