Written by Kim Shoute
Apple regained its throne as the biggest mobile device manufacturer in the world overthrowing its rival, Samsung, by hitting record high sales in the last quarter of 2016 . For developers like me, this means we should anticipate more iOS devices in the coming year.
Still, competition is fierce and in order to stay on top of the game Apple has invested more in research and development than ever . From AlphaGo’s historic victory against top tier Go player Lee Sedol to Tesla’s self-driving cars, breakthroughs in deep learning have broad applications and public interest is at its peak. The big five are all motivated to bring leaps in artificial intelligence to consumer applications. Apple joins the race in capitalizing on this sensationalization by promising to deliver a magical integration of technology and intuition.
This year, I was fortunate enough to attend tech’s renowned conference, WWDC, and get an insider’s view of what’s new with Apple and how their evolving culture of software impacts the rest of us.
Many advances have been made in other areas including game development. For brevity, I highlight areas that will have the highest impact in enterprise development.
The iOS team is determined to take advantage of recent developments in Artificial Intelligence. Integrating deep learning and neural networks to make their in-house apps more catered and convenient. The ultimate goal, according to Tim Cook, is to cultivate intuition on your device that anticipates your needs.
- The introduction of augmented reality in Camera opens up a world of possibilities. Among other things, this feature will take away the heavy lifting, literally, and enable visual queues for Map apps. I highly recommend looking deeper!
Deciding on furniture?
- Siri’s improved language comprehension and deeper integration with the OS combined with SiriKit means better custom voice interactive apps. It’s only a matter of time before Siri starts doing my taxes. https://www.apple.com/ca/ios/siri/
- A new type of predictive auto-complete that incorporates device activity such as recently viewed items and searches, provides smart suggestions outside the scope of a traditional dictionary.
- In the Photos app, the improved faces and memories feature allows you to find people and navigate through memories seamlessly. Full adoption of HEVC and HEIF formats retains quality and reduces the size of your images by half, allowing you to save more memories on your device.
- Wallet integration with iMessage and the introduction of Business Chat empowers businesses to connect with their users and process transactions more efficiently and interactively than ever — effectively eliminating the dreaded phone wait times and the need to set aside time to reach support.
- Interact in 3D with FlyOver and detailed indoor Maps. https://www.apple.com/ca/ios/maps/
- Brand new App Store. https://www.apple.com/ca/ios/app-store/
- Brand new Files app to consolidate all your cloud folders such as iCloud, Microsoft’s OneDrive, and Google Drive.
Other convenient miscellaneous features include…
- Multi tasking (iPad only), drag and drop, and document scanner.
Learn more: https://www.apple.com/ca/ios/ios-11-preview/
Leveraging the power of Machine Learning, Apple’s in-house team were able to build more intuitive apps. But how can we take advantage of smarter algorithms?
Vision, NLP, and Core ML
Along with the long standing NLP framework, Apple has introduced Vision and Core ML. Complex language processing is designed to be handled by NLP. Vision takes care of everything related to graphic processing including image pattern recognition and categorization. For everything else, there’s Core ML.
Powered by Accelerate and MPS, Core ML allows you to drop trained model into your project and learn on the user’s device. It takes care of energy efficiency and on-device performance tuning, so you can focus on delivering experience. This allows you to do some pretty amazing things like…
- Produce images from their descriptions utilizing generative adversarial text to image synthesis.
- Word vector representations trained over question answering (QA) problems combined with recurrent neural networks for sentiment analysis can take your automated chat bot to the next level by making them more sympathetic and increasingly more accurate.
- You can also enable search on images by using Vision to identify faces, detect features, and classify scenes in images and video.
- Or you could use an open-source pre-trained model to identify the exact type of an object — like what type of flower? What is that monument?
Traditionally, images used to be out of the scope for processing. Hardware wasn’t strong enough; software wasn’t intelligent enough. Advances in hardware and AI have overcome these barriers opening the door for more rich and engaging features.
Now let’s take a step back to look at NLP. NLP is not new to the iOS, but the additions made to NSLinguisticTagger for iOS 11 may prompt you to revisit this framework. New APIs allow you to breakdown and identify lexical class, such a verb or noun; tokenize words, is it a word or punctuation?; and perform lemmatization i.e. identifying the underlying relationship between words e.x. “better” and “good” are essentially the same. This lets you deconstruct sentences and provide better suggestions to create an overall more interactive experience.
Learn more: https://developer.apple.com/machine-learning/
Advances aren’t just limited to AI, however. For developers solving conventional problems, advances in swift language and tooling will enable you to write more efficient apps, and more quickly pinpoint pain points after deployment.
iTunes Connect & Testflight
For Testflight, the results are…
- Shorter wait times for beta and App Store releases.
- Support for A/B testing. Enhanced groups enables you to issue different builds for different users.
- Increased beta testing limit to up to 2000 users.
- Extended beta testing period to 90 days.
- Continue testing after a build has been released to the store.
For iTunes Connect…
- Phased releases enables gradual release of your app to the App Store. This allows you to cancel releases before it reaches a wider audience. No more campfire app store update horror stories.
Now, on to language! Last september, the swift team introduced Swift 3. The results were better syntax translating to more readable, less error prone code.
This year Apple introduced the next successor, Swift 4. Just as we were getting accustomed to the new syntax introduced in Swift 3.2. The good news is source compatibility with Swift 3 is top priority in Swifts 4. The transition, in theory, should require little effort. If you do decide to take the dive into Swift 4, you can look forward to…
- Class bound JSON de/serialization.
- Multiline string literals.
- Memory exclusivity, a side effect of which is less bugs and more performant collection types.
- Key-Value coding.
- Up to 3x faster string processing.
Finally, last but not least.
The release of Xcode 9 will be the biggest game changer for Apple platform developers like myself. For those of you whose projects simply don’t scale with Xcode’s build times, the new Xcode promises…
- Native integration with Github.
- To be faster in every way. Think 50x faster project navigation.
- Increased parallelism and caching for faster build times.
- Improved code diagnostic and runtime address and thread sanitizers.
- Compatibility with Swift 3.2 and Swift 4.
- Generate missing stubs and protocol requirements.
Meaning more time for writing code, and less time waiting for your code to compile and run.
Transitioning from Xcode 8 to Xcode 9 is meant to be easy; however, a word of warning for those itching to upgrade, Xcode 9 is still very much in beta.
You can download Xcode 9 Beta here: https://developer.apple.com/download/
Written by Kim Shoute, DevFacto Consultant