Apple on Monday during its Worldwide Developers Conference 21 announced iOS 15 update with powerful features that enhances the iPhone experience. In summary, the company says the mobile operating system makes FaceTime calls more natural, introduces SharePlay for shared experiences, helps users focus and be in the moment with new ways to manage notifications, and brings more intelligence to photos and search to quickly access information.
“For many customers, iPhone has become indispensable, and this year we’ve created even more ways it can enhance our daily lives,” said Craig Federighi, Apple’s senior vice president of Software Engineering.
“iOS 15 helps users stay connected while sharing experiences in real-time, gives them new tools to help reduce distraction and find focus, uses intelligence to enhance the photos experience, and, with huge upgrades to Maps, brings new ways to explore the world. We can’t wait for customers to experience it,” added Craig.
What captured the attention of many users is its on-device intelligence that now powers the new Live Text feature, to enable it recognize text in a photo and allow users to take action. Sounds familiar? Yes, it does. This is Apple’s version of Google Lens launched in October 2017 by Google and designed to help users bring up relevant information related to objects the lens identifies using visual analysis based on a neural network. Three years later, Apple has now launched its own.
The feature will recognize text in photos or through Apple’s camera app and will read seven languages to start. For instance, users can now search for and locate the picture of a handwritten family recipe or capture a phone number from a storefront with the option to place a call. With the power of the Apple Neural Engine, the camera app can quickly recognize and copy text in the moment, such as the Wi-Fi password displayed at a local coffee shop, capture random quote shared to you in form of a photo on WhatsApp or if you will be very lazy to write down what is on your conference or class whiteboard, you can now simply point the camera towards the entire text, copy and then paste it on wherever you would wish to.
What Apple calls Visual Look Up will let users learn more about popular art and landmarks around their world, plants, and flowers, breeds of pets, or even find books. Its Spotlight now uses intelligence to search photos by location, people, scenes, or objects, and using Live Text, it can find text and handwriting in photos. It also offers web image search and all-new rich results for actors, musicians, TV shows, and movies, just what Google Lens really does. Live Text works across iPhones, iPads, and Mac computers and supports seven languages: English, Chinese (both simplified and traditional), French, Italian, German, Spanish, and Portuguese.