Trends

Is Apple still as visionary as it was in the past?

Every year, Apple's conferences ignite a spark of excitement, captivating IT professionals and media alike, and provoking lively discussions. Our colleague Piotr Jeremicz, Technical Lead (iOS) at ERGO Technology & Services Poland, attended Apple's Worldwide Developers Conference in California this year. We asked him about his impressions.

Mainstage at Apple Worldwide Developers Conference

Would you describe the conference as evolutionary or revolutionary? How will it affect the further development of mobile technologies?

While this year's conference introduced a new product, it cannot be classified as a revolutionary or evolutionary breakthrough. Instead, it is another step in Apple's ongoing development cycle within their established platforms. Putting aside Vision OS for a moment, the true essence lies in the continuous evolution of existing technologies, aligning with Apple's vision for developers.

In recent years, it has become evident that Apple is directing its focus towards AR, machine learning, object recognition, and object analysis. The culmination of this creative cycle of new software is represented by Vision OS, with Vision Pro taking the lead. As a developer, I view it as another year in which I dedicate June, when WWDC happens, to acquiring new knowledge and skills which I subsequently apply in my day-to-day work for the following eleven months. It is a gradual evolution that ultimately leads to revolutionary outcomes.

Could you tell us something about WWDC 2023?

The magic of WWDC is the moment when a developer directly interacts with the company that provides software for them, shedding their anonymity among fellow developers. This is the aspect that defines the conference – its immediacy. Currently, the event spans three days, commencing with a “kick-off” day that precedes the Keynote. It serves as a typical networking occasion, where this year we had the opportunity to visit Apple's former campus in Cupertino, California.

The event boasted a diverse range of employees at various levels – from engineers and communications personnel to directors. A conference of this nature brings us closer to this select group. We have the chance to meet Apple engineers and discuss our concerns, and ultimately, the opportunity to witness the unveiling of new products first-hand.

What has changed from a developer's perspective with the release of Apple Vision Pro?

From our perspective, there is both significant change and continuity. The changes are evident with the introduction of a new system, which brings forth fresh opportunities. Primarily, it enables the creation of innovative end-user applications. The "Pro" suffix in Vision Pro implies a departure from generic apps to more advanced ones, carrying specialised content. Moreover, Vision Pro itself, equipped with numerous sensors and capabilities to perceive the world in the way that humans do, opens the door to comprehending distances, studying shapes, analysing the environment, and recognising objects.

These capabilities go beyond mere app creation for tasks such as ordering food or browsing social media. Instead, they enable integration with the world, such as overlaying digital content on physical objects or creating an AR experience tailored to specific environments. A compelling example would be displaying a life-sized 3D model of a CT scan in the actual operating room, providing valuable insights for surgeons before an operation. It raises anticipation for the possibilities that lie ahead. For instance, architects could automatically scan entire rooms and generate instant 3D schematics, while graphic designers can explore new dimensions to create visually captivating spaces.

On the other hand, certain aspects remain unchanged for developers. They have the option to seamlessly transition to the Vision OS project by choosing an additional platform, allowing their existing projects to be readily available on the new platform. This continuity ensures that developers can deliver their products to new users without encountering major obstacles.

The glasses made available on the market so far have not caught on particularly well. Do you think the new Apple Vision Pro will popularise wearable technology? Or is it more a case of users' habits needing to change?

Amidst a sea of diverse phone designs and touchscreen technologies, the initial unveiling of the iPhone exemplified a remarkable convergence, forever altering the landscape of mobile devices. Similarly, the introduction of the iPad revolutionised the concept of a portable entertainment hub, while the Apple Watch swiftly rose to become the reigning champion in the realm of smartwatches. And let us not overlook AirPods, which ingeniously transformed ordinary headphones into an everyday accessory, so effortlessly stowed away in one's pocket that they now seem commonplace.

Now, we have the Vision Pro, a device made of high-quality materials that looks literally out of this world. Seeing these glasses at the conference gave me the impression that this is the future. However, we must now ponder the question: who exactly is the intended recipient of this advancement? Undoubtedly, it is not destined to be embraced by all. 

The Vision Pro showcases an ensemble of six LiDAR sensors, accompanied by eight, or even twelve, cameras, as well as cutting-edge sensors for fingerprint recognition and meticulous eye position tracking. Beyond the remarkable feat of presenting visuals directly in front of the wearer's eyes, the purpose of Apple’s new glasses extends even further, allowing the wearer to use their eyes as a natural means of pointing, similarly to the way we effortlessly use a mouse. 

Did Apple address the impact on users’ health when presenting the new product? If so, what solutions are we talking about?

One of the key aspects, particularly in low-light conditions, is the screen brightness that is intelligently adjusted to accommodate the condition of our eyes. We must also consider factors that can potentially weaken our eyesight, such as eye muscle dryness when focusing on distant objects. While Apple Vision should be treated as a working tool that necessitates maintaining natural habits, its overall impact, whether positive or negative, remains uncertain.

Furthermore, it is still unknown whether people will start wearing these glasses more widely in public spaces. Possibly users will only put them on when necessary, thereby minimising any potential loss in vision quality.


Do you see more potential in AR and VR? If so, what is it?

While the ability to watch movies on a 120-inch screen is undoubtedly impressive, it does not align with my personal goals. Instead, I find value in exploring possibilities that can benefit users, even if they have not been physically invented yet.

These glasses have the potential to revolutionise specific industries, particularly those that requiring specialised approaches. While mass adoption is still a distant goal, they are likely to find greater use in the entertainment realm. AR and VR technology will bring the greatest benefits in areas where spaces play an important role. Medicine will benefit from the possibility of analysing virtual 3D images. The architecture will allow faster visualisation of room equipment in real time, and training techniques in specialised areas will gain a new tool.

However, I also envision an area where early adopters could be individuals who are blind and visually impaired. The iOS environment already offers excellent accessibility support, and I had the privilege of working on a project in Warsaw that made public transportation information more accessible through sound. Thanks to such an experience, I deeply understand the challenges faced by those who struggle to perceive the predominantly visual nature of reality.

With virtual glasses, people, including those using a white cane or using a voiceover, can receive information such as “There is room number 6 ahead of you” without using their hands. This ensures better accessibility, convenience and comfort of movement, especially in spaces containing a lot of text information.

Although, the high prices currently categorise it as a “Pro” option and limit its widespread adoption, I firmly believe this technology holds the potential to spark the next revolution in many industries and could provide better accessibility for people with special needs.

Tim Cook, CEO of Apple, and Piotr Jeremicz, Technical Lead (iOS) at ERGO Technology & Services Poland Tim Cook, CEO of Apple, and Piotr Jeremicz, Technical Lead (iOS) at ERGO Technology & Services Poland at WWDC 2023

Most Popular