Aussie visionOS Day - Sydney
Event description
For our Sydney event there will be two half days;Â
- Design will be in the morning
- Enterprise will be in the afternoon
You are welcome to attend both, but please only get the sessions you intend to turn up for. We are expecting there to be a waitlist for both sessions.Â
There will be a stellar line-up of presenters covering a broad range of experiences for the Apple Vision Pro including;
DESIGN for visionOS - 9am to 12:30pm
Accessibility on Apple Vision Pro presented by Apple
The best technology works for everyone. At Apple, our products and services are inclusive by design, with built in accessibility features — right out of the box. As with all Apple products, powerful accessibility features have been built right into visionOS. Apple Vision Pro provides many features to support vision, physical and motor, hearing, and learning needs. In this session, explore how to personalise your device to meet your unique needs, discover more about these features and how to develop for them.
Rethinking the UX for Apple Vision Pro Oliver Weidlich (Contxtual)
One of the opportunities for spatial computing is to orientate digital content and experiences within the physical world. But there are many aspects of the physical world that we can combine with personal information and digital content that can make our experiences easier, and more engaging. Oliver will talk about we can use elements of context to design better spatial computer experiences and will use the case study of their Apple Vision Pro app, Day Ahead.
iOS to visionOS, Preparing your apps for the future Devin Davies (Creative Interactions) 2024 Apple Design Award Winner
Mobile has been a dominant platform for the last 17 years, with that comes well established paradigms and feature rich applications, in this session we’ll look at how we can take those apps and update them not just to run in the spatial computing era, but to excel on this new exciting platform.Â
Design great creativity and productivity apps for visionOS presented by Apple
This session will cover fundamental principles to consider when designing your app for visionOS. We’ll explore the various immersion styles available in visionOS, showcasing some impressive examples from developers. Furthermore, we’ll discuss collaborative approaches for enhancing productivity and creative workflows, as well as the advantages of building native apps for visionOS.Â
ENTERPRISE for visionOS - 1:30pm to 5pm
Immersive Disaster Recovery Orchestration (POC)Â Rod Sampera (Telstra Purple)
Sharing our explorations into a proof-of-concept Apple Vision Pro app, designed to assist Field Intelligence Officers in monitoring, assessing, and orchestrating Disaster Recovery Operations following major weather or disaster events. The app integrates real-time field intelligence sourced from iPhones, iPads, drone scans, and 3D models of impacted sites. Leveraging the immersive capabilities of Apple Vision Pro, this data is brought to life for in-depth analysis and collaboration, enabling the Disaster Recovery Operations team to make informed decisions and efficiently restore network functionality.
Building Thoughtful ML Experiences for visionOS James Dale (Friday Technologies)
Machine learning has become crucial in shaping how we interact with devices and services in recent years. Let's explore machine learning's capabilities in the spatial context of visionOS. Learn more about building great apps thoughtfully with Core ML and the Enterprise APIs available for visionOS 2.0. We will also explore building with privacy and accessibility in mind and what considerations designers and developers should take when running models in the wild to respect user data and maintain trust.
Spatial computing for medical assessments and health care Mel Randall (Labflow)
Labflow currently provides software solutions within the health care sector and is embracing the potential that Vision Pro may bring to its customers. In medical environments, professionals are often ‘on-the-go’, moving between wards, workstations and testing labs. With Apple Vision Pro, the ability to access data, hands free and untethered to a traditional workstation is potentially a game changer for productivity and efficiency.
Enhancing Hearing Experiences with Vision Pro: NALscribe AR and the Future of Immersive Audiology  Nicky Chong-White (National Acoustic Laboratories)
This presentation explores several applications of Apple Vision Pro in hearing health. First, our NALscribe AR app enhances live captioning by creating speaker-attributed streams positioned above each speaker and adding visual cues when someone begins speaking, making conversations more accessible for those with hearing loss. We are also working towards introducing a Vision Pro headset in selected hearing clinics, allowing patients to experience enhanced sound clarity and improved speech understanding in immersive, realistic scenarios (e.g. noisy restaurant, music venue, shopping mall). Additionally, we are investigating ways to offer researchers a flexible tool to recreate realistic soundscapes anywhere, providing more control and immersion without the need for sophisticated lab setups - enabling faster, more effective hearing research.
Tickets for good, not greed Humanitix dedicates 100% of profits from booking fees to charity