Manned-unmanned Operations Trainer Use Case

LogoManned-unmanned Operations Trainer


The Collins Aerospace Applied  Research & Technology Centre is seeking to deploy a Mixed Reality (MR) simulator, combining a physical cockpit with synthetically generated window views, on the CHARITY platform to demonstrate the potential of MR to be deployed and operated across a continuum of device, edge and cloud resources.

Advances in XR have enabled dramatic opportunities for training simulators to increase the sense of immersion of trainees. Commercial flight simulators, traditionally requiring large expensive screens and significant local computation resources, are seeking to leverage advances in XR to reduce the amount of equipment required on site and bring an elevated sense of realism and immersion to flight trainees. 

Our redrafting of the conventional flight simulator model extends beyond XR headsets to a fluid microservice architecture distributed across the edge and cloud. Moving flight simulator deployments from forklifts to backpacks requires a rethinking of the entire architecture. The maintenance and scalability efficiencies of a cloud based approach has many attractions but equally significant challenges in terms of latency and orchestration when deploying at scale across multiple users. CHARITY gives us a platform stretching from device to edge to the cloud to rollout, operate and manage flight simulator instances. It offers monitoring, orchestration and dynamic adaptation, efficient routing and mobility and we need all of these to push forward our vision of a cloud-based mixed reality flight simulator. 

Collins Aerospace is currently working on the application adaptation and walkthrough flight simulation use case, refining the flight simulation pipeline and refactoring the functionalities in each component. In addition, we are building a mixed reality cockpit with HMD.