Symphony AR can provide an organisation the ability to share and interact with its location or boundary spatial data in an innovative, intuitive and visual way on a mobile device. Data is represented visually, therefore reducing the time it takes to identify and interrogate assets or points of interest. Its filtering capability allows you to streamline only the data you require, while at the same time giving access to more detailed information when needed by interacting with it.
Our Symphony AR app comes with a set of web services that allow you to view any of its spatial data on a smartphone (both Android and iOS), tablet or smart glasses while out in the field. This data can be viewed in a traditional 2D map and as augmented reality images overlaying the real-life view on the screen. Adaptable to your organisation’s individual needs, you can choose the data you want represented, determine the visual representation, restrict the devices that have access and control security.
Key features
View any location/asset with geographical coordinates as an augmented reality (AR) image.
You can either create your own images or choose from our extensive library to represent the categories of your locations/assets. You can also select / create the further information shown when users interact with the image. The size of the AR image is distance sensitive, ie it will decrease in size the further you are away from it.
View your own location intelligence data as augmented reality markers
Any location intelligence you’re holding on specific locations/assets can be visualised through augmented reality markers, eg risk data such as firearms licenses, petroleum or dangerous dog; as well as social data such as vulnerable person, child at risk, or hoarder etc; or information of interest relating to towns, festivals or tourist/heritage spots. It does this by pulling data directly from your systems, ie from a database, APIs or via a scheduled upload of files from your systems.
Filter data by required range or category
This means those out in the field can filter to see only what’s relevant or of interest to them, eg, this could mean showing all the risks in a half mile radius, or showing all the trees with a preservation order.
Control what data is seen when interacting with the images
You can control what information is shown according to the user, tailoring pertinent data to the individual.
Ability to show boundaries as well as locations
The boundary awareness view provides an overview of a defined area rather than a specific point, eg the county, Ward or Parish. It will also show environmental boundaries such as flood risk areas or protected woodlands.
Image recognition feature that triggers relevant augmented reality images/animations
By pre-loading selected images, when the smart device is aimed at that image in the field, a pre-determined animation, text box, table or video will augment onto the screen. This is a great way to engage with the public on nature trails or at heritage sites, as well as detailing data on conservation or local hazards.
Capability to display interactive 3D models the user can manipulate
If emergency services had interactive 3D models of large leisure and sports facilities, for example, it could help speed up the emergency planning relating to an incident at such a venue. With the capability to zoom in and pan round the 3D model, it provides a virtual recce of the building.
Multi-level tracking
Each device can be identified and therefore tracked when inside a building. This is particularly beneficial for fire and rescue services in terms of knowing the location of crew members both at the time and post-incident.
Device manager enabling control over access to the data
You can take full control of who sees what, ensuring relevance and efficiency, as well as control over sensitive data.
Capability to show proximity based banner ads or messages
This feature could be used to generate income for a local authority, or it could be used for warning messages relating to nearby risks etc.