Case Study: Barclay's
Client proposal & concept exploration UX design
UI & UX design
Allow key stake holders the ability to experience the final design of Barclay’s new Buchanan Wharf office buildings, in situ, as the construction progresses.
- Raise awareness in the press of the new jobs Barclay’s will be bringing to Glasgow
- Gain interest from investors around the available commercial spaces
- Excite employees of their new work spaces
From initial client meetings and requirements gathering and learning that the real driving factor behind this app was to simply excite people visiting the construction site throughout the build process. The ability to augment the final building design in the correct place, in scale, over the skeletal building work would be a great way to build interest and excitement in people visiting the site.
I did a set of initial wireframes to demonstrate to the client the process a user will go through when encountering permission requests and tutorials for the augmented reality functionality.
Because the whole app relies on using the camera for AR we needed to ask the users permission to access that. On Android, if the user declines a permission (whether on purpose or accidently) you can simple prompt the native dialog as many times as you need to. On iOS, however, you can only ask the user for a particular permission once. To get around this (if the user accidently declines the permission) we can direct them straight to the settings page where they can manually give camera access to the app. Obviously the best solution is to try and mitigate this problem in the first place by having clear succinct screens explaining why we need the permissions in the first place.
Once we were happy with the flow of the app and satisfied there were no snags from user testing, I proceeded with incorporating Barclay’s branding into the app, working from their brand guidelines.
In the wireframe stage, during some user testing, we encountered people being a little confused at the AR instructions. Just to explain, the method we are using for AR is a mixture of Apples ARKit (which scans the surrounding environment for surfaces or planes) and Vuforia image tracking (which recognises a non repeating pattern—like an image or QR code). The reason for this is because ARKit requires the user to place the AR object into the scene themselves, in a postion and orientation defined by them. However, due to the fact we wanted to augment an entire building in the correct orientation and position relative to the building site, we have to decide all of that for the user—and that is where the image tracker (sign) comes in. The user can scan the sign first, which will tell the app the orientation the user is in, and then place the AR model in the environment using ARKit (which allows the user to then look away from the sign and not worry about losing tracking). I was quite confident I could help to explain the 3 steps required from the user more clearly in the mockups with some simple illustrations to accompany the text, which seemed to work very well when conducting further user testing at this stage.
Testing and Reception
Our developers worked on-site to correct the position of the sign-post, and the augmented building until it was perfectly seamless for the user.
The reception to the app has been very positive and even the first minister was down using it recently. Overall Barclay’s have been very pleased with the reception the app got that we are now working on a second phase specifically for Barclays employees to view their work stations and tour the building and nearby local amenities.