National Australia BankYourself
People find it hard to really know what they want in life. As National Australia Bank, if customers don’t know what they want in life, then it’s hard to help and provide relevant services.
Combining the latest technology for facial analysis and speech recognition, NAB Yourself is an interactive tool designed for self-reflexion and for people to start think about what they really want in life.
Entering the experience, people could scan their face and start speaking on different topics designed to trigger a conversation. While answering out-lout, the web application analysed what people said and extracted key themes. By the end, people had a one of a kind artistic representation of themselves.
The end result, as people finish talking to themselves, is a unique visualisation of that person’s real ambitions, transforming them into art through the use of technology.
Liaised with both internal and external stakeholders to help with creative, scoping, prototyping and client presentations.
Designed and prototyped around feasibility and creative execution. Trying to push boundaries on what was achievable from a technical standpoint, all happening from the browser.
Led a team of x3 developers and x1 tester to build, test and deploy the project under a super tight production timeline.
Worked on the foundations for the core biometrical analysis, speech recognition analysis and real-time artistic illustrations generation.
Customers spent an average of 4.25 minutes talking to themselves, with 4 out of 5 users interviewed saying the experience had a positive impact on thinking about what they really wanted in life.
41,343 data points generated, allowing the National Australia Bank to connect customers with products tailored to them.
Biometrical analysis was built upon Amazon Rekognition and a set of custom algorithms, allowing to extract key face metrics, balance the results based on facial expressions and transform properties ratio to align with creative direction.
Speech analysis used both the native Web Speech API, and IBM Watson Cloud services to convert people’s voices into transcripts. We then analysed text in real-time using natural language processing to extract main themes and metrics, transforming what each person said into animated illustrations. Symbols, shapes, positions, and colours of the patterns that fills up the different part of illustrated faces, were driven by voice data.
Working with a national bank means high-standards around security, accessibility and browsers support. To make the entire experience fully AA accessible and support Internet Explorer 11, we've had to be creative around solving certain issues. For example, we've developed a bespoke feature fallback (based on the Adobe Flash Player) for browsers that didn't support the MediaDevices interface.