For the past decade, mobile apps have powered disruption. When designed well, they’ve created magical experiences and transformed industries as we know them. As with most emerging technologies that create opportunities for disruption and innovation, they also create complex business problems – like delivering insurance quotes in seconds or providing on-demand transportation seem effortless. Unbeknownst to the user, behind every one of these “effortless” apps is impeccable code and a dynamic technology architecture that balances hardware and software, purposefully engineered to deliver a frictionless user experience.
Create experiences that are inclusive, accessible, and respects users’ desire for privacy and control, because good design is good for business. This approach can lead to having high rates of adoption & retention., driving real business value.
Creating differentiated mobile experiences
When exploring ideas and strategies for building a new mobile app and experience for your users and business, there are a lot of considerations. In today’s competition for mind share and screen time, your app needs to integrate 1) the needs of your users, 2) the possibilities of technology to deliver a unique experience that is delightful and intuitive, and 3) the requirements for business success. Once you’ve managed to solve that Rubik’s cube of requirements, there are dozens of critical technical considerations such as scalability, availability, redundancy, real-time access to data, and flexibility to adapt to new capabilities or new systems in the future. Another key consideration – use what’s already available!
Here at Rocket Wagon, we work with forward-thinking companies to stand out by taking advantage of the capabilities that exist within a smartphone and integrating those capabilities into the design and functionality of their app. By leveraging the existing technology, you’re presenting users with a seamless and differentiated mobile experience. It utilizes elements and features that people are already familiar with, reducing the cognitive load of using the app and allows them to intuitively accomplish tasks and actions within the context of your brand.
So, what device capabilities should you and your team consider leveraging when building your new app? There are numerous options available today and new sensors being created every year, but we’ve chosen a few of our favorite use cases that highlight the possibilities of connecting a great brand story with an immersive and differentiated experience.
Vision / Camera
One of the most common hardware elements we see integrated into mobile apps, and one that is evolving every year, is the camera.
When a user needs to accomplish a task that requires taking a picture or video, making the camera on the phone or tablet itself, and the camera within the app, one in the same, creates an optimal experience for the user. Companies have harnessed the capabilities of device cameras in fairly creative ways. From augmented reality and computer vision to image recognition and virtual reality, a phone’s camera can unleash timely utility and immersive experiences.
Take Vivino for instance. Utilizing a phone’s camera and machine vision, Vivino users can scan any bottle of wine or wine list and instantly see user ratings, reviews, winery info, prices and where to buy it. They’ll also receive recommendations for other wines based on what users like.
By helping people across the globe easily find a wealth of information about wine, Vivino has developed a database containing more than 11.1 million wines and attracted over 38 million users (including a few of us here at RW) who contribute to 100,000+ ratings a day.
Other examples include Sephora’s Virtual Artist that lets users test out different products and styles through their camera before making the decision to buy, or Amazon’s AR View enabling users to digitally implant products into their home using their tablet or phone’s camera and augmented reality to visualize how an item would look in specific areas or rooms.
Location / GPS
Apps can use a phone’s location data to provide content relevant to the user’s location or to simplify certain operations. For example, if you’re designing an app for food delivery, instead of asking the user to provide an address for delivery, you can auto-detect their current location and ask the user to confirm that they want to receive a delivery to that location.
A recent example of GPS being applied in a unique way is Uber’s new RideCheck, a feature that can help drivers and passengers get help in case of a crash or any other accident. Uber looks for potential crashes and unusual trips that go off course by using GPS and data from the other sensors in the driver + passenger’s smartphones.
The Experience, Story, and Value Rule.
If you provide great value and create a great experience (or product), you will improve your story (brand). Or, if you have a great story and you provide a great experience, you amplify your value.
Once RideCheck is initiated, Uber can proactively send users a notification asking if everything is OK. The passenger or driver can then check the notification and bring up a panel that gives them instant access to the app’s emergency button to call 911, call Uber’s Safety Line to report a crash, or take other actions like add or change their destination, or to share their trip with a friend.
Movement / Accelerometer
A phone’s accelerometer is able to monitor and detect the tilt and movement of a device through sensors within the phone itself. This technology is what determines which way the screen is being held, and it is what causes your screen to conveniently flip when you are lying down or changing the orientation of the device. Many fitness brands are integrating this hardware into the design of their mobile apps in order to help users to track activity and live a healthier life. Another industry is also interested in how you move and your (driving) health. Insurance.
Root Insurance has potential policyholders take a Test Drive where they’ll gather and analyze data from a user’s accelerometer in their smartphone to evaluate driving behaviors such as how they brake, how fast they take turns, and their overall consistency as a driver.
When the user’s Test Drive is over they will receive an overall rating on their scorecard which will be used in the underwriting process to give the policyholder a custom insurance rate.
Sound / Microphone
The microphone in smartphones and voice technology are making a big impact across our daily lives. From the ways we compose searches on Google, how we order simple household items, and even how we understand our health insurance policies. This emerging mode of digital communication is changing our most fundamental perceptions of human-computer interaction (HCI).
While most organizations are looking at how to drive better understanding of their consumers habits and purchasing paths, Duolingo has focused on an educational use of the microphone to help users learn foreign languages. It takes a gamification approach to education, attempting to make study and practice more engaging than a game of Candy Crush. Part of the lessons require users to recite sentences and words into the microphone, while the app analyzes to make sure you are pronouncing the words correctly. This presents individuals with an easy and informative experience that they can enjoy anywhere and at their own pace.
As with all emerging technologies, look before you leap.
Before spending time on building a strategy, spend time on vision; if you use the wrong lens the results can be uninspiring and kill innovation in its tracks.
Leading technology teams are perpetually working on new sensors and chips to integrate into our smartphones.
Google/Android: One that will be seen this year in Google’s new Pixel 4 phone is Project Soli. The new radar chip can sense precisely how you’re moving your hand — whether you’re making a twisting-like motion as if you were turning a volume knob up or down, or tapping your thumb and index finger together as if you were tapping a button — and then perform an action on your device that’s mapped to that specific movement
The applications of this new sensor are yet to be seen, but you can assume that developers are dreaming up new and exciting ways to integrate it into future apps. The Soli team has pointed out that the idea was not to replace existing forms of interaction but to provide an additional option on top of the methods we all already know and use today.
“It offers a third dimension of interaction which complements and enhances other interaction modalities such as touch-screen and voice input,” “We don’t fight with them. We work together.” – Ivan Poupyrev, Director of Engineering, Advanced Technology and Products (ATAP), Google.
Apple/iOS: Another exciting technology that is making its way into our phones is Apple’s U1 chip that was just launched with the iPhone 11 (they haven’t made an official presentation on its capabilities…yet). The U1 chip utilizes ultra wideband (UWB) radio frequencies in order to better understand spatial awareness and location of other U1 enabled devices or objects. In the use case that Apple gives on their website is an improved version of AirDrop where you point your phone at another iPhone user’s iPhone (11 or higher), and their phone will show up first on the list.
Given that UWB can detect proximity to around 20 cm by measuring the time it takes for the radio signal to pass between two connected devices it will not only be used for indoor mapping systems, a much more accurate GPS for indoors, but it will be critical to augmented reality, virtual reality, and mixed reality environments for fine tuning of spatial coordinates within a given area. UWB will also be exceedingly useful with systems in automobiles, drones, and robotic systems as these systems continue to mature and become part of our everyday lives.
All Platforms: Lastly, one of the most promising pieces of technology that has made its way on to newer flagship phones is machine learning (ML) chips. Machine learning is and will continue to provide sweeping changes to mobile apps. This is due to the ability of this technology to allow hyper-personalized and contextual user experiences that reinforce the purpose and value of the app.
This technology is capable of taking advantage of various powerful functions, such as instantly detecting skin rashes, forecasting and preventing headaches, or providing recommendations based on a user’s previous actions and/or location.
The rapid development of mobile applications on machine learning has occurred as a response to a series of common problems that classic machine learning models have already worked with, sort of like a starter kit. In the future, mobile applications will require faster processing speeds and lower latency, which will be provided by newer, faster phones and the emergence of 5G networks.
Each of these examples are providing value to their users through tools and an experience that they can’t access from their desktop or laptop. Many of these features are what make the mobile app unique, and allow companies to stay on top of the expanding capabilities of the ever-changing digital world.
It’s always a stimulating exercise when thinking of the ways to integrate existing and emerging technologies into mobile app designs in our pursuit to provide value through technology-enabled brand experiences. However, given that change is our only constant these days, leading brands are already preparing for the day that screens disappear, creating an entirely new paradigm and boundless opportunities for differentiated experiences.
RW is a digital product development firm that exists to deliver on the promise of emerging technologies and bring dynamic, creative and impactful consumer engagement ideas to life.
Give us your hard problems
Build a reputation for innovation, and embrace the full power of your future with Rocket Wagon.Let's Talk