Why contextual awareness is about to change the way you use tech
In Depth Technology is about to make your life even easier
Fed up with constantly staring at your smartphone to accomplish
menial tasks? Apps for this, apps for that, all needing to be
downloaded, found, then launched at the appropriate time. Siri and
Google Now may think they know what you want, but they have no idea
where you are or what you're doing - yet.
Step forward contextual awareness: a new breed of app coming to a smartphone or wearable near you. It promises to make use of the data it collects to second guess, and automate, a lot of your daily tasks.
Soon, apps with some degree of contextual awareness will begin to fire up automatically. "While Google Now and Siri may be useful in spotting how the weather may change or how far you are from home, systems where the smartphone instantly enters relevant modes will ultimately prove far more useful," says Jay Karsandas, Digital Manager at Mobiles.co.uk. "This could include GPS in the car, calorie counters when jogging, or distance trackers when on a bike."
When you sit down on your daily commuter train your most-used app will load - be it music, a game or your email inbox. And when you sit down in front of the TV, your second screen will be ready with Facebook, Twitter or eBay without you having to find the app manually.
"Highly efficient, context-aware management of your calendar, tasks, and information needs" is what we want from a smart watch, apparently, and the example given does sound enticing; your watch tells you when to leave for your next meeting, based on real-time traffic information.
On the back of such contextual awareness, the Smartwatch Group expects half of the estimated 1.6 billion watches that will be sold in 2020 will be connected to the internet.
He
believes the scientifically-based means to assess if we need to go to
the gym more, change our sleep patterns, or add an extra green vegetable
into our diet will prove irresistible.
"Slyce uses proprietary developed image recognition to detect products within user-generated mobile pictures," says Mark Elfenbein, President at the Canada-based Slyce. "Slyce assesses the attributes in these images and matches them against the closest comparable match from a specific retail brand's catalogue."
This white-labelled tech will likely be embedded in future apps from retailers. You'll be able to use a smartphone to snap a photo of a pair of shoes in the store and it will take you to the item's website, send you a discount coupon, or show you a demo video.
However, Elfenbein suggests other uses for Slyce. "Snap a photo of a
specific hair-style to receive product information on how to 'get that
look', or snap a photo of a home fix-it project to get the necessary
tools and supplies to complete job," he suggests.
So what's the future of contextualised visual search? "Slyce believes the future of contextual visual search is allowing a user to snap a photo of any item and derive all useful data out of that specific item, from cost, availability, comparables, related items, coupons, product demonstrations," says Elfenbein.
"What's been missing is the ability to combine these factors with real-time information such as 'in the moment' browsing data, the device a customer is using, their specific location and their stage in the purchase cycle. Bringing all of these factors together is defined as contextual personalisation."
This is about giving people what they want, when and where they want it, but it could also be the catalyst for innovations that marry the online world with the real world. Imagine if a shop knew what you wanted before you even entered – everything from your shopping habits, likes, dislikes and previous purchases.
As an example, Fleming describes how contextual awareness could help Jane while she shops for shoes online and in store. "Modern technology allows us to align what she has looked at previously [on the website] with her current online behaviour and external data such as geo-location and weather," he says. "You can then tailor her experience by providing pages that reflect the sunny weather where she is in Brighton and recommend summer sandals. Likewise if she was in Manchester, where it's raining, you could highlight some of the latest styles in wellies."
Fleming points out that even if Jane doesn't put anything in her basket, the retailer can use the historical and real-time data collated to send her an email within minutes around a deal based on the products she has looked at. The same could apply to cinemas and coffee shops.
"This is what contextual personalisation, combined with new technologies such as Apple's iBeacon, enables," says Fleming. "It uses consumers' known online behaviour data to drive offline sales."
Tell a future app that you're in the market for a new car, and it will schedule test drives for cars it knows you'll like – from your browsing history – at garages in your area at times it knows you can make. Some apps are already getting close to this; US-only app MyTime lets you instantly book appointments for a haircut/MOT/with a dentist without picking up the phone. In future, apps will be smart enough to know what you need (a year has passed since you last saw an optician) and when you should go (you're driving nearby and the optician just had a cancellation).
"Project Tango strives to give mobile devices a human-like understanding of space and motion through advanced sensor fusion and computer vision, enabling new and enhanced types of user experiences – including 3D scanning, indoor navigation and immersive gaming", says Johnny Chung Lee, Technical Program Lead of Google's ATAP (Advanced Technology and Projects) group, one of the technology partners in Project Tango.
Sensor chip's dynamic range and 3D camera modules that measure depth ever more accurately are now in the lab, but the Tango tech – and contextual awareness – could hit the mainstream sooner than anyone thought.
Step forward contextual awareness: a new breed of app coming to a smartphone or wearable near you. It promises to make use of the data it collects to second guess, and automate, a lot of your daily tasks.
Soon, apps with some degree of contextual awareness will begin to fire up automatically. "While Google Now and Siri may be useful in spotting how the weather may change or how far you are from home, systems where the smartphone instantly enters relevant modes will ultimately prove far more useful," says Jay Karsandas, Digital Manager at Mobiles.co.uk. "This could include GPS in the car, calorie counters when jogging, or distance trackers when on a bike."
When you sit down on your daily commuter train your most-used app will load - be it music, a game or your email inbox. And when you sit down in front of the TV, your second screen will be ready with Facebook, Twitter or eBay without you having to find the app manually.
What will be contextually aware?
Smartphones and tablets will be contextually aware, for sure, but this will benefit wearables especially. Independent research company Smartwatch Group analysed the 20 most relevant application areas for smart watches in 2020, with personal assistance the clear winner."Highly efficient, context-aware management of your calendar, tasks, and information needs" is what we want from a smart watch, apparently, and the example given does sound enticing; your watch tells you when to leave for your next meeting, based on real-time traffic information.
On the back of such contextual awareness, the Smartwatch Group expects half of the estimated 1.6 billion watches that will be sold in 2020 will be connected to the internet.
Why wearables?
Wearable tech will be one of the most important sources of data for contextual awareness platforms. "The sensors in wearable tech can go beyond what sensors fitted in a smartphone can detect, especially when it comes to health and physical data about the individual using wearable technology," says Henrik Torstensson, CEO of health and fitness platform Lifesum."Data accumulates continuously making services become more and more accurate and complex, based on the patterns and idiosyncrasies of human behaviour."How will contextually aware apps help us?
Contextually aware apps could become integral parts of our culture. "The perception will be that they significantly improve quality of life, so that more and more people use them, without a second thought," says Torstensson.What is contextualised search?
Contextualised search goes beyond the barcode scan and the "search by image" feature on Google Images. It's best demonstrated by the Slyce visual search platform."Slyce uses proprietary developed image recognition to detect products within user-generated mobile pictures," says Mark Elfenbein, President at the Canada-based Slyce. "Slyce assesses the attributes in these images and matches them against the closest comparable match from a specific retail brand's catalogue."
This white-labelled tech will likely be embedded in future apps from retailers. You'll be able to use a smartphone to snap a photo of a pair of shoes in the store and it will take you to the item's website, send you a discount coupon, or show you a demo video.
In Depth Technology is about to make your life even easier
So what's the future of contextualised visual search? "Slyce believes the future of contextual visual search is allowing a user to snap a photo of any item and derive all useful data out of that specific item, from cost, availability, comparables, related items, coupons, product demonstrations," says Elfenbein.
Contextual awareness is about personalisation
The age of big data means that marketeers now have access to a huge range of customer information, but they're not using it. "Until now, personalisation has usually involved using a combination of known profile information and historical data," says John Fleming, Director of Marketing, EMEA & Australia at Webtrends, which works with the companies like Lloyds Bank, Barclays, HSBC and Lastminute.com."What's been missing is the ability to combine these factors with real-time information such as 'in the moment' browsing data, the device a customer is using, their specific location and their stage in the purchase cycle. Bringing all of these factors together is defined as contextual personalisation."
This is about giving people what they want, when and where they want it, but it could also be the catalyst for innovations that marry the online world with the real world. Imagine if a shop knew what you wanted before you even entered – everything from your shopping habits, likes, dislikes and previous purchases.
As an example, Fleming describes how contextual awareness could help Jane while she shops for shoes online and in store. "Modern technology allows us to align what she has looked at previously [on the website] with her current online behaviour and external data such as geo-location and weather," he says. "You can then tailor her experience by providing pages that reflect the sunny weather where she is in Brighton and recommend summer sandals. Likewise if she was in Manchester, where it's raining, you could highlight some of the latest styles in wellies."
Fleming points out that even if Jane doesn't put anything in her basket, the retailer can use the historical and real-time data collated to send her an email within minutes around a deal based on the products she has looked at. The same could apply to cinemas and coffee shops.
"This is what contextual personalisation, combined with new technologies such as Apple's iBeacon, enables," says Fleming. "It uses consumers' known online behaviour data to drive offline sales."
Will contextually aware apps make things easier?
Life is already easier thanks to apps to some extent, with apps like Uber (connects you with a driver) and Venmo (make and share payments) simplifying everyday tasks. but we're in for more intricate apps that gather information on your context – and, therefore, calculate your needs.Tell a future app that you're in the market for a new car, and it will schedule test drives for cars it knows you'll like – from your browsing history – at garages in your area at times it knows you can make. Some apps are already getting close to this; US-only app MyTime lets you instantly book appointments for a haircut/MOT/with a dentist without picking up the phone. In future, apps will be smart enough to know what you need (a year has passed since you last saw an optician) and when you should go (you're driving nearby and the optician just had a cancellation).
Google gets into contextual awareness
Although big data has a huge role to play in contextual awareness, for personal assistants like Google Now to be more aware of the user's interactions as well as geo-location requires much more accurate sensors in phones. Google's Project Tango concentrates on making devices much more aware of its surroundings."Project Tango strives to give mobile devices a human-like understanding of space and motion through advanced sensor fusion and computer vision, enabling new and enhanced types of user experiences – including 3D scanning, indoor navigation and immersive gaming", says Johnny Chung Lee, Technical Program Lead of Google's ATAP (Advanced Technology and Projects) group, one of the technology partners in Project Tango.
Sensor chip's dynamic range and 3D camera modules that measure depth ever more accurately are now in the lab, but the Tango tech – and contextual awareness – could hit the mainstream sooner than anyone thought.
Commentaires
Enregistrer un commentaire