Mobile shopping represents today 30% of all US e-commerce sales, and with 60% of consumers using their smartphones while in a store, the opportunity to scale is impressive. Unlike more patient desktop users, mobile consumers seek fast answers and minimum clicks to get there. Using the built-in cameras and image matching technology, companies like Slyce offer the possibility to easily snap and shop any item ( or similar) that crosses the path of an avid shopper. But that is only the beginning of the mobile visual search experience. We spoke with Mark Elfenbein, President and CEO of Slyce to learn more.
A little about you, what is your background?
I’ve spent most of my time in technology start-ups and most recently with Mood Media (MM-TSX). We provided in-store music and visuals to over 560,000 retail locations and I worked with companies such as Macy’s, Levi’s, Abercrombie & Fitch, Yahoo, Shazam, Shopkick, and TripAdvisor. Prior to that, I co-founded SkillJam.com, working with Fun Technologies to build a skill games business with over 25 million registered users. We worked with AOL, MSN, and eBay. My introduction to the consumer product industry came through my family as we founded K-tel International and we were most well-known in the music industry.
Explain Slyce. What does it solve?
How does it work?
Do people really use their smartphone to shop?
You are not the only one in the visual search shopping space. How does Slyce differ from its competition?
Slyce stands out from the competition in several ways. Specifically speaking, we have raised close to $40MM to date which has funded several complimentary technology acquisitions. As a result, we are the only company to offer a robust product suite to meet almost every need or customer as the industry evolves and use cases increase.
You are using image matching to connect photos with inventory. Do you plan to add deep-learning and A.I to enhance the user experience?
We can’t comment about specific product and technology roadmaps, but with that said our goal maintains the same: In the snap of a photo, connect users with meaningful product results in the most intuitive way.
Slyce can recognize clothes, food. What else is on your map?
We think the best way to predict future use cases is by returning to the initial problem Visual Search is attempting to solve: Snap a photo of a product that is challenging to type into a search engine. Naturally this leads to fashion and home decor given the average person may not know how to accurately describe unique wedges or a love seat. But then, you have to look at use cases that are driven by convenience such as grocery or wish list creation and the pool grows bigger. You then can get into scenarios based around home repairs such as providing product suggestions or brands that want to activate print advertisements to drive aspirational content and immediate mobile commerce. The opportunities with visual search really are endless…
You just launched the Craves app for consumers. How is the reaction?
What would you love to see Slyce offer that technology cannot yet deliver?
We see the technology progressing towards 1-to-1 exact matching. Today if I snap a photo of a handbag, it will show visually relevant results to that bag (which sometimes does include the exact match). We want the world to reach a point where every product is recognizable and users taking a photo can get both an exact product match but also visually relevant similar results if they also desire.
Slyce is traded publicly on the TSX Venture Exchange market.
Author: Paul Melcher
Paul Melcher is the founder of Kaptur and Managing Director of Melcher System, a consultancy for visual technology firms. He is an entrepreneur, advisor, and consultant with a rich background in visual tech, content licensing, business strategy, and technology with more than 20 years experience in developing world-renowned photo-based companies with already two successful exits.