The brief history of Visual Search

Visual search in the simplest terms is searching something using its visuals whether done by human beings with their eyes or by a machine i.e. computer vision. Search on the internet has come a long way from text, voice based, image search to finally visual search.

In text search, the search is purely based on the keywords which users are inputting. One should narrate the characteristics of the image which one wants to search on the internet. In some cases it is a difficult task to express the characteristics of an image in words. Voice search was introduced to increase the ease by eliminating the boring keyboard typing and allowing users to search the web using spoken voice commands.

Image search by google was introduced on July 12, 2001 due to a demand for pictures of Jennifer Lopez’s green Versace dress that the regular Google search couldn’t handle.

For the sake of no confusion let’s define image search and visual search. Image search is the search for an image using text or words as query whereas visual search is finding images using images as input query.  The results depend on the searched image, its contents, such as metadata, objects, patterns, distribution of color, shape, etc.

Although visual search has immense number of applications in various sectors, it has a direct relationship with fashion and lifestyle industry where products are difficult to put into words and visual search comes handy.  Almost all the big brands in online and offline-to-online retail have realized the potential of visual search and incorporated it in their shopping platform. A few of these retail giants being Macy’s , Ikea, Tommy Hilfiger, JCpenney, Target , Forever 21 , ASOS etc.  By 2021, early adopter brands that redesign their websites to support visual and voice search will increase digital commerce revenue by 30% -Gartner.

Visual search has come along significant changes on its path of what we see and experience today. A lot of companies have been developing their own visual search engines over the years and hence contributing to the technology as a whole.  Let’s experience chronologically the big steps taken by various companies which lead to the current evolution of Visual Search:

In 2008, TinEye claimed to be the first company to launch reverse image search engine which used image identification technology rather than keywords, watermarks or metadata.

In 2010, Google introduced Google Goggles as a tool for the first time to search particular information of an image with the help of a camera, initially designed for Android smartphones. Unfortunately it did not get very popular among the users due to some technical disabilities and Google eventually decided to shut it down.

Amazon.com announced fresh visual search capability in its men’s and women’s shoe stores which allowed customers to search and browse for shoes based on how they look.

In 2011, Google introduced Reverse Image Search for the web-based image search. Upload a photograph from desktop to Google Images and it will show related images used on other websites and also different sizes of the same photo almost instantly. Amazon launched their first augmented reality app, Flow,  which provided both object and barcode recognition.

In 2013, Image searcher Inc. launched Camfind ,a  visual search and image recognition mobile app . It allows users to identify any item by taking a picture with their smartphone.

In 2014, Pinterest allowed its users to highlight specific sections of pins to find visually similar pins to the selected section.

In 2015, India’s largest online retailer Flipkart added visual search feature to its mobile app. Within a few months of launch the company removed the functionality from the App due to low customer search numbers.

In 2016, Microsoft Bing, introduced highlight-specific-sections functionality to its mobile image search.

eBay acquired “Corrigon”, an Israel-based startup,  specialist in computer vision and visual search technology to  build out its search and discovery technology for customers.

India based Turing Analytics , launches its Visual search solution for fashion, furniture and lifestyle retailers.

In 2017, Pinterest launches Pinterest Lens, a visual search feature on its platform. H&M’s Image Search first launched in May in UK, Netherlands, Denmark and Finland.

Google announced its image recognition mobile app “Lens” at the I/O conference in May, 2017.

Tommy Hilfiger launched Visual search on their app in a fashion show where audience could buy products by clicking on model’s photo on the ramp and uploading on app.

In 2018, Snapchat added visual product search powered by amazon, to let its users identify products just by capturing them with their phone camera.

Microsoft Bing launched new intelligent Visual Search capabilities that build upon the visual technology already in Bing.

Visual searches per month reach 1 billion mark- eMarketer’s 2018 Visual Search report .

Visual search has still miles to go in terms of going granular into identifying the most minuscule object/parts of objects in any image amongst other data related challenges. It is a continuous process of data training and improvement and will only get better with time.

Turing Analytics is a computer vision company providing Visual Intelligent solutions that enable the top retailers of the world to provide a superior shopping experience to their customers. Visual search is one of their prime offerings. Their complete solution stack includes:

  1. Visual Search – Enable shoppers to search items using photos of products they are inspired to buy.
  2. Visual Recommendations – Suggest products on E-commerce platforms based on visual similarity.
  3. Visual Recognition – Identify and tag objects in Images and Videos.
  4. Visual Cart – Faster visual search implemented on device. It enables Queue less checkout (like Amazon Go) for retailers without any infrastructure investment.
Please follow and like us:

Leave a Reply

Your email address will not be published. Required fields are marked *