Trending March 2024 # Google Lens And The Future Of Visual Search # Suggested April 2024 # Top 9 Popular

You are reading the article Google Lens And The Future Of Visual Search updated in March 2024 on the website We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested April 2024 Google Lens And The Future Of Visual Search

The way forward for AI and retail is Visual Search

Machine learning is taking online search to new levels, and it’s not just voice search that’s on the rise. While many have been focused on the marketing potential of capturing conversational queries barked at Alexa over breakfast, big brands have been quietly developing a stronger competitor in interactive SEO: visual search.

For decades, we’ve searched for information and products online using a text search bar. The introduction of voice search has since made waves in local SEO, with users searching for opening times, directions, weather reports and other of-the-minute information – but it has left many online retailers feeling stumped. Not every trend fits with every business model, and for businesses who rely heavily on visuals to drive conversions, the opportunity brought by voice search has often felt limited.

Download our FREE Resource – 10 common website customer experience mistakes

The most common user experience mistakes, with strategy recommendations, examples and recommended resources.

Access the

Humans process visuals 60,000 times faster than text, and according to research by Kissmetrics 93% of consumers consider visuals to be the key deciding factor in a purchasing decision – which is why over the last few years, ecommerce sites have been bulking up photo galleries and adding 360° videos in an effort to increase conversions. Now, thanks to innovations like Google Lens and Pinterest’s Shop the Look, the payoff for that work seems set to increase.

What is visual search?

When you’ve seen an item or an image of an item that’s caught your eye but are unsure of the brand, the model, the name of that style, that’s where visual search comes in.

Unlike an image search, where an ordinary text search pulls possible relevant images using structured data, visual search is the process of fulfilling searches by using machine learning to analyze components within a submitted photo, and finding results that replicate or relate to those visual cues. Think of the way that Facebook now recognizes the faces of friends you’ve tagged in past images – it’s this genre of technology that is now being used to develop wider visual search.

Just as you can look across a room and see a variety of objects, read labels and observe features, so can visual search AI. Google Lens, for example, can see an image of a landmark and offer you details on its history or where to buy tickets to go inside. It can look at your photo of a book and offer recent reviews for that title, places to buy it online and alternative books by the same author.

In the same manner, Pinterest’s visual search functions allow you to select different parts of a photo – a pair of shoes, a lamp, a paint colour – and find similar products to purchase. It can also offer outfits or room décor suggestions that include other items to pair your selection with.

The current state of visual search

eBay announced in 2023 that they were gearing up to launch Find It On eBay, adding image search functionality to their app and mobile website and enabling users to snap a photo and instantly find anything on eBay that looked like it. They’ve also cashed in outside of their own domains, recently announcing a collaboration with Mashable in the US where users can shop for eBay products that bear resemblance to clothes and items in Mashable images – all without leaving Mashable’s site.

The potential for ecommerce sites to capitalize on this style of search is huge. Keyword-generated image search can be frustrating when faced with vast inventories and products that may have been incorrectly or poorly tagged, or described using terms we haven’t thought of. Users who have found themselves typing in every variation of a colour name or style description in the hope of finding what they’re looking for are more than ready for a simple and effective visual-match search.

People can find cheaper alternatives to items they’ve seen in a shop window or print magazine, or identify a variety of plant they’d like to add to their garden. As well as offering increased convenience from a user perspective, retailers who optimize their sites effectively should find an increase in relevant traffic that’s ready to convert.

Optimizing for visual search

Though visual search will have uses in a range of industries, it seems fair to assume that this intuitive tactic will be retail-dominated. Sites should still ensure that their images are optimized using structural data and other traditional SEO tactics, but going forward there’s also a need for imagery to be clutter-free and easy for visual search tools to process while the technology is still developing. And of course, there needs to be plenty of imagery to digest in the first place.

How to optimize for visual search:

Offer a range of clear images for each product

Optimize image titles with target keywords

Submit image sitemaps

Set up image badges

Optimize image sizes and file types

Run structured data tests

Generally speaking, the more steps there are between the start of the purchase funnel and the checkout, the higher the cart abandonment rate. The Baymard Institute say  that on average, nearly 70% of online carts are abandoned before checkout, and a lengthy process to get to the payment screen can cause around a third of users to ditch a site and shop elsewhere.

In essence, traditional image optimisation is still half the battle – the other is ensuring that you’ve put enough time and thought into your product photography in the first place.

Visual search is set to dramatically improve the online shopping experience in the coming years, and with retail ecommerce sales in the UK predicted to reach a value of almost £94billion in 2023, there couldn’t be a better time to cash in.

You're reading Google Lens And The Future Of Visual Search

Google Search Gets Stories, Video Results, Google Lens In Images

Google Search is now 20 years old, and to celebrate the landmark, Google has announced a host of changes aimed at making search more visually interactive as well as convenient.

More Stories

Stories in search results will look like WhatsApp Status, but the major difference is that Google’s take on Stories will also contain relevant text-based information about a famous personality arranged in a sequential order, helping users get more information in a visually engaging fashion.

The Stories will leverage Google’s AMP platform and will be curated by an AI algorithm to show relevant information. AMP Stories arrived on the scene this year, and these new Search Stories are an extension of the feature. Moreover, each Stories card will have an embedded link that will take users to the source if they want to do an in-depth research. Stories will begin appearing starting today in Google Search.

Video Previews

Another notable change is video previews in search results, which will play a featured clip that Google determines has the most relevant content related to your search query.

For example, if you search for a term like the Alps, Google will show you a video about a trip to the Alps, with suggestions about places of interest and landmarks.

Google Lens Integration in Images

Google Lens has proved to be a highly useful tool so far, allowing users to instantly find relevant information such as product details, or translate text on the go. Google has announced that Google Lens will now be available in image search as well, making it even more convenient for users to find more details about what the photo is about. Users will also be able to draw on a specific part of the image to get information. Google Lens integration in Google Images will be rolled out in the next few weeks.

Activity Cards

Another very useful feature announced by Google is the activity cards, which allows users to retrace their search history for the same query. For example, if you search for Michael Jordan and visit 4 websites, Google will show you activity cards containing your past browsing history the next time you enter the same keyword, along with suggested pages with more relevant information.

Activity cards are an opt-in feature, as users can choose to disable it whenever they want, and also delete their search history to prevent others from retracing their search history.


The Collections feature in Search allows users to organize relevant images and content they read, so that they can easily access these when they search for the same topic again. Collections has also received a couple of upgrades, such as allowing users to import content from activity cards directly to Collections, as well as suggestions from other users’ Collections.

In-Depth Search Results

Another notable change announced by Google is that search results will now surface sub-topics related to a search query, so that users can easily discover relevant information without having to do an in-depth research.

For example, if users search for a particular breed of cat, they will also see separate tabs for related topics such as traits of that breed, grooming tips, etc. The updated search format is already live and will soon encompass more topics in the foreseeable future.

More Contextual Information in Images

In order to provide more information related to an image in search, each photo result will now be accompanied with additional contextual information such as related search terms, the title of the publisher and more.

All these changes will start trickling in soon, as Google rolls out the features to mark the 20th year of existence. It’s Google’s birthday, but we are getting the gifts.

Third Of Consumers Have Used Visual Search

Chart of the Week: 36% of people have used visual search and over half say that visual information is more important than text when shopping online

Over a third of people have used or performed a visual search, according to new research from Intent Lab. A total of 36% of survey respondents said they have used visual search, which is the same rate as those who have performed voice searches, showing that search is changing.

As well as the rise of visual search, 59% also said that visual information is more important than text, showing that you should be incorporating visuals into all aspects of your marketing strategy.

Mobile is more visual

When it comes to visual search, mobile is the most popular device with 53% saying their smartphone is the main device they use for this type of search. This is largely unsurprising due to the visual nature of social media, the ease of use due to portability and the fact that mobiles have cameras incorporated.

This suggests mobile optimization, as well as your social media strategy, should largely be image led. Providing eye-catching and memorable images helps to put across information quickly and easily, as well as being ideal for visual search results.

Images are key when online shopping

The majority of consumers believe visual information to be more important than text when it comes to online shopping.

In fact, the only three product types where text is seen as being more important than images are household, electronics and wine and spirits. This is likely due to the fact that specifications and descriptions are required in order to ensure the suitability of a product.

Comparatively, a huge 86% of people believe that images are more important than text when it comes to buying clothes and 85% agree when purchasing furniture. Cars, groceries and vacations were all slightly more even with 58%, 56% and 55%, respectively, saying images were more important when purchasing these things.

While it’s hard to deny that text is still an important aspect of product listings, both in terms of SEO and informing the customer, the current trend shows that a range of images – such as simple product shots and lifestyle photos – could be more beneficial and influence sales.

Images aid with comparison

One of the things that visual information helps with the most when it comes to online shopping is comparison. Around 41% of survey respondents said that they find images helpful when comparing products.

Visual information allows shoppers to look at and compare aspects that they see as being important but that retailers may not necessarily include in their text descriptions. For example, when it comes to clothing, shoppers can compare the fit, quality, colour and matching items, making purchase decisions easier.

In terms of the exploring stage when online shopping, respondents said they find visual information helpful when it comes to learning about different options and ideas, as well as browsing category and product information.

ur guide takes you through the essential practices of e-commerce merchandising to help you connect your shoppers with the right product.

When evaluating products, as well as aiding with comparisons, visuals help shoppers create a list of brands to choose from and develop opinions on the different available options. The better the images, the better products look, helping shoppers keep your brand in mind.

The final part of the process is making the decision to buy a product. The majority of respondents said that visual information is very helpful at this stage also, suggesting that failing to use images could affect conversion.

Lack of trust affects visual search

Despite the rise in visual search, trust is still a big blocker when it comes to more people adopting it. A total of 37% of respondents said that trust is the primary factor stopping them from relying on visual search, while a further 26% said that privacy was an issue.

The main trust issue for consumers is the fact promoted items show up first, despite the fact they may not be the best match. Some 24% of respondents said that this is the main issue they do not fully trust visual search.

Respondents also do not trust the fact that many visual search results are created by social influencers who are paid by brands. This means that their posts and reviews can be biased and so do not offer an accurate representation to those searching for products. Similarly, 15% of respondents said that search results are created by brands, also making them biased.

Final thoughts

While visual search is far from the most popular type of search, it is on the rise and consumers are already viewing imagery as being more important than text. Ensuring that your images, as well as your written content, are optimized can help improve results no matter how people are searching.

Taking a more visual approach to your marketing, especially on social media and when it comes to product pages, can also help to improve brand awareness and conversion. Offering imagery throughout the buyer journey can help ensure your customers have all the information they need to make a decision. You can also incorporate user-generated content to give balance out with your more “biased” content.

What The Google Pixel Means For The Future Of Android

One of Android’s biggest strengths is also its biggest weakness. Its openness helped drive adoption, experimentation, innovation, and diversity. It also created the ugly fragmentation mess we see today. To some extent, the Nexus attempted to address that issue by providing a model where, like Apple, Google could push updates as soon and as much as it wants. The Pixel could be an even stronger push in that direction, with Google in control of everything. That control, however, might bet at the expense of openness. The Pixel didn’t just kill off the Nexus in terms of product availability and market, it may have also killed its spirit.

The Nexus has always been the most open Android devices in the market. Short of actually providing official rooting methods, Google gave developers everything they needed to hack the device. While still sold and marketed as a consumer product, the Nexus, in truth, was practically a developer device. It valued openness highly and invited anyone who can to dive inside it. There is nothing about the Google Pixel that hints at that characteristic. In fact, everything about the Google reeks of proprietary features.

A changelog leaked earlier showed which features in Android 7.1 are going to be exclusive to the Pixel. Unsurprisingly, those are the very features that make the Pixel even worthwhile. If those features ever became common, the Pixel would have lost its commercial edge and would become yet another Nexus, just with a different name. And no doubt, that has already caused a stir among Android’s staunchest and most vocal allies: open source developers and fans

There was no small amount of outcry back when Google started replacing AOSP (Android Open Source Project) apps with its own proprietary suite in Nexus and Google Play Edition devices. Those up in arms felt it was an affront to the open source friendliness and simplicity of the Nexus spirit. That has since then died down and people have come to accept Google’s decision, simply replacing the default Nexus firmware with a custom ROM of their choice. After all, Google made it easy to do so.

The Pixel, however, goes over and beyond that. In addition to your staple of Google apps, it puts Google Assistant, Duo, Allo, Photos, and Cloud right at the center of the Pixel experience. And Google Assistant and some of the cloud backup options are completely unique to the Pixel. No other Android smartphone will have it. At least for now. Maybe Google will eventually launch them to the whole Android world, but it would then lose its bargaining chip for future Pixel smartphones. Chances are, it isn’t in a hurry to do so.

There is also no indication of how hackable the Pixel smartphones will be. No hint of factory images, unlockable bootloaders, and all the other puzzle pieces needed to easily develop custom ROMs for the device. Given the proprietary bits that are critical to the Pixel’s success, it is almost unlikely that Google will make it as easy as a Nexus. And even if it does, some exclusive features might not make it to those third-party Android ROMs. They might be better of getting other smartphones for hacking.

This could very well be Google’s vision of what it takes to make Android compete with the iPhone head on, and it’s not a very open vision at that.

To Find Focal Length Of Concave Lens Using Convex Lens

The tutorial mainly gives focuses on finding a convex mirror’s focal length and for this experiment; a convex lens is used. In this experiment, some materials are needed and they are essential for this examination. Here a bench is the most essential material and two needles are required here. It is seen that if the concave lens was not there and the object was at a distance then the focal length of the convex lens will create a real image.


The main aim of this experiment is to find the focal length of a concave lens with the help of a convex lens.

Required Materials

Different materials are required for this experiment such as an optical bench having four upright. A concave lens is also required for this process with a focal length (Learncbse, 2023). A convex lens and two lens holders are required for this process. A knitting needle, one thin optical needle, and one thick needle are also important materials here. Another required material is a half-meter scale.


The theory is mainly discussed by using a formula of an experiment. It is used for the calculation of the concave lens. The formula says that f = uv/u-v.

Here f refers to the focal length of the concave lens L1. In this part, u refers to the distance of I from the optical centre of the lens L2. Lastly, v means the distance of I’ from the optical centre of the lens L2.

Figure 1: Focal Length of Concave and Convex lens

Ray Diagram and Procedure

The lab procedure goes by certain steps and for this; one should have to keep the convex lens of a known focal length in contact with the concave lens. Here the focal length of the concave lens will have to be determined and this process helps to form the combination of lenses (Olabs, 2023). In the next step, it is placed between the illuminated wore gauze and the screen at a certain distance. Here, the position of the screen is adjusted for getting a clear image of the wire gauge within the screen.

Figure 2: Ray Diagram

After that, measurement of the distances of the lenses and screen will have to do. Therefore, it is seen that the procedure goes through some stages (Zhou et al. 2023). At the very first stage, the determination of the rough focal length of the convex lens is done. In the next stage, one has to set the convex lens. The setting of the image and more observation for getting an accurate result is required.

Observation Table

Sl. no Upright position of Observed R′ = c – d (cm) Corrected R Observed R′ + e (cm) Focal length f (cm) ∆ f (cm)

Table 1: Determination of radius of curvature of a convex mirror


The calculation is done for finding the observed u after searching for the difference in the position of the two. The observation of the concave is done through a variety of positions (Song, et al. 2023). It is seen that u and vs. corrected values can get through the application of index correction. The calculation here is that f = uv/u-v. In this part, the mean f = f1+f2+f3/3


The result of this matter is that the focal length of the given concave lens is done in a calculation method by using cm.


Like every experiment, this experiment also needs to have some precautions. The main factor here is that the convex lens’ focal length will have to be less than the concave lens’ focal lens so that the combination can be convex. The lenses which are used here should have to be clean. Here the needle which is used in the experiment will be placed at a distance for getting a real image.

Sources of Error

Some sources of error are present here and it says that in this field the vertical uprights must not be used. Another factor is that in the time of removing the parallax the removal must not be perfect.


The present experiment says that the difference between a concave mirror and a convex lens and the lens’ focal point is known as the focal length. In this part, it is discussed that the focal length can be either positive or negative. Here a lens is nothing but a piece of transparent glass and it concentrates the rays of light. This part mainly discussed the process of finding focal length of a concave lens by convex lens.


Q1. What is the concept of a concave lens?

Concave lens mainly refers to the lenses having one curved surface inside. The shape of this lens is round on the inner side and it helps to make the light diverge. It is mainly used for the treatment of myopia.

Q2. What does it mean by convex lens?

It is an optical lens and it is made up of two spherical surfaces. The surfaces of the lens are bending outwards and it is known as a convex lens.

Q3. How many factors can affect the power of a lens?

Different factors can affect the lens’ power such as the thickness of the length. Light’s wavelength is also a factor that affects the power of a lens. The radius of curvature, the change in the medium, and the reflective index affect the power.

Q4. How does the focal length define?

A focal length is considered the length of a lens and it is the distance between the focusing point and the optical centre of the lens. The particular length has played important role in the experiment of concave and convex lens.

Google Search Patents 2023: The Mega

Hello and welcome to our great, big, mega-post round-up of Google Search patents from 2023.

Throughout the year, I spend time collecting and writing about Google Search patents that are of interest to the SEO community. I also write about the interesting ones every now and again here on Search Engine Journal.

Now that we’ve made it to 2023 (thank goodness), I wanted to share the entire 2023 collection with you.

They’re grouped to make things easier and so that if you’d like to do some research in a particular area, you can dig in.

It’s all about trying to learn more about various aspects of information retrieval and Google’s algorithm. I hope you find it useful.

Google Assistant

Authorship Conversational Search / Voice Discover Entities Image Search Indexing

Knowledge Graph Local Links Machine Learning Mobile NLP Neural Networks

Personalization Predictive Query Classification Ranking / Scoring Recommendation Engine Searching

Semantic Social Networking Spam Temporal Vectors Videos

And there we have it.

Whew, what a workout!

Next up, I’ll be sharing some of the odd, non-search related patents Google was awarded in 2023. Just for fun.

Then we’re back on with the new Google Search Patents for 2023.

Stay geeky out there!

More Resources:

Image Credits

Featured image by author, January 2023

Update the detailed information about Google Lens And The Future Of Visual Search on the website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!