You are reading the article Google Search Patents 2023: The Mega updated in December 2023 on the website Katfastfood.com. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested January 2024 Google Search Patents 2023: The Mega
Hello and welcome to our great, big, mega-post round-up of Google Search patents from 2023.
Throughout the year, I spend time collecting and writing about Google Search patents that are of interest to the SEO community. I also write about the interesting ones every now and again here on Search Engine Journal.
Now that we’ve made it to 2023 (thank goodness), I wanted to share the entire 2023 collection with you.
They’re grouped to make things easier and so that if you’d like to do some research in a particular area, you can dig in.
It’s all about trying to learn more about various aspects of information retrieval and Google’s algorithm. I hope you find it useful.Google Assistant Authorship Conversational Search / Voice Discover Entities Image Search Indexing Knowledge Graph Local Links Machine Learning Mobile NLP Neural Networks Personalization Predictive Query Classification Ranking / Scoring Recommendation Engine Searching Semantic Social Networking Spam Temporal Vectors Videos
And there we have it.
Whew, what a workout!
Next up, I’ll be sharing some of the odd, non-search related patents Google was awarded in 2023. Just for fun.
Then we’re back on with the new Google Search Patents for 2023.
Stay geeky out there!
Featured image by author, January 2023
You're reading Google Search Patents 2023: The Mega
The way forward for AI and retail is Visual Search
Machine learning is taking online search to new levels, and it’s not just voice search that’s on the rise. While many have been focused on the marketing potential of capturing conversational queries barked at Alexa over breakfast, big brands have been quietly developing a stronger competitor in interactive SEO: visual search.
For decades, we’ve searched for information and products online using a text search bar. The introduction of voice search has since made waves in local SEO, with users searching for opening times, directions, weather reports and other of-the-minute information – but it has left many online retailers feeling stumped. Not every trend fits with every business model, and for businesses who rely heavily on visuals to drive conversions, the opportunity brought by voice search has often felt limited.
Download our FREE Resource – 10 common website customer experience mistakes
The most common user experience mistakes, with strategy recommendations, examples and recommended resources.
Humans process visuals 60,000 times faster than text, and according to research by Kissmetrics 93% of consumers consider visuals to be the key deciding factor in a purchasing decision – which is why over the last few years, ecommerce sites have been bulking up photo galleries and adding 360° videos in an effort to increase conversions. Now, thanks to innovations like Google Lens and Pinterest’s Shop the Look, the payoff for that work seems set to increase.What is visual search?
When you’ve seen an item or an image of an item that’s caught your eye but are unsure of the brand, the model, the name of that style, that’s where visual search comes in.
Unlike an image search, where an ordinary text search pulls possible relevant images using structured data, visual search is the process of fulfilling searches by using machine learning to analyze components within a submitted photo, and finding results that replicate or relate to those visual cues. Think of the way that Facebook now recognizes the faces of friends you’ve tagged in past images – it’s this genre of technology that is now being used to develop wider visual search.
Just as you can look across a room and see a variety of objects, read labels and observe features, so can visual search AI. Google Lens, for example, can see an image of a landmark and offer you details on its history or where to buy tickets to go inside. It can look at your photo of a book and offer recent reviews for that title, places to buy it online and alternative books by the same author.
In the same manner, Pinterest’s visual search functions allow you to select different parts of a photo – a pair of shoes, a lamp, a paint colour – and find similar products to purchase. It can also offer outfits or room décor suggestions that include other items to pair your selection with.The current state of visual search
eBay announced in 2023 that they were gearing up to launch Find It On eBay, adding image search functionality to their app and mobile website and enabling users to snap a photo and instantly find anything on eBay that looked like it. They’ve also cashed in outside of their own domains, recently announcing a collaboration with Mashable in the US where users can shop for eBay products that bear resemblance to clothes and items in Mashable images – all without leaving Mashable’s site.
The potential for ecommerce sites to capitalize on this style of search is huge. Keyword-generated image search can be frustrating when faced with vast inventories and products that may have been incorrectly or poorly tagged, or described using terms we haven’t thought of. Users who have found themselves typing in every variation of a colour name or style description in the hope of finding what they’re looking for are more than ready for a simple and effective visual-match search.
People can find cheaper alternatives to items they’ve seen in a shop window or print magazine, or identify a variety of plant they’d like to add to their garden. As well as offering increased convenience from a user perspective, retailers who optimize their sites effectively should find an increase in relevant traffic that’s ready to convert.Optimizing for visual search
Though visual search will have uses in a range of industries, it seems fair to assume that this intuitive tactic will be retail-dominated. Sites should still ensure that their images are optimized using structural data and other traditional SEO tactics, but going forward there’s also a need for imagery to be clutter-free and easy for visual search tools to process while the technology is still developing. And of course, there needs to be plenty of imagery to digest in the first place.How to optimize for visual search:
Offer a range of clear images for each product
Optimize image titles with target keywords
Submit image sitemaps
Set up image badges
Optimize image sizes and file types
Run structured data tests
Generally speaking, the more steps there are between the start of the purchase funnel and the checkout, the higher the cart abandonment rate. The Baymard Institute say that on average, nearly 70% of online carts are abandoned before checkout, and a lengthy process to get to the payment screen can cause around a third of users to ditch a site and shop elsewhere.
In essence, traditional image optimisation is still half the battle – the other is ensuring that you’ve put enough time and thought into your product photography in the first place.
Visual search is set to dramatically improve the online shopping experience in the coming years, and with retail ecommerce sales in the UK predicted to reach a value of almost £94billion in 2023, there couldn’t be a better time to cash in.
Google Search is now 20 years old, and to celebrate the landmark, Google has announced a host of changes aimed at making search more visually interactive as well as convenient.More Stories
Stories in search results will look like WhatsApp Status, but the major difference is that Google’s take on Stories will also contain relevant text-based information about a famous personality arranged in a sequential order, helping users get more information in a visually engaging fashion.
The Stories will leverage Google’s AMP platform and will be curated by an AI algorithm to show relevant information. AMP Stories arrived on the scene this year, and these new Search Stories are an extension of the feature. Moreover, each Stories card will have an embedded link that will take users to the source if they want to do an in-depth research. Stories will begin appearing starting today in Google Search.Video Previews
Another notable change is video previews in search results, which will play a featured clip that Google determines has the most relevant content related to your search query.
For example, if you search for a term like the Alps, Google will show you a video about a trip to the Alps, with suggestions about places of interest and landmarks.Google Lens Integration in Images
Google Lens has proved to be a highly useful tool so far, allowing users to instantly find relevant information such as product details, or translate text on the go. Google has announced that Google Lens will now be available in image search as well, making it even more convenient for users to find more details about what the photo is about. Users will also be able to draw on a specific part of the image to get information. Google Lens integration in Google Images will be rolled out in the next few weeks.Activity Cards
Another very useful feature announced by Google is the activity cards, which allows users to retrace their search history for the same query. For example, if you search for Michael Jordan and visit 4 websites, Google will show you activity cards containing your past browsing history the next time you enter the same keyword, along with suggested pages with more relevant information.
Activity cards are an opt-in feature, as users can choose to disable it whenever they want, and also delete their search history to prevent others from retracing their search history.Collections
The Collections feature in Search allows users to organize relevant images and content they read, so that they can easily access these when they search for the same topic again. Collections has also received a couple of upgrades, such as allowing users to import content from activity cards directly to Collections, as well as suggestions from other users’ Collections.In-Depth Search Results
Another notable change announced by Google is that search results will now surface sub-topics related to a search query, so that users can easily discover relevant information without having to do an in-depth research.
For example, if users search for a particular breed of cat, they will also see separate tabs for related topics such as traits of that breed, grooming tips, etc. The updated search format is already live and will soon encompass more topics in the foreseeable future.More Contextual Information in Images
In order to provide more information related to an image in search, each photo result will now be accompanied with additional contextual information such as related search terms, the title of the publisher and more.
All these changes will start trickling in soon, as Google rolls out the features to mark the 20th year of existence. It’s Google’s birthday, but we are getting the gifts.
Google is simplifying Search Console reports so you can focus more on issues that affect how your website appears in search results.
The upcoming changes will affect the ‘warning‘ label for URLs and items. There’s confusion around whether this status means a page or item can’t appear in Google.
To alleviate that confusion, top-level items will be labeled as either valid or invalid.
‘Valid’ refers pages or items without critical issues, while ‘invalid’ to pages or items with critical issues.
In a blog post, Google sums up how this will benefit Search Console users:
“We think this new grouping will make it easier to see quickly which issues affect your site’s appearance on Google, in order to help you prioritize your fixes.”
You’ll also see ‘valid’ and ‘invalid’ labels when looking at reports rendered by Google’s URL inspection tool.
Individual issues are still classified as error, warning, or good, which is communicated through use of color and icon rather than a text label.
The following reports are affected by this update:
Core Web Vitals: Poor/Need improvement/Good categories are retained, while pages are grouped into good and not-good tables.
Mobile Usability: Categories are labeled as either ‘Not usable’ and ‘Usable.’
AMP report: Warnings are replaced with ‘valid’ and ‘invalid’ labels.
Rich result reports: New labels will apply to Events, Fact checks, Logos, and other types of report.
URL Inspection: The top level verdict for a URL will be either:
URL is on Google
URL is on Google, but has issues
URL is not on Google
You may not see any changes today, as this update is rolling out gradually over the next few months.
This is only a reporting change in Search Console. There are no changes to how your website is crawled, indexed, or served in search results.
Sources: Google Search Central, Search Console Help
Featured Image: Screenshot from chúng tôi June 2023.
Changes to Google’s image search results over the years, and current best practices, are discussed in a newly uploaded presentation.
The presentation was given by Francois Spies, Product Manager for Google Images, at Google’s Webmaster Conference in November.
Google just published a batch of videos from the conference so, for most people, this is their first time seeing it.
Here’s a recap of Spies’ presentation on Google Images.Google Images vs Google Search
The presentation begins with comparison between how people use Google Images and how they use regular web search.
People come to image search not just to find an image; they use image search to get things done in the real world.
Some of the many different use cases include shopping, interior design ideas, inspiration, learning how to complete tasks, and so on.
Image search is just a tool that helps peoples’ brains process information faster than reading a web page.
Over the years Google has changed image search to reflect its different use cases.
More on that in the next section.Evolution of Google Image Search
See below for an example of what Google Images looked like two years ago.
As you can see it was mostly images with no context, which is admittedly hard for users to navigate when they’re looking to get something done.
In the years that have passed, Google has made many changes to image search.
Google started providing more context to images by displaying text snippets, as well as the domain where the image was found.
This additional context allows users to determine which result is most relevant to them depending on their use case.
In the backend Google went from ranking not just the images, but ranking images from the best landing pages depending on what the user wants to accomplish.
Google fundamentally changed how it ranks results to help users find the content behind the image.
Google also introduced structured data to image search for products, recipes, and videos.Recent Changes to Google Images
The changes discussed in the section above rolled out over the past few years.
Here are some more recent changes to Google image search that launched over the last several months.Image Search Optimization Best Practices
The last part of Google’s presentation goes over optimization best practices for image search.
Google narrows down its image optimization tips to these three best practices.
Use structured data (especially for products, videos, and recipes).
Use description titles, descriptions, and file names.
Use high-quality & optimized images, placed next to text, on mobile-friendly pages.
See the full presentation in the video below:
It is not everyday that you come across search technology that challenges the established leaders in the search space – that too on the grounds of relevancy.
It is a fact that there is an explosion of content on the web. Relona’s technology is based on the theory that the keyword space (or query space) has to grow in order to cover the whole swath of content on the web. This means that on an average, users will have to enter more keywords to get more relevant results from the search engine.
On the major search engines, the results returned for long queries is not as relevant as those returned for short queries. This is where Relona’s Intent Based Search algorithm uses statistical models to better map content to search queries by adjusting the emphasis on the words that form the query.
Intent Based Algorithm
The Intent based algorithm does computationally what the user does by habit when the results returned do not meet the relevance criteria – use different words to convey the same meaning. This is where Relona’s technology takes the middle path between natural language processing and using pure link analysis. By using statistical models, Relona’s search technology “guesses” users intent by adjusting the weight on different terms used in the search query.
Sprucing up relevance as compared to Google
What is really interesting is the enhanced relevance that MSN, Yahoo and chúng tôi have (as compared to Google) when combined together with Relona’s Intent based algorithm.
MSN-Live – Relevance improves by 20% for two or more keywords.
Yahoo – Relevance improves by 35% for two or more keywords, making it 5% more relevant than Google.
The results are provided at chúng tôi as impressive presentations and were obtained from analysis of AOL’s query logs. Also to be noted is that the above analysis was done without any direct integration with the search engines (Is this a hint they are open for acquisition?)
Relona’s technology is all the more alluring because it seeks to build upon the base of the first era of search engines and uses a statistical model that improves with usage. A demonstration of the search engine’s technology can be accessed from here.
Update the detailed information about Google Search Patents 2023: The Mega on the Katfastfood.com website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!