What to focus on in 2022 as a part of your search marketing? Our team has put together the things you need to implement on your website to rank higher in the SERPs.
Each year Google releases a few updates. The number of ranking factors and the amount of requirements for online resources is constantly growing, so there is always some food for thought. Each innovation of Google is another challenge and a test of your professionalism, which requires a thorough analysis of the situation and the search for the best solutions.
The top algorithms of the year that will still be relevant in 2022:
- Core Web Vitals — the algorithm was launched in the summer of 2021 and continues to be actively developed.
- Page Experience — a new ranking signal, launched in May 2021 and focused on improving user experience.
- Mobile-first Indexing — Google started in 2021 and continues to switch all websites from desktop-first to mobile-first indexing.
- Passage Indexing — the algorithm launched in February 2021; it allows the search engine to better understand pages with relevant but poorly structured content — “looking for a needle in a haystack.”
- Updating the Page Quality algorithm to the E-A-T concept.
- Updating the Title display in Google output.
In 2022, search engine optimization will need to focus on future updates, but try to make sure that the web resource is as relevant as possible to existing requirements by their release. Here’s how to do it.
Core Web Vitals
In 2020 Google announced and, in the summer of 2021, launched a new algorithm for assessing the quality of websites — Core Web Vitals. If you follow it, you can significantly improve the position of the resource in the results.
The parameters that are evaluated after the update of Core Web Vitals:
- LCP (Largest Contentful Paint) — how quickly the main content of the page is loaded;
- FID (First Input Delay) — how soon you can start interacting with the page;
- CLS (Cumulative Layout Shift) — how much the layout shifts during loading.
Tips for adapting to new ranking factors:
- Use the Chrome User Experience data to highlight the percentage of users with negative experiences using the website.
- Use Google Search Console to understand how the website performs on mobile devices and on desktops.
- Analyze groups of pages by LCP, CLS, and FID to identify problematic pages.
The Core Web Vitals parameters help a website meet current requirements and be competitive in the search results. That’s because the official part of Google’s search ranking algorithm is essentially an assessment of page perception quality.
SEO specialists and developers need to precisely understand how the Core Web Vitals parameters affect your optimization effectiveness. Proper use of these parameters will improve the website’s ranking because after a while they will play a crucial role in improving your ranking. Websites with better LCP, FID, CLS, other things being equal, will take higher positions in the ranking. The good news is that there is still time to get ahead of the competition!
Page Quality YMYL projects
Google is concerned about user safety, so it checks YMYL (Your Money or Your Life) websites with special attention. They include resources about medicine, finance, and psychology. Websites of this type are subject to the highest requirements — E-A-T (Expertise, Authoritativeness, Trustworthiness), and these are evaluated by real people, not search robots.
Reasons for E-A-T criteria:
- the increase in the spread of misinformation and fake news;
- extremist behavior on the Internet;
- the impact of online information on elections;
- criticism of the quality of medical information on the Internet;
- the impact of misinformation on the spread of diseases.
Diverse and high-quality content on the website is a big plus. It includes standard product cards, reviews, listings, and news, reviews, articles — all valuable information that will allow a user to understand the topic and make a purchase decision. It is best if the content author is an expert in a particular field.
Google says: “Understanding who is responsible for a website is an important part of the E-A-T evaluation for most types of websites. In addition, high-quality pages should have clear information about the site so that users feel comfortable and trust it.”
On blogs, for example, it’s essential to list the author under each post. It’s ideal if the signature is a hyperlink that leads to a page with all of the author’s content and brief information about them.
Here we are talking about the reputation of the website or organization. Such factors provide this:
- A vast amount of quality content.
- Completed sections “Contacts” and “About Us.” If the website involves purchases or financial transactions, there should also be a support service so that users can get help in solving problems.
- Proof of good reputation: prestigious awards, references from well-known representatives, and industry professionals.
To adapt content to E-A-T requirements, we recommend checking:
The algorithm focuses on improving the user experience and has several requirements:
- an optimized mobile version that meets the mobile-first algorithm;
- high loading speeds Core Web Vitals;
- open content;
- absence of viral/malware software;
- SSL certificate and HTTPS connection;
- no annoying ads that cover your workspace.
Read more about the algorithm in the official Page Experience help.
This metric shows whether the website is optimized for mobile devices. You can use the Search Console report to check if your website has problems with the mobile version. It will highlight specific weaknesses and issues with URLs where they occur.
In addition to this report, you can use the standalone Google Mobile-Friendly Test tool, which allows you to test individual URLs.
Safe Browsing is a metric that shows whether your website has malware, dangerous downloads, deceptive content (such as phishing), or other similar problems. You can use the report in the Google Search Console or the Safe Browsing site status tool to check for safe browsing issues.
The metric is responsible for the SSL certificate and for the website’s performance over an HTTPS connection. This is how Google evaluates sites that can encrypt user data they enter during the registration and payment
Pop-up ads on your phone
Pop-up ads make content less accessible, especially on mobile devices, because they cover most of the screen. This degrades the user experience, and Google accordingly “punishes” such sites.
Launched last year, the Mobile-First Index continues to gain momentum. It is an algorithm for indexing and ranking web pages on the Google search engine, which will evaluate the mobile version of the site. That is, first of all, the search engine will show mobile versions of resources if they are available. This applies to both the search results on smartphones and PCs.
How do you know if your site is in the Mobile-first Index
You can determine which type of Googlebot is leading on your website in Google Search Console through a report:
You can also track which robot is bypassing pages in the “Coverage” tab:
If the main robot is Googlebot for smartphones (as in the screenshots), your resource is already indexed by Mobile-First Indexing.
Google has learned to index not only web pages as a whole but also to understand the relevance of their parts.
Does Google index sections or parts of a page independently?
Google still indexes total pages but will also consider excerpts content and meaning when searching for the most relevant information. So even if the answer to a search query is hidden deep inside the page, the system will be able to find that particular part for you.
The search engine does not index individual excerpts. However, it better checks the page content and displays valuable information when ranking. This innovation helps identify pages with one specific section that matches the search request particularly well.
Transition indexing looks at the page content, determines whether parts of the page answer the search queries, and delivers those results to the search engine. If the page ranks well, indexing conversions may not affect the website.
Previously, the search engine “did not like” long and poorly structured texts. And now, if such a long text contains precious information, the system will recognize it and, thanks to Passage Indexing, will rank such content better.
It does not mean that now all texts should be long. It means that you can stop splitting good informative articles into many small ones to get a better ranking for each piece of material.
Updating the Title tag in Google search results
In August, many SEOs noticed that tags and other HTML elements were being replaced in Google search results. Some title updates had unfavorable results: new words were inserted, grammar was changed, title structures were distorted, or ellipses were removed from truncated titles.
- limitation on the number of pixels;
- request-based modification — Google changes the fragment based on the request;
- complete replacement — the search engine completely changes the snippet because the original one didn’t display the content well enough.
As a result of this update, the title tag snippets seem shorter. But that’s not because the pixel limit has been lowered. This is an indirect result of the “readability and relevance algorithm.” It is designed to change the way title tags are presented in a search.
But there is no need to worry. Google has confirmed that if a different title is displayed in search results, it will not affect the page ranking. The searcher will use what is in the tag for ranking.
And let’s make it clear that updating header snippets have nothing to do with excerpt-based indexing.
Constantly test and refine the usability of your website:
1. Keep an eye on the analytics data — they allow you to see the real user experience:
- What’s convenient?
- What gets in the way?
- What information is missing?
- What is too hidden?
2. Draw conclusions and implement improvements.
- Reduce users’ transition back to the SERPs: get them interested in the snippet in search so they come to the site and stay on it.
- Monitor performance on Google’s stated parameters through official services. Do this for your project and competitors’ niche sites — for comparison.
SEO trends change every year, adapting to the updates of search engines, user behavior, and search queries. Therefore, you need to be aware of these trends and adjust your sites according to them, constantly improving the usability, content, and technical parts.
|Do you want your site to be liked by both search algorithms and users? Let’s discuss how to achieve this.|