Whats New In SEO 2022

A new year brings a new set of updates and trying to keep up with increasing volatility within the Google algorithm and how it impacts both mobile and desktop SERPs. January started the year with a handful of updates and changes to how it crawls pages. Take a look at what went down, and what you need to know for success moving forward.

Google Algorithm Updates and Changes

New Shops Section for Mobile Search

Google launched a new “Shops” section just for mobile search to start off the year. It’s designed to show 3 shops. Users can then expand to see up to 10 different merchants or stores near them, depending on availability. And these shops shown, depend on their organic search ranking.  A Google spokesperson told Search Engine Land, “We launched this to help present more seller options to users on Search.”

Google has expanded the shopping section, specifically for organic search several times over the past few years. Previously paid search results owned this section of SERP real estate. But it’s nice to see Google changing up and giving small businesses a chance to rank through organic search results.

Page Experience Coming to Desktop

We knew this was coming. Google made a huge deal out of the Page Experience Update last spring. At the time of launch, it only impacted mobile rankings. But starting in February, the Page Experience ranking signals will apply to desktop SERPs as well. Google made this announcement on Janaury 17th. You’ll start to see the Page Experience report into Google Search Console for desktop, where it was previously only for mobile. And the report is very similar. It will give a graph showing a percentage of “good URLs”, meaning they use HTTPS, and have a good Core Web Vital score. The report will also show you total page impressions. So look to that coming in the coming weks, even though the actual release for desktop likely won’t create a dramatic change in the actual desktop SERPs. Because we knew this was coming, most SEOs have had time to prepare. 

New Robots Tag Gives Websites More Control Over Indexing

On January 21st, Google announced a new tag for robots.txt files. This new tag, indexifembedded, tells Google you’d like your content to still be indexed if it is embedded elsewhere through iframes or other methods, even if the source content page has the noindex tag. Essentially this really gives you better control over how your content and website pages get crawled. 

This really impacts media publishers who want their site crawled when it gets embedded, but might not want their media pages indexed individuals. But this new tag works in conjunction with the noindex tag so that it can be crawled when imbedded onto another page through an iframe or other HTML tag. This type of new tag can get into the weeds and really only impacts a certain kind of website. But if you have additional questions about if it applies to your site, reach out to your SEO provider.

SEO Industry News for January

Title Tag Study

Last fall Google changed up some of the way it indexed sites, which ended up actually re-writing the Title tags of a lot of websites. This move really upset a lot of SEOs and content managers. Because what’s the point of creating these tags with SEO keywords if they’re just going to be ignored? 

Now we have a greater understanding of just the impact of this change. A study from Zyppy.com reviewed over 80,000 title tags from nearly 2400 sites to see how they showed up in search. The Zyppy study found Google re-wrote, at least partially, 61.6% of all title tags. They cited the following factors are most likely to have your title tag adjusted. 

  • • Too short or too long. The ideal length is between 50-60 characters.
  • • Using brackets or parentheses in your title
  • • Using the title separator. Most of the time it was replaced with an emdash – 
  • • Using too many keywords
  • • Using the same title across multiple pages

Impressions Do Not Equal Search Volume

Someone asked if the impressions they saw for any specific keyword in Google Search Console was essentially the search volume for that keyword, but John stressed it is not. This is because not every searcher for that keyword will see the same results depending on many factors including their location and prior search history, plus their search habits.

Total Link Number is not a Useful Metric

John stresses that a report showing your site’s total number of links is not a useful metric on its own. His main point being that your total number of links are always changing, and none of the third party SEO reporting sites have an accurate number. In addition, quantity alone is not the only factor. You also need to think about authority, quality, industry, and more play important roles in building your overall link portfolio.

On Targeting Multiple Locations on One Site

John talked about targeting multiple locations, such as a business servicing multiple states, on one website. In this case, static location pages work well as long as it’s not overdone. For example, don’t try to target every single city across multiple states with landing pages. Also make sure to have easy to find links to your location pages so visitors can find the information relevant to them.

Site Outages Hurt Your SEO

Unexpected (or even expected) site outages will hurt your SEO. John states Google will quickly de-index your pages if your site is down for more than a couple of days. So if you need to take yoru site down or any pages down, for any reason, make sure it’s taken care of quickly. Then resubmit it for indexing so Google can address any fixes and improvements made.

Leave your comments


  • No comments found