What’s Missing from the New Google Search Console?

September 17, 2019

Update, 9/11/2019: As of 9/9/2019, Google has officially retired the old Search Console.

Unfortunately, not much has changed since we first wrote this post about the differences between the old and new Search Consoles back in February. Google has acknowledged this, and continued to make some legacy tools and reports available in the old interface. A list of these tools and reports, with direct links to them, can be found here. These include:

Google’s blog post says they are going to continue to work on making these insights available in the new Search Console. However, it is likely that some things, such as the specific and useful data that was available in the old “Crawl Errors” report, won’t be coming back. For now, your best bet is to bookmark Google’s list of legacy tools/reports, and hope for further improvements to the new Search Console.

The New Google Search Console

Over the past year or so, you may have noticed Google has been gradually unveiling piece after piece of its new Search Console experience. The old version has continued to be available; however, in some cases, reports have disappeared, replaced by links to the corresponding reports in the new Search Console.

Other reports, like the “Crawl Errors” report, simply include an ominous warning that they will be replaced soon.

These warnings have been appearing for over two months, and Google has now given us some insight into what “soon” means: After John Mueller reportedly stated that Google hopes to discontinue the old version of GSC in March 2019, Google posted in the Webmaster Central blog that a number of changes will be coming to Search Console towards the end of March. This isn’t necessarily the same thing as discontinuing the old version, but, either way, it’s clear that some large changes are on the horizon. And while Google’s messaging has largely focused on enhancements in the new Console (16 months of performance data! URL inspection!) and equivalences between old and new reports, they’ve acknowledged that certain features will be dropped. Specifically:

  • HTML Suggestions
  • Property Sets
  • Android Apps
  • Blocked Resources

Of these, the “HTML Suggestions” report has been around the longest, dating back to the days when GSC was still called Google Webmaster Tools, and is certain to be missed by many. Google says their “algorithms have gotten better at showing and improving titles over the years,” but this disregards the fact that the HTML Suggestions report was an effective means of sniffing out serious duplicate content issues — not just a tool for improving title tags.

So, what else is missing, as far we know?

  • While the “Fetch as Google” tool has effectively been replaced by the “URL Inspection tool,” certain components are missing:
    • In the URL Inspection tool, when you click “View Tested Page” and open the “Screenshot” tab, the only available screenshot is rendered via Googlebot smartphone. There is no option for desktop rendering.
    • The side-by-side comparison of Googlebot’s view of the page versus a user’s view has been removed.
    • The screenshot doesn’t show the entire rendered page, only the portion above the fold.
  • The “Sitemaps” report is not nearly as forthcoming with data as the old report. While the old report provides a clear picture of the number of pages indexed vs. the number of pages submitted in each sitemap, the new report makes users dig for the number of valid and excluded pages, and it doesn’t provide this data for individual sitemaps that are included within a submitted sitemap index file. In the following screenshot, we are told that 25,000 URLs were discovered (i.e. submitted in the sitemap), but the “See Index Coverage” text is not a clickable link. The “See Index Coverage” option only works for the overall sitemap.xml index file. The more granular view is absent, making the report much less useful for diagnosing indexation issues.
  • There are stark differences between the “Crawl Errors” report in the old GSC and “Index Coverage” report in the new GSC, which is supposed to be replacing it. Notably:
    • The incredibly useful “Linked from” tab in the old GSC, which provides a list of pages that link to each error URL, is missing from the “Index Coverage” report.
    • Comparing specific lists of errors, such as 404s, in the “Index Coverage” report with the same lists in the “Crawl Errors” report, the new report appears to be significantly under-reporting these errors. I’m seeing many of examples of recently-crawled 404 errors in the “Crawl Errors” report that are completely absent from the “URL Coverage” report.
    • The “First Detected” date, which allows users to get an idea of how long a page has been erroring out, is missing from the “URL Coverage” report.
    • Certain designations, such as “Submitted URL has crawl issue” or “Crawl anomaly” are less specific and not as helpful as those in the old report.
    • Crawl errors are no longer broken out by “Desktop” and “Smartphone” in the new report.
  • The “Structured Data” report, as we know it, is going away, to be replaced with individual reports for specific kinds of markup. It seems unlikely at this point that these individual reports will cover all the data types found in the old “Structured Data” report. Additionally, the ability to get an overview of all structured data types (and their errors) at a glance will be greatly missed.

There are a number of other reports that fall under the “Currently unsupported features” list on Google’s guide to migrating to the new Search Console, such as “International Targeting,” “Crawl Stats,” “Robots.txt tester” and more. The guide says they aren’t yet supported, leading one to believe they will eventually be added, but given that HTML Improvements is on the list and Google has since said this report is being dropped, the future of all of these reports is in question. This makes the proposed March discontinuation of the old Search Console particularly worrisome. As that date nears, it becomes less likely that the majority of these reports will make the cut. I would urge Google to continue work towards parity between the new and old versions of Search Console, and hold off on discontinuing the old version until all valuable features are available in the new.

Follow ForwardPMX

You May Find These Interesting

How Can Algorithms Help Brands Win the COVID-19 Rebound?

“We are living through challenging times.” The now everyday adage referring to the current global pandemic has presented every industry across the globe with a very human problem. For those lucky enough to be unaffected by its health implications, the threat of a...

read more