Ep187: Danny Sullivan's Insights on Google's Ranking Misconceptions
Manage episode 384839936 series 2839121
Episode 187 contains the Digital Marketing News and Updates from the week of Nov 13-17, 2023.
1. Danny Sullivan's Insights on Google's Ranking Misconceptions - In a recent presentation, Google's Danny Sullivan addressed a common misunderstanding in the SEO community regarding how search rankings work. He emphasized that the SEO industry might be misinterpreting Google's guidance and documentation. A key example Sullivan provided was the widespread adoption of author pages and bylines, based on the belief that Google's algorithm prioritizes these elements. However, Sullivan clarified that Google does not specifically look for author pages.
The main point of Sullivan's presentation was the gap between Google's communication and the SEO community's interpretation. He acknowledged that Google needs to improve how it conveys its expectations for successful content. Sullivan urged SEO professionals to focus on creating 'people-first' content, emphasizing the importance of trustworthiness and reliability in content, rather than specific elements like author pages.
Sullivan also highlighted the challenge in communicating what Google looks for in content. He explained that Google's guidance is often interpreted too literally, leading to misconceptions about the ranking algorithm. For instance, when Google advises assessing a webpage's trustworthiness, it does not mean the algorithm is specifically looking for features like author pages. Instead, it's about evaluating the content broadly for trust factors.
The presentation led to a significant shift in how SEOs should approach Google's documentation. Sullivan presented examples of how certain parts of Google's advice are cherry-picked and misinterpreted as direct ranking factors. He encouraged more critical thinking and a closer examination of what is being advised, distinguishing between opinions and actual Google statements.
Sullivan concluded by revealing that much of Google's recent update documentation is a reiteration of decades-old advice. The core message has always been about creating helpful, people-first content. The difference now lies in the advanced technology like AI and machine learning, making it plausible that these long-standing principles are integrated into Google's algorithm.
P.s: For business owners, this insight from Danny Sullivan is crucial. It suggests a shift from focusing on specific SEO tactics to prioritizing the overall quality and trustworthiness of your content. Understanding that Google values 'people-first' content can guide you in developing a more effective and sustainable SEO strategy, ensuring your website not only ranks well but also genuinely serves your audience's needs.
2. GoogleSafety - Google's New Crawler - Google has updated its official list of crawlers, adding details about a previously undocumented and somewhat mysterious crawler. This update is particularly relevant for website owners and digital marketers who need to understand how Google interacts with their sites.
Understanding Crawlers: Crawlers, also known as bots or spiders, are tools used by search engines like Google to collect information from websites. This process is crucial for indexing and ranking websites in search results. There are different types of crawlers, each serving a specific purpose.
- Common Crawlers: These are primarily used for indexing various types of content. Some are also employed for search testing tools, internal Google product team use, and AI-related crawling.
- User-Triggered Fetchers: Triggered by user actions, these bots are used for tasks like fetching feeds or site verification.
- Special-Case Crawlers: These are used for unique purposes, such as mobile ads, webpage quality checks, or push notification messages via Google APIs. They do not follow the global user agent directives in robots.txt marked with an asterisk.
The GoogleSafety Crawler: The newly documented crawler is known as the "GoogleSafety" user agent. It plays a critical role in Google's process of identifying malware and is unique among special-case crawlers. Unlike others, the GoogleSafety crawler completely ignores all robots.txt directives. Its primary function is to crawl for malware in publicly posted links on Google properties. The full agent string for this crawler is "GoogleSafety."
3. Google's Hidden Gems Ranking System - Google has introduced the "Hidden Gems" ranking system, a significant update aimed at promoting authentic content buried within forums, social media, and blog posts. This update is distinct from the Helpful Content Update and has been part of Google's core updates for a few months. The Hidden Gems algorithm is designed to identify content that offers personal insights and experiences, which might have been challenging to find in search results previously.
The Hidden Gems update is not a classification system but rather a method to highlight content perceived as especially helpful. This content often resides in unexpected places, such as comments in forum threads, posts on lesser-known blogs, or articles with unique expertise on a topic. Google's approach with this update is to make these valuable pieces of content more accessible to users.
Initially, there was some confusion about whether this update was live and its relation to the Helpful Content Update. However, Brad Kellett, Senior Director on Google Search Product and Engineering, clarified that Hidden Gems is its own algorithm and ranking system, separate from the Helpful Content Update. This initiative is part of a series of ongoing updates, not just a single change.
4. Google Search Console Introduces New Robots.txt Report - Google has announced a significant update to its Search Console with the introduction of a new robots.txt report. This new feature replaces the older robots.txt tester tool. The report is designed to provide webmasters with detailed insights into the robots.txt files found for the top 20 hosts on their site. It includes information about the last time these files were crawled by Google and highlights any warnings or errors encountered during the process.
This update is particularly important for business owners and digital marketers who rely on Google Search Console to monitor and optimize their website's performance in search results. The robots.txt file plays a crucial role in controlling how search engines crawl and index a website's content. By providing a more comprehensive report, Google aims to make it easier for site owners to identify and fix issues that could affect their site's visibility and ranking in search results.
However, the removal of the older robots.txt tester tool has been met with mixed reactions. While some users appreciate the new report's enhanced capabilities, others miss the simplicity and familiarity of the previous tool. It's worth noting that Bing still offers a robots.txt tester, which might be a useful alternative for those who prefer the older format.
5. Google's $18 Billion Secret: Paying Apple for Safari Search Dominance - In a surprising revelation during the federal antitrust trial, Google CEO Sundar Pichai confirmed that Google pays Apple a significant 36% of its Safari search revenue. This payment, amounting to a staggering $18 billion, is in exchange for Google maintaining its status as the default search engine on all Apple devices. This disclosure came to light when Google's final witness, Kevin Murphy, inadvertently mentioned the ...
204 episodios