Certainly! Here’s the rewritten article without the domain URLs:
Frédéric Dubut and Fili Wiese spoke at SMX Advanced in Seattle in June. The session was so popular that they will team up again to discuss the latest news about Bing and Google penalties and algorithms at SMX East in New York on November 13.
Frédéric Dubut, leader of the spam team at Bing, and I participated in the first-ever joint presentation between Bing and a former Google representative at SMX Advanced. We spoke about how Google and Bing handle webspam, penalties, and algorithms. Since we couldn’t address every question during the Q&A session, we’re following up here. Below are the questions submitted about Google and Bing penalties, along with our responses.
Q: Does the disavow tool work for algorithmic penalties, or is it mainly for manual actions?
A: The disavow tools from Bing and Google are crucial for resolving link-related manual spam actions/penalties. They also help websites with a history of active link building to remove low-quality links violating Bing or Google Webmaster Guidelines. While Google doesn’t have algorithmic penalties, disavowed link data is used by both engines as a data point for testing various ranking algorithms.
Q: Any thoughts or tips on combating spam users’ posts in user-generated content sections (reviews, forums, etc.)?
A: Vigilance is key when combating user-generated spam. It’s essential to monitor communities for brand protection. Use methods like CSRF tokens or batch review user submissions. Tools like Akismet or reCaptcha can limit spammer activity. If you can’t dedicate resources to moderating UGC sections, consider not allowing links. Remember, no tool can completely stop human ingenuity, so committing resources, including trained staff, is essential to reducing the risk of user-generated spam.
Q: How can you identify if someone is purchasing links?
A: It’s about intent and trends. Typically, a quick look at backlink data is enough to raise suspicions. Reviewing the backlink profile in detail usually uncovers clear evidence.
Q: Regarding known issues with JavaScript indexing, how are you addressing cloaking, as many SSR and dynamic solutions resemble cloaking? Is it hard to distinguish malicious intent?
A: Focus on the intent behind the solution. If it’s designed to deceive search engines by showing different content to bots versus users, it’s cloaking and a serious violation of both Bing and Google Webmaster Guidelines. To avoid misunderstandings by search engine algorithms while offering a better user experience on JavaScript-heavy sites, follow progressive enhancement principles.
Q: Can a site be verified in Google Search Console or Bing Webmaster Tools while a manual penalty is applied?
A: Absolutely. For Bing Webmaster Tools, create an account to file a reconsideration request. For Google Search Console, verify your site as a domain property to see if any manual actions apply.
Q: Is there a way to report link spammers to Google? We have thousands of toxic backlinks with the anchor text "The Globe," and removing them is costly.
A: Yes, you can report violations like link spamming through Google’s webspam report channel. Additionally, Google Webmaster Help forums monitored by Google Search employees allow for escalating such issues.
Q: Does opening a link in a new tab (using target=_blank) cause any SEO issues, penalties, or poor quality signals?
A: Opening a link in a new tab doesn’t impact SEO. Consider user experience, as new tabs can be annoying.
Q: Should we proactively disavow scraper sites and other spammy links not part of a black hat campaign?
A: Yes, if they significantly affect your backlink profile. The disavow tool helps manage backlink risks and distance your site from shady links. It’s a suggestion for search engines, and they decide how to use it.
Q: How is a cloaking penalty treated? At the page level or sitewide? Can it be algorithmically treated or purely manual?
A: Cloaking is a major infringement for Bing and Google, treated with algorithms, manual penalties, and more. Deceptive cloaking often results in removal from the index, possibly at the domain level if it’s egregious.
Q: Can a manual penalty on subdomain pages affect the overall domain?
A: It can, depending on the penalty and its impact on overall SEO signals. If a penalty occurs, rankings and site growth may suffer. The best action is requesting reconsideration from the search engine.
Q: Do Bing and Google penalize out-of-stock inventory pages? We have many soft 404s. Any suggestions for handling this on large e-commerce sites?
A: Neither Google nor Bing penalizes sites with a lot of 404 pages. However, many soft 404s can impact search visibility. Using smart 404s to offer alternatives while serving a 404 HTTP status or noindex helps. Consult an SEO expert to determine the best strategy, considering factors like site size and product availability.
Have more questions?
Join us at SMX East for more updates about Bing and Google penalties and algorithms.