-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SEO Checking tools give bad feedbak #7
Comments
Thanks @thomasklaush for this observation, this is a very relevant question and finding. @iamSahdeep any comments or similar experiences? Also did you learn anything more about this topic: #1 |
Seems like a valid issue. It probably be taking some time to place all the elements in the DOM but crawler is already done with its process. Need to find a way to make webpage in loading state till our process is done, not sure how we can achieve it have to look into it. But the thing is googlebot still get all the data from DOM, the second last result here is from my personal website (same page as above link in seobilty) and it shows all the data in the search result for that page. @rydmike No luck on any response on it, I tried to get some feedback here as well but no response there. Do you know someone from google? probably some experts on this topic who can shed some light on it. |
Google will recognize this as a Black hat SEO technique. |
Exactly, I wouldn't use it in production! This can be treated as Cloaking by search engines... |
So potentially the only solution is always replacing |
When pages are checked with or without using the package, no enhancements are found.
Giving back no h1 text, no text in general, ...
Example for the example app with seobility.net:
https://freetools.seobility.net/de/seocheck/check?url=https%3A%2F%2Fseo-renderer.netlify.app%2F%23%2F&crawltype=1
Best regards,
Thomas
The text was updated successfully, but these errors were encountered: