Do you know which factors are considered by the Google while figuring out whether a page has a high or low-quality score google content and how you can find out those pages on your own? There is a great list of things that need to get examined while determining the pages that affect your rankings and which don’t? It ranges from the page load times to the searcher behavior and spelling mistakes. All the things are covered by the Rand in just a single episode of Whiteboard Friday.
Tap on the whiteboard image below for opening it in a high-resolution in a different tab!
Hello, everyone, you all are welcomed to another edition of the Whiteboard Friday. In this edition, we will talk about the techniques through which Google identifies the quality of a page on the website and if there is any possibility that it could affect us for different optimization options.
Here, the real challenge is to identify which pages on your website are actually low quality and what they constitute?
What includes “quality score” for Google?
The Google uses some unique techniques for what’s low quality vs. high quality. Some of them are pretty common and we are completely familiar with them. Even, some of these are more intriguing. So…
- The unique content is always appreciated by the Google.
- They ensure that the visitors who came to their website is completely unique and not rewritten in some different words and phrases on this page. Their real purpose is that the searchers should get something different. If you have some questions regarding the unique value, then checking out the Whiteboard Friday is the real idea.
- Google loves to see many external sources that link to a page editorially. It shows that the page has high-quality content and it is really worthy for the readers.
- In addition, they prefer to see not just some sources & domains, but some high-quality pages. There should also some high-quality pages linked to your website. They can be external as well as internal links. If your high-quality pages are linking to another page on your site, then it can be the case for you as Google gets interpreted in this manner.
- The query of the searcher should be answered successfully by the page.
It is an intriguing question. For e.g., if there is a search performed by someone, let’s say we type “pressure valves” in the search box on the Google. We will just write “pressure valve” and this page will show up. So, if someone taps on that page and stays there or maybe they come back to the Google. However, they will perform a different search this time or go to a different task and visit a different website. There are chances that they may go back to their email or whatever it can be. This way, it shows Google that the query has solved by this page.
On the other hand, if a different person has conducted a search and click on a link and find out a very low-quality page, there is a chance that they will come back immediately and selects a different page. It shows Google that the page opened by the searchers did not become successful in answering their queries. If it continues to happen again and again, then this activity is called pogo-sticking by the Google where the visitors don’t get answers to their query and they go to the different page for getting an adequate solution. There is a great chance that it will move down your rankings and Google will consider it as a low-quality score google page.
- The page should be able to load fast on all the internet connections.
- Google always likes to see high-quality score google accessibility. So, your site should offer wonderful user experience and design on all types of devices, whether it is a desktop or a laptop or even a tablet.
- They want the content on the page should be well-spelled and grammatically correct. I understand this might be a surprise for some, but we have performed a deep research and tests that show that bad grammar or poor spellings removed the featured snippets from Google. It means if you have a featured snippet that is doing a wonderful job in the SERPs, but you made some modifications and all mess up. At that time, Google said”, it no longer qualifies”, which the content on your website is no longer of high quality. This way, we got to know that Google analyzes pages for getting this type of information.
- There should be text alternatives in the non-text content. It is the main reason that Google inspires the use of the alt attribute. That is why they love to see transcripts of the videos. In our blog, there is a transcript below where you can check out and get all the important content without any need to listen to us if you don’t want to know the technical reasons behind this or if you don’t have the ability for it.
- Also, they want to see a well-organized and easy to understand content on the pages. They are interpreted by lots of different things, but their machine learning systems have the capability for picking that upwards.
- Furthermore, Google loves to check out the content that linked towards the additional sources for more details or follow-up tasks. So, we can say that the external links from a page can do this job efficiently.
We can’t say that it is an exhaustive list. However, some of the above-mentioned things help Google in finding the quality of a page so that they can filter things.
How can pages be filtered on the sites by SEOs & marketers to identify high quality versus low quality?
There is a wonderful process that can be used by the SEO and marketers for doing this job. There is no way that we can get access to all the components through which Google measure this, but still, we can have a look at the things that can assist us in figuring out what is a low quality and what is a high quality. You will also get the answer to the question on whether or not I should recreate or even delete this from my site if it is a low-quality content.
If you ask me, I will suggest you to never use the below-mentioned things on your site:
- Raw bounce rate
- Assisted conversions
- Organic visits
- Raw time on site
Some of you might be wondering why? Because they are misleading signals that can be harmful in the long run.
If someone is on your website for a long time, then there are chances that they are engaged with your content. However, it can be because the searcher is annoyed and can’t be able to find the content that they require. That means, they will return to the search page on Google and tap on something different that can solve their query in a suitable manner. It can be due to there are many pop-ups on your site and the searcher have to click on them that makes it hard for them to find out the x-button. Because of that, they have to scroll down to your content that makes them frustrated.
Bounce rates work in a similar fashion
A high bounce rate is not a big concern if your next step is to link somewhere else or if you are solving some basic queries. For example, if I need an answer about the number of study years in an MCA degree and it turns out three, I will be fine. There will be no need for me to stay on that page for an extended period of time. It means I don’t have to visit your site for more time. There is a chance that your bounce rate will increase up to 90%, but still, you have given me the answer that I was looking for. You have done the thing that the Google is looking for. So, there is no doubt that bounce rate is a bad metric by itself.
Same things work with organic visits
You can be the owner of a page that has low-quality content and still get a great amount of organic traffic for one reason or two. It can be ranking due to lots of long tail stuff, but it is disappointing for searchers. This can work a little better in the long run. You will be able to get a better information by checking it over the course of months as compared to some days. I don’t like it anyway.
Assisted conversions are a perfect example
There are greater chances that this page wouldn’t convert anyone. It can be a chance for dropping cookies and retargeting someone or attracting them to sign up for your email list. However, it might not convert the goal conversions directly. It doesn’t mean that it has a low-quality content.
These can be a wonderful start:
Now, I will suggest you think about a combination of metrics. You can analyze high versus low quality anytime and have a blend of metrics approach that you are implementing.
It could be a combination of engagement metrics
Here, we will look at the following things:
- Total Visits
- Internet and external links
- Have a look at the pages per visit after landing
Imagine, someone has come to your page and then they check out the other pages on the website, it is a wonderful sign. However, if they only browse a handful of pages, then it is a bad sign but doesn’t need to be taken by itself. It should be combined with other stuff like bounce rate, total visits, external visits, time on site, etc.
Combining some offsite metrics
They are the things like:
- Number of linking root domains
- External links
- PA and your social shares, such as Linkedin, Twitter, and Facebook share accounts can be accounted here. If you check out something that is getting lots of social shares may not match with the needs of searchers, but it can have high-quality content.
Search Engine Metrics
Here, you need to look at the following things:
It can be performed by typing a URL into the browser bar or search bar directly and checking the indexing status of the page.
- You can also check out the things that are ranking for their own title.
- You can see the Google Search Console and check out click-through rates.
- See unique versus duplicate content
For example, I have written a URL here and check several pages show up from my site or I type a title that has created by myself or there are multiple URLs showing up from my website. It shows that there are lots of uniqueness issues on my site.
You should be looking forward to performing a real hand review of a handful of pages
- Pages from subdomains, subsections or subfolders
If you are having them and say “Oh please hang on”. Is it actually assists the visitors? Is your present content and up to date? Is it meeting the standards of the organization?
Making Three Buckets
You can create some buckets by utilizing these combinations of metrics. It can be done in a very easy manner by exporting each of your URLs. You can utilize the power of things, such as Moz’s Crawler or DeepCrawl or Screaming Frog. All your pages can be exported into a single spreadsheet with these metrics. After that, you can begin sorting and filtering process. You can develop some type of algorithm or some blend of the metrics that can determine and double-check by yourself. I strongly suggest you put them into three types of buckets.
First One- High Importance
These are high-quality content that you should keep on your site.
Second One- Demands Work
This content requires some modifications, but it still has the power for staying in the search engines because it is not that bad and doesn’t affect your brand in a wrong manner. It is not that content which is called low quality by the search engines. You would not be penalized for this content.
Here, the problem is that it is not living according to your hopes or expectations. You can republish it or should improve its quality score for google.
Third One- Low Quality
It is the content that is not meeting the requirements of searcher in any manner. However, you don’t just require for deleting it straight away. All you need to do is some testing. Just take a sample that includes some of your worst content and put it in the low bucket. Eliminate it from the website and keep its copy. Now, a check is there any increase in the crawl budget and indexation, rankings or search traffic after removing these pages. If that works, then begin to become less judicious and more liberal with the content that you are cutting out of that low-quality bucket and get amazing results from Google.
All right guys, we have reached the conclusion. I hope that you should have enjoyed this article and we will meet you next week.