Monday, April 27, 2020

36 Practical SEO Techniques for On Page SEO that have Great Worth

Respected Visitors , This is not a article this is Practical for On Page SEO guideline. If you are a professional and need a checklist for On Page SEO for any project this is Complete Check List for you. Just do follow all these 36 Ideas and Check the different. 

If your On Page SEO is done you have almost done. Off Page SEO have its own worth but On Page SEO is very very Important. Because It is mostly related for your visitors. So every Step will improve your Web Pages worth. Your Visitors will like to stay more time if you adopt these tips and tricks. Hope you will like this effort.

Kindly Share your feedback in the end of this task but first start doing work on your project here is your complete List for On Page SEO.

Check List for On Page SEO - Techniques That Really Work 

Check List for On Page SEO



1. What is Important to focus on Page SEO


  • Title Tag or Title Elements
  • Meta Description 
  • Page URL
  • Heading Tags 
  • Body Copy 
  • Images
  • Links On the Page


2. Always Add a Title tag 

Remember You Must have to add a unique Page Tittle in your each page or each post to tell google what the post is about for

HTML Code for Page Title:
 <title>  Your Text here </title> 

Google Like Best if Title is 55 to 60 Character not more than 60 

3. Meta Tags Importance in SEO in 2020 

Google does not use the Keywords meta tag in search rankings, but Yahoo and Bing still use Meta tags and assign very little importance to it. So many developer do not use ti now and those who use it they would want to do so with caution. This is a tag that has been abused a lot by spammers in the past. Which is why search engines no longer attach much importance to this tag. The Format of Meta tag is 

HTML Code for meta Tags:
<meta name="Keywords" Content = "list of Keywords your page is about"> 

SEO Expert think this is a negative signal for now a days. In other words, if search engines see that a certain site has a lot of keywords in Keywords meta tag, that site might actually suffer because of it. This is a tag you will want to use with caution. Limit the list of Keywords to One or may be two Keywords at the most. Do not use Too many keywords in the list of keywords. It might even be a good idea not to use this tag on your site anymore.  

4. Description Meta tag. 

If you visit WordPress in Google , the first Official web page word-press will appear with its slogan lines , Now go to view source and find the discrimination tag you will find the same line there as google was showing you in searching. This is the power of Description meta tag. 

Your position in SERPs will not be affected by the content of Your description meta tag. This Meta Tag matters because users see the snippet in the SERPs. If Your description is well written, anyone reading it will want to click on your listing and visit your site. This is what this tag is useful for - to gain click through from the SERPs. 

What is the Format of the description of meta tag:

<meta name="description" content=" Your Description in 155 characters">

Every Page should have unique description , you would not want to use the same description tag on more than one page. 

You would also want to avoid using alpha numeric characters in your meta description tags.  

5. Yoast SEO Plugin  for On-Page Optimization 

SEO Plugin Youast is a magic for On-page SEO. You can use the Yoast SEO Plugin to optimize the individual pages. A Default installation of word press will not have any option to enter the meta description keywords meta tags for any page. To enter these meta tags you need to modify your blog Theme's Header.php or by Using a plugin. 

Using Plugin is far easier than modifying theme files. Now its up to you what method you use to add these tags. Here we will discuss about Plugin method. 

We'll now see how to specify a unique title and meta description for every page using Yoast SEO plugin. When you are in the 'Edit Page' Screen or add a new page scree , or Creating your New Post in WordPress  scroll down and you will see this Section Yoast SEO Setting if Yoast is install on in you library. 

Click on the button enter your focus keywords and you'll see the Snippet preview. Click on Edit Snippet and you will find 3 option here 

1. SEO Title
2. Slug
3. Meta Description 

What is SEO Title in Yoast Plugin Snippet Option :

Title Mean the Title of Your Page or Post you want to add for your visitors like the Main Title You add for your Post or Page.  It is good thing if you past the Same Title here as you write in above Post or Page Title Option. 

What is Slug Option in Yoast Plugin for SEO 

Slug is the User Friendly name of the Page. What you enter in Slug option will become a part of the Page's URL. 

What is Meta Description option in SEO Yoast Plugin 

If you are using WordPress you can not easily add meta tags for every page. So by using this option you can easily add you meta tags to make your page more productive for Search Engine Optimization. Remember now a days meta tags are treated as a negative signal for white SEO techniques. Now its your choice what you like to do. 


6. Alt Text and its importance in On Page SEO

Alt Text or alternative Text for images used to be very important, but its importance seems to be declining now a days. Images play vital role for your pages and post or articles. Focusing images have its own importance for any website but since search engines cannot know what a image is about. 

All search engine rely on attributes like the image's alt text, file name and description to know what the image is about. When ever you add a image in word-press, you have an option to specify the title, caption , alt text and the description. Now it's very easy to go over  board here and end up over optimizing. 

You would not want to use the same keyword in all of these. While the alt text is still an importance parameter, it might not be as important as it once was. You can see the alt text of images by right clicking on the image  and choosing 'inspect'. In Advance SEO section we will discuss all about alt text in more detail so stay with us . 

7.  What is Robots. txt File and its Importance for SEO 

Search Engines like Google Use Robots - or bots to crawl links and access web-pages. Robots.txt is a simple text file that will use to communicate with these bots. The most common purpose of this text file is to restrict access to your website by these bots. All the Main search engine spiders - or bots - looks for this file before they crawl your blog or site. So these bots - at least, those of the bing search engines like Google, Yahoo and Bing -read this file and follow the instruction you put there.  

This file needs to be in the root of your website - where the index.php file resides. This file needs to be in the root of your website - where the index.php file resides.  

The path of this is <<yourdomai.com>>/robots.txt. 

Remember only some one can read this file by going to this URL : <<<Yourdomain/robots.txt>>>

8.What is  htaccess File How it is important for SEO 

The .htaccess file is another command file that's in the root of Your web server. Just Like the Roobts.txt File.  

The Path of this file will be <<Yourdomain>>/.htaccess . 

But online robots.txt, just about anyone will not be able to view this file. This file can also be present in any of the individual folder. This file does not have an extension. You could say the master htaccess file sits in the root and other htaccess files can reside in individual folders. 

You would want to note that this file does not have an extension. You can use the htaccesss file for a variety of purposes. Like You can use it to set up redirects, or to rewrite URLs. To restrict access to certain files and fikders etc. One thing more this .htaccess files can be used only on linux servers and not on Windows servers. Some shared hosts disable several features of this file. 

If you want to use the full functionality of this file, you would typically need a linux dedicated or a cloud server. 

9. How Focus Keywords are important for On-Page SEO 

Every page you want to monetize needs to have at least one focus keyword. The Focus Keyword will need to be used in these ares of our page or post like Page title, Your Page URLs, Once in a H1 Tag etc. The focus keyword is a keyword you desire the page to rank high for in the SERPs.  

You can add Focus Keyword once withhin the first 100 Words of the body copy. Two or Three Focus Keywords in the rest of the Body Copy. Your Post description should be at leat 1500 to 2500 Keyword in length.  

Once in the file name of an image, and once in the meta description tag. The Body copy should read natural. It should not look lik you forced th focused keywords in to the copy. That is not a good sign for you. We will discuss this in a great detail in the Advance SEO Section. 


10. How Keyword Density is important in On Page SEO 

In the Past, Spammers would stuff their pages with keywords they wanted to rank for. They would even use hidden text to influence their rankings. Such Techniques do not work anymore. Google is a lot, lot , lot smarter today than it was few years ago. Over use of any particular keywords can hurt your rankings. 

If you read your copy aloud, it should seem natural, If you are in doubt, get someone else to read your copy, if your copy reads naturally, it's most likely alright. You would not want to use your focus keyword more than 3 to 5 time in a 2000 words page but their is no hard and fast rule. 

There is no magic percentage for keyword density that will catapult a page up the SERPs. Since Usage of to many Keywords again and again can hardly hurt your blog or site. 

In a 2000 Word Page, your would not want to use your focus keyword more than maybe 3 to 5 times - but again remember there is no hard and fast rule in this view. 

11. How URL is Important in On-Page SEO 

The URL is another on page factor that matters a lot. It is not the words in the URL alone that matter - structure matters too. URL structure helps search engines end users - understand context better. 

Like www.abc.com/services/new-york/automobile is a url which tells you that this is a company url which is offering automobile services in New York. 

Let See another URL www.abc.com/automobile , This is URL which does not give you any information. 

You can not be able to conclude whether the page is an informational article about automobile , whether it's selling equipment related to automobile or whether someone is looking for automobile services etc. This kind of URL does not give enough information about what the page could be about. 

Likewise, A search Engine too can get a lot of information about a page from the structure of URL and the words contained in the URL. This is one of the reason siloed sites are preferred to non-sioled sites. Siloed Sites have much better URL Structures.  

You would not want to have junk characters in the URL like '?' or '&' Sign try your best to avoid such symbols in URLs. Such symbols end up confusing search engines and users alike. So such pages do not rank High. 

Do not worry we will discuss all about URL in greater detail in the Advance on Page SEO section. This is good enouh for today and stay tuned for more Updates. 

12.  How Fresh Content is Important for On Page SEO. 

Fresh Content is Good, but its importance is sometime overrated. Some webmasters go to great length to keep content looking Fresh, They change a few words here and there , add and delete images , just to give illusion of freshness and keep spiders coming back as often as possible. Luckily the ranking advantage of fresh content is just another SEO myth. 

While it is good to have fresh content on your site and some visitors might love your website if you post new content regularly, Fresh Content does not have an advantage when it comes to ranking in Google. Fact is , a lot of the pages that are ranking in the Top of Google is content that's at least several weeks old or even several months old. 

They are certainly not fresh. Those pieces of content have built up links and authority over time and are now ranking at the top of Google. Fresh content usually does not have many links. And Just being fresh alone does not give it the power to dislodge the content that's already ranking at the top of Google.  

It takes relevance and popularity for a piece of content to rank at the top of Google. It doesn't really matter if a piece of content is fresh or old. If it is relevant and very popular, it will rank well regardless of whether it is fresh or old. 

Google can now tell whether a certain search phrase is a QDF a query that deserves freshness or not. For instance , if someone is looking up information about an earthquake that happened an hour ago, or about a baseball game that's in progress, Google can tell it's a query that deserves freshness and it will give priority ot fresh content on the topic. 

But if a high school kid is looking for information about what makes the sky blue, Google can tell it's not a query that deserves freshness , and fresh content will get no priority. You can say fresh content matters for news sites, but not so much for affiliate sites. 

13. How HTML header Tags can play role for On Page SEO  

HTML header tags are also important for On Page SEO. In Particular the H1 tag play a big role in this concern. Remember HTML header tags are very important especially the H1 tag. There are 6 Header tags <h1>, <h2>, <h3>......<h6>. 

These header tags make important content stand out and help structure the textual content. Remember H1 tags tell the search engine that what the post or Page is about , what the main idea of the page is, H1 tags also are believed to send relevance signals to Google, Google give more weight to the H1 tag and far less importance to the rest of the header tags. 

For the less important headers, you would want to use the other header tags H2 through H6. H1, H2, H3 are very important while other three H4, H5, H6 have very little importance and are rarely , if ever used on most web-pages. It is very important that you have several headers.

Internet users skim pages, scanning for content they are interested in. Using headers makes browsing a more pleasant experience and you would want to have plenty of white space between headers and the rest of the textual content. 

You would want to use the focus keyword once the H1 tag. H1 tags are typically placed at the top of the page and would the same as our page title. Avoid using the focus keywords in the rest of the header tags, Using it once in the H1 tag is sufficient and using it in several headers might be considered spamming. And H1 tags need to be unique do not have duplicate H1 tags. That said, you would never want to use more than one H1 Tag on your Pge. If you use to many H1 Tags Google Can penalize the site or they will less rank it in searching. 

A few years ago, You could use several H1 tags on your page and that could move your page up the SERPs significantly. But trying that now can hurt your site. Today, search engines look very closely at the semantic relationships between words contained in header tags and the rest of the text on the page. 

In addition to alt text of images , image file names and the words in the URL of the page. So it is impossible to influence rankings just using multiple H1 tags like it used to be possible in the past. Some CMS or even poorly coded themes can make your page have multiple H1 tags. You would want to make sure that doesn't happen with your site. 

14. Top On Page Factors for On Page SEO tips and Tricks 

Off all the On page factors, the following are the most important.    Title Tag, URL , Page Content, Image File Name, Alt Text, Keywords in H1 Tags, Outbound Links etc. It's important every page link to several other closely related pages - pages on the same website as well as those on external websites. 

Google wants to see pages link to each other wherever necessary. Example - Wikipedia pages; every Wikipedia pages links out to several highly relevant external sources.  Google wants to see pages on the internet link each other whenever necessary. Google algorithm is based on how research pages like the ones published in journals cite other sources.  You will not find a single research paper that does not cite other sources. 

Any Source that gets the most citations from several research papers in perceived as being the most authoritative on the topic and links are just like these citations. When ever you link out to an external resource, or a page on your own website, you are in assence citing that as a reference. Like wise, Google and other search engines expect to see pages link to other sources wherever necessary. 

A Good Example of this would be Wikipedia. Every Wikipedia page links out to several external Sources. And any source Wikipedia links to is almost certainly a high authority website on the topic. In more detail we will discus these factor in Advance SEO course Stay with us for more updates. 

15. What is Different between Inbound Links & Outbound Links

There are mainly two kinds of links inbound links and outbound links. 

Inbound links are seen as endorsements of your sites from other websites. Number and quality of other websites that link to your site is an important ranking factor.  Both of these are important. Inbound links are links that are pointing to your site from other websites. Google sees these links as votes from other sites to your site. 

In fact, the number of such websites that link to or vote for your website is a very important factor when it comes to ranking in Google. But it's not just the quantity these sites also need to be relevant to the main topic of Your site and need to be authority websites by Google. 

Given the huge importance of links , a lot of people resort to black hat methods to boost their link popularity. They trade links with other websites, buy links or rent links. 

All in the Hope of rising in SERPs. But buying or renting links is a dangerous practice and if Google find a certain website is indulging in such practices, it will penalize the website. In addition to topical relevance and authority of the sites linking to you. 

Google Also considers the age of the link.Links that are several months old have more weight than links that are just a day or two old. 

Outbound links are links from your site to other websites or to other pages on your own website. You would only want to link to authority websites that are topically relevant to the main topic of your website. If you link other website that have been penalized by Google, Your site may suffer because of it. Google wants to see at least a few outbound links on every page. 

Some webmasters mistakenly believe their pages will "leak" authority if they link out. That's not always true. Google expects you to link to other high quality websites. From every web page on your site, You will want to link out at least to a few other very high Quality web pages - after all, its links that make the web what it is. Linking out to other high quality and topically relevant web pages actually boosts the relevance of your own web pages. 

16. Importance of Anchor Text in On Page SEO 

Links and anchor text are possibly the most important things in SEO. Anchor text is the clickable portion of a hyperlink. 

Format of Anchor text is : 
<a hrf="http://www.Google.com"> 

The most popular search engine </a> , Here the phrase ' the most popular search engine' is the anchor text. In the above example, we are linking to www.Google.com with the anchor text 'The Most Popular Search Engine.  Anchor Text is the most important things in SEO. What is anchor text? It's the clickable portion of a hyperlin.  

For instance, in this piece of code.....
<a hrf="http://www.Google.com"> the most popular search engine </a>

We are going to link Google.com with the anchor text The Most Popular Search Engine 

If you get this kind of links from High Authority sites that are relevant to the main topic of your site with the right anchor text, Your pages will rise up the SERPs really fast. On the other hand, if your anchor text is not "right" , Your result will not be so good. In fact the wrong anchor text can get a site  penalized.  

The tricky part is the definition of what is right anchor text has been changing over time. What used to work a couple of years ago no longer works. And doing what used to work 5 years ago can now ge a site penalized. Like if you are trying your site with a keyword like Mattress site for say best twin mattress in 2020, Al l you had to do was get as many links to your site with the anchor text "Best Twin Mattress" or for automobile "Best Services of Automobile" etc and in a short while, you would be ranking on the first page. SEOs would either set up private blog networks or PBNs used them to link to their own sites with any anchor text they wanted to. 

They would even rent their PBN sites to other SEOs for a monthly fee. Thinks were the same for several years and then Google released the Penguin updates in April 2012.  That went after such linking schemes. Google start measuring anchor text ration and any website having to high a percentage of a certain anchor text would be penalized. 

Since the penguin update, webmasters have been wary about their anchor text ratios. Many SEOs now do not even use exact match anchors any more. Now if they want to rank for '' Best Twin Mattress'' they ensure the phrase is not used in their anchors. 

They will use phrases like chick here, Visit this website, Check this Out, Mattress, Buy matterss, Loot at this........ and so on. They ensure the phrase best twin mattress does not occur even once in the anchor text they use to link to their sites. But this may be taking things a bit too far...If you do not use the anchor text you want to rank for you will need a very long time to rank. 

You could use anchor like these instead... like This is the best twin mattress, voted the best twin mattress, two years in a row , buy a bed online, buying a bed, check this out, happy shopping, Read this review, sleep like a baby, most comfortable bed today, Choose a bed that's good for your back, URL etc in all these ideas only two time i use phrase best twin mattress and the rest of the phrases do not contain the phrase. 

While you do not want to over use the phrase in your anchors avoiding it completely is not good either. So you would want to limit the percentage of phrases containing the exact match anchors to between 10 to 20 % or lesser; But the rest of the anchors should not contain your anchor text. This is very important , all the other 12 anchor text option do not contain the phrases whole or in part.  Using Your URL as the anchor text is one of the safest things you can do to diversify your anchor text unless your domain name happens to be an EMD (exact match domain) 

17. What id Do follow and No follow link - Their impact for On Page SEO 

 Every link to your website is a vote for your website. Every such link may or may not pass some amount of juice to your web page. 

Good select the pages with the most "link juice" and relevance for a particular topic and ranks them high. Links that pass '' Link Juice " are do follow links i mean a link from, any website to Google.com would be of the format : 

<a href="www.Google.com"> check out this mattress site </a> 

The above  link is a do follow link - and A do follow link from a two day old site with no external links will also pass little link juice, but not much.  

Every time you link to an external website mean you are voting for that website. You are telling Google you trust and you vouch for the content on that website. 

But same time if that website you linked to happens to be untrustworthy in Google's eye it may be infected with a virus or may link to some bad websites, your website also suffer on the same time . 

That could hurt your site rankings. If you want to link to a website but do not trust it and do not want to pass any juice to that site form your site, You could use the no follow attribute. Links that use that no follow attribute are "no follow" links format is as under

<a rel=nofollow href="www.xyz.com"> Check out this mattress site </a>

This is a no follow link as it uses the rel=no follow attribute. 

If you link to a sketchy site you do not trust, no follow the link; if you link to a site that has pornography or malicious content, you are voting for the site; that can be bad ; If you really must link to it for whatever reason, no follow the link. No follow links are not bad , they should be part of a naturally diverse backlink profile. Links from social media sites are all no follow. Google expects a percentage of your links to be no follow links. 

No follow links can lead to conversion and sales so just like do follow links. If none of your link are no follow and your site has 1000s of links , that may be viewed with some suspicion by the search engine. In this view Google expects a percentage of Your links to be no follow links. That's just natural. No follow links from high authority sites like Wikipedia may be seen as a sign of trust by Google. At least, it's far better to have no follow links from authority, highly relevant sites than no t having them. 

18.  What are Editorial Links - Outreach Links - Non-Editorial Links 

There are mainly three kind of links Editorial Links, Outreach and Non Editorial Links. 

Editorial Links: These are the ones Google Values the Most. Sites with Outstanding Content get linked to naturally by other webmasters. The Value of an editorial link depends on the strength of the domain it originates from. 

An editorial link, say from stand for .edu carries alot, lot more weight than a editorial link form a relatively unknown site. But Not all editorial links are the same.  A editorial link from anytimes.com or from stanford.edu has a lot more value than a editorial link from a relatively unknown site. But Still , Google Loves editorial links and give them highest value. 

What are the Outreach Link:

These are links You acquire by emailing other webmasters or these are links you can get from directory submissions, writing to other webmasters or even paying other webmaster for a listing. Search Engine Sometime find it hard to tell the difference between editorial links and outreach links. So these links can be very valuable too. 

Non-Editorial Links 

Blog Comments , links from article directories, forum profile links, guest book links are known as non editorial links. These used to work very well until the penguin update devalued them. Using exact or partial match anchors in these type of links is not a good practice. 

Too many of these links is not good either such practice can get your site in the hot water now. 

More So if you use exact match anchors while linking to your site with these type of links.  One of the easiest ways for a search engine to conclude a site is trying to game the system is if it sees lots and lots of spam links with exact match and partial match anchors. 

19. What is Site Popularity mean in On page SEO 

Website Popularity or Website Authority is very important and The Degree of benefit your site receive from a link depend on two thing, 

1. Topical Relevance Of the Site You are Getting a Link From 
2. Popularity  or Authority of the website. 

The most common measures of a domain's popularity / authority are 


20. What is Google Page Rank and its Metric to measure Website Authority. 

Google's Page Rank is a page level metric and not a domain level metric and this is perhaps the most important of all SEO metrics.  Page Rank is Google's Proprietary scoring system and it was and is the most important of all SEO matrices. 

It is possibly the most important and reliable measure of a site's authority but this is no longer public. Page Rank is a number Google assigns to every page in its index. 

This number indicates how important Google thinks a page is. Unfortunately, Google does not make this data public anymore. So there is no way you can check the current page rank of any page anymore. Google assigns a number to every page in its index. This number is a measure of how important Google Thinks the page is. 

Google Computes this number by finding out the number of links pointing to a page - and how important Google thinks each of those pages that are linking to the page are. Page Rank of a page is scored on an logarithmic scale and goes from 0 to 10. 

A page with a page rank NA is either a new page or a page that has been penalized by Google.  Since the Scale is logarithmic, a page with a page rank of 5 is 10 time more powerful than a page with apage rank of 4 and 100 time as powerful as a page with a page rank of 3. A page rank of 10 is the highest possible score and there are just a handful of sites at any point of time with pages that score this highly. 

A page with a page rank 8 would be a very very very important page in the eye of Google. if you get a do follow link from a page , you are in essence getting a bit of the page's Page Rank. The higher the page rank of the page the link originates from, the more Juice Your site will get as a result of that link. 

The amount of Juice your site gets also depends on the number of outbound links on the page linking to you. The amount of Juice passed by the outbound links is divided more or less equally between them. So the amount of juice you get form a page with 5 outbound links is a lot more than you would get if the same page has 100 links.  While page rank is a hugely important part of Google's ranking system, it is unfortunate they do not make this data public anymore. 

21. How to Check Domain Authority of a Website. 

Domain authority can be said to be the strength of a domain. For this Moz.com's domain authority or DA can be said to be the strength of a domain. DA is computed by taking into account several factors more than 40 individual factors. 

Domain with high Domain Authority have a lot of Google Trust, domains with very low Domain Authority Google doesn't yet trust and those with a very low Domain authority Google does not yet trust fully. 

To Check Domain authority you can go to link given below.

This URL will redirects to this URL.  

Like if you check the Google.com authority on this website you will find the result as Google has a perfect score for domain authority - 100/100 a possible response. So do Facebook, Twitter and YouTube. 

Very few affiliate websites have a domain authority more than 50. Any website with a Domain Authority more than 50 or 60 will be able to very easily rank for a lot of easy medium competition phrases with hardly any link building because these sites already have a lot of authority and Google trusts them. 

High DA sites also have a easier time in the sense they are very unlikely to get a Google slap or penalty no matter what they do. For instance, if a site with a Domain Authority of 5 gets a lot of Spam links, it might get a penalty from Google. 

But if the same thing happens to a site with a DA of 80, it will not be affected in the least because it has already earned Google's Trust. That's why new sites with very low Domain Authority need to be very careful. 

You will not want to overuse the same set of anchors. You will need to be mindful of your link velocity and You will need to be very careful who you link to. These rules are a lot more relaxed for high DA sites. 

If you have a DA 85 site, you can blast links at it without hurting it in the least. But if you blast links at a new site with a very low DA, it will almost certainly get penalized in some way by Google. 

Getting a link from a site that's very relevant to your topic and has a high DA would be very good indeed. More so if the link happens to be a homepage link.

22. Page Authority (PA) measurement through Moz & importance in On-Page SEO 

Page Authority is a metric created by Moz.com. It is indication of a page's ability to rank high in the SERPs. While domain authority is a measure of the strength of a domain. You can say page authority is a measure of the strength of a page to perform well in the SERPs. Like DA, page authority is also measured on a 100 point logarithmic scale. So it is a lot easier to go from PA 10 PA 20 than it is to go from PA 80 to PA 90. Moz takes into account several factors while computing PA and they do not reveal the exact formula. 

MozRank is Moz com's global link popularity score and just like Google's Page Rank, Moz Rank is measured on a logarithmic scale that goes from 0 to 10, with 100 being the highest. MozRank and PA are now being used by many SEOs instead of Google's Page Rank because Page Rank is not publicly visible anymore. Links from topically relevant pages with PA more than 40 are considered very valuable. These pack serious power and can result in very good SERP improvements. 

To measure any page's PA , you will need to visit  , and you check the authority of Google.com. Page authority of Google.com is 97 while Domain Authority is perfect 100. 

23.What is Trust Flow and Citation Flow  & its Impact in On Page SEO

Trust Flow (TF) is a metric created by Majestic.com. They rate a website on a scale of 1 to 100, and sites with a higher Trust Flow are considered to be having greater authority. Majestic collated a set of high authority sites - a seed set. If a site has link from this set of trusted sites in the seed set, its Trust Flow will be very high. 

On the other hand, if a site is several hops away from that seed set of high authority sites, its trust flow will be low. You can say TF is a number measures the degrees of separation of a site from the seed set of high authority sites. If a site that have link form sites in the seed set of high authority sites are more trustworthy than a site having links in seed sets of low authority sites. 

What is Citation Flow in SEO:

Citation Flow, on ther hand, is a measure of the link juice flowing to the site.   More the links pointing to a site, higher will be its citation flow. visit MAJESTIC.com clikc here 

Let's check the Trust Flow and Citation Flow of Google.com on Majestic.com. Enter the URL in the Search area and Click Search button. 

Trust flow of Google.com will be show near about 97 and Citation Flow 96. These are extremely high numbers and you would want to remember that these are logarithmic scales.  

One Ration you will want to compute is the Trust Ration. Its the ratio of Trust Flow to Citation Flow. As far as possible, you want to get links from domains with a trust ration greater than or equal to one. 

At the very least, you want this ratio to be as close to 1 as possible and certainly not a lot less than 1. This ratio is a very good indicator of whether the links to the site are trustworthy or Spam. If the trust ratio of a site is greater than 1, it means the Trust flow is great than Citation Flow , In other words , the links are mostly from highly trusted sites. If the Trust ratio is equal to 1, It's still alright. It means at least a few of the links are form trusted sites.

If the trust ratio is far lesser than 1, it means the site has mostly Spam links with very low trust flow and you would not want a link from the domain. Any website with a lot of links pointing to it will have a high citation flow. If most of these links are from very low quality sites, its trust flow will be very low. 

A low Trust Ratio is one indication of a site with a Spam link profile and you would not want a link from such a site.  In above example we see for Google.com Trust Flow is greater than Citation Flow so the trust ratio of Google.com is greater than 1. 

24. What is Ahref's Domain Rating and its Importance for On Page SEO

Ahref's domain rating is another very important metric that seems to have a strong correlation with domain's ability to perform well in Google. Domain rating is measured on scale that gooes from 1 to 100, with 100 being the strongest. 

The higher the Domain rating of a domain, the more likely it is to perform well in the SERPs. It is computed by taking into account all of the back links to the site. Sites that have a high domain rating also tend to have a high trust ratio and a high domain authority. To visit ahrefs.com Click Here


You will need to create an account and then log into the site before you will be able to check the domain rating of a website. To find the Domain Rating of a website, enter the URL in this search box and click explore button. This tool will gives you a lot on information and with the domain rating information. you can use this tool for Best SEO audit of website to improve all the factors needs to improve. 

25. What is Duplicate Content 

Is duplicate content really bad? This is a question that gets asked a lot in SEO circles. News websites use syndicated content which is essentially duplicate content as each news story gets picked up and published by several websites. If we talk about Google, Google estimates that about 20 % to 30 % of all content on the web is duplicate. If they were to treat all of the duplicate content as spam than a large percentage of the web would be marked as spam.

What Google does not like in onsite duplicate content -If is sees several pages on your site are copies of each other, that will hurt your site and you will want to avoid it. Some CMSes Content Management System like WordPress actually cause on site duplicate content issues if they are not setup the right way.

If you are using post on word Press and have category, date on author archives on your site and use tags, then you are certain to have a large percentage of on site duplicate content. In addition, Duplicate title and meta description tags are not good as well. Every Page on your site needs to have unique titles and description. 

Onsite duplicate content is like having a lot of dead weight that just keeps your site form performing well on the search engines. Onsite Duplicate content may not cause a penalty, but it will affect the performance of your site. 

The Best way to handle on site duplicate content in WordPress is to use the Yoast SEO plugin and no index, follow all of the category, tag author and date archives. Finally if you really need to use a lot of text in the footer of your website pages , convert the text to an image so that it will be not seen as duplicate content. You'll also want to specify canonical URLs. WordPress makes this very easy, but this used to be a huge issue with HTM sites. 

26. Black Hat SEO Vs White Hat SEO techniques an Overview. 

The main difference between white hat and black hat SEO is the Off Page methods usd to rank websites. Black Hatters use techniques like private blog networks, 301 redirects, link spamming tier 1 properties, buying links and so on.....! On page Black Hat methods like keywords stuffing, hidden text, multiple H1 tags and more used to work but these can seriously hurt your site. Black hat link building methods do work when done the right way, but doing it right is not easy. If Google discovers a site is using black hat methods, the site may be penalized. 

White Hat SEO on the other hand relies mainly on high quality content and getting editorial links. While black hat methods are designed mainly to fool search engines into thinking a site is a high authority site in the niche. 

White Hat SEO aims to build real authority sites that stand the test of time. Ever since the advent of the Panda algorithm, regardless of the type of links used to rank a site, the content has to be of very high quality Something that can help users with whatever they are seeking. It's no longer possible to rank with links alone - So now even black hatters need to build very high quality websites. 

27. What is Google Panda ?

Google's Panda Update - also called Farmer's Updates. It is an update that first happened in March 2011.  Since then, there have been several panda Updates. Panda Looks at mainly onpage factors. 

How it basically works is they have a panel of expert site quality raters look at a whole bunch of site. These rates either like or dislike every site they see. Google then uses machine learning algorithms to figure out the common parameters of sites that the raters liked and the common parameters of the sites they disliked.

They also decide upon a set of parameters that every Good site should have. They also decide the common parameters of the Bad sites. They then tool this across their index, and then sort the Good Sites from the Bad Sites. You can say the panda update focuses on the experience a site gives a user. They use machine learning algorithms to answer questions about a site like.Would a user trust this site with their credit card? Would a user trust the medical information on this site and so on.  

So what are the things you can do to pass test with flying colors? For this you would want to focus on these rules. Your website should be designed and User experience based. it should look good, be easy to navigate, should load fast and should render well on mobile devices. if a site has too many ads then the site is likely to score very poorly. 

If a site has too little content and looks like a cheesy sales pitch then the site is almost certain to score very poorly. If the graphics do not looks good and the site looks like one of those old HTML site that were build in 1995, then it is very likely to score very poorly. If the pages are very sloe to load , it certainly will affect user experience negatively. 

Even if a site has just a few very poorly designed pages,  it can affect the ability of the rest of the pages on the site to rank well. Some sites do not render well on mobile devices - that would affect the site negatively as well. Content Quality is an other factor , it's no longer sufficient that your content is good , unique and grammatically correct. 

The bar has been raised. Your Content should be share worthy. Anyone who sees it should want to share it. A few years ago, a lot of SEOs would write article that were between 300 to 500 words long. 

These were all mostly unique, grammatically correct pieces of content and that used to be sufficient to rank. That is no longer the case. To rank, A page needs to be much more than that now. Atl east 1500 to 2500 words of extremely informative content that's unique and grammatically correct and have a good number of unique and relevant images and or videos. And such pages needs to be well designed and visually appealing. 

So a User spend a good amount of time absorbing the content and possible share it as well. user Metric is an other factor, in addition to design and content , Panda also considers usage metrics - How many people click through from the SERPs to the Page, How much time they spend on the page. How many people just bounce off the page and go back to the SERPs results.  How many people visit your site directly. 

You would want to note that the Panda algorithm does not take into account the number and quality of links or the social signals a site has and this is done purpose. A site may have really good links from very high authority and topically relevant domains. But if it does not provide a good user experience all of the links and authority may not matter. The Site will be pushed down in SERPs. Google is trying to ensure only sites that provide a good user experienced rise to the top of the SERPs. 

Panda is now a part of Google algorithm and is no longer a mere update that happens every few months like it used to.This only goes to show how particular Google is about the kind of site they want at the top of the results. 

28. What is Google's Penguin 

Google's Penguin update is primarily focused on finding out which site use Spam links to game with Google's system and removing them from the search results. First Update was rolled out in April 2012. The Penguin Update is not focused on user experience but on combating link spam. As we discussed in Do follow link chapter , every do follow link passes some juice. 

Links from topic relevant , authority sites pass a lot more juice than links from low authority, Topically irrelevant sites. But still, a few years ago if you managed to get thousands of low quality links to your site, your site would rise in SERPs. Google saw people were gaming the system by generating 1000s or even hundreds of thousands of spam links. 

All such links with exact or partial match anchors to their sites and they wanted to change that. The penguin update was designed to devalue these Spam links and even penalize sites that indulged in such practices. 

You would want to know that penguin does not care about how good the content on a site really is if it concludes that the site is trying to game the system, it will devalue the links and maybe even penalize the site. 

Suppose a site provides excellent user experience and has great content but if the penguin algorithm finds that it's been using Spam links to try to manipulate the such results the site may be penalized or the Spam links propping up the site may be devalued. in either case, the site will fall in the SERPs. So Google is willing to punish even site with great content if they use spam to try to get to the top. With this update, Google seems to be indicating they want to see only editorial links - not links that are paid for or user generated. 

It is believed penguin , like Panda, may now be a part of Google's core algorithm. There may be no more penguin updates but if it is really a part of the the core algorithm, sites that build Spam links will be penalized faster than they would be in the past. 

29. What is Google's Hummingbird and its Purpose 

Unlike Panda and Penguin which were designed to combat low quality content and link spam. Hummingbird is not an update and neigher is it a penalty.Its a completely new redesigned version of the Google Algorithm itself. Panda and Penguin were more like Plugins that were plugged into the algorithm. Hummingbird is a redesigned version of the entire algorithm. 

It is believed this was developed because more and more people were using speech recognition on their mobile devices to search for what they wanted instead of using a traditional keyboard to type their queries in. When someone uses a speech recognition app, they structure their query differently than they would when using a keyboard. 

All Queries that are spoken into a mobile device tend to be longer and more specific, while queries that are typed tend to be shorter in general. Hummingbird was designed to better understand queries people were using on mobile devices. It was not trying to combat any kind of spam or low quality content. It attempts to understand the context better, so the search results will more closely match what a user is looking for when performing a speech. 

So websites are not being penalized by Hummingbird. It is only the Google algorithm evolving and becoming better. 

30. What is Rank-Brain 

Rank Brain is a machine learning system that Google Uses to filter results and select relevant pages based on what users search for. Now a days more and more people are using mobile devices to search, and Rank Brain is designed to understand speech and give results that closely match what users are looking for. In the Past , before Rank Brain was developed Google's algorithm was coded around several hundred parameters by its engineers had chosen. The algorithm would rank pages based on what the scores were for each of the parameters. 

This algorithm was not capable of learning new information by itself. In other words, the algorithm would not change - unless Google's engineers changed it. Google's algorithm was not capable of learning anything new by itself. But Rank Brain might be changing all of that. Google has released hardly any data on how this works  - But one thing is clear from what they have made public. This system uses artificial intelligence and is capable of learning.

It is said to be able to interpret language almost in a way humans do and can make very intelligent guesses about what users may actually be looking for. Computer are not designed to understand human speech. 

The input to computers need to be in a very structured way. Rank Brain can interpret human speech and make intelligent guesses about what a human may be looking for when they speak into their phones devices. 

So the different between Google's old algorithm and Rank-Brain is this , Rank-Brain learns and becomes better all by itself. Google does feed it data and Rank-Brain might also be able to learn by all the content in Google's index. 

Here One thing is very clear: As Google Starts incorporating more of machine learning into its algorithm, Search Results will start getting better and better. Google has actually said this is now the third most important ranking signal and the system is handling a very large fraction of the hundreds of millions of speech queries Google receives everyday. 

31. Google Penalties and the Link disavow tool

Google is a company with extremely smart engineers and their algorithm is very good. But Still, there are spam websites that get through their filters. The Google algorithm, while being extremely good, still cannot catch 100 % of all spam. 

So they do have human who check the results manually, looking for sites that violate the webmaster guidelines but have slipped through their filters. When they change upon a page that's ranking higher than it really should be, they penalized the site. 

There is no telling what the penalty might be. The site may be indexed or simply demoted which means it would fall in the SERPs , but would still remain in Google's index. 

Legal Reasons :

A manual penalty can be levied on a site for several legal reasons, Content can also be de indexed for legal reasons. If a court orders Google to remove a certain result, it will have to comply. 

Malware Virus or Trojan:

If Google finds a certain site is infected with malware like a virus or a Trojan, it will be removed from the index. 

Spam Report:

A site's competitor can also file a spam report against a site they suspect is using black hat methods. In that case, the site will be reviewed manually and if something is wrong it will be hit with a penalty. Manual penalties are different from panda and Penguin penalties. 

Panda and penguin penalties are algorithmic, with Google reviewers - Humans - Levy manual penalties. It's important to know how to tell the difference.  You should be able to tell whether a certain site has been hig with an algorithmic penalty, or whether a manual penalty has been lived on it. 

If your site receives a manual penalty, You will have a message in your Google Webmaster tools. You will need to address the issues specified in the warning and then file a re consideration request. Panda and Penguin penalties are algorithmic - and filing a reconsideration request will not be of any help. You will want to removing all unnatural links and disavow those you cannot remove using the link disavow tool. 

Penguin Penalizes sites with unnatural links and disavowing these links using the disavow tool might help. But there is no telling how long it will take to lift a penalty. 

Panda Penalizes sites with low quality content. The Disavow tool will not be of any help with Panda penalties as the penalty has nothing to do with links. The disavow tool can help if a site has been penalized due to unnatural links. This can help with manual or penguin penalties. 

Google adds an invisible no follow to any link that you disavow using the disavow tool and those links will not harm your site anymore. To Use the this tool , you will need to log in to your Google webmaster tools/searchconsole. 

Now you will need to create a text file and add all of the links to the site that you want to disavow. This text file needs to be in a certain format. 

If you have lots of Spam links, do not try to disavow individual links from domains disavow the entire domain. Here is the format for this file. 

copy form below this line
___________________________________

domain:domainname

***don't prefix http://orwww***
_______________________________________

For example if you have a site link like www.abc.com you will create a file like this 

#disavow links from spammy sites
domain:google.com

We will not use www or https:// etc. 


Many people include a lot of unnecessary test in the file. Maybe they hope to influence the reader in some way. Remember including any text file can be caused to rejected the request. So do not try to include any documentation , a story or any expiation like that. The software that read this file expects the file to be in a certain format.  

Any thing else is rejected , No person will be reading this file. It's a software that reads this file. So file need to be a text file with standard format. Without this standard if you submit Microsoft Word or Spreed sheet file will not work. Once you have created the file , you will need to click on disavow links. A warning message will comes up . Click again now choose the file and click submit button. 

Question is that why should you know this stuff? Well, there are many people who can try to harm a site by pointing a lot of Spam links at it. 

If that happens to your site and you see links in your webmaster tools that look Spam, You will want to know how to handle such a situation. Hope all this information will help you in this view. 

31. What is Cononical URLs 

Search Engines might see and non www version of a website as separate websites. This was a major issue many websites used to face in the past. This can happen even now unless you handle it. This can cause serious duplicate content issues. This can seriously hurt your chances of ranking well in the SERPs. For example if you site has four links like 

www.abc.com
www.abc.com/index.php
abc.com/index.php
abc.com

Such situation will be treated as distinct pages by all search engines. If you were to types in each of these URls into Your browser's address bar and all comes up with different pages . Then Search Engines most likely see these are different pages as well which may cause serious duplicate content issues.  

Usman thing and research for your blog in this view. 

32. Thin Affiliate Sites. 

Google wants every page in its index to add value to the web. And it does not like pages that are created only to earn money form ads or form selling products. There are several kinds of thin content. 

There are software that can scrape others people content and create website with thousand of pages in just a few minutes. Each of these pages are laden with ads for various products and services. Needless to say, Google hates them with a passion.  Doorway pages are another kind of thin content. The aim of door way pages is not to provide quality content that users may be looking for but to drive thme to another site as soon as they land on the page. 

They often crowd the search results for specific search phrases , aming to maximize the traffic they can siphon off from the search engines. Some websites do not resort to using doorway pages but fill their sites with pages that are almost all copies of each other, with very little unique content. Suppose a real estate marketing company create a website with separate pages for every city they offerd their services in. 

With the only difference between the different pages on the site being the city name. Those pages would not be adding any value to the web in Google's eyes. And then there are thin affiliate sites. These are sites that just grab the merchants data feeds and use that to populate their sites. The reason this is problem is that there my be hundred or even thousands of affiliates for every merchant - all doing the same thing.

They are all creating websites with the exact same content because they all use the same feed from the same set of merchants. Each of those affiliate sites are just grabbing the same feeds from the merchants and putting them on their sites. That's create a lot of duplicate content and none of these affiliate sites add any Value to the web. 

There are also affiliate sites that are filled with content purchased from article banks, spun content, or even just scraped content or syndicated content. these have no unique content of their own. 

The only aim of such sites to get as much free traffic from the search engines as possible by having 100s or even 1000s of pages on their sites none are these pages are unique and generating affiliate revenues from that traffic they may be getting.  

They copy content from other sites , put several affiliate links on each page and then hope they will be able to earn money form such sites. 

Google does not hate affiliate sites. What it does not like is affiliate sites that add no value to the web. It wants all websites - affiliate sites included to add value to the web. 

Google wants to see genuine reviews usable information , pros and cons of various products and so on. For this a good example would be any amazon page. Each product page has a detailed product description, images and scores upon scores of reviews form real users who actually purchase the products. Each amazon page provides lots of useful and unique information about any products. Google watn to see that kind of content . 

Instead, if a site scrapes content or grabs a merchant's data feed and create pages of if , that kind of content is not unique and in many cases may not be useful either. On the other hand, an affiliate site with an original description, unique images or videos a detailed product review. These kind of affiliate sites provide useful information that users may find helpful in making an information buying decision and google values them. 

Affiliate sites that provide value perform well in Google. You would want to add your own insights, detailed product reviews, analyses , research videos and images stuff that makes your contet very helpful to users. 

You have to add such content that your user liek and book mark it and content like which excite them. That's the kind of Pages that can rank high in Google. 


32. Link Cloaking, iFrames and URL Shorteners.

Cloaking or domain cloaking is a black hat method where content presented to the user is different from the content presented to the search engine spiders.  Doing this will get a site banned by the search engines. Link cloaking, on the other hand is a method were affiliate URLs that are very long and unsightly are made shorter and prettier. It also refers to various methods used to hide affiliate links both from the search engine and site users. 

Some People use URL shorteners like Goo.gl ,  tiny url and other to shorten their affiliate links. You would not want to use one of those URL shortners to shorten your amazon affiliate URL. 

You would not want to use any plugins like pretty link Creator or any URL shortner . This is against the amazon terms of service and you would not want to break of service. 

There are other, much more sophisticated methods people use to hide their URLs from search engines and humans. Frames and subpages within webpages and they are used to hide affiliate links.  You can use the iFram HTML tag to create frames. 

Using iFrames is a very sneaky technique that can be used to fool unsuspecting people. If someone uses this method to cloak amazon affiliate links, they will be banned from the program. You would not want to use any such method or a variation. There are a lot of software that promise to help you boost your affiliate earnings by using such very clever methods. You would want to stay away from them. 

Do not attempt to hide your amazon affiliate links form your site visitors. Amazon does not want their affiliates to hide the affiliate links in any way. Remember amazon has an in build URL shortener. If you want to shorten Your affiliate links, You would want to use their URL shortener and not any third party software. Frankly, You would not want to shorten Your amazon affiliate links, that might affect Your CTR. 

33. What is Broken Links mean how to improve it. 

As far as possible, there should not be broken links on any page of your site. This one does not get a lot of attention form many SEOs. 

Every page on your site should link to several very relevant, high authority external sources. But over time, site go offline , pages move sites get restructured. So broken links are bound to occur over time.  

Make it sure as far as possible there should not be any broken link on any page of your website. 

Its essential that every page of your site link to several high authority and very relevant external web pages. But over time, site die, pages are moved and site are restructured. So the page you linked to may move or even just disappear. 

Broken links are natural and are bound to occur over time. But a page that has  a lot of broken links does not provide a good user experience that clearly meaning that the page is not actively maintained. 

So pages with a lot of broken links tend to fall in the rankings. You can find the broken links  if any on your blog or site by using Google search Console or webmaster tools.

 If there are any broken link on your site you can go to Search console and than Crawl and than Crawl errors it will show you if you site have any broken link. Every few month you have to run this check and see if your site has any broken links 

35. Domain Registration Length

Does it matter how long your domain registrant length is ? 

There are a lot of SEOs who believe it does. many who believe it does not. Does the age of your domain matter ? Most of SEOs expert believes it does , and some believes it does not really matter. But you would to err on the side of caution and register Your domain for at least 3 years. 

It is possible that domain registration length is a signal Google looks at. Google looks at hundreds of signals , and it's very possible that domain registration length is one of them. Most of the highly trusted brand online register their sites for at least 3 to 5 years. 

Most companies who are serious about their websites register their sites for 3 year or more and many of the highly reputed brands online register their domains for as long as 10 Years. 

Even almost all spammers who build churn and burn sites register their sites for one year. Most of PBN domains are registered for one year. 

These people are worried their domain might get penalty within a year and so would not want to risk their money by registering their domains for more than a year. All spammers almost never ever register their domains for 3 years. The methods they used are very risky, and they know they are certain to get a penalty sooner or later.

If the domain does not get penalized within a year, they renew it. If it does get penalized, they let it expire. That's how they operate and Google Know all this. It might also matter where your site is hosted, most of the spam sites are hosted on shared hosting. 

A lot of PBN domains are hosted on SEO hosting. This does not mean Google sees all sites on shared hosts as spam sites. It's widely believed that Google looks at the neighborhood of a site. Like a site that are hosted on the same server, sites that link to the site and the sites taht are linked to from the site. So it would be good to register Your domain for at least 3 years. 

It doesn't hurt to host your site on a shared host but once Your site starts generating revenues, you might want to move it to a cloud host or a VPS. That would be good for your site speed as well. 

36. SEO Expert Checklist about Things that can hurt Your site. 

Keywords stuffing / Over optimization 

Using your focus keyword too many time in the copy will not work. It's better to err on the side of caution and under optimize your content. In this case your website be slapped by Google with an over optimization penalty. 

Exact Match Domain EMDs

Google Does not penalize Exact Match domain just because of the domain name.The Google EMD update that Happens in Sep 2012 targeted those sites with Low quality content. So building links to an EMD is like walking a tightrope. 

Building an authority site around EMD is not easy the domain name kind of limits how big your site can grow. 

Using In Multiple H1 Tags

Using Your keyword in multiple H1 Tags is another very bad idea. This used to work very well in the past, but is not regarded as a good practice now. Google does not penalize a page just because it has a few H1 tags. But some people have their focus keyword in every H1 tag and have several such H1 tags on the page. This is a big No-No! It is not a good thing for having several H1 tags on Your web pages. 

Trust me it is a horrible idea to have several keyword laden H1 tags on a single page. 

For any site that displayed several posts on the front page, it might be alright to have 10 to 20 H1 tags on a single page. All of the post are on different topics and are Un related to each other. 

But on affiliate site the once we are building, All of the pages are very closely related topics If you really have to use multiple H1 tags on a page for some reason, You would not want to use CSS to style all of them to look like normal text. Multiple H1 tags by themselves do not invite a penalty but abusing the H1 element sure does. 

Re ciproal Links

If I link for you and You link for my web pages not work anymore. Too many reciprocal links can get a site penalized now. 

On Site Duplicate Content:

All duplicate content is not bad. Syndicated content does not get penalized. But on site duplicate content can hurt a site's ability to perform well in the SERPs. 

Grammatical errors and Misspellings. 

Search engines do not penalize site with misspellings or grammatical errors. But if the site has too many errors of this kind, they might conclude the site does not deserve to rank well.  Targeting misspellings of high traffic keywords used to be a good SEO technique. You could get a good deal of traffic by targeting misspellings and using misspell words in your title tag and H1 tags. But you would not want to do that now. 

Duplicate title tags and meta descriptions

Duplicate title tags and meta descriptions is an other kind of on site duplicate content and can hold a site back in the search engines. 

Link Cloaking, URL Shortners and iFrames. 

Hiding affiliate links is not something Google Likes but this is not something that Google Penalizes it. But hiding amazon affiliate links can mean getting banned from the amazon affiliate program. 

Thin Content 

The panda algorithm was created to punish sites with thin content that adds to little value. Too much useless content on a site can get the site penalized. 

Ads above the fold.

This is another bad practice and can result in a google slap. 

Buying links on public blog networks :

Google Penguin goes after sites selling links - and sites buying links are not spared either. Google frequently takes down entire link networks for indulging in shady practices and sites that have links from such networks get hurt too. You would not want to sell links on your blog either. It may be tempting not to try selling links when your site becomes popular but you would not watn to sell links on your site. 

Not having an XML sitemap:

Not having an xml sitemap might mean not all of your pages set crawled. This can be issue with the bigger sites with a lot of pages. 

No anchor text diversity: 

Too many links with exact match and partial match anchors will most certainly hurt the ability of a site to perform well and might penalty as well. 

No linking to relevant, to High Authority Sites:

If you link to several very relevant, high authority sites then Google Starts seeing your own site as possibly being important in the niche. 

Broken links 

Broken links are a sign of a site that's not actively maintained that can be a bad thing.  

No comments:

Post a Comment