December 17, 2012

Density of back link’s keywords

At the init of previous month I was in holiday in Namibia, spending my cash (before departure somebody violated my credit card) in my preferred entertainment. When I was back, I found that the toolbar pagerank contains new values. I made a quick control to see how my sites were passed this exam, and I ascertain 18 sites with PR5, and others (I have it about 85 in this moment) between 3 and 4. A nice satisfaction for the commitment I put in. But I already known that this success will not be respected in the SERP, and unfortunately I had right. But finally I think that I know where is the catch.

All the measures and procedures undertaken in past period by me didn’t give any certainty that I’m on the right way and I could not convey much desired traffic from search engines. I continued with the tests, verifications and checks, but also with research in the forums. In conclusion I think that I figured out where is the cause of my trouble. To tell the truth I already knew the concept, but I missed some parameters.

With the upgrade of Panda (I begin to hate that animal, small, ugly and lazy) in April 2012, Google introduced the penalties for sites that are too much optimized. There are two kinds of optimization, internal and external to the same site. The inner one mainly refers to the construction of site and concentration of the keywords on the page, while the outside is seen as the density of keywords with which the page in question is linked in. I guess the concept is as follows:
  • Let's say that the limit accepted of keyword density is about 4%, and if you go over a penalty immediately triggered. Google divides the actual percentage to the limit, and thus obtain the coefficient of inside optimization. The maximum is 1.
  • Considering all keywords with which the site is linked, the percentage for each one is calculated. Assuming that all the links use the same keyword, you will get 100%. Dividing with the physical limit, 100%, is obtained in this case an external optimization factor par to 1. It would seem that beyond the value of 0.5 (ie 50% - in the past I considered 70%, and that is my mistake) take a punishment automatically on this item alone.

Moreover, multiplying or adding these two values gives a number of unitary page. If this number exceeds a certain limit, I have no idea about this entity, take the punishment and the page moves back in placement. My niggles are backlinks and I'm trying to dilute the density of them.

Sometimes one is not really sure if he has been assessed a penalty or other worked better and they are in front for the placement. An aid to understanding the situation is visit, which existence I discovered recently. It 'a branch of Google that does not apply Panda filter. If your site there is much more advanced in the rankings than on Google itself, you can be almost certain that you are castigated and your life will become very hard in your attempts first to understand the problem and then to overcome it.