December 17, 2012

Density of back link’s keywords

At the init of previous month I was in holiday in Namibia, spending my cash (before departure somebody violated my credit card) in my preferred entertainment. When I was back, I found that the toolbar pagerank contains new values. I made a quick control to see how my sites were passed this exam, and I ascertain 18 sites with PR5, and others (I have it about 85 in this moment) between 3 and 4. A nice satisfaction for the commitment I put in. But I already known that this success will not be respected in the SERP, and unfortunately I had right. But finally I think that I know where is the catch.

All the measures and procedures undertaken in past period by me didn’t give any certainty that I’m on the right way and I could not convey much desired traffic from search engines. I continued with the tests, verifications and checks, but also with research in the forums. In conclusion I think that I figured out where is the cause of my trouble. To tell the truth I already knew the concept, but I missed some parameters.

With the upgrade of Panda (I begin to hate that animal, small, ugly and lazy) in April 2012, Google introduced the penalties for sites that are too much optimized. There are two kinds of optimization, internal and external to the same site. The inner one mainly refers to the construction of site and concentration of the keywords on the page, while the outside is seen as the density of keywords with which the page in question is linked in. I guess the concept is as follows:
  • Let's say that the limit accepted of keyword density is about 4%, and if you go over a penalty immediately triggered. Google divides the actual percentage to the limit, and thus obtain the coefficient of inside optimization. The maximum is 1.
  • Considering all keywords with which the site is linked, the percentage for each one is calculated. Assuming that all the links use the same keyword, you will get 100%. Dividing with the physical limit, 100%, is obtained in this case an external optimization factor par to 1. It would seem that beyond the value of 0.5 (ie 50% - in the past I considered 70%, and that is my mistake) take a punishment automatically on this item alone.

Moreover, multiplying or adding these two values gives a number of unitary page. If this number exceeds a certain limit, I have no idea about this entity, take the punishment and the page moves back in placement. My niggles are backlinks and I'm trying to dilute the density of them.

Sometimes one is not really sure if he has been assessed a penalty or other worked better and they are in front for the placement. An aid to understanding the situation is visit googleminusgoogle.com, which existence I discovered recently. It 'a branch of Google that does not apply Panda filter. If your site there is much more advanced in the rankings than on Google itself, you can be almost certain that you are castigated and your life will become very hard in your attempts first to understand the problem and then to overcome it.

July 13, 2012

Continuous Updates

In my previous mental rumination I have described my problems with my site frowned upon by mister G. It's been 6 months since I made a freeware available to my visitors, enriching the content and the service offered, hoping to improve the image of my website in the eyes of the inspectors, but for now there is not any improvement. In fact I have no way to know if someone has checked the content and I don’t know is it possibly to force the verification. Meanwhile, another of my site has deteriorated: from habitual second or third position on the first page of search results, I went first to fifth and then sixth. With this change, I run down from nearly 1000 visitors from Google per day to about 300: bad and very daunting. I started to study and still one of the main indications that I have found is to have a dynamic site that changes content often, and which is updated constantly.

But I'm not a professional webmaster and stand behind this job every day is not my desire, because I’m a bit lazy and also for the lack of free time. I have reflected well on the problem by trying to read a solution for me and I found a way. On the home page I have included an SSI (server side included) to which I have associated a small program wrote in Perl programming language that daily changes one paragraph on the page. I set a text file where each line is a paragraph and when I find a little desire to enrich this file in order to have a richer content and less repetitive. In this way, the paper is different every day, even if the content is repeated after a while because now I made only 16 sections. This also has another purpose, to force Google to visit my page more often and to have Google cache of my page always very fresh.

I introduced this innovation about 2 months ago and until today I didn’t see any material consequence, and I’am about to make another change. I noticed also that the distribution of my keywords is not always uniform in the text, which is a bit at the beginning, middle and end of the content, as recommended by most experts. So I have to work for a good remedy to this defect. Lately I have found another piece of information that could be valuable: some years ago a minimum of text needed to be clearly seen was about 300 words. This amount has increased since there and I try to keep my content to about 400, even 550 words, but some sources claim that the figure has become about 900 entries, or even 1100.

It seems to me a stretch since the amount often comes at the expense of quality. Getting lost in the maze of language to say that white and black is the opposite is useless, but if Mr. seeker wants I will try to accommodate him. Even this assumes a lot 'of work and time spent, so I choose two or three sites to try to verify the usefulness of this parameter. You can see from this post that I'm forcing the language to stretch it, a little training for the more general task that awaits me.

Bad things are happening: in the world, the economic and financial crisis spreads each day more, while I’m in a constant battle with a virtual things, it does not physically exist, it is only a piece of code, miserable two bytes I am trying to understand and discover.