April 10, 2014

Webmaster in vacation

In this period there is the lack of news regarding my webmaster's job so I decided to tell you a story, related to the issues present in this blog. Here it is!

A successful webmaster, became rich thanks to his job, decided to take year off from the work. He always wanted to go around the world in a luxury yacht. There was no need to buy it; he decided to hire one. Obviously he couldn't leave his business without monitoring it and, therefore he took with him the laptop and for remote areas in the middle of ocean, where internet connection was impossible, he bought the coolest model of satellite phone present on the market. It was really expensive, the phone and the connection. Luckily he hadn't financial problems.

The journey started from the west coast of the United Stated. In those two month, everything was relaxing, he had no rush. Actually, he was happy about his decision to take some time off and elated about his new page ranking. He realized how productive was to work in such a beautiful atmosphere. Thanks to his new spiritual energy, also the creativity became more powerful. He was in the middle of a Pacific ocean when his yacht was overturned by a tropical storm. Immediately he understood there was no other way than wear the lifejacket and so he did. That was the last thing he remembered. When he opened his eyes, he was laying on the white sand, so he realized he was in the remote island in the ocean.

He lived on the island struggling against hunger, weather conditions, the nights were cold and days hot, but he had hope that one day, someone would have found him. After six months the hope was becoming weaker and he was desperate. Since he was a rational man, very aware of his situation, he decided to build a hut and protect himself from the weather events. One morning, he woke up and he saw in front of his eyes, a beautiful, tanned, blonde woman. She explained the man she was a castaway, on the other part of the island.

After two hours of walking, they reached the hut built by the woman and inside everything was very luxurious! There were no branches and palm leaves but metal and plastic panels. She also had a fridge, a gas cooker, table, chairs and many of kitchen tools. In the bedroom there was a bed with mattress, wardrobe and behind the house there was a diesel generator that produced electricity. She explained to the man that all the stuffs were recovered from the ship on which she was traveling, sank two years earlier.

The man, amazed by this unexpected miracle, took a shower, ate something and took a nap on the hammock under the palm tree. When he woke up, the woman was standing in front of him, completely naked. She told him:
- Now you will get what you dreamed for the past six month alone on this island.
He, excited:
- Really? - I can't believe it! You have also an Internet connection?

October 31, 2013

Unfair rules

The summer has passed and the days are getting shorter and colder, so much of us will turn back to work with new enthusiasm. During the summer I did two separate vacations of two weeks: one travel in China and Tibet and one holiday on the see, in Croatia, under beach umbrella, enjoying clean sea and good Croatian beer called “Ozujsko pivo”. So I loaded my batteries for this autumn and the coming winter.

When I was back to home, I found a very bad surprise: two domains, from where I had a lot of links were closed; the webmasters didn’t renewed them. So I lost few tens of links, and this notice means that I must work very hardly to recover them. This happened rather often. As in the life, neither on Internet there are fix points on which you can bet forever. Due to my SE positions, there were no much news; a lot of my pages were still penalized, and I want to say something about the mode in which Google apply penalties.

My modest opinion is that each webmaster has full responsibility for his own site; for the content and the link present on the pages, but can’t has any obligation versus the factor that are external to the same site. For some times I hear the voices about Google’s penalization relative to the bed incoming links. I always refused to believe in that, but my sites want off, and finally I decided to verify if there is truth in that speeches. Two of my sites had very bad positioning in SERP and I had nothing to lose, so at April I removed some links that could be considered wicked. After 3 weeks I noticed improvements of my ranking. I was happy for that improvements, but very unhappy for the discovery I did.

To have a final prove of this fact, now I should try to link one or two sites with some wicked link and be assured that their position go down. And this is the main problem, because this interpretation will take us in a global war against our concurrent. It wouldn’t be enough to simply neglect eventually bad links, and not operate penalization? In that mode nobody can’t hurt other sites, and incorrect webmaster would only waste their time trying to earn such kind of connections, that don’t give any rate.

Last news from Google

Google’s speaker Matt Cutts said this days that Pagerank Toolbar will not be updated this year, and avoided to respond to the question about next year update. This notice immediately caused some webmaster to conclude that the Google ranking of pages is dead conceptually, and this suspect turn back frequently. But one can draw a different conclusion; that Google want to hide it’s really intention, and to give in future bigger value to this parameter, without letting know to webmaster the real worth of the sites, leaving visible on Toolbar old Pageranks.

April 21, 2013

Disappointments continue

Thanks to my own activities on Internet, well, actually more the past ones then the recent ones, I don't feel the European and the global financial crisis. I bought my own apartment, the car (even if from 1999 it's still functional) and I have some savings in the bank. Recently the revenues of my sites declined but I'm not complaining about that, my past work allows me to live well. A few month ago I realized problems my sites encounter but I'm trying not to give too much importance to the thing. I love my job and I'm grateful for that, I don't think excessively to the final purpose, I mean the money.

One of my activities that continues without interruptions is the research of the new backlinks, from domains never used before. Two days ago I found a great source, useful to reach thousand of links, that have a pagerank with a certain value. I firmly believe backlinks together with pageranks will have again the importance they had in the past; actually it seems to me this is the only possible way to distinguish between different sites. The devices used by Google in the last year seem to me not so punchy and they don't improve the research results.

Recently I discovered the penalties inflicted to those sites who have advertise banners on the top of the screen. According to one of my analysis it seems Google apply sanctions even if on the head of the page is present a big image. The excuse for doing this are the presumable complaints from users that don't like to scroll down the screen to get to the topic of their interest. There are sites where the user don't have to drag down the scrollbar because all the information he needs are already visible, in another case, doing a little physical effort, you can find what you were looking for. I think there is a little bit of mess in all that.

Not being able to reach the visitors arriving from the best search engines, I try to get them from the other resources, for example, from my sites that don't have sponsors or any kind of economic advertising, they have a small number of visitors but also some decent SERP placement. It's clear that users who visit those pages are not looking to buy something, in any case sometimes they can be a source of income; it's like building a house, a brick above the other and at the end of the work you have your own home.

Another method I use is to subscribe in top lists and try to be in first positions helping me by manually clicking, sometimes even using proxies. I admit, it's not correct but as the Bible says "who is without sin cast the first stone". Am I right? Everything I do is to be present on the first page of the most important search engine in the world, waiting to be judge positively and be able to keep my life style the same as it was in the past years.

December 17, 2012

Density of back link’s keywords

At the init of previous month I was in holiday in Namibia, spending my cash (before departure somebody violated my credit card) in my preferred entertainment. When I was back, I found that the toolbar pagerank contains new values. I made a quick control to see how my sites were passed this exam, and I ascertain 18 sites with PR5, and others (I have it about 85 in this moment) between 3 and 4. A nice satisfaction for the commitment I put in. But I already known that this success will not be respected in the SERP, and unfortunately I had right. But finally I think that I know where is the catch.

All the measures and procedures undertaken in past period by me didn’t give any certainty that I’m on the right way and I could not convey much desired traffic from search engines. I continued with the tests, verifications and checks, but also with research in the forums. In conclusion I think that I figured out where is the cause of my trouble. To tell the truth I already knew the concept, but I missed some parameters.

With the upgrade of Panda (I begin to hate that animal, small, ugly and lazy) in April 2012, Google introduced the penalties for sites that are too much optimized. There are two kinds of optimization, internal and external to the same site. The inner one mainly refers to the construction of site and concentration of the keywords on the page, while the outside is seen as the density of keywords with which the page in question is linked in. I guess the concept is as follows:
  • Let's say that the limit accepted of keyword density is about 4%, and if you go over a penalty immediately triggered. Google divides the actual percentage to the limit, and thus obtain the coefficient of inside optimization. The maximum is 1.
  • Considering all keywords with which the site is linked, the percentage for each one is calculated. Assuming that all the links use the same keyword, you will get 100%. Dividing with the physical limit, 100%, is obtained in this case an external optimization factor par to 1. It would seem that beyond the value of 0.5 (ie 50% - in the past I considered 70%, and that is my mistake) take a punishment automatically on this item alone.

Moreover, multiplying or adding these two values gives a number of unitary page. If this number exceeds a certain limit, I have no idea about this entity, take the punishment and the page moves back in placement. My niggles are backlinks and I'm trying to dilute the density of them.

Sometimes one is not really sure if he has been assessed a penalty or other worked better and they are in front for the placement. An aid to understanding the situation is visit googleminusgoogle.com, which existence I discovered recently. It 'a branch of Google that does not apply Panda filter. If your site there is much more advanced in the rankings than on Google itself, you can be almost certain that you are castigated and your life will become very hard in your attempts first to understand the problem and then to overcome it.

July 13, 2012

Continuous Updates

In my previous mental rumination I have described my problems with my site frowned upon by mister G. It's been 6 months since I made a freeware available to my visitors, enriching the content and the service offered, hoping to improve the image of my website in the eyes of the inspectors, but for now there is not any improvement. In fact I have no way to know if someone has checked the content and I don’t know is it possibly to force the verification. Meanwhile, another of my site has deteriorated: from habitual second or third position on the first page of search results, I went first to fifth and then sixth. With this change, I run down from nearly 1000 visitors from Google per day to about 300: bad and very daunting. I started to study and still one of the main indications that I have found is to have a dynamic site that changes content often, and which is updated constantly.

But I'm not a professional webmaster and stand behind this job every day is not my desire, because I’m a bit lazy and also for the lack of free time. I have reflected well on the problem by trying to read a solution for me and I found a way. On the home page I have included an SSI (server side included) to which I have associated a small program wrote in Perl programming language that daily changes one paragraph on the page. I set a text file where each line is a paragraph and when I find a little desire to enrich this file in order to have a richer content and less repetitive. In this way, the paper is different every day, even if the content is repeated after a while because now I made only 16 sections. This also has another purpose, to force Google to visit my page more often and to have Google cache of my page always very fresh.

I introduced this innovation about 2 months ago and until today I didn’t see any material consequence, and I’am about to make another change. I noticed also that the distribution of my keywords is not always uniform in the text, which is a bit at the beginning, middle and end of the content, as recommended by most experts. So I have to work for a good remedy to this defect. Lately I have found another piece of information that could be valuable: some years ago a minimum of text needed to be clearly seen was about 300 words. This amount has increased since there and I try to keep my content to about 400, even 550 words, but some sources claim that the figure has become about 900 entries, or even 1100.

It seems to me a stretch since the amount often comes at the expense of quality. Getting lost in the maze of language to say that white and black is the opposite is useless, but if Mr. seeker wants I will try to accommodate him. Even this assumes a lot 'of work and time spent, so I choose two or three sites to try to verify the usefulness of this parameter. You can see from this post that I'm forcing the language to stretch it, a little training for the more general task that awaits me.

Bad things are happening: in the world, the economic and financial crisis spreads each day more, while I’m in a constant battle with a virtual things, it does not physically exist, it is only a piece of code, miserable two bytes I am trying to understand and discover.

November 11, 2011

HTML elements and text

While the pagerank influents search results much lesser than few years ago, after the past post I focused my work on other important aspects for placement in search engines. The last update was done by Google at the end of Jun this year, and in that time I was in vacation, two wonderful weeks in South Africa, that I paid with the money I earned with my work on Internet. Back home, I found an electronic letter of a friend who warned me of what happened, namely that Googhi has updated toolbar pagerank. A quick browsing of my sites has given me great pleasure; the new pagerank were substantially better than previous ones. I got so far as to have 5 sites with PR5. But since the final aim is not the pagerank, I went to check my positions in search results. What a disappointment: most sites have lost many positions, in some cases going from the first to the fourth page of SERP.

Several days later I noticed that almost all the pageranks are went back to the old values. I had to say bye, bye to all of my five PR5. What the hell is going on? Perhaps those in the forum have had some light, they understand something that I can’t see. But nothing even there, only other confusion. In fact, even in the forums I found a gloomy atmosphere, cold and despair, because no one understands what happened, or do not want to reveal useful information. Hundreds of hypothesis on how to proceed, what to do, but no certainty, no confirmation on existing facts, that one approach is better than the other. At the end I was only with myself, and I have been busy comparing the first places for different keywords, trying to find out the elements, the methods to be introduced to strengthen my web. Without going to explain how and why, here are two sagacity that I am currently applying, hoping to see a positive outcome.

I remembered a conversation of some time ago with an acquaintance who insisted on the importance of the various HTML elements on the page. In fact, looking at some well-placed pages, I seem to have had some confirmation of this. So I try to introduce new html tags in my pages, such as UL, EMBED, CITE, and so on, trying to get richer the pages’ encoding. This means even more complicated for me. I personally do not like this because I believe that beauty is in simplicity, but I must try to adapt to the situation if I want to do some other nice trip, or long vacation.

Due to the content of the page, with the usual precaution to have the presence of certain keywords, I try to repeat as little as possible other words, using very much synonyms, thus having a greater linguistic diversity. I remember at school that the teacher of English always recommended to avoid repeating the same words. I made a small program in Visual Basic that does a count of words within the text and I noticed that in some cases I repeated the words that had nothing to do with my main keywords. The worst thing is that those words had the density very near to the density of my keywords. Therefore, as in finance, the diversification of coding elements, the words in the text and content of the site are very important.

I forgot! You know that the today’s date should be very lucky: there are 6 number 1 there. At least Chinese think so. Good luck to all of you.

May 19, 2011

Googleations

I'm crazy for new inventions in the language field and I was very happy when I got the idea for the title. But someone else has already had this illumination? Easy to check with the gentleman named in the heading. The fact is that have been so many before me to have this inspiring, but anyway I'll keep it because I swear it is the original product of my brain. These days we are more or less one year from the great renewal that Google has made ​​in its algorithm and we can sum up the result. There are two main aspects to consider, our point of view, webmasters who are trying to place our masterpieces on the first page and the experience of other navigators who are simply seeking to obtain information necessary for their work, or even better, for fun. Webmasters have digested the changes and have adopted the new rules. For some it went well, but many still have not been able to reach levels of success they had yesterday and unfortunately the gay that writes this columns is among them.


But for the surfer has gone even worse. An American magazine specializing in the topic of Internet searches has published a survey carried out among the common people and network users, those who do it professionally and the conclusion was catastrophic. Google is no longer the best search engine and the quality of search results provided are worse than during precaffeine, as they called the new version introduced last year. To verify this news I personally started to do some searches and I was remain really disappointed with what Google showed as the best sites. Looking for "online forex" on one national Google, from the first 10 sites listed, 4 did not give any useful information. Indeed, in the first place was a post from one forum containing only one sentence, that is without substance, and with a pagerank 0. Very surprising. Google's management have noticed what was happening and being afraid of losing customers, let’s remember that there are a lot of money from advertising, announced that they are reviewing the algorithm deeply and as quickly as possible they will try to improve it and give to the end users a good product.

For those of us who work around the world this changes promises a new, hard work, first of all to understand the modifications that will be introduced and then apply the appropriate steps to achieve the goal of being present on the first page. There will be so many sleepless nights, a lot of caffeine to keep us awake and new frustrations by seeing our sites slip 10, 20 pages back. But this is the Internet and you have to adapt. One of the ways to go out successfully, without excessive suffer this continuous changes is to make various sites in a different way, by applying a policy of diversification of contents and backlinks, so you can have different profiles and hope that one of them is already well optimized for changes to come.

November 17, 2010

SEO parameters - second part

Yes, it's been a while since I wrote my last post but during the summer I prefer to spend time in nature, walking and riding the bike, instead of hanging on the computer. Summer is already gone and it’s time to return to work. The first article was posted in April and those month something important have happened for all the webmasters. Google has introduced Cocaine. Yes, you read right, the new version isn’t called Caffeine, as they want us to believe and I have the evidence to prove all this. The ranking the new version gives, compared to previous ones, can only be produced by an entity heavily doped: you don’t understand how it works. Or at least it was so early, a shock for those involved in this sector, but now the webmaster begin slowly, slowly to understand the new rules and adapt to them. The changes are a lot and in many cases it will take a lot of time and effort to get back on earlier positions, but this is Internet, characterised by its dynamism, many changes, today you are in and tomorrow you are out. But let’s go back to our theme.

Often the second part of a movie or a book has the same title as the first but with the addition of the word "revenge." Therefore, the title could be SEO parameters, the revenge of SE. In fact, in the years to come algorithms became more and more complicated that only few were able to understand. But the turning point was the introduction of backlinks as a parameter. Replicate a site is easy, but having the same external links is much more difficult, especially if the guy we want to reproduce has its own network of sites where he links; certainly he will not give a link to his rival.

With the birth of backlink also the market for links was born. I will pay you and you will post a link to my site on your web page. The result of this was that the strong, I think of those with a solid financial base, have become even stronger and the weaker ones, but often full of enthusiasm and great content, they are almost gone from the front pages with the results. Also Google has noticed this fact and has introduced new rules but without any significant change; it is difficult to be sure that a link has been sold, that is bought. So the internet reflects today's society very well, pure capitalism – money earn money. No despair, there are many exceptions to this rule.

One of the results of back links is also the pagerank, which is widely analyzed in these pages, which lately has become less important than once, but differently of many webmasters who believe in its death (the argument in their favor is the fact that it was not updated for more than half a year), I think it's still an important parameter.

Returning to the words used to link our site, Cocaine has introduced an innovation. The sites optimized for search engines were penalized. Too much uniformity in the keyword of the ancor tag linked to us is for example considered optimizations we mentioned above. Therefore, according to the latest knowledge in the sector, it’s important to diversify the keyword. For example, if our main keyword is "ads online," it should be dilute its presence of about 70% of cases and the remaining 30% use alternative words such as "best ads online," "buy and sell ads" etc.

There are obviously also other terms of comparison, they say more than a hundred, that Google considers, from which many lesser known, but we will see them in the sequel.

April 15, 2010

SEO parameters - first part

In this and some future articles I will talk about the parameters that affect the page rankings, especially in Google. Knowing these parameters and the ability to measure them in some way gives us great advantage because it allows us to optimize the SEO work in order to get good results. For newbies, the abbreviation SEO means Search Engine Optimization. Google it, and you’ll get over twenty seven million hits – this tells you how important SEO is. But before going any further, let me talk a bit about how I started in this business and how it was done back at the stone age of internet, more then 10 years ago.

In the late 90’s, when I started to access the Internet using an ancient 48Kb/sec modem, I had already understood how important a good placement in search engines really is, to attract quality web traffic, and visitors clicking the banners and ordering goods or services offered by sponsors. Back then, together with AltaVista, the most popular search engine was Infoseek. I spent hours analyzing the sites that appeared on the first page of results, trying to discover the mechanisms that regulate which site stands at the top of the rankings and which at the bottom. In those days the SEs were not taking in consideration the backlinks and thus it was pretty easy to discover the ranking concept – all I had to do was to analyze the content of the pages. My effort was rewarded and after two weeks of hard work, I succeeded in finding the mathematical formula that assigned scores to a web page. The formula was based on the presence of keywords in the title, text and links. It was enough for me to analyze the source code of the first placed site and I could produce the page that had better score then the analyzed page. The result was that my site became placed at the top of the search list.

The first sites that I developed and the first pennies I earned came from the sponsorship of the adult industry. Being rated first in the rankings for the word "porn" translated into a few thousand dollars a month and I often succeeded to be the first. I already thought about quitting my job and becoming a full time webmaster. After all, my regular job was paying only a fraction of what I was making on the Internet, and besides, I was my own boss and was working whenever I wanted.

However, I soon realized that there are others who had the same idea and the competition was getting fierce. I needed to work day and night in order to beat the competitors and keep the rankings. I remember days when ten or more different sites would occupy the first spot – everything was changing so fast. Soon thereafter, the people at Infoseek have begun to change the search algorithms very frequently, making everything even more complicated. For instance, they introduced limits on the density of the keyword - if a keyword, let’s say, exceeded 6% of all the words on the page, the site was penalized. I had developed a word generator that repeated the keywords as many times as required, but soon realized that this is not going to work any more.
And in order to get a high ranking, the good content was becoming a necessity, the texts became longer, and the files grew larger. The game was getting really complicated.

November 2, 2009

PageRank Sculpting

Pagerank Sculpting, what is this? How one can sculpt an unmaterial thing like pagerank? In fact we don’t sculpt PageRank, but it’s distribution within a site. All of us who have one or more sites must admit that some pages are more important than others and we would like that some of them also occur between the results of research, not just our home page. I don’t know who had this idea of modeling the site, but the concept is this: there are important pages within the site that I care about and the others I care less because they have no relevance to the search engines. The idea is to pass more PageRank from those non important to the important pages: it is a redistribution of PageRank. And how we can do it?

The technical procedure is very simple and is accessible to all webmasters. The tip is the using of the tag "nofollow" (again this famous tag). The pages that you want to ignore must be linked from other pages using that tag. In this way the internal links do not pass pagerank to those less important pages, and the available Page Rank is distributed to the most important. Easy to apply and seems also to be very useful in some cases. But is this concept works, is it correct? By my opinion no and I will go to explain my point of view.

Who is familiar with the formula for the calculation of pagerank knows that the pagerank passing to other linked pages drop by a factor of damping (this is not of our interest for this reflection) and depends on the quantity of the number of links. The originators of this technique are confused and thought: putting nofollow tag fore some links, these will not be taken into account for the distribution of pagerank and here is the error. Number of links are used as the probability that one of the links on the page is clicked by a visitor. But the visitor does not see if a link has been tagged and not change anything in its intentions.

My conclusion is that this technique not only does not help the site, but it hurts: the pages linked with nofollow lose pagerank and don’t pass anything to important pages and at the end of the story, all the site is penalized because with less pages, drains less page rank from external links that point it. To those who are not convinced by my argumentation and that in any case would like to try this technique, I can say that this is recognized as legitimate procedure directly from Goolge. If your findings are different from mine, let me to know about.

July 30, 2009

How to get good placement with Search Engines

First of all, let’s say what I consider a good placement in Search Engines. It’s simply, the best thing is to be at first place on the first page of search results, but as good position normally is considered to be at the first page. If your site is present at the second or third page, you will probably have some traffic, but insufficiently to make enough money from your little business, supposing that you use the sponsors that pay you for clicks, or more usually for sell products. Let’s remember that the best clients are the visitors that come from search engines. And we want to have lot of them, so they can click our banners and buy our products.

As you can see from various articles on this blog, the tip to reach a good placement in Google and in other search engines is very simply: have a good pagerank and lot of backlinks to our site. The both of this things are reached in the same mode, submitting the site to directories, top lists and exchanging links with other webmasters that have the same scope as you. The concept is basically very natural and simply but it’s not always easy traduce it in the practice. Some of the directories and top lists, too, are specialized in some particular themes and don’t accept sites with not adequate contents. Lot of webmasters don’t want to trade link with some kind of sites, for example with adult sites, or with those that promote online games, or high yield investments.

The solution of this problem is to create a little network, better if a big one, of own sites, dedicated to the most various themes, and very different contents. From my point of view it’s better not go too far, i.e. the themes should be ours, the things that interest and excite us, that we know something about because in that mode is much more easier dedicate free time to this sites. When we create a net of about ten, twenty sites, it becomes rather hard to maintain and promote them if we haven’t passion. Part of our net can be composed of different blogs that we can freely open with some providers and less important sites can be free hosted (however, be careful to chose a good free hosting; there are on the Internet). The most important thing is differentiate arguments and subjects.

So now we can submit this, let’s call them, secondary sites to the places that our main sites wouldn’t be ever accepted and listed because out of topic. When our secondary sites one day finally reach Google’s pagerank (I forgot to say that this secondary sites must be promoted, but it seemed to me superfluous), we will put the links that point our primary sites and the game is done. Practically we obtained indirect links from the sites that wouldn’t give us direct link.

Construction of one little network with at least 10 sites, but not more then 50 if the network is managed by one person only, is an excellent solution that can greatly increase our revenues, but requires from 2 to 3 years of work. Having such a nice network, tomorrow when we will open a new site, we can give it immediately whole of links that will guarantee a fat achieve of page rank. The better thing is that the network is our, so we can do with what we want, and we can link what we need, without any dependence from other subject of Internet life.

March 16, 2009

Google’s rules

It seems that recently Google has become more severe in appraising the sites that it visits and that applies more penalization in comparison to the past. For those webmasters that earn money working on Internet, be in a condition in which the site is penalized and it doesn't appear in the search results, corresponds to a catastrophe.

Because the basic rule to earn money with a Internet site is very simple and consists in the being on the first page for the search. No visitors from the search engines, and we know everybody that Google is used from more than the 90% of the navigators, means no money for webmasters. It’s happened to me, too, to be victim of this last wave of the penalizations, and with this article I want to revise the rules of Google together with you; this has been useful for me and I hope for you, too.

The basic condition during the process of the preparation of a site and during its promotion is to examine and respect the rules given by Google, that can be found on the following page:
http://www.google.com/support/webmasters/bin/answer.py?answer=35769.
Let’s see together the basic concepts. They are separated in three groups, and in this way I have prepared my exposure, adding something from my own experience.

Design and content


Shortly, it is advise to use the site structure with static link and that all the pages that want to be considered by, are linked from at least one other page: in this way the search engines can easily find all the resources of the site. The body text must contain the keywords, those tied to the topic of the site. Obligatorily use Title and Alt tag, and if Description tag is used, it should be at least 20 words long. There are also the councils how to organize dynamic pages, but my advise is to avoid them, when this is reasonable. Moreover, control that all the link are working and this is worth also for the connected images. The number of the link on a page would not have to exceed 100, considering the sum of internal and external links. At the end it is advised to write the important texts as text, not to insert them in the images; Google isn’t able to read the text inserted on the images.

Technical aspects of site


During the creation of the site and before putting it online, it is advised to control its “readability” using different browsers, as for example Microsoft Internet Explorer, FireFox, etc, in order to assure the compatibility and correct accessibility with all of them. It is important for the visitors, but also for the robots of the search engines, because they can interpret the same html code in a different way. I always try to produce the most simple HTML code possible, excluding Javascript, visual basic and css, where not closely necessary. These aspects contain also a series of the councils for the server settings, using the robot.txt files, and so on.


Quality requirements for the site


This part of the rules is the most important, because not respect of some of them can induce Google to penalize the website, as it declares explicitly. First of all Google insists to create your site for your visitors, and not for search engines. The fact is that if the site is enough good to attract attention of the surfers, the same should happened with the search engines, obviously if the site respects Google policy. There are the list of the “dangerous” actions that can cause the penalization or, still worse, expulsion of the site from the search results of Google. It’s prohibited use of hidden text (very used practice in the past, still remained in the memory of some webmaster), cloaking, the practice to show different pages to search engines and to visitors, duplicate content copied from other sites, and so-called doorway pages, the pages wrote only for the search engines, in order to obtain the placing for the several keywords.

On the other pages, connected to this with the guidelines, some details are explained, but the new ones can be found, too. One of this, very important, is the concept of the natural link, the fact that the site are linked spontaneously by other sites, duo to its quality. This concept produces another rule: no pay link, because buy links to own site isn’t natural. And here is another catch: also the links that link sponsors can be considered as pay link and your site can be penalized for this.

This rule is debatable, but the master (read Google) has always reason, so what can we do to resolve the problem.? Nothing fear, the solution is very simple and also confirmed by Google: it’s enough to insert the attribute rel=”nofollow” in all links that go to sponsors’ site.

February 11, 2009

More about Nofollow tag

Few days ago I was contacted by a webmaster that wanted to exchange index page link with one my site. I made a detailed analysis of his site, the thing I always do to be sure that it’s a good trade, and I discover that he has more then hundred links, about 130 if I remember well, on the index page, where my link should be inserted. One of the rule that I respect when I trade links is to not make exchange with pages with mode then 100 links (I consider together external and internal links), because this is against Google policy.

I responded to the webmaster that I’m not interested and I explained the reason. He let me know that for more then 60 links on the his page have nofollow tag, so he support the thesis that he has only about 70 links. I thought about and I concluded that nofollow tag shouldn’t influence the page rank vale passed to the linked page, because the concept is, as we can see from page rank formula, that the total number of links represent the probability that one of them would be clicked by a surfer that visits the site. So nofollow tag, that isn’t visible for a normal surfer (you must open the html source to see it), doesn’t increase the probability that other links, without it, be chosen by visitor.

I sent an e-mail to the webmaster, and I said him that I’m not interested due to my opinion that nofollow tag doesn’t change the real page rank values passed. I’m not really sure that my opinion is really right, but I put it in my know-how notes. Of course, the webmaster didn’t share my thought, but at the end of the story, we didn’t make a trade.

I want to conclude this article with ask a question that this days I make to me: where will be the next Google update of Tool Bar PR, and if Google will respect the regularity of this as it did the past year. However, Google updated at the end of January, but that update wasn’t visible on the Tool Bar.

December 31, 2008

Page Rank updated the last day of the year

This year, 2008, Google was updating very regularly, each 2 months, so everybody was expecting an update at November this year, but Google disappointed us, and lot of us webmasters thought that the regularity of the updating was finished and went back in the past, when this event seemed to be very casual. Today morning I switched on my modem and took up Internet connection, wanting to make some control about some of my new sites. I opened one of them and wow, there was Page Rank on the Tool Bar. After that, I controlled some my old sites and I noticed that there were some changes in PR, so Google was really updated it’s Tool Bar page rank. For all webmasters that work seriously this is a nice gift for the end of the year. Lets see the updates of Google this year, with some approximation: January 10, February 28, May 1, July 27, September 27 and finally December 31. Only this last one, is over 3 months.

Generally, my sites went very well with this last update. For some less important sites I lost some PR but the major part earned something, so now I have 2 sites with PR5, and that isn’t bad. I hope that this will take some money too, from my Internet business.

Hopping that this update was good for you too, I wish you happy new Year, and let your sites have good Page Rank and be on the first page of Google and other search engines. Happy new Year to all of you.

November 10, 2008

Pay attention while trading link

While you trade link with another site you must pay attention specially to one thing to avoid to be cheated. There are lot of webmaster that try to take advantage of the trade, taking backlink from you, without giving one back to you, and you can’t know if this happens if you don’t examine HTML code of the page where your link should be present. There are two most diffused technique to do this cheat.

There are webmaster who put in the head code of the page meta tag noindex for robots. This tag “forbids” to robot to elaborate the page. In this mode, when robot/spider of some search engine visits this page, he doesn’t index it, and you haven’t the link to your sites. The head meta tag mentioned is wrote like this: <meta name="robots" content="noindex"> .

Other webmaster instead use attribute nofollow in the anchor tag. This attribute says to search engine’s robots not to follow the link at which the attribute is related, and even here no backlink for your site. An example of this technique is this
<a href="http://www.mysite.com" rel="nofollow">Link to my site </a>
This concept can be done using meta tag, too, inserting the attribute nofollow. It’s less used way while all links on the page are not followed, including the inner links of the same site, and this is disadvantage for the webmaster who apply this method

When I exchange link with somebody, I always control inner code of page to see if there are some of this tags, and I make other 2 controls: I control if the page with my link is accessible from the main page of the site (often non honest webmaster hide the page from the rest of the site), and if the page is present in the Google cash: if not, maybe robot.txt prevent the page from scanning, so, in this case, I control the robot.txt file, too.