January 10, 2017

Free blog

Happy new year to all of you. Personally I hope in one better year respect the past. I had a lot of problems with my health. This is due to my advanced age: I’m not far from sixty. Nothing really serious, but rather unpleasant. A lot of pain and spent money, for doctors, analysis and medicines. But the money was the last problem for me. I still have some difficulty with teeth and my dentist didn’t discover the reason for the pain I feel. Well, I don’t want to transform this blog in health one, so let’s speak about webmastering. I work about, but slowly, without conviction. I feel that I have not enough stimulus. There are periods that I work without knowing why and what is the purpose of work. Often I do useless thinks, as an excuse that I do something, however.

To wake me up I decided to visit fair of gambling and financial industry online, in Berlin, at the second half of November, past year. In parenthesis, there I begun to have problem with teeth. The expectation was to see the news in the field made by provider, listen the experts of SEO and obtain new inspirations to find the right direction and the creativity. The first day I had a bad surprise with the organization of the fair: the opening and registrations of participants were delayed for an hour. We are in Germany, I wondered? I didn’t find known people,  the third day I had already done some knowledge. The interesting and nice people, but too tied to work; there are no other issues on which you can chat with them. The usefulness of the trip was very relative. It had more touristic value, I visited for the first time the capital of Germany, then business utility. I found a new sponsor for Forex, on the advice of my current sponsor, but the experience for now is not very cheering.

Many of us used free blogs, to secure links to sites that belong to us. I call them support. The best thing would be to have possess of own domain and hosting, but this would cost, so we go for free solutions. The problem is that not all hosting are good, for various reasons. One of the problems is that the provider will often drop the service and we have to find a new one. Sometimes the system automatically adds a nofollow tag to links that we insert and we do not like However it can be acceptable because it ensures a diversification of types of links (it seems that Google appreciate this). Sometimes the inserted links become a dynamic address, virtually unrecognizable.

This is the case of the blog hosting by jimdo.com. One we might say that this is part of the differentiation, and might be acceptable, but there are other inconvenient. Just yesterday I closed my web log on this host and the main reason was the barely comprehensible and controllable interface. First problem I had to log my account: the form contains all the fields, without distinction, for both the logging and registration of new users. I tried it with different browsers, but the problem persisted. Once inside, I had this problem before, you could not figure out how to insert pictures in the article. At the end I succeeded to fix everything, controlling well. In the afternoon, probably I had an intuition about, I opened the page and the pictures were missing. Except one, but that presented only half of original photography.

Fed up of the matter, I decided to cancel the account. It took me more than 15 minutes to figure out how to do it. This has nothing to do with intuitive interface. The only reasonable approach is to try to click everything and hope to find what you need before the last attempt: the misfortune here often puts his finger. The blog I am talking about was at his third host; the first two have closed and jimdo.com I closed myself. Now I'm on the fourth attempt. But it does not say that trouble always comes in threes? In the new information age it seems that we also need to change the popular sayings.

May 4, 2016

Invisible pagerank

The introduction of the concept of PageRank as a measure of the popularity of a site is one of the key points in the history of online search. Until then it was enough to copy a site that was at the top of the results and it was done. But with the establishment of the pagerank concept it has changed dramatically: it was virtually impossible to have the same link that had the copied site. At the beginning of the period in the first pages of the search you could only find sites with a very high pagerank (atr least 3 or even 6, depending on the commercial value of the search words): in fact, among the many parameters influencing the ranking, the PR was one of the most important. So, it was logical for webmasters to try to obtain a big number of links, and here begins the problem of which the solution is the latest invention of Google; do not show this data.

The idea of mister G was really simple: better is your site and more people will link it spontaneously. But who is linking to me spontaneously? It 's a very uncertain possibility and I can not afford to base my business on a such uncertain fact. Therefore, it has started to spam blogs with comments that always had a link in it and the most wealthy people have started to buy links. The original idea began to lose sense because webmasters have begun to influence the SERP. Google reacted by penalizing sites where he saw the wrongful acts even if, in the end, Google gave less importance to this parameter. Lately, pagerank has a very low influence classifications and results have become even worse. Of this fact, users are complaining for years.

And that's why the pagerank value disappears from the view. For several weeks Toolbar works as before, but it shows a PR 0 for all the websites. Google has officially announced that the pagerank still exists and is still part of the parameters that influence the positions in the SERPs, but is no longer visible. In other words, the users don't know which sites have a low or high PR. As time passes the historical traces of the values will be lost, although it will be always possible to guess the value of a site. From this point of view probably in a couple of months Google will begin to increase the importance of pagerank in the ranking, hoping to recover the search quality and trust which, at the moment, is a little bit lost, through its own users.

And what the webmaster will do? It's not easy to assume but I have an idea about it. All those who, like me, believe in the enhancement of the importance of the pagerank, will reopen the hunt for links, even more ruthless that it was before. This hunt has dropped significantly in the last 2-3 years and many webmasters focused on the internal work on their pages. But now we'll come back in search of links much more intensely; for the fact that we don't know the value we will get, we will need to obtain the largest possible number of links.

The quantity will become more important than quality for the simple reason that the last one has become unknown. And there will be attempts to reconstruct the value with which Google operates, by simulating the calculations that lead to it. To perform these simulations it's necessary to have the data on the millions of sites and perform the processing that will last for days. Who will manage to achieve this, definitely won't give the results of the research for free but will try to earn a good money for their work. At the end, those with a pocket full of money will take advantage.

November 15, 2015

Cell phone solution

I am a webmaster since more than ten years, the next year I will be celebrating twenty years of my career. Right now, I'm a little bit exhausted. For a long period my main purpose was making money, as much as possible. There were other aspects that I liked, e.g, being in the first position on the search engines. That was a great satisfaction, confirming my skills or maybe, my fortune, who knows? However, I was making money. After putting aside a good earning, here they are, the though times came. So many young competitors, very creative and stimulated to do their best. My personal excuse was that I reached my goal and I decided to relax. I started to take it easy, without analyzing the market. Someone told once the knowledge is everything. In the last years I'm using the old bases, without any updates and new rules of this business.

Last spring, mister G send me an e-mail informing me that one of my websites wasn't optimazed for cell phones, the so-called, smartphones. I've heard, not a long time ago, that the number of Internet users was higher from mobiles than from PC's or notebooks. I had to do something for my business, hoping to improve my returns. I was advised to visit https://developers.google.com/speed/pagespeed/insights/ for checking the pages. It's enough to insert the URL of the web page you want to check and Google analyzes all the important aspects for a mobile user. The final response is a mark to the user experience, ranging from 1 to 100. If the mark is high, from 85 to 100, the green color appears. If the value is from 70 to 85 we are in the yellow area, which means, pay attention. If the mark is even lower, the red color will appear. In the case of yellow and red colors, Google will befall the score of the web page and backwards it on the search engines for those who use their App on the mobile phones.

All my web pages were in the red area and that was really a bad strategy for trying to make more money from smartphone users. Also the classic positioning was no good. In a few words, I was no longer existing for the net. I didn't care about this new technology, especially because I don't have a cell phone. Incredible but true. However, I realized the importance of being present in this new area. And so I did. There is a key sentence in the header of the page's code:

<meta name="viewport" content="width=device-width, initial-scale=1.0">

This code tells to the cell phone to represent in the best possible way the web page by using the full width of the screen, maintaining the scale 1:1. Everything seemed very contradictory for me but in the end I figured it out. If you are working only by using texts and tables that are not fixed as width, and the font is big enough to be readable when the text is reduced, everything is ok. The probability of this happening is very low. If you have a fixed web page with a width superior of the cell phone, one part remains out of the screen and Google gives you the yellow or the red color. Another problem are the images and videos, they have a fix width and they don't fit in the viewport.

After I had analyzed all these things, I fixed three of my web sites. I applied variable tables by fixing the proportion between the columns as they were percentages, not pixels. Regarding photos and videos, I had to insert the javascript code to range the width in function to the screen where they were shown. This step was really a hard one, for me. I found quickly the code I needed but, the one I found was ok for Chrome, not ok for Internet Explorer 8, the old one I still use for the compatibility for all versions. After three days, finally, I found the solution: at the beginning of the page you have to insert <!doctype html> and your work will be fine for the both browsers. I put a lot of effort to understand all these new tips and tricks but I had no results. Probably, it's my fault. Other Google optimization are still missing.

October 14, 2014

Verify the facts

On the net a lot of information can be found, you can find a bit of everything, in particular webmasters can find what they need for working, the most important of all are tips and tricks on how to improve SERP, on national or international level. Despite all these information regarding webmasters work, many of them are not verified so their "value" is dubious. Often someone idea is to interpret certain things in his own way. If the person is able to explain that idea using "convincing" words, the hypothesis becomes a truth, even if in fact it is not. The best thing to do is to verify by ourselves, where there is possible and check the facts. So lately I worked hard to find the evidence relating to the question of "bad links”.

One of my websites dedicated to the market of gambling online, dedicated to a small country, was for years the leader of SERP: the first position for almost all the significant keywords. This was possible because this market wasn't so interesting from the economic point of view and the competitors were almost non existent. Thanks to the issues related to the legislature of many countries that have regulated this market, the revenues have declined markedly and so even those smaller states have become attractive, both for providers and for the webmasters. In July 2013, for the first time I've noticed a change; my website went down to the second place on the Google and then the things proceeded even worse (actually I am on the end of the first page) because I had new, and powerful, competitors, reason why I was forced to deal with my own creativity.

I have identified one of the sites that was before mine and decided to put on it undesirable links from sites dedicated to the adults, most of all, men. At the begging, I've put 6 links, 12 and at the end 24, that is all that were available to me and that I could take advantage of. I was really surprised with the results: my “aim”, after the first 6 links went up from the third to the first place. So, I thought: "Perhaps it's necessary reach a certain percentage of bad links. Even guys from Google should have a little bit of heart, they can't penalize you for a couple of bad things." Reason why I continued to add links but the result was always the same.

Nowadays there are no means indicating how many backlinks has one site so I tried to do the same thing using a lower ranked site. What happened was identical: my "victim" went up to the second place. How this can be possible? Maybe sites for adults are put in the same trash with casino sites so it's valid be linked between them. I should try with other category, the financial services. I reuse previous links. Are you curious to know the result of this experiment? The persecuted has disappeared from the pages of the most used search engine. So the concept of "bad links" is valid but it is necessary to consider the belongings of sites that you link to see if this will be treated as you want.

Someone have already hypothesized that in the end I used the 24 links to my website and that I'm currently on the pedestal. Wrong! I thought about that so many times but I wasn't brave enough to try. And what if I missed some parameters that those with whom I had tried had and I did not? It may be that the reasoning on the percentage of bad links is correct and my "victims" had many more links compared to mine. There are so many questions, only a few answers and just one sure: my site bring me still some money even if it's in a bad position. Making now mistakes could take me away my small profit, one of the last remaining after years of glory. So, why take a risk? I stopped. Because of the fear. Nothing else.

April 10, 2014

Webmaster in vacation

In this period there is the lack of news regarding my webmaster's job so I decided to tell you a story, related to the issues present in this blog. Here it is!

A successful webmaster, became rich thanks to his job, decided to take year off from the work. He always wanted to go around the world in a luxury yacht. There was no need to buy it; he decided to hire one. Obviously he couldn't leave his business without monitoring it and, therefore he took with him the laptop and for remote areas in the middle of ocean, where internet connection was impossible, he bought the coolest model of satellite phone present on the market. It was really expensive, the phone and the connection. Luckily he hadn't financial problems.

The journey started from the west coast of the United Stated. In those two month, everything was relaxing, he had no rush. Actually, he was happy about his decision to take some time off and elated about his new page ranking. He realized how productive was to work in such a beautiful atmosphere. Thanks to his new spiritual energy, also the creativity became more powerful. He was in the middle of a Pacific ocean when his yacht was overturned by a tropical storm. Immediately he understood there was no other way than wear the lifejacket and so he did. That was the last thing he remembered. When he opened his eyes, he was laying on the white sand, so he realized he was in the remote island in the ocean.

He lived on the island struggling against hunger, weather conditions, the nights were cold and days hot, but he had hope that one day, someone would have found him. After six months the hope was becoming weaker and he was desperate. Since he was a rational man, very aware of his situation, he decided to build a hut and protect himself from the weather events. One morning, he woke up and he saw in front of his eyes, a beautiful, tanned, blonde woman. She explained the man she was a castaway, on the other part of the island.

After two hours of walking, they reached the hut built by the woman and inside everything was very luxurious! There were no branches and palm leaves but metal and plastic panels. She also had a fridge, a gas cooker, table, chairs and many of kitchen tools. In the bedroom there was a bed with mattress, wardrobe and behind the house there was a diesel generator that produced electricity. She explained to the man that all the stuffs were recovered from the ship on which she was traveling, sank two years earlier.

The man, amazed by this unexpected miracle, took a shower, ate something and took a nap on the hammock under the palm tree. When he woke up, the woman was standing in front of him, completely naked. She told him:
- Now you will get what you dreamed for the past six month alone on this island.
He, excited:
- Really? - I can't believe it! You have also an Internet connection?

October 31, 2013

Unfair rules

The summer has passed and the days are getting shorter and colder, so much of us will turn back to work with new enthusiasm. During the summer I did two separate vacations of two weeks: one travel in China and Tibet and one holiday on the see, in Croatia, under beach umbrella, enjoying clean sea and good Croatian beer called “Ozujsko pivo”. So I loaded my batteries for this autumn and the coming winter.

When I was back to home, I found a very bad surprise: two domains, from where I had a lot of links were closed; the webmasters didn’t renewed them. So I lost few tens of links, and this notice means that I must work very hardly to recover them. This happened rather often. As in the life, neither on Internet there are fix points on which you can bet forever. Due to my SE positions, there were no much news; a lot of my pages were still penalized, and I want to say something about the mode in which Google apply penalties.

My modest opinion is that each webmaster has full responsibility for his own site; for the content and the link present on the pages, but can’t has any obligation versus the factor that are external to the same site. For some times I hear the voices about Google’s penalization relative to the bed incoming links. I always refused to believe in that, but my sites want off, and finally I decided to verify if there is truth in that speeches. Two of my sites had very bad positioning in SERP and I had nothing to lose, so at April I removed some links that could be considered wicked. After 3 weeks I noticed improvements of my ranking. I was happy for that improvements, but very unhappy for the discovery I did.

To have a final prove of this fact, now I should try to link one or two sites with some wicked link and be assured that their position go down. And this is the main problem, because this interpretation will take us in a global war against our concurrent. It wouldn’t be enough to simply neglect eventually bad links, and not operate penalization? In that mode nobody can’t hurt other sites, and incorrect webmaster would only waste their time trying to earn such kind of connections, that don’t give any rate.

Last news from Google

Google’s speaker Matt Cutts said this days that Pagerank Toolbar will not be updated this year, and avoided to respond to the question about next year update. This notice immediately caused some webmaster to conclude that the Google ranking of pages is dead conceptually, and this suspect turn back frequently. But one can draw a different conclusion; that Google want to hide it’s really intention, and to give in future bigger value to this parameter, without letting know to webmaster the real worth of the sites, leaving visible on Toolbar old Pageranks.

April 21, 2013

Disappointments continue

Thanks to my own activities on Internet, well, actually more the past ones then the recent ones, I don't feel the European and the global financial crisis. I bought my own apartment, the car (even if from 1999 it's still functional) and I have some savings in the bank. Recently the revenues of my sites declined but I'm not complaining about that, my past work allows me to live well. A few month ago I realized problems my sites encounter but I'm trying not to give too much importance to the thing. I love my job and I'm grateful for that, I don't think excessively to the final purpose, I mean the money.

One of my activities that continues without interruptions is the research of the new backlinks, from domains never used before. Two days ago I found a great source, useful to reach thousand of links, that have a pagerank with a certain value. I firmly believe backlinks together with pageranks will have again the importance they had in the past; actually it seems to me this is the only possible way to distinguish between different sites. The devices used by Google in the last year seem to me not so punchy and they don't improve the research results.

Recently I discovered the penalties inflicted to those sites who have advertise banners on the top of the screen. According to one of my analysis it seems Google apply sanctions even if on the head of the page is present a big image. The excuse for doing this are the presumable complaints from users that don't like to scroll down the screen to get to the topic of their interest. There are sites where the user don't have to drag down the scrollbar because all the information he needs are already visible, in another case, doing a little physical effort, you can find what you were looking for. I think there is a little bit of mess in all that.

Not being able to reach the visitors arriving from the best search engines, I try to get them from the other resources, for example, from my sites that don't have sponsors or any kind of economic advertising, they have a small number of visitors but also some decent SERP placement. It's clear that users who visit those pages are not looking to buy something, in any case sometimes they can be a source of income; it's like building a house, a brick above the other and at the end of the work you have your own home.

Another method I use is to subscribe in top lists and try to be in first positions helping me by manually clicking, sometimes even using proxies. I admit, it's not correct but as the Bible says "who is without sin cast the first stone". Am I right? Everything I do is to be present on the first page of the most important search engine in the world, waiting to be judge positively and be able to keep my life style the same as it was in the past years.

December 17, 2012

Density of back link’s keywords

At the init of previous month I was in holiday in Namibia, spending my cash (before departure somebody violated my credit card) in my preferred entertainment. When I was back, I found that the toolbar pagerank contains new values. I made a quick control to see how my sites were passed this exam, and I ascertain 18 sites with PR5, and others (I have it about 85 in this moment) between 3 and 4. A nice satisfaction for the commitment I put in. But I already known that this success will not be respected in the SERP, and unfortunately I had right. But finally I think that I know where is the catch.

All the measures and procedures undertaken in past period by me didn’t give any certainty that I’m on the right way and I could not convey much desired traffic from search engines. I continued with the tests, verifications and checks, but also with research in the forums. In conclusion I think that I figured out where is the cause of my trouble. To tell the truth I already knew the concept, but I missed some parameters.

With the upgrade of Panda (I begin to hate that animal, small, ugly and lazy) in April 2012, Google introduced the penalties for sites that are too much optimized. There are two kinds of optimization, internal and external to the same site. The inner one mainly refers to the construction of site and concentration of the keywords on the page, while the outside is seen as the density of keywords with which the page in question is linked in. I guess the concept is as follows:
  • Let's say that the limit accepted of keyword density is about 4%, and if you go over a penalty immediately triggered. Google divides the actual percentage to the limit, and thus obtain the coefficient of inside optimization. The maximum is 1.
  • Considering all keywords with which the site is linked, the percentage for each one is calculated. Assuming that all the links use the same keyword, you will get 100%. Dividing with the physical limit, 100%, is obtained in this case an external optimization factor par to 1. It would seem that beyond the value of 0.5 (ie 50% - in the past I considered 70%, and that is my mistake) take a punishment automatically on this item alone.

Moreover, multiplying or adding these two values gives a number of unitary page. If this number exceeds a certain limit, I have no idea about this entity, take the punishment and the page moves back in placement. My niggles are backlinks and I'm trying to dilute the density of them.

Sometimes one is not really sure if he has been assessed a penalty or other worked better and they are in front for the placement. An aid to understanding the situation is visit googleminusgoogle.com, which existence I discovered recently. It 'a branch of Google that does not apply Panda filter. If your site there is much more advanced in the rankings than on Google itself, you can be almost certain that you are castigated and your life will become very hard in your attempts first to understand the problem and then to overcome it.

July 13, 2012

Continuous Updates

In my previous mental rumination I have described my problems with my site frowned upon by mister G. It's been 6 months since I made a freeware available to my visitors, enriching the content and the service offered, hoping to improve the image of my website in the eyes of the inspectors, but for now there is not any improvement. In fact I have no way to know if someone has checked the content and I don’t know is it possibly to force the verification. Meanwhile, another of my site has deteriorated: from habitual second or third position on the first page of search results, I went first to fifth and then sixth. With this change, I run down from nearly 1000 visitors from Google per day to about 300: bad and very daunting. I started to study and still one of the main indications that I have found is to have a dynamic site that changes content often, and which is updated constantly.

But I'm not a professional webmaster and stand behind this job every day is not my desire, because I’m a bit lazy and also for the lack of free time. I have reflected well on the problem by trying to read a solution for me and I found a way. On the home page I have included an SSI (server side included) to which I have associated a small program wrote in Perl programming language that daily changes one paragraph on the page. I set a text file where each line is a paragraph and when I find a little desire to enrich this file in order to have a richer content and less repetitive. In this way, the paper is different every day, even if the content is repeated after a while because now I made only 16 sections. This also has another purpose, to force Google to visit my page more often and to have Google cache of my page always very fresh.

I introduced this innovation about 2 months ago and until today I didn’t see any material consequence, and I’am about to make another change. I noticed also that the distribution of my keywords is not always uniform in the text, which is a bit at the beginning, middle and end of the content, as recommended by most experts. So I have to work for a good remedy to this defect. Lately I have found another piece of information that could be valuable: some years ago a minimum of text needed to be clearly seen was about 300 words. This amount has increased since there and I try to keep my content to about 400, even 550 words, but some sources claim that the figure has become about 900 entries, or even 1100.

It seems to me a stretch since the amount often comes at the expense of quality. Getting lost in the maze of language to say that white and black is the opposite is useless, but if Mr. seeker wants I will try to accommodate him. Even this assumes a lot 'of work and time spent, so I choose two or three sites to try to verify the usefulness of this parameter. You can see from this post that I'm forcing the language to stretch it, a little training for the more general task that awaits me.

Bad things are happening: in the world, the economic and financial crisis spreads each day more, while I’m in a constant battle with a virtual things, it does not physically exist, it is only a piece of code, miserable two bytes I am trying to understand and discover.

March 1, 2012

Useless site

For years I do the webmaster, but I never made investments in this activity. Only costs that I have are two hosting providers and my multiple domains. I would surely have the cost of my personal computer with an Internet connection without doing this job. I never bought links, which many of you do, never invested in a SEO campaign, or a kind of promotion. But two years ago I have been offered 50 euros from Google AdWord to test their advertising program. Since it was an offer I decided to try, to broaden my horizons. Once I subscribed the program, they asked me to add my 5 euros and I made this little sacrifice.

I decided to promote my Forex site and I was following the results. Nothing! Nothing! I had a bit 'of visitors, some of them from the various sites that use AdSense and lend their space to advertising and some directly from Google search pages, but the financial result was nil. I thought I could invest, let’s say, 1000 and receive 1500 euros, have a capital with interest, but this test was entirely negative. Last year, returned from vacation I found another promotion that Google gave me 100 euros and without any additional request. Because it is totally free, it would be silly not to try again. But this time I chose a different tactic: I didn’t promote my site, but directly my sponsor. I entered my sponsor’s code directly into AdWord and I started to follow the result. Very hopeful, I must say, because the idea seemed to me very good. Why send a potential customer to my site if I can send him directly to the sponsor.

The statistics that provide my Forex sponsor are quite detailed and I can see where the traffic originates. And the visitors came, and all matched very well with the stats I had from AdWord: you know that we affiliates are a bit skeptical about the correctness of what we pay and what we pay for certain services, but in this case all were perfect. Except the main thing: 100 euros invested, which fortunately were not mine, 0 cents earned. I heard that there are people who earn in this way, but to me this didn’t happen. Because it simply does not work, at least with the services that I promote myself, or I have made mistakes in my ignorance. But I gained a new experience, sorry two.

While the campaign was underway, I had a communication from the staff of AdWord which referring to my site that was present in the account two years ago, but was inactive in the current campaign. They simply said that my site was useless, because it did not give useful information to visitors and I had only one sponsor, that the visitor did not have much choice where to register to trade with currencies. They also gave me an address with two examples of utility / futility of two similar sites.

Bad thing, supported by the fact that a month before my site is out of the first page of search results and finished on fourth. I did not know what to do, how to make "profit" from the point of view of Mr. G. Obviously the human verifier has not peeled the pages well and examined in detail the content, because there are things that cannot be found elsewhere. I needed something visible and accountable to the 'inspector" at first glance that there is something that is valid and original for a surfer. It took me some time to get there: I created a free Forex software that follows the trend of currencies and that visitors can download for free. It took me two weeks of work and I hope that this job will be paid in terms of utility. I just have to wait for future development: I will let you know my conclusion in one of the subsequent posts.

November 11, 2011

HTML elements and text

While the pagerank influents search results much lesser than few years ago, after the past post I focused my work on other important aspects for placement in search engines. The last update was done by Google at the end of Jun this year, and in that time I was in vacation, two wonderful weeks in South Africa, that I paid with the money I earned with my work on Internet. Back home, I found an electronic letter of a friend who warned me of what happened, namely that Googhi has updated toolbar pagerank. A quick browsing of my sites has given me great pleasure; the new pagerank were substantially better than previous ones. I got so far as to have 5 sites with PR5. But since the final aim is not the pagerank, I went to check my positions in search results. What a disappointment: most sites have lost many positions, in some cases going from the first to the fourth page of SERP.

Several days later I noticed that almost all the pageranks are went back to the old values. I had to say bye, bye to all of my five PR5. What the hell is going on? Perhaps those in the forum have had some light, they understand something that I can’t see. But nothing even there, only other confusion. In fact, even in the forums I found a gloomy atmosphere, cold and despair, because no one understands what happened, or do not want to reveal useful information. Hundreds of hypothesis on how to proceed, what to do, but no certainty, no confirmation on existing facts, that one approach is better than the other. At the end I was only with myself, and I have been busy comparing the first places for different keywords, trying to find out the elements, the methods to be introduced to strengthen my web. Without going to explain how and why, here are two sagacity that I am currently applying, hoping to see a positive outcome.

I remembered a conversation of some time ago with an acquaintance who insisted on the importance of the various HTML elements on the page. In fact, looking at some well-placed pages, I seem to have had some confirmation of this. So I try to introduce new html tags in my pages, such as UL, EMBED, CITE, and so on, trying to get richer the pages’ encoding. This means even more complicated for me. I personally do not like this because I believe that beauty is in simplicity, but I must try to adapt to the situation if I want to do some other nice trip, or long vacation.

Due to the content of the page, with the usual precaution to have the presence of certain keywords, I try to repeat as little as possible other words, using very much synonyms, thus having a greater linguistic diversity. I remember at school that the teacher of English always recommended to avoid repeating the same words. I made a small program in Visual Basic that does a count of words within the text and I noticed that in some cases I repeated the words that had nothing to do with my main keywords. The worst thing is that those words had the density very near to the density of my keywords. Therefore, as in finance, the diversification of coding elements, the words in the text and content of the site are very important.

I forgot! You know that the today’s date should be very lucky: there are 6 number 1 there. At least Chinese think so. Good luck to all of you.

May 19, 2011

Googleations

I'm crazy for new inventions in the language field and I was very happy when I got the idea for the title. But someone else has already had this illumination? Easy to check with the gentleman named in the heading. The fact is that have been so many before me to have this inspiring, but anyway I'll keep it because I swear it is the original product of my brain. These days we are more or less one year from the great renewal that Google has made ​​in its algorithm and we can sum up the result. There are two main aspects to consider, our point of view, webmasters who are trying to place our masterpieces on the first page and the experience of other navigators who are simply seeking to obtain information necessary for their work, or even better, for fun. Webmasters have digested the changes and have adopted the new rules. For some it went well, but many still have not been able to reach levels of success they had yesterday and unfortunately the gay that writes this columns is among them.


But for the surfer has gone even worse. An American magazine specializing in the topic of Internet searches has published a survey carried out among the common people and network users, those who do it professionally and the conclusion was catastrophic. Google is no longer the best search engine and the quality of search results provided are worse than during precaffeine, as they called the new version introduced last year. To verify this news I personally started to do some searches and I was remain really disappointed with what Google showed as the best sites. Looking for "online forex" on one national Google, from the first 10 sites listed, 4 did not give any useful information. Indeed, in the first place was a post from one forum containing only one sentence, that is without substance, and with a pagerank 0. Very surprising. Google's management have noticed what was happening and being afraid of losing customers, let’s remember that there are a lot of money from advertising, announced that they are reviewing the algorithm deeply and as quickly as possible they will try to improve it and give to the end users a good product.

For those of us who work around the world this changes promises a new, hard work, first of all to understand the modifications that will be introduced and then apply the appropriate steps to achieve the goal of being present on the first page. There will be so many sleepless nights, a lot of caffeine to keep us awake and new frustrations by seeing our sites slip 10, 20 pages back. But this is the Internet and you have to adapt. One of the ways to go out successfully, without excessive suffer this continuous changes is to make various sites in a different way, by applying a policy of diversification of contents and backlinks, so you can have different profiles and hope that one of them is already well optimized for changes to come.

November 17, 2010

SEO parameters - second part

Yes, it's been a while since I wrote my last post but during the summer I prefer to spend time in nature, walking and riding the bike, instead of hanging on the computer. Summer is already gone and it’s time to return to work. The first article was posted in April and those month something important have happened for all the webmasters. Google has introduced Cocaine. Yes, you read right, the new version isn’t called Caffeine, as they want us to believe and I have the evidence to prove all this. The ranking the new version gives, compared to previous ones, can only be produced by an entity heavily doped: you don’t understand how it works. Or at least it was so early, a shock for those involved in this sector, but now the webmaster begin slowly, slowly to understand the new rules and adapt to them. The changes are a lot and in many cases it will take a lot of time and effort to get back on earlier positions, but this is Internet, characterised by its dynamism, many changes, today you are in and tomorrow you are out. But let’s go back to our theme.

Often the second part of a movie or a book has the same title as the first but with the addition of the word "revenge." Therefore, the title could be SEO parameters, the revenge of SE. In fact, in the years to come algorithms became more and more complicated that only few were able to understand. But the turning point was the introduction of backlinks as a parameter. Replicate a site is easy, but having the same external links is much more difficult, especially if the guy we want to reproduce has its own network of sites where he links; certainly he will not give a link to his rival.

With the birth of backlink also the market for links was born. I will pay you and you will post a link to my site on your web page. The result of this was that the strong, I think of those with a solid financial base, have become even stronger and the weaker ones, but often full of enthusiasm and great content, they are almost gone from the front pages with the results. Also Google has noticed this fact and has introduced new rules but without any significant change; it is difficult to be sure that a link has been sold, that is bought. So the internet reflects today's society very well, pure capitalism – money earn money. No despair, there are many exceptions to this rule.

One of the results of back links is also the pagerank, which is widely analyzed in these pages, which lately has become less important than once, but differently of many webmasters who believe in its death (the argument in their favor is the fact that it was not updated for more than half a year), I think it's still an important parameter.

Returning to the words used to link our site, Cocaine has introduced an innovation. The sites optimized for search engines were penalized. Too much uniformity in the keyword of the ancor tag linked to us is for example considered optimizations we mentioned above. Therefore, according to the latest knowledge in the sector, it’s important to diversify the keyword. For example, if our main keyword is "ads online," it should be dilute its presence of about 70% of cases and the remaining 30% use alternative words such as "best ads online," "buy and sell ads" etc.

There are obviously also other terms of comparison, they say more than a hundred, that Google considers, from which many lesser known, but we will see them in the sequel.

April 15, 2010

SEO parameters - first part

In this and some future articles I will talk about the parameters that affect the page rankings, especially in Google. Knowing these parameters and the ability to measure them in some way gives us great advantage because it allows us to optimize the SEO work in order to get good results. For newbies, the abbreviation SEO means Search Engine Optimization. Google it, and you’ll get over twenty seven million hits – this tells you how important SEO is. But before going any further, let me talk a bit about how I started in this business and how it was done back at the stone age of internet, more then 10 years ago.

In the late 90’s, when I started to access the Internet using an ancient 48Kb/sec modem, I had already understood how important a good placement in search engines really is, to attract quality web traffic, and visitors clicking the banners and ordering goods or services offered by sponsors. Back then, together with AltaVista, the most popular search engine was Infoseek. I spent hours analyzing the sites that appeared on the first page of results, trying to discover the mechanisms that regulate which site stands at the top of the rankings and which at the bottom. In those days the SEs were not taking in consideration the backlinks and thus it was pretty easy to discover the ranking concept – all I had to do was to analyze the content of the pages. My effort was rewarded and after two weeks of hard work, I succeeded in finding the mathematical formula that assigned scores to a web page. The formula was based on the presence of keywords in the title, text and links. It was enough for me to analyze the source code of the first placed site and I could produce the page that had better score then the analyzed page. The result was that my site became placed at the top of the search list.

The first sites that I developed and the first pennies I earned came from the sponsorship of the adult industry. Being rated first in the rankings for the word "porn" translated into a few thousand dollars a month and I often succeeded to be the first. I already thought about quitting my job and becoming a full time webmaster. After all, my regular job was paying only a fraction of what I was making on the Internet, and besides, I was my own boss and was working whenever I wanted.

However, I soon realized that there are others who had the same idea and the competition was getting fierce. I needed to work day and night in order to beat the competitors and keep the rankings. I remember days when ten or more different sites would occupy the first spot – everything was changing so fast. Soon thereafter, the people at Infoseek have begun to change the search algorithms very frequently, making everything even more complicated. For instance, they introduced limits on the density of the keyword - if a keyword, let’s say, exceeded 6% of all the words on the page, the site was penalized. I had developed a word generator that repeated the keywords as many times as required, but soon realized that this is not going to work any more.
And in order to get a high ranking, the good content was becoming a necessity, the texts became longer, and the files grew larger. The game was getting really complicated.

November 2, 2009

PageRank Sculpting

Pagerank Sculpting, what is this? How one can sculpt an unmaterial thing like pagerank? In fact we don’t sculpt PageRank, but it’s distribution within a site. All of us who have one or more sites must admit that some pages are more important than others and we would like that some of them also occur between the results of research, not just our home page. I don’t know who had this idea of modeling the site, but the concept is this: there are important pages within the site that I care about and the others I care less because they have no relevance to the search engines. The idea is to pass more PageRank from those non important to the important pages: it is a redistribution of PageRank. And how we can do it?

The technical procedure is very simple and is accessible to all webmasters. The tip is the using of the tag "nofollow" (again this famous tag). The pages that you want to ignore must be linked from other pages using that tag. In this way the internal links do not pass pagerank to those less important pages, and the available Page Rank is distributed to the most important. Easy to apply and seems also to be very useful in some cases. But is this concept works, is it correct? By my opinion no and I will go to explain my point of view.

Who is familiar with the formula for the calculation of pagerank knows that the pagerank passing to other linked pages drop by a factor of damping (this is not of our interest for this reflection) and depends on the quantity of the number of links. The originators of this technique are confused and thought: putting nofollow tag fore some links, these will not be taken into account for the distribution of pagerank and here is the error. Number of links are used as the probability that one of the links on the page is clicked by a visitor. But the visitor does not see if a link has been tagged and not change anything in its intentions.

My conclusion is that this technique not only does not help the site, but it hurts: the pages linked with nofollow lose pagerank and don’t pass anything to important pages and at the end of the story, all the site is penalized because with less pages, drains less page rank from external links that point it. To those who are not convinced by my argumentation and that in any case would like to try this technique, I can say that this is recognized as legitimate procedure directly from Goolge. If your findings are different from mine, let me to know about.