Using Automated SEO Software Tools for the Best Results - Part 1 of 2 - Search Engines - SE Optimization

How to Use Automated SEO Software Tools for the Best Results

A discussion of software tools to automate various aspects of search engine optimization (SEO) for websites, including keyword insertion, creating backlinks, generating unique content by article spinning.

There's a preamble before I assess the pros and cons of automated Search Engine Optimization (SEO) software tools, but I think that it serves to set the scene nicely.

I opt to receive junk e-mail from Internet marketers because, occasionally (very rarely, in fact), I find an item of software that is actually useful.

As a qualified professional software engineer, I'm able easily to distinguish good software from bad. Free software usually doesn't work well and may even be detrimental to the computer in some way if it's created by an amateur. I've found that even the software that works well is almost always not designed well, but the fact that it produces results is more important than the way it looks or its poor usability, I suppose.

Every day I receive hundreds of junk e-mails. This morning (04 Sep 2010), for example, I received among them the three following items.

One was selling "Keyword Research Software" which seemed only to show search engine results from Google and YouTube when a keyword is typed in. It had nothing at all to do with keyword research. There was no price stated for the software. Much of the sales page was taken up with the seller's life story (poor education, "littlle" (sic) brother died, etc.) Then it went on to offer article spinning software for $47 with this enticement:

"College student plagerizes ...gets thrown out of school...To Bad He Didn't Have The Best Research Software...Click Here to Order Now!"

This is just one example of thousands of similar sales pages sloshing around on the Web. It begs the question: What sane person would even consider handing over money to such an unbusiness-like individual who is barely coherent, can't spell and can't string sentences together properly?

The second junk e-mail I want to mention included this:"How to use little known, yet brutally effective techniques for hijacking commissions.""How to hijack all their hard work for massive paydays."

Good grief! What are Internet marketers coming to? This person is encouraging buyers of this "system" to steal from others! Inciting cybercrime! How would honest people, trying to make an honest living with the Internet, feel about this?

Again, this is only one of many similar appeals to cheat or steal from others that I've seen in my long experience.

The third junk e-mail, which is very relevant to the subject, talked about an "Instant Ranking Formula" that claims to get Google to rank any website highly. It stated:"Get any new site indexed quickly and backlinks to it automatically - and this happens simultaneously."

(Let's ignore the all-too-frequent poor English in this case.) Software like this usually scatters backlinks (explained later) around sites which belong to a group of like-minded people who have also paid for the backlinks, regardless of the relevance of the textual content containing the backlink to the site that it links to.

This is not the purpose intended by search engines for backlinks, and backlinks from sites whose subject matter is unrelated can harm the ranking of both sites. Google can detect this, and assumes that the site owners are trying to "beat the system" by robotic means. Moreover, even relevant outward-bound links, if they are too numerous, can get a site labelled as a "link farm", which renders all the linked sites liable to be penalized. Many purveyors of automated SEO software, keyword research software, etc. unashamedly claim that their software "cheats" the search engines.

Seeing these three examples today spurred me to write this article, with the message that attempting to cheat the search engines with fully automated SEO software may come back to bite you.

Google's 'Quality Guidelines' for websites states: "Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or 'bad neighborhoods' on the web, as your own ranking may be affected adversely by those links."

This is confirmed on Google's web page 'Google-friendly sites', where it says: "Keep in mind that our algorithms can distinguish natural links from unnatural links. Natural links to your site develop as part of the dynamic nature of the web when other sites find your content valuable and think it would be helpful for their visitors. Unnatural links to your site are placed there specifically to make your site look more popular to search engines. ... Only natural links are useful for the indexing and ranking of your site."

Regarding deceptive techniques generally, Google's 'Quality Guidelines' for websites also states: "Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit." Enough said!

What are Search Engines looking for, then?

That is the question, rather than "To cheat, or not to cheat?"

Most people who have investigated how to raise a website's position in search engines have, at some time or another, encountered the phrase "Content is King!" This declaration has always been, and always will be, true. The reputation of search engines relies on the usefulness to the visitor of the search results they return. They will, therefore, reward those websites that contain useful content by ranking them highly in their search results. It's that simple.

Because I know more about Google than Yahoo, Bing (previously MSN) and the others, I'll refer to Google's criteria. It is very likely that other search engines use similar criteria, and most of them get their results from Google, anyway.

So, how does Google determine whether or not website content is useful? Because Google uses robots ("spiders") to crawl through websites, they cannot read the content as humans do. Therefore programmatical methods are employed. The most important of these determine how relevant the content is to the search term used by the visitor. Relevance is measured in several ways.

How does Google determine the relevance of a website to a search term?

1. Keyword Density

Keyword density is a ratio of the number of times a keyword (phrase) occurs in a web page, expressed as a percentage of the total number of words in the page. Search engines calculate it to determine whether a web page is relevant to a specified keyword (phrase).

Keyword density is less important nowadays as a factor for determining page rank (PR), simply because it is too easily manipulated by website owners. Indeed, too many keywords in a web page is regarded as an attempt to cheat the search engines and can cause it to be penalized. This practice is known as "keyword stuffing".

The optimum keyword density is considered by many SEO experts to be between 1% and 3%. The density of a keyword (phrase) of 4% or more might be considered as being "search spam".

2. Latent Semantic Indexing

It's a grand title for a bold attempt to simulate human intelligence. To keep the explanation simple, a search engine that employs Latent Semantic Indexing (LSI) in its algorithms analyzes the entire content of a web page, to discover what the subject matter is about. Once it "knows" this, it indexes the page to appear in results for search terms that are conceptually similar in meaning, even if the actual search term does not appear in the text.

It is, therefore, fruitless to use software simply to create a nonsensical page and inject keywords into it at random. While search algorithms become more "intelligent", it becomes more and more essential to add the human touch to websites, to distinguish them from those cranked out by automation software. By all means, use software tools to automate repetitive mini-tasks that are beyond human capacity, but use them judiciously. There is no substitute for the human brain.

3. External Backlinks

Backlinks are links to a website from other websites. Genuine backlinks are created by people, not robots, who think that a certain website contains useful information from which other web users could benefit. So, they publish a link to it on their website, thus making their own site more useful. This is the way natural backlinking is done. Google rewards websites that have natural backlinks with a higher page rank (PR), because it reckons that, if many other people think that the website is useful enough to link to it, then it must be useful. Fairly logical, really.

Any other kind of backlinking is considered by Google to be unnatural. An example of unnatural backlinking is "reciprocal linking", whereby two websites simply exchange links to each other. This kind of backlinking was easily discovered by Google, and it adjusted its algorithm, purportedly, so that the two links cancelled each other out, and neither website gained any benefit. Reciprocal linking became useless and has all but died out.

People then began to develop "three-way linking" in order to try to beat the system. This works perfectly well and is acceptable to Google, provided that it is done in the spirit intended.

For example, Company A programs and builds websites, Company B specializes in graphic design, and Company C is expert in Search Engine Optimization (SEO). None of these companies is a competitor to either of the others; their businesses complement each other in that all three specialisms are related and are needed for creating a successful website. If Company A's site contains a link to Company B's site, Company B's site contains a link to Company C's site, and Company C's site contains a link to Company A's site, all three companies benefit from the link exchange, and Google is happy. The only way to achieve such a natural three-way link exchange is to make a personal, individual approach to related websites.

If, on the other hand, three-way links are created by a robot, or by a human for the sole purpose of trying to increase page rank, Google frowns on the practice and all the sites involved run the risk of being penalized.

Google values links to a website from other websites highly, and usually increases its page rank (PR), provided that:A) The website containing the link has, itself, a high page rank;B) The website containing the link is in a related industry;C) The website containing the link is an "authority" site;D) The website containing the link is not a "link farm", i.e., its primary purpose must be to provide useful information to visitors, and not to be a repository for dozens of links to other websites.

Unless these four criteria are met, a backlink to a website is unlikely to have any effect on its page rank. Moreover, if criteria B and D are not met, an adverse effect on the website is possible.

There are two fundamental problems with using automatic backlinking software tools:1) They do not distinguish between relevant and irrelevant websites on which to place the backlinks, thus failing criterion B. They create backlinks on only general blog, forum and social media sites, or sites in a group where links are distributed to each other.2) Because they are automated, they create the backlinks on the same sites for everybody, thus swelling the number of links on those sites, and failing criterion D.

A third drawback of automatic backlinking software tools is that any Tom, Dick or Harry can achieve exactly the same link results. Thus, the playing field is just as level as it was beforehand, and no advantage (which is questionable on such general sites, anyway) is gained by any user of the software. There is no more benefit for anyone who is prepared to put in some effort to achieve better results than for anyone who is lazy and just wants to click a button.

The expression "link juice" came into vogue some time ago. It's quite apt. If a website is linked to by many "authority" sites of high page rank which have only a few links on them to other sites, those links will be very valuable to that website, and its page rank is very likely to rise. If, however, the sites contain many links to other sites, the "link juice" is diluted and the links are less valuable. Moreover, if the number of links continues to grow and the informational content does not, it's only a matter of time before Google classifies such sites as "link farms", and penalizes them. Once that happens, the websites that they link to are likely to suffer, as Google will consider them to have tried to beat the system unfairly.

Every day I receive e-mails from Web Masters, asking for a three-way links arrangement with one or other of my websites. I decline respectfully, unless:A) The domain name of the site they're offering to link to mine is relevant to my site's content;ANDB) The content of the site they're offering to link to mine is relevant to my site's content;ANDC) The domain name of the site to which they want my site to link is relevant to my site's content;ANDD) The content of the site to which they want my site to link is relevant to my site's content;ANDE) The site they're offering to link to mine is not an obvious "link farm", i.e., its primary purpose must be to provide useful information to visitors, and not to be a repository for dozens of links to other websites.

If only some of these criteria are met, I write back, offering the Web Master pre-paid advertising on my site instead of a link exchange. Sometimes, albeit rarely, they agree to pay for the link. Incidentally, most of my Internet income is derived from advertisers who write to me, offering directly to pay for one-way text links to their website from mine. Again, the websites must be on a related subject, otherwise my website could suffer, and theirs surely would. A prime example is TheGamesForum.com, which makes thousands of dollars annually from only a few advertisements. The reason for its high page rank is not so much its backlinks (which are all natural ones) as its content, which consists of hundreds of pages about games. In short, it is an "authority" website.

[This article is continued in "Using Automated SEO Software Tools for the Best Results - Part 2 of 2", which deals in depth with article-spinning software and the most important aspects of SEO. Copy and paste "Using Automated SEO Software Tools for the Best Results" into the 'Search Articles' box, and select 'Search by Article Title'. Please read that article next.]





iAutoblog the premier autoblogger software

0 comments: