Monday, January 12, 2009

How was SEO born?


In 1995 (the early days of Yahoo – have a moment to reminisce if you were online back then – grey backgrounds, single column websites, crazy spinning GIF’s), optimization was born out of the roots of AAA, A#1 and Acme style yellow pages / white pages alphabetical optimizations.

This meant that if you wanted to be top of the list, you would name your site or business !@#AAA 123 Business – it seems insane now, but it worked. This was the first example of SEO – or the manipulation of rankings for your own gain.

In late 1996, search engines finally realized the work of databases to match text and how that would be applied to the greater database of the web.

In 1997, the first algorithm crackers appeared and this was quite basic by simply making up of pages in the results of many of the major search engines. The first major “page jacking” and “bait and switch” incidents begin to happen. This was when people copied the content from top ranking search engines for their own use, got high rankings and then switched the content to their own.

In late 1997, it was very easy for search engines especially Infoseek to crawl and index. In fact, back in the day it was possible to submit a site in the morning and have it appear in the Search Engine Results Pages (SERPs) in the afternoon.

Due to the easy indexing within 24 hours on Infoseek “Spam” became a very serious problem for the SE’s as deceitful spam sites began to understand the algorithms and how to manipulate them.

In 1998, the off-page optimization criteria of link popularity and directory listings had grown in importance of SEO as search engines tried to lessen the effects of just looking at the onpage optimization. At the same time, it had also revealed that multiple algorithm usage in search engines resulted in different top 10 positions in search results.

In late 1998 and early 1999, AltaVista (then the #1 search engine and had the biggest
index) made a fight against “too many URLs” and banned huge segments of sites and sites with auto doorway page generators. Google had introduced its page rank technology while the other search engines self destructed under management chaos and mountains of pages with no idea how to determine which was more relevant compared to other pages.

In late 1999, search engines had introduced PPC to make their service profitable. This enabled businesses to take a short-cut to the top rankings by agreeing to pay a small amount every time a surfer clicked on their link.

Friday, January 9, 2009

Wednesday, January 7, 2009

Dynamic Site SEO Tips and Hints

What are dynamic website:

Dynamic website are websites in which webpage are created through on the fly or database generated. All the contents, links, images, footer etc appears from database. Examples of dynamic websites are shopping cart, ecommerce sites, news site, forums, real estate site, manufacturer & buyers sites etc.

Dynamic Sites are Spidered Slower than Static Sites:

Google in particular has made it clear dynamic sites are spidered slower than static sites. The reason for Google to do this are webmaster friendly, dynamically generated sites can potentially have unlimited pages and so Google assumes a dynamic looking site (a site with URLs like dynamic-page.php?page=1) is big and slows the crawl speed. It does this to limit server load because if a dynamic site (any site) has millions of pages, Googlebot and the other spiders could cause the server to crash if they spidered too many pages at one time.

Problems with dynamic website:

As all the pages is generated through database, a dynamic page does not exist as a file on a hosting server, as a static html page does until the request comes for a page, when a crawler comes to the website, does not get that pages to crawl every time, so crawler avoid those pages next time to crawl.

Database generated pages URL called dynamic URLs, dynamic URL is not easy to read by human as well as search engine spiders.

How to overcome with this problem:

a) Place links of the dynamic pages on static pages. Further, it can be submitted in search engines manually. Most of the dynamic pages will indexed by the search engine by this way. Use appropriate alt and title tags on your static page, and be sure to follow all of the search engine guidelines before submitting your static pages along with the dynamic pages.

b) Coldfusion and other softwares are available which can replace special characters like ?, %, & with alternative text.

c) Using help of CGI/Perl, we can convert query string in the URL with suitable text. Path Info and Script Name are environment variables, which contains complete URL including strings in dynamic pages.

d) URL rewriting is another method; convert the dynamic URL to static URL. If you are using the apache server then mod_rewrite is used to rewrite the dynamic URL and in case of windows, ASAPI is used to rewrite. In addition, there is much software to convert your dynamic URL to static.Dynamic URL - http://www.seoinfotech.com?a=seo&b=services&c=IndiaAfter rewrite it looks likeStatic URL - http://www.seoinfotech.com/seo/services/india.htmlThis is easy to read and crawled by spiders.

Tips to Optimize Dynamic Websites:

Now you know what hurts search engine bots/crawlers to index your website. What you need to know is that how you can keep your valuable website indexed by search engines; the more your web pages are indexed the better your website will impress search engines

1. Create an HTML sitemap with 100 text links or less. If you have more than 100 links, break the sitemap into more than one web pages

2. Google Sitemap will also be an advantage, specially if your website is big and dynamic

3. Get inbound links deep into your website from other relevant websites such as directories, classified directories, vertical industrial portals

4. Convert dynamic web pages into static web pages with the help of URL re-writing techniques

5. You can use some plug-in applications that will change your existing dynamic URLs into static ones, specially for shopping carts there are plenty of applications available

6. Avoid using session IDs in the URL, specially when user has not logged in

7. If you do need to include parameters, limit it to two and limit the number of characters per parameter to ten or less

8. If you do have small dynamic website and enough time, you can apply this technique. Just right click on page by page of you website, copy the source code and create new static page with .htm or .html extensions