Tuesday, March 18, 2008

Guidelines to Create search Engine Friendly Websites

The big question today is “What is Internet Marketing and why do we need that?” Have you ever wondered why everybody now-a-days is talking about Internet Marketing or Search Engine Optimization and stuff like that? Well, I think that is because it is the need of the time and to remain in the market or increase publicity, one must consider this aspect of the web. Today, everybody who wants an online presence is having a website. This helps them to reach their target audience who are physically very far away from them but with the help of Internet can reach them with just one mouse click. Now what can be a better option to reach your target audience but a website or blog of yours? Internet Marketing does not only mean SEO or getting higher rankings in the SERPs. It’s much more than just SEO and I will be talking about all these aspects of Internet Marketing in my Blog. But, the first thing first. If you want to market your product, services, brand or simply yourself on the Net, the very first thing you need is a Search Engine friendly website and to create such site you need to follow certain guidelines. I have prepared a list of the guidelines, to be followed to create a Search Engine friendly site. This list has been prepared with the help of Google webmaster guidelines and you can visit Google Webmaster Help Center for more help and guidance.

A. Design Guidelines

  1. Websites should be made with a clear hierarchy and text links. All the page should be reachable from at least one static text link.
  2. Java script and CSS should be defined in a separate file.
  3. Toggle should be avoided.
  4. No inline css should be present.
  5. No unnecessary spaces should be present.
  6. Linking level should not exceed more than 3 levels.
  7. Nested tables/Div should be avoided, DIV based website is best as per SEO point of view because Search engines crawlers can easily crawl DIV based website.
  8. No hidden text or hidden links should be present on the page. (Sometimes this happens by mistake due to CSS).
  9. No broken links or 404 error links should be present.
  10. Text should be used instead of images to display important names, content or links. The Google crawler does not read or recognize the text that is contained in images.
  11. Pop ups should be avoided.
  12. Uniformity should be maintained throughout the website.
  13. All the pages should be W3C validated.
  14. The pages should not be more than 100KB.
  15. Image to text ratio should be maintained and over usage of images should be avoided.
  16. Maintain code to text ratio.
  17. Usage of frames to create the websites should be avoided.
  18. Minimum use of flash.
  19. There should be no doorway pages.
  20. Multiple copies of a page with different URLs should not be created. Many sites contain text-only or printer-friendly versions of same pages which contain the same content as the corresponding graphic-rich pages. If the need arises to create such pages, then to make ensure that your preferred page is included in Google search results, block duplicates from the spiders using a robots.txt file.

B. Development & Technical Guidelines:

  1. Java script should be defined in separate file.
  2. Try to reduce code by using functions and include files.
  3. Database should be optimized. (To load pages faster, response time can be decreased).
  4. Search bots should be allowed to crawl the websites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of the sites, as Search Engine bots may not be able to eliminate URLs that look different but actually point to the same page.
  5. Make use of robots.txt file on the web server. This file tells the crawlers which directories can or cannot be crawled. Don't accidentally block the Googlebot crawler. You may visit http://www.robotstxt.org/wc/faq.html to learn how to instruct robots when they visit your site. You can also test your robots.txt file with the robots.txt analysis tool which is available in Google Webmaster Tools.
  6. Don’t try or attempt to "cloak" pages, or put up "crawler only" pages.

C. Content Guidelines:

  1. Content should be original and unique.
  2. SEO copywriting should be practiced while developing the content for the websites.
  3. Don’t load pages with irrelevant keywords or phrases.
  4. Don’t create pages with duplicate content.
  5. Don't stuff the pages with lists of keywords.

D. Server Administration:

D.1 Technical Server Side Factors:

  1. The web server should support the If-Modified-Since HTTP header. This feature allows the web server to tell Google whether website content has changed since the crawler last crawled the website. Supporting this feature also saves the bandwidth and overhead.
  2. URL Canoncalization should be removed by using 301 permanent redirections. (http://example.com/ and http://www.example.com/) both are different URL’s in the eye of the Search Engines.

D.2 Steps for Switching Hosting or Changing IP:

  1. Bring a copy of the site up at the new IP address.
  2. Update the name server to point to the new IP address.
  3. Once the pages from the new IP address are fetched by the Search Engine spiders (typically within 24-48 hours), it's safe to take down the copy of the site at the old IP address.

0 comments:

 

Ads Banner

My Blog List

Search

Followers

Share

E Marketing Strategies Copyright © 2009 Blogger Template Designed by Bie Blogger Template