Showing posts with label Tips and Tricks. Show all posts
Showing posts with label Tips and Tricks. Show all posts

Tuesday, July 08, 2008

10 Tips to Avoid SEO Spam

In SEO fraternity, the word Spam is used to describe unethical techniques and practices adopted to boost Search Engine Rankings artificially. These techniques and tactics are a real no no for the Search Engines and the sites adopting these techniques are banned by them being thought of using unethical business practices.

Even the word Spam itself is being used to describe the dark-side and often deceptive side of everything from Email marketing to abusive forum behaviour.

In this post I am listing some techniques which are treated as Spam by the SEs and should always be avoided while promoting any website.

1. Cloaking
Cloaking is the technique that involves serving one set of information to search engine robots or crawlers and an entirely different set of information to the general visitors.

2. IP Delivery
IP delivery is a simple form of cloaking which involves s
erving targeted/different content to users based on their IP address. If you need to use the techniques in some gelocation specific cases then make sure that Search Bots see the same content as a typical user from the same IP address would see.

3. Leader Pages
Leader pages are a series of similar documents each of which are designed to meet requirements of different search engine algorithms. It is considered SPAM by the major search engines as they see multiple incidents of what is virtually the same document.

4. Mini-Site networks
Mini site networks were designed to exploit a critical vulnerability in early versions of Google's PageRank algorithm. These are very much like leader pages and involve the creation of several topic or product related sites all linking back to a central sales site. Each mini-site is designed to meet specific requirements of each major search engine and has its own keyword enriched URL.

5. Link Farms
Link farms are groups of websites all hyperlinked to every other page in the group. Most of the link farms are created through automated programs and are a form of spamming the index of a search engine, also referred as spamdexing or spamexing.

6. Blog and/or Forum Spam
Blogs and forums are amazing platforms for essential communication over the world Wide Web. In some cases, blogs and forums establish high PR values for their documents which make them the targets of unethical SEOs looking for high-PR links back to their websites or those of their clients. Google in particular has clamped down on Blog and Forum abuse.

7. Keyword Stuffing
At one time, search engines were limited to the algorithm that used to rank websites entirely on the number of keywords found on the pages. That limitation led webmasters to stuff keywords everywhere they possibly could. Over use of keywords on a page leads to keyword stuffing and spamming.

8. Hidden Text
There are two types of hidden texts. The first one is the text in same colour and shade as that of the background which makes it invisible to human visitors but not to search robots. The second is the text that is hidden behind images or under document layers. Search engines tend to dislike both forms of hidden text and devalue the pages containing them.

9. Hidden Tags
There are a various types of tags like comment tags which are sometimes used by SEOs to increase keyword density on a page. These should be avoided since they are treated as spam techniques.

10. Misuse of Web 2.0 Formats (ie: Wiki, Social Networking and Social Tagging)
A very common form of SEO spam is the misuse of user-input media formats such as Wikipedia or social networking and bookmarking sites. Like blog comment spamming, the instant live-to-web nature of Web 2.0 formats provide an open range for SEO spam technicians.

Many of these exploits might find short-term success though in the long run they are always penalized by the Search engines.

Thursday, July 03, 2008

Image Optimization Tips for Search Engines

With the increase in the impact of visual trends, importance of images on a website and image search has drastically increased. Image search can be defined as query results that are accompanied by thumbnail graphics and supplanted by contextual information, that best match users' search queries.


In this post, I have listed few points which will help you to optimize your images for Search Engines and rank them in the image search.

1. Places the images where image search results appear, and are indexable into general Search Engines' contextual results, including:

  • Major Search Engines - either within contextual search results or vertical image search
  • Photo sharing sites (Flickr, Webshots, PBase, Fotki)
  • Social image sharing sites (MySpace, Facebook)

2. Take original photos, so that you can brand them with your trademark, logo or url. At business listing sites, add your business logo to creates a more significant effect on users’ mind.


3. Use good quality pictures and images, and make necessary resolution adjustments between full size images and thumbnails. Pictures with good contrast tend to work better.

4. Save your photos as JPG files, and other graphic images as GIFs.


5. Give appropriate file names to your images that match and represent the theme. Image names should be descriptive such as mobile-phone.jpg rather than untitled1.jpg.


6. Give appropriate alt tag and title tag to the images.


7. Use clear images as distorted images are not able to speak their agenda.


8. Images should be less weighted as heavy images increase page loading time and also use extra bandwidth.


9. Always specify the width and height of the images when you define an image on webpage. If you don’t mention the same then html parser itself need to consider the image size and it’ll take some extra time to think.


10. If content rich pages are embedded with pictures then they have more chances of ranking for those images and web pages. Such as if you’re providing services then add smiley faces of employees OR if you are selling products then include all products images on site. So add images on to content rich pages.


11. Add map image or link to map from site page. It increases your site usability.


12. On alt tag of map images, add physical address of your business.


13. You can also add testimonials from customers, celebrities’ snaps or award winning snapshots. This will create trust and attract more visitors.


14. Add your logo on press releases and link them to your home page.


15. While submitting your site to directories, add logo to your profile.


16. Upload images to Google Base.


17. Include images and logo into your newsletter (email marketing campaign) and get linked from them.


18. Get more links from clients, partners or B2B sites through images.


19. Use thumbnails (i.e. small size images) instead of large images and get the large images linked from those thumbnails. It will decrease the load time of your web page and also increase its usability.


20. Enable enhanced image search in Google webmaster tools in order to add valuable tag to your image in Google image Search Engine.


21. Link every image as hyperlinked images have much better chances of being included in image searches even if they are linking to themselves.


22. Bookmark your images using social networking sites such as Facebook, Digg etc.


23. Search Engines also look at the text surrounding a graphic image to determine relevancy so take care of the surrounding text and use related text near the images.


24. Do not exclude your graphic images directory from search robots or limit search engine access to graphic-image files.


25. Don’t use JavaScript code to show up the large size of the image. As search engines still can’t understand JavaScript code completely so never do this.


Image Search Engines:-

http://images.google.com/

http://www.live.com/?&scope=images

http://www.exalead.com/image/results?q=

http://www.pixsy.com/

http://www.picsearch.com/

http://www.altavista.com/image/

http://www.ithaki.net/images/

http://www.graphicsfactory.com/

http://www.ditto.com/

http://pro.corbis.com/

http://www.animationfactory.com/en/

http://www.faganfinder.com/img/ (Specifies all search engines, Stock photographs, graphics and clip-art, photo sharing sites and artwork related images sites)

http://images.search.yahoo.com/

http://www.ask.com/?#subject:img|pg:1

http://www.fastimagesearch.com/

www.fotosearch.com

www.webplaces.com/search

Friday, June 06, 2008

How to Get Listed in DMOZ Directory – The Open Directory Project (ODP)

DMOZ is the largest and most comprehensive human edited Open Directory of the web. Being listed in this directory definitely help Google rankings, but getting in can take a very long time. This is due to the fact that often webmasters fail to fulfill the criteria and submission guidelines which lead to rejection for submission. Websites must fulfill certain criteria before being submitted in DMOZ. The websites are manually reviewed for quality and relevance.

Although the directory plays very little role in generating traffic as not many people actually use DMOZ for searches. DMOZ listing can improve your search engine rankings and Google Pagerank dramatically. The DMOZ directory data is syndicated throughout hundreds of web directories and even Google uses this data.

As we all know that Page Rank is an integral part of Google's ranking algorithm, and higher PageRank helps in getting higher rankings in Google SERPs. A listing in DMOZ creates two significant backlinks for your website - one from DMOZ itself and one from the Google directory. This also gets your website links from the thousands of small sites which download and use the DMOZ directory.

Now, there are some points to be remembered while submitting the site in DMOZ. It rejects spammy/MFA sites, sites with too many affiliate links and sites without unique content. Another major reason why a site may be rejected is because of the failure to adhere to submission guidelines. If the Title and Description provided in the submission don't follow the Open Directory's guidelines, then the site gets rejected. While submitting a site, it is very necessary to read and follow submission guidelines.
So, how can you submit a website over the DMOZ, this point is worth pondering. According to some editors of DMOZ, it is a 5 step process. These steps are not official DMOZ submission guidelines but these steps may help. Here are those 5 steps…

  1. Read carefully the submission guidelines: The open directory does not include mirror sites, sites with duplicate and illegal content, or sites consisting largely of affiliate links.

  2. Try to choose and submit in the regional section rather than the main one.

  3. Submit in the appropriate category under which your website belongs.

  4. Follow the guidelines while creating the Title and description for submission.

  5. Make the title official: Sometimes a non promotional description works. Experts advise to use business names a titles.


So, follow the steps listed above and get your website listed in DMOZ directory. And yes don’t forget to continue working on your site to get the desired results.

Wednesday, May 21, 2008

XML Sitemaps

Few days back, I had published a post on 10 basic thumb rules for SEO success. I got many queries regarding the 7th point i.e. xml sitemap. So I am publishing this post on xml sitemap for those who want to learn more about these sitemaps.

What is a Sitemap?

Sitemaps are an easy way for webmasters to inform the Search Engines about all the pages on their websites that are available for indexing or crawling. These are basically the tree structure of the website showing the hierarchy of the pages and the clear structure of website architecture and navigation. Usually Search Engine crawlers discover a new page from the links within the site and from other sites. Creation of sitemaps helps in providing this data to the crawlers.

Sitemaps can be of two types: html and xml. Html sitemaps are simple html files containing links to the individual pages of the website.

XML Sitemap

In the simplest form, an xml Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL. These metadata are the last updated or modified date, the change frequency, importance of individual pages, relative to other URLs of the site etc. These are the additional information for the Search Engine crawlers.

The Sitemap protocol format consists of XML tags. The file must be UTF-8 encoded and data values in the Sitemaps should be entity escaped. The xml Sitemap must:

  • Begin with an opening tag and end with a closing tag

  • The namespace (protocol standard) should be specified within the tag

  • A entry must be included for each URL, as a parent XML tag

  • A child entry must be included for each parent tag

All other tags like , etc. are optional and support for these optional tags may vary among Search Engines.

Another very important aspect to be remembered is that all URLs in a Sitemap must be from a single host, such as www.example.com.

A Sample XML Sitemap

xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9
http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">

http://www.example.com/

2005-01-01

monthly

0.8


A Sample XML Sitemap with All Attributes

   
      http://www.example.com/
      2008-05-21
      monthly
      0.8
   
   
      http://www.example.com/page1.html
      weekly
   
   
      http://www.example.com/page2.html
      2008-05-21
      weekly
   
   
      http://www.example.com/page3.html
      2008-04-20T18:00:15+00:00
      0.3
   
   
      http://www.example.com/page4.html
      2008-03-21
   


Using Sitemap index files (to group multiple sitemap files)

You can also provide multiple Sitemap files, but make sure that each Sitemap file you provide must have no more than 50,000 URLs and must not be larger than 10MB. To list more than 50,000 URLs, you must create multiple Sitemap files.

In case of multiple Sitemaps creation, you should list each Sitemap file in a Sitemap index file.

The Sitemap index file must:
  • Begin with an opening tag and end with a closing tag

  • Include a entry for each Sitemap as a parent XML tag

  • Include a child entry for each parent tag

  • The optional tag is also available for Sitemap index files

Syndication Feeds

An RSS (Real Simple Syndication) 2.0 or Atom 0.3 or 1.0 feed can also be provided which is generally done when the site already has a syndication feed. Make sure that the RSS feed is located in the highest-level directory. Search engines extract the information from the feed as follows:
  • field - indicates the URL
  • modified date field (the field for RSS feeds and the date for Atom feeds) - indicates when each URL was last modified
Use of the modified date field is optional.

Text File

You can provide a simple text file that contains one URL per line. Following guidelines must be followed while creating a text file:

  • The text file must have one URL per line. The URLs cannot contain embedded new lines

  • You must fully specify URLs, including the http

  • Each text file can contain a maximum of 50,000 URLs and must be no larger than 10MB (10,485,760 bytes). If you site includes more than 50,000 URLs, you can separate the list into multiple text files and add each one separately

  • The text file must use UTF-8 encoding. You can specify this when you save the file (for instance, in Notepad, this is listed in the Encoding menu of the Save As dialog box)

  • The text file should contain no information other than the list of URLs

  • The text file should contain no header or footer information

  • If you would like, you may compress your Sitemap text file using gzip to reduce your bandwidth requirement

  • You should upload the text file to the highest-level directory you want search engines to crawl and make sure that you don't list URLs in the text file that are located in a higher-level directory
Location of a Sitemap File

The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. An xml Sitemap located at http://www.example.com/dir1/sitemap.xml can include any URLs starting with http://www.example.com/dir1/ but not the ones which include URLs starting with http://www.example.com/dir2/.

So, the Sitemap should always be located under the root directory to include all the pages of the website.

Informing the Search Engines

After creating the Sitemap and placed it on the webserver, the Search Engines that support this protocol must be informed of its location. This can be done by:
  • Submitting it to the search engine via their submission interface

  • Specifying the location in the robots.txt file

  • Sending an HTTP request

The search engines can then retrieve the Sitemap and make the URLs available to their crawlers.

Wednesday, April 09, 2008

Successful SEO Campaign - 10 Basic Thumb Rules

Search Engine Optimization is the stepping stone for developing Internet Marketing Strategies for any website. It lays the foundation for the success of web marketing and advertising for any entity. If you want to increase the online presence of your website, then SEO is the first step. The basic performance of any successful SEO campaign can be measured by examining the ranking of the website in the Search Engine Result Pages or simply SERPs. SEO is an ever evolving and fast moving field. There are several techniques which can be followed to accomplish the task but the big challenge is to decide the techniques which should be adopted for any particular campaign to get the desired results. The best practice is to do proper planning before proceeding with the SEO process and adopt a combination of the most suitable techniques which depends entirely on the website theme and goals. The success depends upon how intelligently you use that technique.

The new and evolving trends are always good to try and experiment but don’t forget to implement the basics of SEO. There are 10 basic thumb rules, which, according to me should always be followed to create a solid foundation. These are:

1. Unique Titles and Meta Tags: All the pages should contain unique and descriptive titles and meta tags.

2. Original Content: There is a proverb which is very famous in SEO fraternity that content is king. So always put unique, high quality and informative content on your website.

3. Quality Link Backs: Get quality back links for your website. Don’t run for junk, high in quantity links.

4. Ethical Approach: Don’t try to fool the Search Engines. Always adopt the ethical approach to optimize your website. It may take some time to show in the SERPs, but believe me the results will be long lasting.

5. Website Architecture: Have a proper architecture and navigation for your website. It helps the Search Engine spiders to do deep crawling of the site.

6. No Unnecessary Code: Remove all the junk or unnecessary code from the site. Make it a neat and clean website.

7. XML Sitemaps: Always create and submit xml sitemaps at regular intervals for proper indexing of the website.

8. No Broken Links: Check and remove all the broken or not found links from your website at regular intervals.

9. Regular Updations: Search engines love the websites which have something new to offer to the visitors. They do not like dead sites. Update your website regularly with fresh and original content.

10. Be Consistent: Consistency is the key to success. So keep on working on your SEO project to reap the benefits.

The 10 points mentioned above are the basic SEO ranking factors and if followed with proper planning, show the sure shot success path to a successful SEO campaign.

Tuesday, March 18, 2008

25 Free Link Building Tips

25 Great Ways to Get Free One Way Backlinks for Your Website

  1. Write articles and submit in niche article directories with a link back to your website in the resource section.
  2. Write informative articles so that other sites reprint and publish it on their sites with your link in the resource.
  3. Participate in forums and create a signature link that points to your website.
  4. Comments on other blogs with a link back to your website.
  5. Do viral marketing for your website.
  6. Participate in yahoo answers and put a link to your website in the signature.
  7. Participate in Google Groups.
  8. Write news worthy press releases and publish in Online PR sites.
  9. Submit your site to free web directories.
  10. Create a blog and do the postings regularly. The most important thing is post great content which attracts visitors and gets you link.
  11. Review a product or company and if your remarks are positive, email the company and ask to feature your remark in their press section.
  12. Add a link for “bookmark this site” on your site.
  13. Create a freebie product like ebooks, whitepapers, free and original tools etc. to give away.
  14. Hold a competition or opinion poll for the Top 10, 50 or 100 (you can decide the number) websites in your niche. Post the result and watch lots of the sites giving you a link back to say what their position was.
  15. Submit relevant and informative videos to video sharing sites like YouTube and Metacafe with lot of substance. Don’t forget to include a link in the description and within the actual video.
  16. Create pages on places like Squidoo, Hubpages etc. and put links to your website.
  17. Place free classified ads on relevant online classified sites.
  18. Conduct an open survey and publish the results. Also make sure you let people know about it.
  19. Try to get listed in Google’s News search but make sure that your site contains unique and quality content.
  20. If your company is fairly good and reputed one, create a page about it in the Wikipedia.
  21. Submit a story to Digg or del.icio.us that links back to an article on your website.
  22. Publish RSS feeds for your content.
  23. Review related websites on Alexa.
  24. Swap some links (but make sure don’t over do it) with relevant partners.
  25. Start networking and creating a positive buzz to attract traffic.
 

Ads Banner

My Blog List

Search

Followers

Share

E Marketing Strategies Copyright © 2009 Blogger Template Designed by Bie Blogger Template