Monday, July 28, 2008

Alexa Ranks – How Worthy They Are?

As an Internet Marketing Consultant, I have come across many clients who are very much concerned about the Alexa ranking of their as well as their competitors’ websites. As a professional Search Marketer, this is my personal opinion that Alexa ranking hardly matters when the success of internet marketing plan for any website is measured.

What is Alexa and Alexa Rank? is a subsidiary of Alexa is a website that provides information on traffic levels for any website and the Alexa rank is measured according to the amount of users who’ve visited a website with the Alexa toolbar installed.

The schema of Alexa ranking is based on the level of traffic each website receives from the number of people who visit a website with the Alexa toolbar installed.

This traffic rank is based on three months of aggregated historical data that is gathered from millions of Alexa Toolbar users. This ranking is a combined measure of page views and visitors and is based on the geometric mean of these two quantities averaged over time. A lower Alexa number means a greater level of traffic. Alexa simply provides a rough snapshot of what is going on and does not mean that much by itself.

Problems with the Alexa Rank

The main problem with Alexa ranking is that it is heavily skewed towards websites which have a large webmaster or web savvy audiences because of the fact that they are much more likely to have installed the Alexa toolbar than the websites whose visitors are unaware of Alexa. This makes Alexa a vastly inaccurate method of measuring a website’s reach, traffic and potential.

But unfortunately, Alexa is still heavily used by webmasters and ad networks while measuring the value of advertising on any website and is considered as a central element in site monetization strategies.

However, if you really want to increase your Alexa rank, watch this space for the next post where I shall explain the methods to get started with Alexa and give you some handy tips and strategies that can be used to increase your Alexa Rank dramatically in the short and long run.

Thursday, July 10, 2008

Link Baiting

Link baiting is simply getting more and more inbound links with a twist. Rather than hunting out links link baiting brings the links to you through unique and popular site content. Matt Cutts defines link bait as anything “interesting enough to catch people’s attention, and that doesn’t have to be a bad thing.”. Link baiting involves production of link worthy content, video or images, which in turn generates massive amount of one-way inbound links.

Link baiting has been long time regarded as black hat technique, but if it’s interesting information or fun, it doesn’t have to have negative connotations. Content can be both: white-hat as well as interesting enough to generate buzz. Floating information or ideas that people talk about is a sure-shot method to generate links.

You don't have to be a big brand or have a lot of money to create link bait. Many people have created link bait on purpose and many have created link bait without even knowing it. For search marketers, the techniques, if handled properly in an ethical manner, can be quite helpful in producing good quality one way links.

How Does Link Bait Work?

Link Baiting is just like fishing. You send out bait (content) in the pond (the internet) and patiently wait for a bite (linkback). The article is the bait, and the link is the catch. A properly created page can capture huge links on its own with little to no effort from you.

The Hooks

Before any real ‘fishing’ can take place you need a link baiting hooks and they come in all shape and sizes:
  • News Hook
  • Resource Hooks
  • Contrary Hook
  • Attack Hook
  • Humor Hooks
1. News Hooks

For this to work you almost exclusively have to break the ice on the matter, that means to you have to be the first to break the news. Whether it’s the latest gadget or the latest Britney Spears goof, you have to be the first and the news should be big enough to generate enough curiosity.

Resource Hooks

Resource hook is more of an informational page. Do some research and build a great, unique and remarkable article in a field on which you’re an expert. It’s important you know what you’re talking about or else you’ll get trolled.

Contrary Hooks

Contrary hooks are when you contradict someone else’s statement. It should be about someone prominent in the industry and it should be controversial.

Attack Hooks

Attack hooks are next level of contrary hooks where you launch personal attacks on people taking the debunking of theories to the next level.

Humor Hooks

This is the easiest type of link bait, just blog about a funny video or story and you can get some major exposure. But it isn’t as easy as it looks, the niche is really crowded and that means you really need to have something truly funny on your hands and a lot of luck.

10 great Examples of Link Baiting:

Rhea Drysdale - SEO Dream Team
Mixed Markets Top 10 Reasons Why This List Will Be Popular on Digg
SEOmoz Top 100 Digg Users Control 56% of Digg’s HomePage Content
The Hot 100 List
SEOmoz’s Web 2.0 Awards
Lifehacker Top 10 iPhone applications
Performancing 10 Articles All Bloggers Should Read (at least once)
Engadget Blu-ray vs HD DVD: State of the Division
2decides 2008 Presidential Election Candidates on the Issues
SingleGrain 301 Useless Facts

Related Links on Link Bait:

Tuesday, July 08, 2008

10 Tips to Avoid SEO Spam

In SEO fraternity, the word Spam is used to describe unethical techniques and practices adopted to boost Search Engine Rankings artificially. These techniques and tactics are a real no no for the Search Engines and the sites adopting these techniques are banned by them being thought of using unethical business practices.

Even the word Spam itself is being used to describe the dark-side and often deceptive side of everything from Email marketing to abusive forum behaviour.

In this post I am listing some techniques which are treated as Spam by the SEs and should always be avoided while promoting any website.

1. Cloaking
Cloaking is the technique that involves serving one set of information to search engine robots or crawlers and an entirely different set of information to the general visitors.

2. IP Delivery
IP delivery is a simple form of cloaking which involves s
erving targeted/different content to users based on their IP address. If you need to use the techniques in some gelocation specific cases then make sure that Search Bots see the same content as a typical user from the same IP address would see.

3. Leader Pages
Leader pages are a series of similar documents each of which are designed to meet requirements of different search engine algorithms. It is considered SPAM by the major search engines as they see multiple incidents of what is virtually the same document.

4. Mini-Site networks
Mini site networks were designed to exploit a critical vulnerability in early versions of Google's PageRank algorithm. These are very much like leader pages and involve the creation of several topic or product related sites all linking back to a central sales site. Each mini-site is designed to meet specific requirements of each major search engine and has its own keyword enriched URL.

5. Link Farms
Link farms are groups of websites all hyperlinked to every other page in the group. Most of the link farms are created through automated programs and are a form of spamming the index of a search engine, also referred as spamdexing or spamexing.

6. Blog and/or Forum Spam
Blogs and forums are amazing platforms for essential communication over the world Wide Web. In some cases, blogs and forums establish high PR values for their documents which make them the targets of unethical SEOs looking for high-PR links back to their websites or those of their clients. Google in particular has clamped down on Blog and Forum abuse.

7. Keyword Stuffing
At one time, search engines were limited to the algorithm that used to rank websites entirely on the number of keywords found on the pages. That limitation led webmasters to stuff keywords everywhere they possibly could. Over use of keywords on a page leads to keyword stuffing and spamming.

8. Hidden Text
There are two types of hidden texts. The first one is the text in same colour and shade as that of the background which makes it invisible to human visitors but not to search robots. The second is the text that is hidden behind images or under document layers. Search engines tend to dislike both forms of hidden text and devalue the pages containing them.

9. Hidden Tags
There are a various types of tags like comment tags which are sometimes used by SEOs to increase keyword density on a page. These should be avoided since they are treated as spam techniques.

10. Misuse of Web 2.0 Formats (ie: Wiki, Social Networking and Social Tagging)
A very common form of SEO spam is the misuse of user-input media formats such as Wikipedia or social networking and bookmarking sites. Like blog comment spamming, the instant live-to-web nature of Web 2.0 formats provide an open range for SEO spam technicians.

Many of these exploits might find short-term success though in the long run they are always penalized by the Search engines.

Thursday, July 03, 2008

Image Optimization Tips for Search Engines

With the increase in the impact of visual trends, importance of images on a website and image search has drastically increased. Image search can be defined as query results that are accompanied by thumbnail graphics and supplanted by contextual information, that best match users' search queries.

In this post, I have listed few points which will help you to optimize your images for Search Engines and rank them in the image search.

1. Places the images where image search results appear, and are indexable into general Search Engines' contextual results, including:

  • Major Search Engines - either within contextual search results or vertical image search
  • Photo sharing sites (Flickr, Webshots, PBase, Fotki)
  • Social image sharing sites (MySpace, Facebook)

2. Take original photos, so that you can brand them with your trademark, logo or url. At business listing sites, add your business logo to creates a more significant effect on users’ mind.

3. Use good quality pictures and images, and make necessary resolution adjustments between full size images and thumbnails. Pictures with good contrast tend to work better.

4. Save your photos as JPG files, and other graphic images as GIFs.

5. Give appropriate file names to your images that match and represent the theme. Image names should be descriptive such as mobile-phone.jpg rather than untitled1.jpg.

6. Give appropriate alt tag and title tag to the images.

7. Use clear images as distorted images are not able to speak their agenda.

8. Images should be less weighted as heavy images increase page loading time and also use extra bandwidth.

9. Always specify the width and height of the images when you define an image on webpage. If you don’t mention the same then html parser itself need to consider the image size and it’ll take some extra time to think.

10. If content rich pages are embedded with pictures then they have more chances of ranking for those images and web pages. Such as if you’re providing services then add smiley faces of employees OR if you are selling products then include all products images on site. So add images on to content rich pages.

11. Add map image or link to map from site page. It increases your site usability.

12. On alt tag of map images, add physical address of your business.

13. You can also add testimonials from customers, celebrities’ snaps or award winning snapshots. This will create trust and attract more visitors.

14. Add your logo on press releases and link them to your home page.

15. While submitting your site to directories, add logo to your profile.

16. Upload images to Google Base.

17. Include images and logo into your newsletter (email marketing campaign) and get linked from them.

18. Get more links from clients, partners or B2B sites through images.

19. Use thumbnails (i.e. small size images) instead of large images and get the large images linked from those thumbnails. It will decrease the load time of your web page and also increase its usability.

20. Enable enhanced image search in Google webmaster tools in order to add valuable tag to your image in Google image Search Engine.

21. Link every image as hyperlinked images have much better chances of being included in image searches even if they are linking to themselves.

22. Bookmark your images using social networking sites such as Facebook, Digg etc.

23. Search Engines also look at the text surrounding a graphic image to determine relevancy so take care of the surrounding text and use related text near the images.

24. Do not exclude your graphic images directory from search robots or limit search engine access to graphic-image files.

25. Don’t use JavaScript code to show up the large size of the image. As search engines still can’t understand JavaScript code completely so never do this.

Image Search Engines:- (Specifies all search engines, Stock photographs, graphics and clip-art, photo sharing sites and artwork related images sites)|pg:1

Wednesday, July 02, 2008

Optimizing Dynamic Websites

Dynamic content used to be a real obstacle for any Search Engine friendly website. Search Engines did not crawl and index these pages which made optimization of such websites nearly impossible. But now times have changed and so do the Search Engines and their indexing process. Now SEs include dynamically generated pages in their index.

Although Search Engines have started indexing and ranking dynamic websites but some particulars of dynamic pages can still be obstacles to getting indexed. Today most of the websites have some level of dynamic or CMS-managed content which means that you need to follow certain guidelines to avoid major pitfalls and ensure that your dynamic body of work is search engine friendly from head to toe.

Here are some guidelines that help your web pages to be crawled properly by the Search Engines:

Allow Search Engine Robots to Follow Regular HTML Links to All The Pages of Your Website

Search engine robots reach any page by following the links to that particular page. Sometimes it becomes difficult for the search robots to reach all the pages on a dynamic website. You can get an idea of the same by the following example.

Suppose you have a form on your website and you ask people to select their preferred category from a dropdown menu, and then when people submit the form your website generates a page with content that is specifically written for that particular category. Since Search Engine robots are software programs, they don't fill out forms or select from dropdown menus, so there will be no way for them to get to that page.

This problem can be rectified by providing standard HTML links that point to all the dynamic pages. The easiest way to do this is to add these links to your site map.

Set Up an XML Sitemap

If providing static HTML links is not possible, you can use an XML site map to tell search engines the locations of all your pages.

You can tell Google and Yahoo! about your sitemap through their webmaster tools (Google Webmaster Tools and Yahoo Webmaster Tools).

Keep The Dynamic URLs Short and Tidy

Here are a few guidelines you should follow for your website parameters:

  • Limit the number of parameters in the URL to a maximum of 2
  • Use the parameter "?id=" only when in reference to a session id
  • Be sure that the URL functions if all dynamic items are removed
  • Be sure your internal links are consistent - always link with parameters in the same order and format

Avoid Dynamic Looking URLs

Not only for Search engines but dynamic-looking URLs are also less attractive to your human visitors. Most people prefer URLs that clearly communicate the content on the page and such pages get more hits than dynamic looking urls.

Static-looking URLs are more likely to receive inbound links, because usually people are reluctant to link to pages with very long or complicated URLs. Also keywords in the URL are one of the factors in Search Engine Ranking Algorithms.

There are many tools available that can re-create a dynamic site in static form and re-write your URLs, if you have too many parameters, to "look" like regular non-dynamic URLs.

De-Index Stubs

Website stubs are pages that are generated by dynamic sites but really have no independent content on them. For example, if your website is a shopping cart for apparels, there may be a page generated for the category “Age 8-10 Skirts” but you may not actually have any products in this category. Stub pages are not only a real no no for the Search Engines but are annoying to searchers also. So, if you can not avoid them, exclude them from indexing using the robots.txt file or robots meta tag.

De-Index Pages With Duplicate Content

While this is not a problem specific to dynamic websites but dynamic sites are the ones which are more likely to face this problem than the static ones. If there are multiple pages on your site with identical or nearly identical content, exclude the duplicate pages from indexing using the robots.txt file or a robots meta tag. Choose the most appropriate and relevant page and exclude the rest.

Since dynamic content is usually timely and useful, more and more webmasters are opting for such websites and Search Engines are ranking them in their result pages. And now you know how to optimize your dynamic website and make it reach its full Search Engine potential so that it can rank well in the SERPs.

Related Posts:

XML Sitemaps


Ads Banner

My Blog List




E Marketing Strategies Copyright © 2009 Blogger Template Designed by Bie Blogger Template