Showing posts with label SEM. Show all posts
Showing posts with label SEM. Show all posts

Friday, August 29, 2008

10 Ways to Increase Your Alexa Rank

  1. Install the Alexa toolbar and set your website as your homepage.
  2. Put up an Alexa rank widget on your website.
  3. Encourage your friends, your site visitors and fellow webmasters to use the Alexa toolbar.
  4. Always place your URL in webmaster forums that will get webmasters to visit your website.
  5. Create a webmaster tools section on your website and write content that is related to webmasters.
  6. Get Dugg or Stumbled which will bring massive numbers of visitors to your website and will have a positive impact on your Alexa Rank.
  7. Buy banners and links for traffic from webmaster forums and websites.
  8. Develop quality content to attract and maintain a large audience.
  9. Promote your content on social networking websites and webmaster forums.
  10. Great link-worthy content will help to increase the website traffic organically and passively increase your Alexa rank.

Now you have ten ways to boost your Alexa Rank and increase your site’s monetization potential.

Monday, July 28, 2008

Alexa Ranks – How Worthy They Are?

As an Internet Marketing Consultant, I have come across many clients who are very much concerned about the Alexa ranking of their as well as their competitors’ websites. As a professional Search Marketer, this is my personal opinion that Alexa ranking hardly matters when the success of internet marketing plan for any website is measured.

What is Alexa and Alexa Rank?

Alexa.com is a subsidiary of Amazon.com. Alexa is a website that provides information on traffic levels for any website and the Alexa rank is measured according to the amount of users who’ve visited a website with the Alexa toolbar installed.

The schema of Alexa ranking is based on the level of traffic each website receives from the number of people who visit a website with the Alexa toolbar installed.

This traffic rank is based on three months of aggregated historical data that is gathered from millions of Alexa Toolbar users. This ranking is a combined measure of page views and visitors and is based on the geometric mean of these two quantities averaged over time. A lower Alexa number means a greater level of traffic. Alexa simply provides a rough snapshot of what is going on and does not mean that much by itself.

Problems with the Alexa Rank

The main problem with Alexa ranking is that it is heavily skewed towards websites which have a large webmaster or web savvy audiences because of the fact that they are much more likely to have installed the Alexa toolbar than the websites whose visitors are unaware of Alexa. This makes Alexa a vastly inaccurate method of measuring a website’s reach, traffic and potential.

But unfortunately, Alexa is still heavily used by webmasters and ad networks while measuring the value of advertising on any website and is considered as a central element in site monetization strategies.

However, if you really want to increase your Alexa rank, watch this space for the next post where I shall explain the methods to get started with Alexa and give you some handy tips and strategies that can be used to increase your Alexa Rank dramatically in the short and long run.

Tuesday, July 08, 2008

10 Tips to Avoid SEO Spam

In SEO fraternity, the word Spam is used to describe unethical techniques and practices adopted to boost Search Engine Rankings artificially. These techniques and tactics are a real no no for the Search Engines and the sites adopting these techniques are banned by them being thought of using unethical business practices.

Even the word Spam itself is being used to describe the dark-side and often deceptive side of everything from Email marketing to abusive forum behaviour.

In this post I am listing some techniques which are treated as Spam by the SEs and should always be avoided while promoting any website.

1. Cloaking
Cloaking is the technique that involves serving one set of information to search engine robots or crawlers and an entirely different set of information to the general visitors.

2. IP Delivery
IP delivery is a simple form of cloaking which involves s
erving targeted/different content to users based on their IP address. If you need to use the techniques in some gelocation specific cases then make sure that Search Bots see the same content as a typical user from the same IP address would see.

3. Leader Pages
Leader pages are a series of similar documents each of which are designed to meet requirements of different search engine algorithms. It is considered SPAM by the major search engines as they see multiple incidents of what is virtually the same document.

4. Mini-Site networks
Mini site networks were designed to exploit a critical vulnerability in early versions of Google's PageRank algorithm. These are very much like leader pages and involve the creation of several topic or product related sites all linking back to a central sales site. Each mini-site is designed to meet specific requirements of each major search engine and has its own keyword enriched URL.

5. Link Farms
Link farms are groups of websites all hyperlinked to every other page in the group. Most of the link farms are created through automated programs and are a form of spamming the index of a search engine, also referred as spamdexing or spamexing.

6. Blog and/or Forum Spam
Blogs and forums are amazing platforms for essential communication over the world Wide Web. In some cases, blogs and forums establish high PR values for their documents which make them the targets of unethical SEOs looking for high-PR links back to their websites or those of their clients. Google in particular has clamped down on Blog and Forum abuse.

7. Keyword Stuffing
At one time, search engines were limited to the algorithm that used to rank websites entirely on the number of keywords found on the pages. That limitation led webmasters to stuff keywords everywhere they possibly could. Over use of keywords on a page leads to keyword stuffing and spamming.

8. Hidden Text
There are two types of hidden texts. The first one is the text in same colour and shade as that of the background which makes it invisible to human visitors but not to search robots. The second is the text that is hidden behind images or under document layers. Search engines tend to dislike both forms of hidden text and devalue the pages containing them.

9. Hidden Tags
There are a various types of tags like comment tags which are sometimes used by SEOs to increase keyword density on a page. These should be avoided since they are treated as spam techniques.

10. Misuse of Web 2.0 Formats (ie: Wiki, Social Networking and Social Tagging)
A very common form of SEO spam is the misuse of user-input media formats such as Wikipedia or social networking and bookmarking sites. Like blog comment spamming, the instant live-to-web nature of Web 2.0 formats provide an open range for SEO spam technicians.

Many of these exploits might find short-term success though in the long run they are always penalized by the Search engines.

Thursday, July 03, 2008

Image Optimization Tips for Search Engines

With the increase in the impact of visual trends, importance of images on a website and image search has drastically increased. Image search can be defined as query results that are accompanied by thumbnail graphics and supplanted by contextual information, that best match users' search queries.


In this post, I have listed few points which will help you to optimize your images for Search Engines and rank them in the image search.

1. Places the images where image search results appear, and are indexable into general Search Engines' contextual results, including:

  • Major Search Engines - either within contextual search results or vertical image search
  • Photo sharing sites (Flickr, Webshots, PBase, Fotki)
  • Social image sharing sites (MySpace, Facebook)

2. Take original photos, so that you can brand them with your trademark, logo or url. At business listing sites, add your business logo to creates a more significant effect on users’ mind.


3. Use good quality pictures and images, and make necessary resolution adjustments between full size images and thumbnails. Pictures with good contrast tend to work better.

4. Save your photos as JPG files, and other graphic images as GIFs.


5. Give appropriate file names to your images that match and represent the theme. Image names should be descriptive such as mobile-phone.jpg rather than untitled1.jpg.


6. Give appropriate alt tag and title tag to the images.


7. Use clear images as distorted images are not able to speak their agenda.


8. Images should be less weighted as heavy images increase page loading time and also use extra bandwidth.


9. Always specify the width and height of the images when you define an image on webpage. If you don’t mention the same then html parser itself need to consider the image size and it’ll take some extra time to think.


10. If content rich pages are embedded with pictures then they have more chances of ranking for those images and web pages. Such as if you’re providing services then add smiley faces of employees OR if you are selling products then include all products images on site. So add images on to content rich pages.


11. Add map image or link to map from site page. It increases your site usability.


12. On alt tag of map images, add physical address of your business.


13. You can also add testimonials from customers, celebrities’ snaps or award winning snapshots. This will create trust and attract more visitors.


14. Add your logo on press releases and link them to your home page.


15. While submitting your site to directories, add logo to your profile.


16. Upload images to Google Base.


17. Include images and logo into your newsletter (email marketing campaign) and get linked from them.


18. Get more links from clients, partners or B2B sites through images.


19. Use thumbnails (i.e. small size images) instead of large images and get the large images linked from those thumbnails. It will decrease the load time of your web page and also increase its usability.


20. Enable enhanced image search in Google webmaster tools in order to add valuable tag to your image in Google image Search Engine.


21. Link every image as hyperlinked images have much better chances of being included in image searches even if they are linking to themselves.


22. Bookmark your images using social networking sites such as Facebook, Digg etc.


23. Search Engines also look at the text surrounding a graphic image to determine relevancy so take care of the surrounding text and use related text near the images.


24. Do not exclude your graphic images directory from search robots or limit search engine access to graphic-image files.


25. Don’t use JavaScript code to show up the large size of the image. As search engines still can’t understand JavaScript code completely so never do this.


Image Search Engines:-

http://images.google.com/

http://www.live.com/?&scope=images

http://www.exalead.com/image/results?q=

http://www.pixsy.com/

http://www.picsearch.com/

http://www.altavista.com/image/

http://www.ithaki.net/images/

http://www.graphicsfactory.com/

http://www.ditto.com/

http://pro.corbis.com/

http://www.animationfactory.com/en/

http://www.faganfinder.com/img/ (Specifies all search engines, Stock photographs, graphics and clip-art, photo sharing sites and artwork related images sites)

http://images.search.yahoo.com/

http://www.ask.com/?#subject:img|pg:1

http://www.fastimagesearch.com/

www.fotosearch.com

www.webplaces.com/search

Wednesday, July 02, 2008

Optimizing Dynamic Websites

Dynamic content used to be a real obstacle for any Search Engine friendly website. Search Engines did not crawl and index these pages which made optimization of such websites nearly impossible. But now times have changed and so do the Search Engines and their indexing process. Now SEs include dynamically generated pages in their index.

Although Search Engines have started indexing and ranking dynamic websites but some particulars of dynamic pages can still be obstacles to getting indexed. Today most of the websites have some level of dynamic or CMS-managed content which means that you need to follow certain guidelines to avoid major pitfalls and ensure that your dynamic body of work is search engine friendly from head to toe.

Here are some guidelines that help your web pages to be crawled properly by the Search Engines:

Allow Search Engine Robots to Follow Regular HTML Links to All The Pages of Your Website

Search engine robots reach any page by following the links to that particular page. Sometimes it becomes difficult for the search robots to reach all the pages on a dynamic website. You can get an idea of the same by the following example.

Suppose you have a form on your website and you ask people to select their preferred category from a dropdown menu, and then when people submit the form your website generates a page with content that is specifically written for that particular category. Since Search Engine robots are software programs, they don't fill out forms or select from dropdown menus, so there will be no way for them to get to that page.

This problem can be rectified by providing standard HTML links that point to all the dynamic pages. The easiest way to do this is to add these links to your site map.


Set Up an XML Sitemap

If providing static HTML links is not possible, you can use an XML site map to tell search engines the locations of all your pages.

You can tell Google and Yahoo! about your sitemap through their webmaster tools (Google Webmaster Tools and Yahoo Webmaster Tools).


Keep The Dynamic URLs Short and Tidy

Here are a few guidelines you should follow for your website parameters:

  • Limit the number of parameters in the URL to a maximum of 2
  • Use the parameter "?id=" only when in reference to a session id
  • Be sure that the URL functions if all dynamic items are removed
  • Be sure your internal links are consistent - always link with parameters in the same order and format


Avoid Dynamic Looking URLs

Not only for Search engines but dynamic-looking URLs are also less attractive to your human visitors. Most people prefer URLs that clearly communicate the content on the page and such pages get more hits than dynamic looking urls.

Static-looking URLs are more likely to receive inbound links, because usually people are reluctant to link to pages with very long or complicated URLs. Also keywords in the URL are one of the factors in Search Engine Ranking Algorithms.

There are many tools available that can re-create a dynamic site in static form and re-write your URLs, if you have too many parameters, to "look" like regular non-dynamic URLs.


De-Index Stubs

Website stubs are pages that are generated by dynamic sites but really have no independent content on them. For example, if your website is a shopping cart for apparels, there may be a page generated for the category “Age 8-10 Skirts” but you may not actually have any products in this category. Stub pages are not only a real no no for the Search Engines but are annoying to searchers also. So, if you can not avoid them, exclude them from indexing using the robots.txt file or robots meta tag.


De-Index Pages With Duplicate Content

While this is not a problem specific to dynamic websites but dynamic sites are the ones which are more likely to face this problem than the static ones. If there are multiple pages on your site with identical or nearly identical content, exclude the duplicate pages from indexing using the robots.txt file or a robots meta tag. Choose the most appropriate and relevant page and exclude the rest.


Since dynamic content is usually timely and useful, more and more webmasters are opting for such websites and Search Engines are ranking them in their result pages. And now you know how to optimize your dynamic website and make it reach its full Search Engine potential so that it can rank well in the SERPs.

Related Posts:

XML Sitemaps

Saturday, June 21, 2008

Unlock Online Success with Keyword Research


Keyword Research is the process of selecting the most appropriate keywords and phrases that can help your visitors to find your website. The keywords and phrases you select to promote your website play a vital role in the online marketing of your business. They can decide the fate of your internet marketing strategy and can make or break your listings in the Search Engine Result Pages. Since the aim of any Internet Marketing plan is not only getting high rankings but to be able to generate maximum business via web. Even sites that have excellent rankings will not benefit if those rankings are for unsuitable keywords. Therefore, the foremost step in any SEO campaign is identifying the niche target audience and researching what keyword phrases they might be searching in the search engines to locate a site.

Keyword research is the core of SEO campaign and is very much similar to customer research, because you are studying and looking for the words that your potential clients use while searching for your services or products on the Internet. While selecting the keywords and phrases for your website, remember to select search terms that describe your products and services in the most logical, simple and specific way. This will not only ensure higher traffic but also targeted traffic for your site. For any marketing strategy to succeed, it is critical to know your audience and the means to reach them. A certain focus is required which could be location specific, region specific or country specific; it could be business, trade, service, product specific, since we are talking specific audience.

Since keywords are the foundation of Internet Marketing plan for any website, so always consider tapping numerous resources to locate a variety of keywords. Identify the best keywords and phrases that are relevant to the products, services, or information you are promoting. Also focus on the phrases which have high searchability with low completion. I know this is a difficult task, but trust me, the efforts are worth it.

There are various keyword suggestion tools like Google keyword tool, Wordtracker, Keyword Discovery and Digitalpoint which can be used to research the potential keywords.


LSI and Synonyms: Today, Search engines not only look for the exact keywords but also for their synonyms due to implementation of LSI or Latent Semantic Indexing. You can learn more about Latent Semantic Indexing at my previous post.

To make use of LSI for online promotion of your website, use a thesaurus to find terms that are related to your primary keyword. You can visit some good websites like Webreference and Merriam-Webster to get your desired phrases.

So, the whole idea behind the concept of Keyword Research is traffic optimization and not traffic maximization. A good and effective Keyword Research helps in bringing qualified, targeted and focused traffic to your website that leads to a higher rate of sales conversion.

Related Posts:

http://e-marketing-strategies.blogspot.com/2008/06/5-most-effective-tips-on-keyword.html

Friday, June 06, 2008

How to Get Listed in DMOZ Directory – The Open Directory Project (ODP)

DMOZ is the largest and most comprehensive human edited Open Directory of the web. Being listed in this directory definitely help Google rankings, but getting in can take a very long time. This is due to the fact that often webmasters fail to fulfill the criteria and submission guidelines which lead to rejection for submission. Websites must fulfill certain criteria before being submitted in DMOZ. The websites are manually reviewed for quality and relevance.

Although the directory plays very little role in generating traffic as not many people actually use DMOZ for searches. DMOZ listing can improve your search engine rankings and Google Pagerank dramatically. The DMOZ directory data is syndicated throughout hundreds of web directories and even Google uses this data.

As we all know that Page Rank is an integral part of Google's ranking algorithm, and higher PageRank helps in getting higher rankings in Google SERPs. A listing in DMOZ creates two significant backlinks for your website - one from DMOZ itself and one from the Google directory. This also gets your website links from the thousands of small sites which download and use the DMOZ directory.

Now, there are some points to be remembered while submitting the site in DMOZ. It rejects spammy/MFA sites, sites with too many affiliate links and sites without unique content. Another major reason why a site may be rejected is because of the failure to adhere to submission guidelines. If the Title and Description provided in the submission don't follow the Open Directory's guidelines, then the site gets rejected. While submitting a site, it is very necessary to read and follow submission guidelines.
So, how can you submit a website over the DMOZ, this point is worth pondering. According to some editors of DMOZ, it is a 5 step process. These steps are not official DMOZ submission guidelines but these steps may help. Here are those 5 steps…

  1. Read carefully the submission guidelines: The open directory does not include mirror sites, sites with duplicate and illegal content, or sites consisting largely of affiliate links.

  2. Try to choose and submit in the regional section rather than the main one.

  3. Submit in the appropriate category under which your website belongs.

  4. Follow the guidelines while creating the Title and description for submission.

  5. Make the title official: Sometimes a non promotional description works. Experts advise to use business names a titles.


So, follow the steps listed above and get your website listed in DMOZ directory. And yes don’t forget to continue working on your site to get the desired results.

Latent Semantic Indexing (LSI) - An Integral Part of SEO Copywriting

Latent Semantic Indexing or LSI is a technique which allows and helps S search Engines to determine the theme of a page and what the page is all about outside of specifically matching search query text. IT is an Algorithm used by Search Engines which is based on LSA or Latent semantic analysis. LSA is a technique in vectorial semantics.

LSI is the process by which Search Engines infer what a pages is about based on words and phrases other than the official keywords of that particular page by considering the latent semantic content of the text. The idea behind the approach is to minimize keyword stuffing. LSI allows you write about anything without mentioning the keyword more than once or twice.

Latent Semantic Indexing is meant to allow a more natural approach for search engines to view and rank websites. The principles of LSA to determine the content of web pages were used by a small company called Oingo that changed its name to Applied Semantics who developed a search system to determine the relevance of page content for specific advert placement. They called this Adsense. This company was in turn bought by Google in April, 2003, and Adsense used to replace their own system which was still under development. Adsense, then, was not developed by Google, but purchased by them.

LSI is about words, keywords usage and their placement with the page content and form sentences so that Google is able to get an idea what the web page actually wants to convey to the users. The technique helps a page to rank for search phrases related not only to the keywords but their synonyms also. Using the Latent Semantic Indexing concept, a web page about music could also include related words such as radio, mp3 players, ipods and so on. Rather than repeating the same words throughout the site, a wider variety of text and phrases can be used for keywords and search phrases.

If we take the example of "SEO", the LSI algorithm will look for SEO related terms like Search Engine Optimization, Website Optimization, Web Promotion, Search Engine Marketing and so on. This also helps in increasing relevant search results and decrease search engine spam.

A simple method to get an idea of the synonyms and the equivalent words considered by Google for any particular term is to use the tilde (~) in Google search for the keyword. For example, if you search for “Phone”, the first result you get is that of Nokia. IT means that Google considers Nokia as the synonym of phone.

Latent semantic analysis and indexing is used by Google primarily to detect spam to determine the page theme and its relevancy. It is also used to determine the true meaning of homonyms, heteronyms and polysemes. The technique is used to determine the difference by means of analysis of the other words in the text.

The web pages on your web site should be related and focus mainly on a special topic while using different words that describe the topic. Use variations of your keyword and synonyms. That makes it easier for search engines to determine the topic of your site.

Saturday, May 31, 2008

SEO Copywriting

SEO copywriting as defined in Wikipedia is "A technique of writing on a web page in a way that it can be read and understand by the surfer and it also uses the keywords and keywords phrases targeted for the websites. The purpose of doing copywriting that is SEO centric is to rank highly in the search engines for the given targeted keywords and phrases."

It can also be explained as the technique of writing the text on a web page in such a way that it reads well for the surfer or the user, and also targets specific search terms. The challenge lies in creating a content that is both user as well as Search Engine friendly. The purpose of SEO copywriting or Search Engine copywriting is to make a web page that can rank highly in the search engines for the targeted search terms and also conveys the message to the surfers in an effective and easy to understand manner.

While optimizing the websites, I have come across many situations where you are provided with a readymade website with very little or no content at all with a long list of keywords that are not at all related to the theme or goal of the website. What worsens the situation is that you can not find a single page which talks or deals about the search phrase you are provided with.

Generally, the novice users or users new to the world of Internet fail to understand the actual process of web development and the relationship between website design and Search Engine Optimisation. According to these users, SEO is all about creation and placement of title and other meta tags with stuffing of keywords within the page content. But the fact is that the task of an SEO expert starts right from the designing phase where the potential keywords are identified before content development and deciding the page url.

Search Engines look for genuine and unique content on the web pages that should be related to the search terms and the keywords used in the title and other meta tags. The content on any page should not be misleading for the users and it should reflect the correct picture and idea behind creation of the page. This ensures that the users are getting exactly what they are expected to find on a particular page. Search Engines ensure this by matching the words present in the text of the webpage with that of the page title and meta tags. LSI or Latent Semantic Indexing is a technique which helps them to accomplish the task. In this technique, synonyms of the search terms or phrases are also taken into consideration while generating of SERPs.

The task begins right from the designing phase before content development. Thorough keyword research is required for all the individual pages. Once the keywords are identified according to the theme of the page, content is developed with strategic placement of the search terms or phrases within the text and other on page elements. Synonyms of the search terms are also used to maintain the LSI ratio.
The reverse approach can also be taken for content development. In this case, first the target key phrases are identified according to the business, website theme and goal, and then web pages are created that can represent the actual aim of the website.

The URL is or the filename is also decided based on the target keywords to give more emphasis to the potential search terms.

This helps the Search Engines to know that the page is actually about the keywords and the users are provided with the correct information.

SEO copywriting is one of the major factors which decide the ranking of any particular page in the SERPs or Search Engine Result Pages. So, it is always a good practice to do SEO copywriting while creating any website or page, so that it can rank well as well as provide genuine and useful information to the users.

Wednesday, May 14, 2008

Google Penalty: Probable Causes and Solutions

Google penalty is a nightmare for SEOs. Just imagine losing all SERP rankings for your website just because Google has penalized your website. Sometimes it happens due to sheer ignorance. Many SEOs are unaware of the factors which can cause the penalty. There are certain guidelines, which if followed properly can save your website to get penalized by Google. I just researched on the probable causes and solutions for Google penalty and consolidated the data at one place so that it becomes handy for those who want a solution to this problem. I found two websites very helpful to prepare this material http://www.growler.com/Pro/SEO/aaSEO2/Google-penalty.htm and http://www.ksl-consulting.co.uk/google_penalty.html.


What Exactly Triggers A Google Penalty?
  • Hidden text or hidden links
  • Sneaky redirects or cloaking
  • Sending automated queries to Google
  • Stuffing pages with irrelevant words
  • Creating multiple domains, subdomains, or pages with substantially duplicate content
  • Creating "Doorway" pages only for the Search Engines or other "cookie cutter" approaches such as affiliate programs containing little or no original and unique content
  • Link buying and selling
  • Excessive use of keywords
  • Linking to banned sites
  • Linking from banned or spam sites
  • Getting links from BAD sites
  • Violate Google Webmaster guidelines
  • Excessive linking in a short span of time
  • Affiliate links on the site
GoogleGuy Says:

Abigail, do you have a lot of links/keywords on a page that could look like stuffing? For example, if a page is pretty fair but then has 200-300 keyword-stuffed links, I wouldn't be surprised if that would go over a threshold at some point. Do you have a lot of those sorts of links, or affiliate links or something similar on your site?

In general, anything hidden from the human visitor but visible to the robots or vice versa is asking for a Google penalty. So don't put coloured text or links on backgrounds of the same color - Google penalty. Same for teeny tiny font sizes, especially if they carry links - Google penalty. And offpage content that only a robot sees - Google penalty.

If you're creating content for humans, it is very unlikely that you will trigger a Google penalty. But if you're making an effort to fool the robots, you're messing with the potential for a Google penalty, big time.

Google Penalty Checklist
  • Linking to banned sites - Run a test on all outbound links from your site to see if you are linking to any sites which have themselves been Google banned. These will be sites which are Google de-listed and show Page Rank 0 with a grayed out Toolbar Page Rank indicator.

  • Linking to bad neighbourhoods - Check you are not linking to any bad neighbourhoods, link farms or doorway pages. Bad neighbourhoods include spam sites and doorway pages, whilst link farms are just pages of links to other sites, with no original or useful content.

  • Automated query penalty - Google penalties can be caused by using automated query tools which break Google's terms of service as laid out in the webmaster guidelines. Google allows certain automated queries into its database using its analytic tools and when accessing through the Google API account. Unauthorised types of automated query can cause problems.

  • Over optimization penalties - These can be triggered by poor SEO techniques such as aggressive link building using the same keywords in link anchor text. When managing link building campaigns, always vary the link text used and incorporate a variety of different keyword terms. Use a backlink anchor text analyser tool to check backlinks for sufficient keyword spread. Optimising for high paying keywords can further elevate risk, so mix in some long tail keywords into the equation. For brand new domains, add no more than 5 new one way backlinks a week and use deep linking to website internal pages, rather than just homepage link building.

  • Website cross linking & link schemes - If you run more than one website and the Google penalty hits all sites at the same time, check the interlinking (cross linking) between those sites. Extensive interlinking of websites, particularly if they are on the same C Class IP address (same ISP) can be viewed as "link schemes" by Google, breaking their terms of service. The risks are even higher where site A site wide links to site B and site B site wide links back to site A. If you must use site wide links, make sure they are not reciprocal links. Link schemes built around links in the footer of each webpage are particularly risky. The reality is that site wide links do little to increase site visibility in the Google SERPS, nor do they improve Page Rank more than a single link, as Google only counts one link from a site to another.It is also believed that Yahoo! now applies a similar policy. There is some evidence that the extensive use of site wide links can lower website Google trust value, which can subsequently reduce ranking.

  • Hidden text or links - Remove any hidden text in your content and remove any hidden keywords. Such content may be hidden from view using CSS or alternatively, text may have been coded to be the same colour as the page background, rendering it invisible. These risky SEO techniques often lead to a Google penalty or web site ban.

  • Overt keyword stuffing - Remove excessive keyword stuffing in your website content (unnatural repetitions of the same phrase in body text). Always use natural, well written web copywriting techniques.

  • Automated page redirects - The use of automated browser re-directs in any of your pages. Meta Refresh and JavaScript automated re-directs often lead to a Google penalty as the pages using them are perceived to be doorway pages. This technique is especially dangerous if the refresh time is less than 5 seconds. To avoid Google penalties, use a 301 re-direct or Mod Rewrite technique instead of these methods. This involves setting up a .htaccess file on your web server.

  • Link buying - Check for any paid links (I.E. buying text links from known link suppliers / companies). There is some evidence that buying links can hurt rankings and this was implied by comments from Matt Cutts (a Google engineer) on his Google SEO blog. Matt states that Google will also devalue links from companies or suppliers of text links, such that they offer zero value to the recipient in terms for improving website rankings or Page Rank.

  • Reciprocal link building campaigns - may also trigger a Google penalty or cause a SERPS filter to be applied when the same or very similar link anchor text is used over and over again and large numbers of reciprocal links are added in a relatively short time. The dangers are made worse by adding reciprocal links to low quality sites or websites which have an unrelated theme. This can lead to a backlink over optimisation penalty (known as a BLOOD to SEO experts!). a Google backlink over optimisation penalty causes a sudden drops in SERPS ranking (often severe). To avoid this problem, reciprocal link exchange should only be used as part of a more sustainable SEO strategy which also builds quality one way links to original website content. Adding reciprocal links to unrelated sites is a risky SEO strategy, as is reciprocal link exchange with low quality websites. If you can't find a website's homepage in the top 20 of the Google search results (SERPS) when you search for the first 4 words of a site's HTML title (shown at the top of the Internet Explorer window) then undertaking reciprocal link exchange with that site may offer few advantages. Don't forget to check that prospective reciprocal link partners have a similar theme as your homepage too.

  • Check Google Webmaster Guidelines - Read the Google webmaster guidelines and check website compliance in all respects.

  • Google Webmaster Tools - According to Matt Cutts's Blog, Google is improving webmaster communication with respect to banned sites and penalties. Google is now informing some (but not all) webmasters the cause of a website ban or penalty, via their excellent new Webmaster Console. In addition, a Google re-inclusion request can be made from the same interface. For this reason, if you've been hit by a web site ban or penalty, it is worthwhile signing up for Google Webmaster Tools and uploading an XML Sitemap onto your site and then to check site status in the Google Webmaster Console. This is an easy 15 minute job and may help to identify the cause and fix for the problem!

Initial Tests for a Penalty

Just because you lose rank does not mean you have a Google penalty. Your ranking for keyword sets depends on many factors, including how many others are competing for the same keyword sets, how much content exists for that keyword set, and off site links. Also, the search engines are constantly updating the algorithms that determine rank, and as these change, ranks do as well. There are sites, and pages from sites that temporarily disappear from the index, but return later for no obvious reason, so don't be too quick to blame a Google penalty when your ranks are swirling.

But you can definitely tell if your site is still in Google's index. Simply search for the url ("mysite.com"). If there is no information returned, the url is not indexed. You can also see all pages that have been indexed by searching for "site:mysite.com" If you were ranking before, but show nothing for this search, you have a Google penalty. If you have many pages on your site, but you only see the homepage in the result, something's very wrong, and you may have incurred a Google penalty.

It is also possible for Google to impose a manual suppression of your site that is impossible to detect, a kind of lower level Google penalty. Recently a case was unwound with such a penalty who had been told by Google that there was no penalty imposed, that his poor ranks were a result of his lack of content. The tipoff to the suppression was that all the other search engines showed high positions for the same keywords, and the company's trade name was not in the #1 rank, but on page 4.

When a penalty is suspected, start by checking with Google the number of URL's it has indexed. This can be accomplished by using the site:yourdomainname.com -asdfasdf command within a Google search window. If no URL's are indexed and no backlinks show up when the link:yourdomain.com is entered then there is a high probability of a Google penalty, especially if your site used to be indexed and used to show backlinks. The exception to this rule is a new website with few backlinks, which may not be Google indexed since it is still waiting to be crawled. Such websites frequently show no backlinks, but this doesn't imply they have received a Google penalty!



Not all Google penalties result in a loss of Page Rank. For example, various Google filters can be triggered by irregularities in backlinks and by excessive reciprocal link exchange, backlinks from spam or banned sites etc.

If you suspect your website has received a Google penalty, you can contact Google by sending an e-mail to help@Google.com to ask for help. They will usually check the spam report queue and offer some form of assistance.

Interestingly, in a recent move by Google, web sites which are in clear violation of Google's webmaster guidelines or terms of service may receive an e-mail from Google advising them to clean up their act, warning of a penalty and website de-indexing. When the breach of Google's terms (e.g. spam or hidden text) is removed from the offending site, Google will usually automatically remove the penalty and re-index the site when the webmaster completes a Google re-inclusion request.

How to Check for A Google Penalty?

To check for a Google penalty with any degree of certainty is very difficult. The very first thing you need to do is determine the exact cause of the Google penalty. This is often not obvious. But if you know you have violated one of Google's published guidelines, that is where you should start. Unwinding a Google penalty means removing the offensive content, links, or strategy, and then informing Google that your site has achieved compliance with their guidelines. A Google penalty will not go away by itself, and a strongly proactive approach is required to both uncover the offensive material, and be certain it is completely addressed before approaching Google.

It is strongly recommended NOT contacting Google immediately upon discovery of a Google penalty. You really want to be certain that after expending time and energy arguing your case, that the Google penalty will not be reimposed later because of an oversight on your part. Make sure your site is completely Google compliant before contacting them. A demonstration that you understand why the Google penalty was imposed is instrumental to unwinding it.

Here's where to go when you're ready: http://www.google.com/support/bin/request.py

If you are uncertain as to the cause, you should seek help. It is not recommended that you contact Google until you can approach them with knowledge. Their contact form will generate an automated reply (if any), and they do not provide a service to diagnose the cause of a Google penalty. Unwinding a Google penalty usually requires an acknowledgement of the problem by the site owner, a clean site, and a statement of compliance with Google's guidelines. More than one attempt will probably be required to undo a sitewide Google penalty. Even in cases where the Google penalty involves only a limited number of pages from a site, be prepared to commit significant time and energy to setting things straight.

Check for a Google Website Ban

If you've used unethical SEO techniques your website could be Google banned and de-indexed. Check for a Google website ban using the free SEO tool at http://www.seojunkie.com/2006/05/09/google-ban-checker/. Please note that the results from this free site ban tool may not be entirely accurate or reliable!

Google Penalty Recovery Strategy

Recovering from a Google penalty normally involves fixing the cause of the problem and then waiting for Google to remove any over optimisation penalties or SERPS filters. To fully recover Google ranking may take around 2-3 months after all website problems are corrected. The Google algorithm can automatically remove a penalty if the affected website is still Google indexed. If your website has been Google de-indexed and lost Page Rank, then you will need to make a Google re-inclusion request. Where the reason for the penalty is clear, it helps to provide details of any changes you've made to correct violations of the Google webmaster guidelines.
The best recovery strategy from any Google penalty is to check for any recent Google algorithm changes and to evaluate recent changes made to your website prior to the sudden drop in Google ranking. Don't forget to check your link building strategy as poor SEO often causes Google penalties. Start by removing any reciprocal links to low quality websites, or sites having no relevance to your website theme.



Wednesday, May 07, 2008

Subdomains or Subfolders : Which are Better for SEO?

Subdomains and subfolders both have their advantages, especially when setting up blogs.

For blogs, I prefer a subfolder (http://www.seomoz.org/blog/) because the link juice which is sent to that blog is going to be naturally distributed to that main domain, and other subfolders under the domain.

Futhermore, the forum/blog will default logo, home page and other links back to the subfolder. If you set this up with a subdomain, by default, the links in the forum/blog itself will all point back to the subdomain. So, with a subfolder, both the inbound and internal linking structure favor the entire site. With a subdomain, the forum or blog will be listed as a separate entity in the Google search results, which is good for owning the results and one’s reputation management. However, Google and other engines will generally not list more than two of these subdomains in the search results, unless those subdomains can prove to Google that they are independent and relevant entities.

I would like to reference Vanessa Fox, an ex-Googler and contributor to Search Engine Land :

Google is no longer treating subdomains (blog.widgets.com versus widgets.com) independently, instead attaching some association between them. The ranking algorithms have been tweaked so that pages from multiple subdomains have a much higher relevance bar to clear in order to be shown.

It’s not that the “two page limit” now means from any domain and its associated subdomains in total. It’s simply a bit harder than it used to be for multiple subdomains to rank in a set of 10 results. If multiple subdomains are highly relevant for a query, it’s still possible for all of them to rank well.

Home Depot is one site which has cleared the relevancy bar at Google with subdomains at HomeDepot.com that are actually marketed as individual sites. Take careers.homedepot.com and look into its backlinks, even if this subdomain was on a whole different domain, like HomeDepotJobs.com, it would probably rank just as highly.

So, in conclusion, if you’d like to build the equity of one web site or entity, I suggest using a subfolder. If you’d like to build an entire new entity with its own equity, launch a subdomain.


Via [http://www.searchenginejournal.com/subdomains-or-subfolders-which-are-better-for-seo/6849/]

Tuesday, April 15, 2008

On Page Optimization

On page optimization is the process of making any website Search Engine friendly or optimizing the site according to the guidelines suggested by the Search Engines. This process helps the crawlers to index and crawl the pages easily and helps to get better ranking in the SERPs. The term on page has come into existence since the activities are done on the web pages directly. The process includes various steps which are as follows:
  1. Unique Title and Meta Tags
  2. Code and Design Optimization
  3. Website Architecture and Site Navigation
  4. Content Optimization
1. Unique Title and Meta Tags
  • Unique title and meta tags are prepared and placed on all the pages based on the theme of the page.
  • Proper alt tags are added for the images.
2. Code and Design Optimization
  • All the inline css are shifted to an external file.
  • The unnecessary spaces are removed.
  • The broken links (if present) are removed.
  • The page size is optimized.
3. Website Architecture and Site Navigation
  • The website hierarchy and text links are made proper so that all the page can be reached from at least one static text link and within three linking levels or clicks.
  • All the pages are linked and the site navigation is made structured.
  • The pages are linked with proper anchor texts (keyword oriented).
  • The important pages are linked directly with the home page.
4. Content Optimization
  • Content is prepared with SEO copywriting and placed on the pages. The important keywords and their synonyms are used with adequate keyword frequency and density.
  • The headings are optimized with the help of header and bold (or italics) tags.
  • If required different font colours and sizes can also be used to optimize the content.
  • The content should be informative and related to the theme of the website.
On page optimization is a very important step towards getting high ranking in the SERPs. Don’t overdo it as the Search Engines penalize for over optimization.

Wednesday, April 09, 2008

Successful SEO Campaign - 10 Basic Thumb Rules

Search Engine Optimization is the stepping stone for developing Internet Marketing Strategies for any website. It lays the foundation for the success of web marketing and advertising for any entity. If you want to increase the online presence of your website, then SEO is the first step. The basic performance of any successful SEO campaign can be measured by examining the ranking of the website in the Search Engine Result Pages or simply SERPs. SEO is an ever evolving and fast moving field. There are several techniques which can be followed to accomplish the task but the big challenge is to decide the techniques which should be adopted for any particular campaign to get the desired results. The best practice is to do proper planning before proceeding with the SEO process and adopt a combination of the most suitable techniques which depends entirely on the website theme and goals. The success depends upon how intelligently you use that technique.

The new and evolving trends are always good to try and experiment but don’t forget to implement the basics of SEO. There are 10 basic thumb rules, which, according to me should always be followed to create a solid foundation. These are:

1. Unique Titles and Meta Tags: All the pages should contain unique and descriptive titles and meta tags.

2. Original Content: There is a proverb which is very famous in SEO fraternity that content is king. So always put unique, high quality and informative content on your website.

3. Quality Link Backs: Get quality back links for your website. Don’t run for junk, high in quantity links.

4. Ethical Approach: Don’t try to fool the Search Engines. Always adopt the ethical approach to optimize your website. It may take some time to show in the SERPs, but believe me the results will be long lasting.

5. Website Architecture: Have a proper architecture and navigation for your website. It helps the Search Engine spiders to do deep crawling of the site.

6. No Unnecessary Code: Remove all the junk or unnecessary code from the site. Make it a neat and clean website.

7. XML Sitemaps: Always create and submit xml sitemaps at regular intervals for proper indexing of the website.

8. No Broken Links: Check and remove all the broken or not found links from your website at regular intervals.

9. Regular Updations: Search engines love the websites which have something new to offer to the visitors. They do not like dead sites. Update your website regularly with fresh and original content.

10. Be Consistent: Consistency is the key to success. So keep on working on your SEO project to reap the benefits.

The 10 points mentioned above are the basic SEO ranking factors and if followed with proper planning, show the sure shot success path to a successful SEO campaign.
 

Ads Banner

My Blog List

Search

Followers

Share

E Marketing Strategies Copyright © 2009 Blogger Template Designed by Bie Blogger Template