Monday, November 03, 2008

How to reduce Bounce Rate percentage?



First let us be aware of what is Bounce Rate in Google Analytics. Most of the definitions found on the web are bit confusing (as I found them so). Thus, would like to be more clear on what Bounce Rate means.

Bounce rate represent the single-page visitors who bounce out of the website to a different website. Bounce rate is the percentage of initial visitors to the visitors who bounced to other website. In other words, instead of visiting the other pages of the website the visitors close the window, type in other url or click the link of the other website on the page. This is different from Exit Ratio. Exit Ratio is the ratio of all visitors to the visitors who have exit from the website. These visitors can be single-page or multiple-page visitors. Bounce rate only considers single-page visitors.

Now to know why this bounce occurs. If you have a high bounce rate for your website or web pages here are few points which you can ponder upon to make the initial visitors to go into the website.

1. content on the page is not interesting for the visitors,
2. too many external links are available on the page for the visitors to click and go to other website,
3. few links are available on the page to take the visitors the to inner pages (or they are not clearly visible for the visitors),
4. irrelevant visitors coming to the website (not target audience)
5. Page takes too much of time to download - visitors may close the window rather than waiting for the page to download
6. Visitors feel confused after seeing the page

If at all, you want to show external website links on the page/website, do open them in a new window. Just keeping the hope that, the visitor may later visit your website and find it interesting and move into the inner pages.

Thursday, July 31, 2008

SEO Basics For A Rookiee

The Search Engine Optimization process is briefly described by the following stages:

1. Keyword Research:

Proper research assures that your website is optimized around the keywords most often used by your visitors. Without proper research, Optimization is likely to yield poor results. The research phase produces a finalized set of keywords that may be anywhere from 1-150 keywords. You can use the free tool available at http://freekeywords.wordtracker.com. Prepare a set of keywords for every page of the website depending on the content that is on the webpage.

2. Identifying your Competition:

After selecting the best keywords, it is important to know which websites rank highest in the major Search Engines. Websites that appear the most often are referred to as your ‘Competition.' Comparing your website to your Competition helps to give an idea of where the initial efforts of Optimization should be focused. Identify and study the keywords being used. Check the link popularity and PR of the competitors and make a note of it. Identify the traffic the competitors’ are getting. You can use these free tools available at – http://www.marketleap.com, http://www.trafficestimate.com, http://www.alexa.com.

A detailed analysis of your website and the competition provides the details necessary to plan a successful Optimization.

3. Meta Tags: Meta-tags & Title tag, Hyperlinks, Image Alt tags:

Optimizing the HTML code includes creating Title tags, Meta description tags, Meta keyword tags, hyperlinks, and possibly headings & image alt attributes. All of these are factors in helping the search engine spiders to properly classify your site, with some being more important than others. Rather than go into detail on how to perform all of these tasks, I'll direct you to some of my previous SEO articles.

When your code is optimized, it's finally time to upload your new pages to your server. In the past, at this point we would then resubmit them to the search engines. However, these days, this is an unnecessary step. As long as you're dealing with an existing site with some links already pointing to it, the search spiders should visit and index at least your home page fairly quickly. It may take a bit longer for them to re-index the inner pages, but you can rest assured that they will.

Do not spam the description or keyword meta tag by stuffing meaningless keywords or even spend too much time on this tag. SEO pros all agree that these tags are not as important today as they once were.

4. Content Optimization:

Once the keyword phrases are chosen for each page of the site, it's time to get down to the nitty-gritty and start utilizing them within the copy. If you're rewriting from scratch, be sure your copywriter understands which phrases need to be used, what the goals of the site are, and who the target market is. Obviously this information will affect how they write the copy, so the more they know; the more accurate your copy will turn out. If you're lucky enough to be able to work in the necessary keywords, that's your next step. Once your copy is finished and approved, you should now have a number of pages focusing on 2 or 3 keyword phrases each.

5. HTML Site Map:

Create the html sitemaps providing the links of all the pages. Have the keywords in the body text and the anchor links if possible.

6. XML Site Map:

Create xml sitemaps and submit to Google and Yahoo. Free tools are available on - http://www.xml-sitemaps.com/, www.google.com/webmasters/ and http://siteexplorer.search.yahoo.com/.

7. Website Structure Analysis:

Website Structure Analysis actually means website architecture. It is an approach to the design and planning of websites which involves certain criteria like technical, aesthetic and functional. The main focus is on the user and user requirements in designing a structure of the website. Attention is primarily given to the web content, usability, interaction design, information and web design.

In case of search engine optimization the navigation schemes of the website should be created most importantly for use by visitors. Through usability testing we can easily determine what type of navigation schemes visitors prefer. Like for example a text based or graphics based one.

For a website to be successful first and foremost, it should satisfy your target audience. At the same time by making sites simple and easily accessible we will be providing the easiest path for the search engine robots to index our site. This strategy helps in the long run success of search engine optimization. No matter how often the search engines algorithm change the good content and simple navigation is always rewarded.

Search engine friendly design: The easier the navigation and the more text on the page the better it is not only for the visitor but also for the search engine robots. Certain points to be considered while designing a search engine friendly site are not to include long javascript in the source code, dynamically generated pages, frames and so on as the search engine robots have difficulty in crawling them.

However there are techniques to overcome the problems with indexing but the best way to assure that the pages will be indexed is to keep them simple and ensure return visits both from the search engines and the visitors as well.

8. Robot.txt file:

Many search engines use programs called robots to gather web pages for indexing. These programs are not limited to a pre-defined list of web pages; they can follow links on pages they find, which makes them a form of intelligent agent. The process of following links is called spidering, wandering, or gathering.

The robots.txt file,

• is a text file created in notepad or any text editor
• should be placed in the top level directory or root of the website or server space
• should include all lower case letters

Through the robots.txt file we can,

• exclude all robots from visiting the server
• allow complete access to all robots
• exclude robots from accessing a portion of the server
• exclude a specific robot
• exclude certain type of files from accessing by specifying the file extensions.

Here are few examples of robot.txt file. Check both good and bad practices.

Entry Meaning
User-agent: *
Disallow:
The asterisk (*) in the User-agent field is shorthand for "all robots". Because nothing is disallowed, everything is allowed.
User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /private/

In this example, all robots can visit every directory except the three mentioned.

More examples are available in my previous post - http://seocolumn.blogspot.com/2005/12/search-engine-robots-and-robottxt.html

9. One-way links and Directory links:

Some SEOs recommend that you do these before all the other work so that you can start getting rankings right away. However, I prefer to wait until the copy changes have been made and uploaded. The newly focused copy helps the directory reviewers to more easily understand what your site is all about, and they'll be less apt to edit your submitted description. If you submit before your site's copy is using your specific keyword phrases within the copy, the reviewer may feel these keywords don't belong within your directory description.

When link building, think quality, not quantity. One single, good, authoritative link can do a lot more for you than a dozen poor quality links, which can actually hurt you.

Where as directory submission to Yahoo, DMOZ… can happen immediately. You just need to make sure you have the details asked by these website forms are ready with you. And make sure to have the home page to have some relevant content prior to submission to these directories.

10. Reciprocal links:

As with directory submissions, I prefer to wait until the site is in perfect condition before starting to request links. You can certainly get started researching link partners before the SEO is complete; however, the better your site is, the more likely others will be willing to link to it. Link building can be done in a quick burst, but should also be an ongoing process. You should always be on the lookout for sites that are a good fit with yours and contact them about exchanging links.

In some cases, additional content pages may also need to be created as per the need of the keywords you want to rank for.

11. Article Submissions:

The fundamental base of the internet is not the page, but the link. A great way to accumulate links is to write articles and submit them to places interested in quality articles.

When you write articles for free on certain websites, you are typically given a by-line or author bio which you can link to your website. Few websites also accept links within the body text using html tags.

Few good Article resources are listed below.

http://goarticles.com/
http://ezinearticles.com/add_url.html
http://amazines.com
http://www.articlecity.com/
http://bpubs.com/
http://businessknowhow.com/
http://www.certificate.net/wwio/ideas.shtml
http://www.promotionworld.com/
http://promotenewz.com/

12. Adopt Web 2.0 Technologies:

Do not think that the traditional tactics of SEO – optimizing using meta tags, keyword density, prominence of keywords…more has lost the importance. They are still important and one needs to follow these for better rankings.

Here are few easy steps…http://seocolumn.blogspot.com/search/label/web 2.0

Guidelines for content writing to improve ranking in search engines

This is only for front-end content writing...

Keyword Relevancy: The more relevant the content is to a specific search term, the more likely the page will rank at the top of search results for that search query. Each page need to be targeted to a keyword / key phrase rather than all the pages for all the keywords.

Keyword Density: There is an optimal ratio of key terms to the overall amount of text that must be used for search engine optimization purposes. Keyword Density refers to the overall density of a given keyword for a particular page and is extremely, extremely important for SEO strategy.

Header Tags: This is a combination of the words found in the title, the headings on the web page and the body of the text. Each page need to have the keyword in the header tag and title tag.

Keyword Prominence: Let the header tag and/or the first word of the first line start with the keywords. Also, have the keyword in the every first line of each paragraph (the keyword needs to be in first 90 characters). The same goes for closing the content - have the keyword phrase twice in the closing paragraph as well as in the last 90 characters.

Anchor Text / Links: Link the key phrases to the appropriate pages. Even here - do not try to link all the main keywords to all the pages. Identify a key phrase for each page and link it. Treat every web page as a different website.

Tuesday, July 29, 2008

SEO of Dynamic Pages

Here are some things to keep in mind when optimizing dynamic pages:
  1. Dynamic URLs are definitely search engine friendly.
  2. If possible specify the title tag. Or use a CMS which has this option. Title tag is the most important on-page SEO factor. Most CMS's use the site name as the default title tag. The worst ones use it as the title for all pages, less worse ones use it in the form " ". Ideally, you want the page title first. And make sure that the title of that content page is well optimized for the keywords you're trying to target.
  3. Specify meta tags. This is not so important for SEO, but can make a difference to your click-throughs from the SERPs (search engine results pages) - particularly the meta description. Ideally, you can specify the content for the meta description when you're creating content. Same goes for the meta keywords, but this is very minor. Major search engines don't really care anymore.
  4. Proper use of heading tags. Make sure that you use h1 tag for the page heading, and use subheadings (h2 tag, etc) for sub-headers in the content. These are relatively well weighted SEO factors. If you're using a WYSIWYG editor in the CMS, make sure that it uses tags as opposed to just changing the font size.
  5. Clean code. I've seen so many dynamic sites that are messily coded. Ideally, the outputted HTML code is as clean as possible and the actual content appears relatively close to the top of the HTML. Using CSS-based, non-table layouts will make a difference here as well.
  6. With regards to listing all the dynamic links on a static page, that's not really necessary, though it's a still a good idea to have a sitemap - both HTML and XML.