Thursday, November 29, 2007

Three Words About Spam

Don’t do it.
You can spam search engines by stuffing your site with keywords, submitting them numerous times or filling your pages with links. And they’ll just get you blacklisted.

Don’t do it. It’s just not worth it. It used be accepted practice to create doorway pages (duplicates of a home page filled with different keywords) but search engines won’t accept these anymore.

They will accept smart pages though. If you want to use more keywords than you can fit on your site, create a second page that is totally different from your homepage but which is still based upon your product or service. Here, you can expand upon a topic you merely touched on in your homepage. A great example of a smart page is to write articles on the benefits of your services using a different set of keywords to those on your home page. You can do that. To sum it up, this chapter gave you a detailed view of many of the proven and effective search engine optimization techniques. SEO is one of the most important traffic generating mechanisms, and when done correctly it can do wonders for your website and your profits. Click here to make money online with the .

Monitoring Your Progress

Okay, so you’ve decided on your keywords, inserted your links and submitted your sites. Now all you have to do is wait for the cash to start pouring in, right?

Well, not quite. You might get lucky with your first shot, but it has never happened to me that way. Once you’ve submitted all your links, you need to keep a close eye on them, and see which need improving and which can be dropped.

The crucial factor here is to keep track of your Search Engine Statistics. These will tell you how many people have come from the various search engines and how many of those become customers. But it’s not enough to know how much traffic you’re receiving. You also want to know how you’re doing in the rankings. There are two ways to do that:

Manual Searches

Dead easy, simply log on and look. First enter your URL to make sure your site has been approved and listed. That can take a little while. Once you can see that you’re online though, you want to see how you’re ranked in each keyword. To do that, you can simply enter each keyword into the search engine and browse the pages until you find your listing. This works, but it takes a while.

Computerized Searches

It’s been a long time since I did a manual search. If you’ve got ten sites and you want to check ten keywords for each once a week, you’re going to lose at least a day’s work a month. That’s too much for me. To automate this task, I suggest using a software program found at WebPositionGold.com. You tell the software all the keywords you want checked and it gives you an automated report.

When you look at your statistics, pay attention to which keywords are bringing in the most traffic. In general, the higher you are, the more traffic you’ll receive and the more sales you’ll make. But that isn’t always true. It might pay more to be fifteenth on a keyword that gets a million searches a month than first on a keyword that gets just a thousand. And if you’re fifteenth, you’ve still got room for improvement.

It’s the improvement that’s the key. If you see that your link is stuck at the bottom of a list somewhere, try adding more links, putting that keyword in more pages or adding keyword-rich content.

Optimizing Your Website

To get listed correctly in the search engines each page of your site that you want listed needs to be optimized to the best of your ability. Since the keywords that you decide to target will be used throughout the optimization process, choosing the right keywords is essential. If you choose the wrong keywords you will not be found in the search engines. Since the keywords you choose to optimize your pages with are so important I’ve put together a few tips to help you make sure that you make the right choices. You should utilize these tips when selecting keywords for each page that you want listed in the search engines.

Think "specific keyword phrases" not "keywords". Due to the extreme amount of competition for general terms in the search engines, if your keyword phrases are too general it is very unlikely you will rank well in the search engines. You stand a far better chance to rank well for specific phrases where there is less competition. The resulting traffic, since it is more highly targeted, should also be much higher quality too.

You should try to come up with as many keyword phrases as you can think of that relate to the page you are optimizing. Try asking a few friends and family what they would search for when searching for a site like yours. Check out your competition for ideas. Do a search using keywords that you already know you want to target and click through on the top sites that come up. Once on the site, view the source HTML code and view the keywords they have in their Meta tags. Searching competitive sites’ Meta tags will give you many more ideas. Make sure to only use keywords that relate to YOUR site or page. To view the HTML code, simply click the “View” at the top of your web browser then select “Source”, or “Page Source”. You should develop a list of keyword phrases, using these tips, for each page that you optimize for the search engines. Apart from these, there are certain aspects that should be avoided.

These are:
.. Dead Links - As search engines index your entire site by crawling through hypertext links, you must make sure you check for dead links before submitting.

.. Graphics and Image Maps - Search engines cannot read images; be sure to include Alternative Text tags.

.. Frames - Many Search engines aren't frames compatible. Meta tags are important in this instance.

.. Spamming - Avoid resubmitting your pages repeatedly to search engines if your site does not get listed in the first few weeks. Allow at least 6 weeks before resubmission.

Continual resubmission (such as those caused by automatic submission software) can cause your site to be penalized.

Inward Link Analysis

Like reciprocal linking, inward links to your website can be an effective strategy to increase your website’s visibility. Inward links are links pointing to your websites from other websites without providing a reciprocal link from your website.

There are many techniques to improve inward linking. Many of these have enjoyed success. The most proven technique for inward linking is writing and distributing free reprint articles. You simply write (or hire a ghostwriter) a few 500-700 word articles related to the theme of your website. You also include an “About the Author” resource box at the end of your article that includes a link back to your website. Then you visit article directories (websites that accept article submissions) and submit your article giving full permission for other websites to reprint your article on their site as long as they include your resource box and a link back to your website along with the article. This is a win-win scenario. Webmasters get needed content to make their sites “sticky” and useful to visitors and you get free inward links pointing back to your website.

Other techniques include posting free ebooks, newsletters, news stories and press releases at other websites, particularly industry specific and general portals. All of these would contain a free link pointing to your website, thus, promoting your website.

Picking Your Partner

Your link partners should be sites your target market will visit. Think about your product and its subject area and brainstorm to determine where people interested in your product might be looking online. For example, if you’re trying to sell a book about blackjack strategy, it makes sense that the people visiting online casinos would make great customers. Online casinos then could be good partners. Identify top-ranked, high quality casino sites and find the email address of their webmasters. You can also identify your competitors, see where they trade links and then follow suit.

Tips for Talking to Webmasters

Before you contact webmasters, place a link to their site on your home page or resource page to assure them that you will actually provide a high quality link.

Create a subject line that will encourage them to read your message rather than deleting it. You don’t want them to think you’re message is spam. (Something about their site or product is sure to capture their attention; they will probably open and read it, thinking that you’re a potential customer.)

Begin your message by talking about your visit to their site and what you found interesting about it. Detail your product or service in one line and ask them to exchange links with you.


Tell them in detail where you have placed their link and emphasize that it is only one click away from your homepage. Tell them that if you don’t hear back from them within 2 weeks, you will consider that to be a negative response and that you will remove their link from your site.

Link Popularity and Link Analysis

The majority of the major search engines use link popularity as an important factor in ranking relevancy. As search engines have become more sophisticated, so too has link popularity. Link popularity simply is the number of links from other websites that point to your website. This strategy has gained immense success due to the crawling nature of most search engines. Spiders crawl from link to link and store pages into their database. Link popularity is generally gained through reciprocal linking. Other websites would usually link to your website only if you have a link to their website from yours first. Years ago, the number of websites linking to your site gauged link popularity; little emphasis was placed on the "content relevancy" of the linking site. In an effort to gain more link popularity, "link farms" began sprouting up across the web. For a nominal fee, a website owner could join link farms and enjoy increased link popularity overnight.

Search engines caught onto this tactic and created better tools for detecting legitimate links. Websites that have links from websites with "similar" or "relevant" content score higher, thus earn better placement in search engines.

This being said, you should avoid joining "link farms" because most search engines consider them a form of spam. Many engines will actually penalize sites for maintaining an abundance of links from nonrelated websites. It is more important than ever to develop a solid "link-popularity" strategy. One excellent, although time consuming, method is to simply contact complimentary websites via email or telephone requesting a link exchange with the webmaster. Link analysis is somewhat different than measuring link popularity. While link popularity is generally used to measure the number of pages that link to a particular site, link analysis will go beyond this and analyze the popularity of the pages that link to your pages. In a way, link analysis is a chain analysis system that accords weighting to every page that links to the target site, with weights determined by the popularity of those pages. Search engines use link analysis in their page-ranking algorithm. Search engines also try to determine the context of those links, in other words, how closely those links relate to the search string. For example if the search string was “toys”, and if there were links from other sites that either had the word toys within the link or in close proximity of the link, the ranking algorithm determines that this is a higher priority link and ranks the page, that this is linked to, higher.

Reciprocal Links and Partner Sites

Keywords and AdWords aren’t the only way that search engines score relevancy; links to other similar sites are another important factor. Keywords have been so abused by some webmasters that links are winning much more relevancy points. Google is said to love them. It might sound strange to suggest that your users should check out your competitors, but they probably know about them anyway. If your competitors have a higher ranking than you, linking to them can often give you a higher relevancy score with the search engines and the subsequent increase in traffic will make it worth your while. Alternatively, you can link to your own site by creating a subdirectory. This is like building another web page, but the URL will include your keyword. So if you were selling stuffed toys, the new URL would be YourDomain.com/stuffed-toys/stuffed-toys.html. You
could then write a short paragraph on the home page, describing the new page and including a link. You’ll get big relevancy points for this

Reciprocal Links

Reciprocal linking means forming partnerships with other sites who place a link from their web pages to yours. You give them a similar link in return.

When you look for people to swap links with, make sure that you don’t reduce the quality or content of your own site. You don’t want users to click straight through without reading your content; you want them to buy first. One way to stop them from running away too quickly is to create a “Resources Page” and link to that page from your homepage. This doesn’t take away from the content on your homepage and the links are just one click away rather than being buried deep within the site, giving value to your partners. In any case, you want to be sure that your site is more than just a page full of links. If your site contains more links than content, it will not be attractive to webmasters, search engines or users.

Google AdWords

If you do a search at Google, you’ll notice that not only do you get a list of all the sites that return your keyword, you also get a list of other relevant ads on the right of the page and at the top of the listing. These are part of Google’s AdWords program. Advertising like this can certainly be an important part of your marketing plan. Well developed ads with clever wording can prompt an immediate response from the reader to visit your site. Google makes a lot of money with this kind of advertising, and if they’re making money, you can be sure their advertisers are too. You can learn more about Google’s AdWords program here: www.adwords.google.com/select/ Buying AdWords advertising on Google is a relatively simple and cost-effective way to promote your website. In effect, Google has combined the Pay-per-Click system with their own relevancy calculations. To get started, you’ll need to select a keyword and write short description. You will also have to choose how much you wish to pay for each click, but that won’t guarantee your position.

Google advertisers enter a maximum bid per click and this is multiplied by the click-through rate (the percentage of users who click on the ad). That’s the score Google uses to allocate position. So for example, if you were prepared to pay a dollar per click, and one user in a hundred who saw your ad clicked on it, you would get a rank number of ($)1 x 1% = 0.01.

Let’s say that gives you top position. You might then get even more users and a higher click-through rate of 2%. That higher rate would reduce your price to 50 cents (0.01 divided by 2%). All very nice, and it’s always fun to pay less than you’ve said you can afford, but how it works is less important than the fact that it does work. All you have to do is figure out how much you’re prepared to pay for each click, how much you can afford to pay each month, and write a great description.

And once again, it’s the description that’s key. Like the PPC’s, your description has to persuade users that you’re relevant; it doesn’t have to play to the search engine’s software. By all means repeat the keyword, but also make sure you have good, call-toaction copy like “Grab a great deal on DVD’s today!” or “Buy now, while stocks last!” Remember, the more clicks you get, the more sales you’ll make and the less you’ll pay.

Always place your AdWords ad in the most appropriate category and track the responses you receive from it. Be proactive in redefining your strategy if you receive minimal response. You will probably need to experiment with the wording of your ads and your keyword selection for a while until you get the results you want.

Submitting to Pay-Per-Clicks

Submitting your site to a PPC is certainly a lot faster than submitting to a search engine or a directory. You must, however, make sure you consider the following: .. The maximum amount you can bid (I can’t stress that enough!)

.. The keywords you wish to bid on.
.. The titles and descriptions of the site.
That last point is very important for making the most of PPC’s.

Just because you don’t have to worry about putting keywords in your title and descriptions to please the search engines, that doesn’t mean relevance isn’t important. On the contrary, relevance still matters. You need to let the user know that your site is exactly what they’re looking for. That means putting the keyword in the title and having a catchy, informative description. Remember, the more good clicks you get, the more money you are likely to make.

Show Me the Money!

With PPC’s, the name of the game is profit. You need to be careful not to get carried away with the ranking so that your promotion doesn’t cut into your revenues. This is essential! There’s no point in being top if you’re out of business in a month. You have to figure out what you can afford and keep to it. Base your decision on your visitor-to-sales-ratio (the number of visitors on average that it takes to generate a sale) and your net profit per sale.

So for example, if you get a sale from every tenth visitor, and you net a profit of $20 from each sale, then you can’t pay more than $2 for each click without operating at a loss (unless you have an effective back-end sales campaign setup as discussed earlier). In practice, you might make one sale for every 100 or so clicks and pay perhaps 15 or 20 cents for each visitor, depending on your market. It’s absolutely crucial for you to know your visitor-to-sales-ratio. It’s also important to keep that ratio as high as possible, and that means only bidding on relevant keywords. If you pay for visitors who are looking for something completely different than the services you’re offering, you’re just throwing your money away. They aren’t going to buy, and even at five cents a shot, those wasted nickels soon add up.


On the other hand, because you can pay so little, it is worth bidding on as many relevant keywords as possible.

The key is to balance high payments for top keywords with low payments that bring in less traffic.

You should also consider the quality of visitors the site will send you. The more targeted a directory, the better your visitor to sale ratio will be and that might make it worth increasing your bid price.

Pay-Per-Click — Buying Status

Pay-per-click programs (PPC’s) allow you to buy a prime position in a search engine by selecting the price you wish to pay for each visitor you receive. This can place you exactly where you want to be in the listing, or let you decide precisely how much you want to spend on advertising.

The big advantage of PPC’s is that you don’t have to worry about messing with keywords or links or any of that. You can just figure out how much you want to pay for a keyword and buy your position. In addition, you only pay for people who actually click on your link (for banner ads, you often have to pay when someone sees it.) And you can also get cheap visitors. Bids usually start at around five cents per click. The top three bids though are often promoted across a network of sites so there can be big bonuses for bidding high.
This is how most pay-per-click programs work:
.. You create your page title, description and link as you want it to appear in the search results.

.. You enter the keywords and phrases that will prompt your listing to appear.

.. You enter your keyword bid (the amount you are willing to pay for each click to your site).

.. Your keyword bid is compared to that of other bidders for the same keyword. The results are returned to the user with

the highest bid appearing first.

Submitting to Search Directories

Submitting your site to a search directory is a little tougher than submitting to a search engine. Directories don’t have spiders. They rely on humans. When you submit your site to Yahoo! or any of the other directories, you’ll have to complete a form that will include your “URL”, “Page Title”, “Keywords” and a “Page Description”. Your keywords and title will play some role in your ranking, but for the description, it’s much better to put a hard sell that will attract users. There’s no point having a link at the top of a category if no one wants to click on it.

Bear in mind that because each submission to a directory is checked by a human editor, it can take quite a while for your site to be approved and listed. Some sites do have express services but these are pretty pricey (Yahoo! wants $299 and $600 for adult sites!), and if they decide your site isn’t suitable for a category, you don’t get your money back. It’s usually worth the wait.

Search Directories – The Benefits of Browsing

Search directories differ from search engines by providing a range of categories for users to browse. Rather than enter a keyword into a search box, users click through categories and sub-categories narrowing down their options.

You could say that search engines are like going straight up to the sales assistant and asking what they have in evening wear; search directories are like browsing through the store and seeing what catches the eye.

How you make your site catch the eye in a directory is actually pretty similar to standing out in a search engine. It’s all about relevancy — a mixture of keywords and links.

Submitting to Search Engines

Submitting sites to search engines is much easier than submitting them to directories or pay-per-clicks. In fact, you only have to submit the home page. The search engine’s “spider” (a neat little software program) will then follow all the links from the home page and include your other pages. Spidering actually increases your relevancy score more than hand-submitting your internal pages yourself.


The disadvantage of spidering is that it can be slow. Google has the best spider but even they can take up to a month to index all your pages. For other search engines you can wait three times as long. Say What?! Is all of this search engine talk confusing to you? Do you want to make money online, but don’t want to figure it all out by yourself? You’re not alone. Click here for a turn-key money-making system that you can start earning from within the next 24 hours.

WEB COPY FOR SEO

The search engines will scan the text on a web page to see if your site is relevant to the search term. That means that in effect, your web copy is going to have to do two things: persuade a customer to buy, and persuade a search engine it’s relevant. When you write your copy aim for about 500 words a page, but throw in 4-8 keywords. You’ll have to try to balance a smooth text flow against getting in all the keywords you need to be listed.


You can also consider adding text-only pages such as how-to articles, tips or tutorials to your site. Throw in some keywords and they can turn up in search engines and create opportunities for link exchanges. So there’s a few ways you can try to improve the position of your site in a search engine. More important than where you put the keywords is choosing the right keywords. That’s not really a huge challenge as your competitors are likely to have done the job for you. Of course, even if you do get everything right, it doesn’t mean you’re going to shoot straight to the top of Google. One of the criteria for relevancy is how long you’ve been online, so success on the search engines won’t come overnight. The sooner you start submitting though, the sooner you can start to rise.

Search Engine page-ranking algorithms

Most Search Engine page-ranking algorithms rank pages based on the following aspects:

.. Content of the website
.. Representation of content, keywords, and links on websites
.. Location and number of inward and outward links on
websites
.. Relevancy of search terms as compared to the websites
Given below is a brief description of the page-ranking algorithms
of some of the most popular search engines.

Google
You can submit your website to Google at the following URL:www.google.com/addurl.html Submitting your site will only make

Google aware that your page exists; it is quite possible that your pages may get crawled even if you have not submitted. It is advisable to submit the home page and some inside pages. Inside pages are added to the submission, just in case the home page is found too slow to load or crawl. The pages that are submitted should link to the rest of the pages. Google indexes the full text that is visible on any page that it crawls. It generally does not index the metatags – keywords or descriptions.

When Google lists your page in the search results, the description that is displayed is the extract of text that is around the first line where the search word appears on the page. It may thus be a good dea to write a good description of the page and build it around the most likely search term(s) and place that near the top of your page. You should remember that one sure way of getting your site listed and indexed is if there are several links that point to your site and such links appear on web pages that in turn have several other links pointing to them. The term “link popularity” is used for this. It analyzes links of the pages that it has visited and this “link analysis” helps to determine the ranking of the page. Google uses a proprietary PageRank algorithm for determining relevance and ranking of pages in the search results. Location and frequency of the search term on your web page are no doubt factors in ranking; however, off-the-page factors such as link analysis are more important. Generally, Google provides search results based on relevancy, meaning that it returns a list of pages ranked by the number of other web pages linking to each page, as well as other mathematical algorithms

Yahoo!

Yahoo! offers a human-powered directory and offers its results to visitors. The directory is supplemented by a web page index created by crawling. The directory is an important channel in the area of search engine marketing. It is popular and is used extensively by people to locate sources of information. Moreover, the directory is a valuable boost to your site for crawling and ranking in other search engines, as the directory provides a high-quality link to your website. When a visitor is looking for information on relevant sites, she could either browse through the hierarchy of directories and subdirectories or search for an appropriate directory through a search interface. As your site can be listed in just one category, generally, the choice of category is an important step. Choose the top category that your target visitor who is making a search may select out of the different categories offered to him/her. Select your target keywords and find out which categories relate to those keywords. For submission of non-commercial sites, the Yahoo! Express submission is recommended rather than the Standard submission option. - 79 -

The results page in your chosen category will list your site in two possible sections (for most categories). One section is called "Most Popular Sites" and this is on top, while the second section contains the remaining listings in Alphabetical order. Yahoo! does not reveal how it includes certain sites in the “Most Popular Sites” list. However, link analysis and click-throughs are likely to be factors. You cannot pay to be included in this section. Certain sites with sunglasses shown next to their name or an “@” symbol shown at the end of the name reflect that Yahoo! considers those sites to be excellent.

Inktomi (MSN Search, AOL Search, Hotbot)

Inktomi is a search engine that does not offer its search services through its own site, but through Partner sites – prominent ones being MSN Search, AOL Search, HotBot and others. Inktomi, through its crawler, creates three different indexes. “Best of the Web” index has around 110 million pages that it indexes on the web and considers high in link analysis. The next set of around 390 million pages is indexed as “Rest of the Web”, considered as lower in link analysis. The third index is of paid inclusion. It also offers specialized regional indexes as well as targeted news, multimedia and directory indexes. It avoids duplication of the same page in more than one index. Link crawling and paid inclusion are the two most effective ways to get covered by crawling. For bulk submissions to its paid program, it offers IndexConnect (for 1000 or more pages). Again, there is a cost-per-click basis with a monthly minimum. Ranking at Inktomi is determined by a combination of factors including HTML links, keywords and description tags near the top of the page or in the title tag. If the search string matches with what is found at these places on the page, the ranking is higher. Link analysis and analysis of clickthroughs are other important criteria that it adopts.

AltaVista

AltaVista will accept free listings through its “addurl” link, but it also has paid inclusion features. Generally, their crawler may visit every four weeks. Paid inclusion may be desirable if you have a new website or web pages or if you alter your web pages frequently, and you do not wish to wait until the next cycle of crawling. There is an “Express Paid” inclusion service of self-service type for up to 500 pages at a time. This service will enable weekly crawling. Their bulk program called “Trusted Feed” will enable the pages to be directly linked to their index. Pricing for “Trusted Feed” is on a cost-per-click model with a monthly minimum. In this program you can submit the Meta data, descriptions and keywords directly to the index. Nevertheless, the engine will check whether the destination page has the same Meta data or not and could levy a penalty for spam. AltaVista’s ranking policies are a combination of various factors. The frequency and positioning of keywords and descriptions is important, as are title tags or words that appear near the top of the page. AltaVista applies link analysis to determine relevancy and page ranking. It levies penalty on spamming and does not recognize invisible or tiny text, keyword stuffing, identical pages, mirror sites, or quick meta refresh tags. 3.4 Keywords — Optimizing Your Site to Get Top Billing at Search Engines When a user enters a search term, also known as a “keyword”, into a search engine, the engine runs through the billions of pages in the database and awards each one a “relevancy score”. The higher your score, the higher your listing. If your site doesn’t contain the keyword used by the searcher, the only score it’s going to get is a big, fat zero. Your first task then is to make sure you know which keywords are most relevant for each of your sites.

There are three ways to figure out your keywords:


Ask Your Competitors

This is the cheapest way to find many of the most important keywords. Simply log on to a search engine (AltaVista is good, Google is better) and carry out a search for sites like yours. Open the top site, and once the home page has downloaded, click on “View” in your browser, and then “Source”. That will reveal all the HTML used to build the web page, including all the keywords that have been specially inserted. Some of those keywords will be relevant to your site. Others, of course, won’t be relevant and there will be lots of other keywords that aren’t obviously listed, such as “vitamins” for example. You can repeat the process on other sites, using different keywords, and build up a pretty long list.

Ask the Pay-Per-Clicks

Pay-per-click sites actually let you see how popular a keyword is. They’re not being kind; they’re trying to make money. The more webmasters bid on those keywords, the higher the bids are going to rise — and the more money the pay-per-clicks are going to make. FindWhat, for example, has a Keyword Center. Other pay-per-click sites offer similar features. One of the most popular key word discovery tools, however, was provided by former PPC giant Overture.

You can play around with this free keyword selector tool at: www.inventory.overture.com/d/searchinventory/suggestion/ Use a Specialized Tool Not too surprisingly, a number of companies have popped up to supply specific keyword services for a fee. The best of these is WordTracker.com. They’re not bargain basement, but you get what you pay for. They’ll give you all the keywords you need and in my experience, they’re a sound investment. GoogleFight.com is another useful tool to see whether one keyword is more popular than another. The site compares two keywords and tells you which is more popular. It’s free and has a limited use, but it’s fun to play with.

As you make up your list of keywords, bear in mind that it’s also worth looking at key phrases. It’s quite possible that a user looking to buy flowers online might search for “red roses” or “cheap bouquets” as well as just “flowers”. Key phrases are often overlooked by competitors, so you’ve got a pretty good chance of getting a high placement with the right combination. Don’t worry too much about the competition though. Some people will tell you that you’re better off trying to find keywords that no one else has thought of and others will tell you to throw in keywords that are only slightly relevant to your businesses. In my experience, that’s a waste of time. If your competitors are using certain keywords, it’s because they know they work. And if any of your visitors found your site using irrelevant keywords, you're not going to sell them anything. Don’t try to reinvent the wheel here: just try to figure out the most popular keywords and the best key phrases to put on your site.

Whichever of these methods you use — and I tend to use more than one — you should end up with a pretty comprehensive list of keywords that you can stick into your website. The next question then, is how do you use them? When a search engine assigns relevancy to a site, it looks for the keywords in a number of specific areas. Title Tag The title tag is written in the section of the web page and after the


The title tag is usually between 50 and 80 characters including spaces. Different search engines have different limits so you want to make sure that your most important words are near the beginning of the title. The rest of the title is made up of keywords and phrases but in fact, you don’t want to put in too many keywords here. Just place one keyword as the second or third word in the title. Too many, and your site could be seen as spamming. You can also list more keywords in the and sections of the area, but because these areas have been so abused in the past, a number of search engines today will skip right past the title tag and go straight to the web copy.

Top Search Engines

I’ve studied how search engines work. An integral part of any Internet marketing or search engine optimization campaign is to know exactly which search engines to target. This section discusses some of the top search engines today.

Google

Google has increased in popularity tenfold the past several years. They went from beta testing to becoming the Internet's largest index of web pages in a very short time. Their spider, affectionately named "Googlebot", crawls the web and provides updates to Google's index about once a month.

Google.com began as an academic search engine. Google, by far, has a very good algorithm of ranking pages returned from a result, probably one of the main reasons it has become so popular over the years. Google has several methods which determine page rank in returned searches.

Yahoo!

Yahoo! is one of the oldest web directories and portals on the Internet today, and the site went live in August of 1994. Yahoo! is also one of the largest traffic generators around, as far as web directories and search engines go. Unfortunately, however, it is also one of the most difficult to get listed in, unless of course you pay to submit your site. Even if you pay it doesn't guarantee you will get listed.

Either way, if you suggest a URL, it is "reviewed" by a Yahoo! editor and, if approved, it will appear in the next index update.

AltaVista

Many who have access to web logs may have seen a spider named “scooter” accessing their pages. Scooter used to be AltaVista's robot. However, since the Feb 2001 site update, a newer form of Scooter is now crawling the web.

It will usually take several months for AltaVista to index your entire site. Unlike Google, AltaVista will only crawl and index 1 link deep, so it can take a long time to index your entire site, depending on how large your site is.

Inktomi

Inktomi's popularity grew years ago as they powered the secondary search database that had driven Yahoo. Their spiders are named "Slurp", and different versions of Slurp crawls the web many different times throughout the month, as Inktomi powers many sites’ search results. There isn't much more to Inktomi than that. Slurp puts heavy weight on title and description tags, and will rarely deep-crawl a site. Slurp usually only spiders pages that are submitted to its index. Inktomi provides results to a number of sites. Some of these are America Online, MSN, Hotbot, Looksmart, About, Goto, CNet, Geocities, NBCi, ICQ and many more.

Lycos
Lycos is one of the oldest search engines on the Internet today, next to Altavista and Yahoo. Their spider, named "T-Rex", crawls the web and provides updates to the Lycos index from time to time. The FAST crawler provides results for Lycos in addition to its own database.

The Lycos crawler does not weigh META tags too heavily. Instead, it relies on its own ranking algorithm to rank pages returned in results. The URL, META title, text headings, and word frequency are just a few of the methods Lycos uses to rank pages. Lycos does support pages with Frame content. However, any page that isn't at least 75 words in content is not indexed.

Excite

Excite has been around the web for many years now. Much more of a portal than just simply a search engine, Excite used to be a fairly popular search engine, until companies such as Google started dominating the search engine market. As of recently, Excite no longer accepts submissions of URL's, and appears to no longer spider. To get into the Excite search results, you need to be either listed with Yahoo! or Inktomi.



Looksmart

Getting listed with Looksmart could mean getting a good amount of traffic to your site. Looksmart's results appear in many search engines, including AltaVista, MSN, CNN, and many others. Looksmart has two options to submit your site. If your site is generally non-business related, you can submit your site to Zeal (Looksmart's sister site), or if you are a business, you can pay a fee to have your site listed. Either method will get you listed in Looksmart and its partner sites if you are approved. Once your site is submitted and approved, it will take up to about 7 days for your site to be listed on Looksmart and its partner sites. 3.3 Search Engine Page-Ranking Algorithms A search engine's main job is to provide results which most satisfy a user's query. If they present a result that the user visits and doesn't agree that the document is about their query, there is a very good chance that the user may not use that search engine again. Most search engines pay no attention at all to the Meta description tags.

Meta description and keyword tags are hidden attributes that you can add to the front of your document which are supposed to annotate and describe the document. Since the users will never see this information, they will be disappointed if you stick in invalid keywords or fail to keep the description in line with the document's contents which usually is the case.

How Search Engines Work

Internet search engines are special sites on the Internet that are designed to help people find information stored on other sites. There are differences in the ways various search engines work, but they all perform three basic tasks:

.. They search the Internet -- or select pieces of the Internet - - based on important words.

.. They keep an index of the words they find, and where they find them.

.. They allow users to look for words or combinations of words found in that index.

Early search engines held an index of a few hundred thousand pages and documents, and received maybe one or two thousand inquiries each day. Today, a top search engine will index hundreds of millions of pages, and respond to tens of millions of queries per day.

Spidering

Before a search engine can tell you where a file or document is, it must be found. To find information on the hundreds of millions of web pages that exist, a search engine employs special software robots, called spiders, to build lists of the words found on websites. When a spider is building its lists, the process is called crawling. In order to build and maintain a useful list of words, a search engine's spiders have to look at a lot of pages. How does any spider start its travels over the web? The usual starting points are lists of heavily used servers and very popular pages. The spider will begin with a popular site, indexing the words on its pages and following every link found within the site. In this way, the spidering system quickly begins to travel, spreading out across the most widely used portions of the web.

Indexing

Once the spiders have completed the task of finding information on web pages, the search engine must store the information in a way that makes it useful. There are two key components involved in making the gathered data accessible to users:

.. The information stored with the data
.. The method by which the information is indexed


In the simplest case, a search engine could just store the word and the URL where it was found. In reality, this would make for an engine of limited use, since there would be no way of telling whether the word was used in an important or a trivial way on the page, whether the word was used once or many times or whether the page contained links to other pages containing the word. In other words, there would be no way of building the ranking list that tries to present the most useful pages at the top of the list of search results. To make for more useful results, most search engines store more than just the word and URL. An engine might store the number of times that the word appears on a page. The engine might assign a weight to each entry, with increasing values assigned to words as they appear near the top of the document, in sub-headings, in links, in the meta tags or in the title of the page. Each commercial search engine has a different formula for assigning weight to the words in its index. This is one of the reasons that a search for the same word on different search engines will produce different lists, with the pages presented in different orders.

An index has a single purpose: It allows information to be found as quickly as possible. There are quite a few ways for an index to be built, but one of the most effective ways is to build a hash table. In hashing, a formula is applied to attach a numerical value to each word. The formula is designed to evenly distribute the entries across a predetermined number of divisions. This numerical distribution is different from the distribution of words across the alphabet, and that is the key to a hash table's effectiveness. The Search Engine Program The search engine software or program is the final part. When a person requests a search on a keyword or phrase, the search engine software searches the index for relevant information. The software then provides a report back to the searcher with the most relevant web pages listed first.

Traffic through Search Engines

It doesn’t matter how great your website is, if no one sees it, you’re not going to make a penny. You can spend days producing the perfect design, weeks tweaking the copy, and months writing the code and uploading the pages, but if no one knows where you are, how are they going to know they should buy from you?

When I first started selling online, the first major problem I ran into was bringing customers to my door. I put banner ads on other sites, organized reciprocal links and joined web rings. Those methods all worked to some extent, but what really did it for me, what turned my business from a small earner into a major money-grabber, was figuring out how to use search engines.

Sure, I’d submitted my sites to the major search engines as soon as I’d finished building them, but I didn’t really pay them much attention. After all, I figured search engines are just for people who are looking for information; they’re not really good for commercial sites.



One day, I sat down and checked out which sites were popping up first in the categories that suited my businesses. I found that all the top-ranked sites were my biggest competitors. And when I say biggest, I mean these guys were in a whole other league. They had incomes that were ten or twenty times the size of mine. No wonder they had top billing at Yahoo! and Google! And then it clicked. Search engines don’t list sites by size, they list them by relevance. These sites weren’t listed first because they were big; they were big because they were listed first!

That was when I began to optimize my pages and think about meta-tags and keywords. As my sites rose through the listings, my traffic went through the roof. And not just any old traffic! The people that came to my sites from search engines hadn’t just clicked on a banner by accident or followed a link from curiosity, they’d actually been looking for a site like mine. My sales ratio went up like a rocket. I’d created my own big break.