Lovely post about search engine optimization:
1. Utilize search engine friendly URL links which have hyphens instead of underscores, dynamic or session URLs which often confuse crawlers.
2. Rewrite dynamic URLs via .htaccess into URL friendly ones.
3. Use robots.txt to control access to which pages crawlers analyze and index. Restrict robots from â€œprintâ€ html pages for example to control duplicate content issues.
4. Put each domain on a separate IP address. Sharing IP addresses with other sites that could potentially be spamming or using black hat techniques isnâ€™t smart, and at $3-5 a month for a unique IP itâ€™s worth it.
5. If youâ€™re running Linux as your web hosting server then read about creating an .htaccess file to avoid canonicalization.
6. If youâ€™re running Windows as your web hosting server then learn about how to do the equivalent of .htaccess.
7. Invest in quality web hosting service. Most sites arenâ€™t setup to handle the incoming traffic from hitting the top of Digg but you should be able to handle a steadily increasing amount of traffic from your SEO efforts. If your site goes down and the search engines try to index the site you will be penalized, and the results can be dramatic.
8. Validate your HTML to W3C standards. Although there is no evidence that this is required Google it provides a clean experience to visitors.
9. Keep the number of trailing slashes to less than four to avoid crawling and other issues. If you have a blog just the post name is optimal, the date and other information is unnecessary.
10. When considering launching a new site consider purchasing a domain name that has been registered for quite some time. This avoids any perils of the â€œGoogle Sandboxâ€ as well as having the advantage of already having established back links, and other SEO benefits.
11. If no domain names are available that suit your company consider purchasing a domain name that has your specific keywords in it. Limit the hyphens to a maximum of two.
12. When creating pages ensure that no page is beyond two pages from the home page. This helps ensure that your pages will become indexed.
13. Check for dead/broken links in Google Webmaster Tools, and then utilize a 301 redirect to the correct location or to the home page to clean up that aspect.
14. Register your domain name for the maximum amount of time. This instills trust among the search engines that you donâ€™t have a one year hosting plan and then youâ€™re done. This is a cheap solution, and lets them know that you plan to be around for a long time.
15. Ensure that outgoing links from your site are going to quality on topic sites. Linking to poker/porn/Viagra, etc sites from a health site for example will hurt not help you.
2. Place your keywords in your body text. Donâ€™t worry about keyword density as much as making it functional for both the user while targeting your specific keywords for the search engines.
3. Use keywords in the H1, H2, and H3 tags that are relevant to your site.
4. Place your targeted keywords in the page URL, separated by a dash between keywords.
5. Target keywords above and beyond just the ones with the most traffic. Misspellings, alternative words, and lower traffic keywords could potentially still are lucrative by driving targeted traffic to your site.
On Page Optimization
1. Add a unique title to each page that targets your specific keywords. Typically this occurs with having your targeted keywords followed by â€“ company name. If your brand is strong enough you can switch this around and have your company first.
2. Add a unique Meta description to each page that again targets your specific keywords. This is limited to 160 characters in Google, 165 in Yahoo, and slightly over 200 for MSN. Make the description unique for each page so that you donâ€™t fall into any duplicate content issues.
3. Add a unique Meta keyword tag for each page that only lists keywords that are relevant and appear on the page. Although Meta keyword is not used by Google it still has some importance for Yahoo.
4. Make sure that all images have alt tags associated with them to ensure that your site is accessible for everyone.
5. Develop content that is unique and different for all pages. Having similar content on multiple pages could throw up duplicate content filters and your pages could go supplemental.
1. Download WebCEO and submit your site to the various search engines and directories through the automatic process.
2. Download DigiXMAS and semi-auto submit your site to several hundred directories.
3. Donâ€™t worry about incoming directory links, and whether you are submitting too quickly. You will be denied from some, others will take months to approve you, and the resulting links will appear natural.
4. Submit your site to DMOZ. Although DMOZ has had problems recently it is still a highly trusted source, and is the premier free directory to submit your site to. It may take months or even years to get added but it is still worth the 10 minutes to submit to.
5. Pay to submit your site to Yahoo directory. Along with DMOZ Yahoo directory is one of the most trusted directories out there. The $299 fee is nominal and instills trust among Yahoo and the other search engines.
6. Create â€œlink baitâ€ content for submission to the social websites such as Digg, Reddit, etc. Getting an article to the top of a social website can result in thousands of additional back links almost overnight.
7. Manual link building is a highly effective way to increase your rankings in the SERPs. Use WebCEO to find link partners who are related to your respective field, and would be willing to link to you.
8. Often times the best links are the links that your competitors have. Go to yahoo and type in â€œlink:www.competitor.comâ€ and look for links that they have that you should look into.
9. Edu and Gov links are typically more respected and trusted than Com links as they have built up thousands of back links, produce quality content, and are unlikely to engage in questionable techniques. If you can get these links get them.
10. If you want to buy links do so in a manner that is not obvious. Purchase links from relevant sites and donâ€™t publish details in forums or other locations which could be seen.
11. When linking out or posting links surround the link with text so that it does not appear to be a generic links page, and instead flows nicely with relevant information.
12. Post your link in forums that you frequent with the anchor text that you are targeting.
13. Comment on blogs, again with the same anchor text and link that youâ€™re targeting.
14. Although Wikipedia utilizes â€œNOFOLLOWâ€ attributes on links adding your relevant links to Wikipedia can generate a significant amount of targeted traffic to your website.
15. Create a Squidoo lens to generate back links as well as hopefully generate traffic to your website.
2. Submit your site Google Webmaster Tools, and then verify it.
3. Inside Google Webmaster Tools choose Preferred Domain and pick www.yoursite.com or yoursite.com to avoid canonicalization issues.
4. Submit your site to Yahoo! Site Explorer, and then verify it.
5. Download Firefox and then head over to SEOBook to download Aaron Wallâ€™s SEO Tools for Firefox. This neat utility allows you to check back links, Edu links, Wikipedia entries, cache date, and other SEO info that is crucial for analyzing data.
6. Use page analysis to look at the anchor text of incoming links to your site to analyze what is working, how spread out your anchors are, and how to solidify under a few anchor texts if possible.
7. Develop articles that visitors may want to link to. Interviews with industry leaders and others are a quick way to develop links.
8. Submit your article to article submission services only after your article has been indexed by the search engines to avoid any duplicate filters.
9. Ignore Page Rank. Although it may be important to those who arenâ€™t as familiar to the SEO process it does not determine your rankings, and is only updated visually every 3-4 months. Actual Page Rank is constantly being updated by Google on a daily basis.
10. Ignore Alexa data. Alexa is calculated by users who install their toolbar and is thus a very narrow segment of the population. Companies who browse their own sites with the toolbar thus rank unnaturally high in Alexa.