Monthly Archives: July 2008

List of TLDs

I found this useful and thought I would share !

List of common generic domain name extensions


List of restricted TLD / Domains

.BIZ    Restricted for Business
.NAME    Reserved for individuals
.PRO    Restricted to credentialed professionals and related entities
List of country codes / country domain name extensions

.AC    Ascension Island
.AD    Andorra
.AE    United Arab Emirates
.AF    Afghanistan
.AG    Antigua and Barbuda
.AI    Anguilla
.AL    Albania
.AM    Armenia
.AN    Netherlands Antilles
.AO    Angola
.AQ    Antarctica
.AR    Argentina
.AS    American Samoa
.AT    Austria
.AU    Australia
.AW    Aruba
.AX    Aland Islands
.AZ    Azerbaijan
.BA    Bosnia and Herzegovina
.BB    Barbados
.BD    Bangladesh
.BE    Belgium
.BF    Burkina Faso
.BG    Bulgaria
.BH    Bahrain
.BI    Burundi
.BJ    Benin
.BL    Saint Barthelemy
.BM    Bermuda
.BN    Brunei Darussalam
.BO    Bolivia
.BR    Brazil
.BS    Bahamas
.BT    Bhutan
.BV    Bouvet Island
.BW    Botswana
.BY    Belarus
.BZ    Belize
.CA    Canada
.CC    Cocos (Keeling) Islands
.CD    Congo, The Democratic Republic of the
.CF    Central African Republic
.CG    Congo
.CH    Switzerland
.CI    Cote d’Ivoire
.CK    Cook Islands
.CL    Chile
.CM    Cameroon
.CN    China
.CO    Colombia
.CR    Costa Rica
.CU    Cuba
.CV    Cape Verde
.CX    Christmas Island
.CY    Cyprus
.CZ    Czech Republic
.DE    Germany
.DJ    Djibouti
.DK    Denmark
.DM    Dominica
.DO    Dominican Republic
.DZ    Algeria
.EC    Ecuador
.EE    Estonia
.EG    Egypt
.EH    Western Sahara
.ER    Eritrea
.ES    Spain
.ET    Ethiopia
.EU    European Union
.FI    Finland
.FJ    Fiji
.FK    Falkland Islands (Malvinas)
.FM    Micronesia, Federated States of
.FO    Faroe Islands
.FR    France
.GA    Gabon
.GB    United Kingdom
.GD    Grenada
.GE    Georgia
.GF    French Guiana
.GG    Guernsey
.GH    Ghana
.GI    Gibraltar
.GL    Greenland
.GM    Gambia
.GN    Guinea
.GP    Guadeloupe
.GQ    Equatorial Guinea
.GR    Greece
.GS    South Georgia and the South Sandwich Islands
.GT    Guatemala
.GU    Guam
.GW    Guinea-Bissau
.GY    Guyana
.HK    Hong Kong
.HM    Heard Island and McDonald Islands
.HN    Honduras
.HR    Croatia
.HT    Haiti
.HU    Hungary
.ID    Indonesia
.IE    Ireland
.IL    Israel
.IM    Isle of Man
.IN    India
.IO    British Indian Ocean Territory
.IQ    Iraq
.IR    Iran, Islamic Republic of
.IS    Iceland
.IT    Italy
.JE    Jersey
.JM    Jamaica
.JO    Jordan
.JP    Japan
.KE    Kenya
.KG    Kyrgyzstan
.KH    Cambodia
.KI    Kiribati
.KM    Comoros
.KN    Saint Kitts and Nevis
.KP    Korea, Democratic People’s Republic of
.KR    Korea, Republic of
.KW    Kuwait
.KY    Cayman Islands
.KZ    Kazakhstan
.LA    Lao People’s Democratic Republic
.LB    Lebanon
.LC    Saint Lucia
.LI    Liechtenstein
.LK    Sri Lanka
.LR    Liberia
.LS    Lesotho
.LT    Lithuania
.LU    Luxembourg
.LV    Latvia
.LY    Libyan Arab Jamahiriya
.MA    Morocco
.MC    Monaco
.MD    Moldova
.ME    Montenegro
.MF    Saint Martin
.MG    Madagascar
.MH    Marshall Islands
.MK    Macedonia, The Former Yugoslav Republic of
.ML    Mali
.MM    Myanmar
.MN    Mongolia
.MO    Macao
.MP    Northern Mariana Islands
.MQ    Martinique
.MR    Mauritania
.MS    Montserrat
.MT    Malta
.MU    Mauritius
.MV    Maldives
.MW    Malawi
.MX    Mexico
.MY    Malaysia
.MZ    Mozambique
.NA    Namibia
.NC    New Caledonia
.NE    Niger
.NF    Norfolk Island
.NG    Nigeria
.NI    Nicaragua
.NL    Netherlands
.NO    Norway
.NP    Nepal
.NR    Nauru
.NU    Niue
.NZ    New Zealand
.OM    Oman
.PA    Panama
.PE    Peru
.PF    French Polynesia
.PG    Papua New Guinea
.PH    Philippines
.PK    Pakistan
.PL    Poland
.PM    Saint Pierre and Miquelon
.PN    Pitcairn
.PR    Puerto Rico
.PS    Palestinian Territory, Occupied
.PT    Portugal
.PW    Palau
.PY    Paraguay
.QA    Qatar
.RE    Reunion
.RO    Romania
.RS    Serbia
.RU    Russian Federation
.RW    Rwanda
.SA    Saudi Arabia
.SB    Solomon Islands
.SC    Seychelles
.SD    Sudan
.SE    Sweden
.SG    Singapore
.SH    Saint Helena
.SI    Slovenia
.SJ    Svalbard and Jan Mayen
.SK    Slovakia
.SL    Sierra Leone
.SM    San Marino
.SN    Senegal
.SO    Somalia
.SR    Suriname
.ST    Sao Tome and Principe
.SU    Soviet Union (being phased out)
.SV    El Salvador
.SY    Syrian Arab Republic
.SZ    Swaziland
.TC    Turks and Caicos Islands
.TD    Chad
.TF    French Southern Territories
.TG    Togo
.TH    Thailand
.TJ    Tajikistan
.TK    Tokelau
.TL    Timor-Leste
.TM    Turkmenistan
.TN    Tunisia
.TO    Tonga
.TP    Portuguese Timor (being phased out)
.TR    Turkey
.TT    Trinidad and Tobago
.TV    Tuvalu
.TW    Taiwan
.TZ    Tanzania, United Republic of
.UA    Ukraine
.UG    Uganda
.UK    United Kingdom
.UM    United States Minor Outlying Islands
.US    United States
.UY    Uruguay
.UZ    Uzbekistan
.VA    Holy See (Vatican City State)
.VC    Saint Vincent and the Grenadines
.VE    Venezuela
.VG    Virgin Islands, British
.VI    Virgin Islands, U.S.
.VN    Viet Nam
.VU    Vanuatu
.WF    Wallis and Futuna
.WS    Samoa
.YE    Yemen
.YT    Mayotte
.YU    Yugoslavia (being phased out)
.ZA    South Africa
.ZM    Zambia
.ZW    Zimbabwe

A new search engine call Cuil

cuil - a new search engineToday there has been a lot of notes about a new search engine launched today called Cuil pronounced “COOL” !!

We will see. Not too convinced at this stage about the quality of the results. I do like the seach refinement widget on the top right hand side. It on occassion gives some good alternative searches. But in its teething it gives you searches with no results.

Currently no traffic to any of my sites – but we will see.

It would be nice to add some competition to the UK searchscape.

Common causes of duplicate content

Duplicate content zebraI was trying to list out all the causes of duplicate content – well accidental duplication.

Here is the list to date

  • Inconsistent URLs and links, especially search results or inventory via different attribute routes
  • Similar products or bundles of products with similar description, this can be on your sites of resellers
  • Print friendly pages inc. white papers, pdf downloads.
  • DNS errors ie. no http:// to http://www etc or https !
  • Content management systems who use session cookies in urls

These all produce Errors – and effectively your are ‘pissing in your own pool’. And risk getting duplicate content penalties – even though – you probably don’t realise you have done it. The good news is that all of these can be fixed fairly easily.

As I resolve more here – I will add to this list – send me more if you have them !

Error page handling

We spend a lot of time ‘optimising’ our sites, adding content, building links, keeping the site fresh and ensuring that it all links up nicely.

But, in many corporates, server hygiene and best practice is commonly overlooked. This is normally by omission rather than any kind of malice.

I believe that using and handling errors correctly can be beneficial for SEO. You should set up a Server Response Code monitoring process.

404 errors can be good!

Handling your 404 page properly is an essential part of on-site optimization that is helpful both for your site visitors and search engines.

Legitimate reasons

  • The page no longer exists
    • Inventory has gone
    • Page is no longer relevant
  • There is an inbound link that is wrong!

Bad reasons

  • You moved something and didn’t redirect
  • Your rewriter tool has broken/failed.
  • You changed the URL rules / business logic

How to handle them!

Your server should return that page with a proper 404 header. This way any bots will know the status of that page and clean up their index.

What should be on that page? Well anything you want really.

Normal convention would suggest a polite message that indicates to the real user that the page no longer exists, but ensure it doesn’t blame them!

Maybe a sign post page is a good idea. Use your main search widget, or some static links. Or if appropriate that message with the contents of a normal page, commonly the homepage.

Errors happen – how you deal with them is the mark of a good digital marketer.

Other codes to monitor, for good and bad reasons

The use of 301’s, 302’s and the 500’s. By watching all of these, you can tell the health of your site and helps you tidy up.

Generally a healthy site is a happy site!

Useful links

Best practice guides for Meta Data & H1’s

Best Practices for Title Tag

  • Limit the length of the titles between 70 – 90 characters including spaces
  • Focus on keywords that most closely relate to the purpose of the page
  • Repeat the primary keyword at least twice
  • Use a secondary keyword within the title
  • Create custom titles for top categories and locations on the site
  • Be careful with “Templated” titles – They tend to be discounted as they are easy to be detected as database generated, ensure they are differentiated, even if this is just adding ‘Page2’ or similar!

Best Practices for Meta Description Tag

  • Meta descriptions should be crafted as marketing messages since they can significantly impact the Click-through rates from search result pages, there is no point being on page one if you are not going to get the attention of the user.
  • The meta description should be no more than 155 characters including spaces
  • Avoid using templated meta descriptions – Customize the descriptions for the top areas of the site, if you are on a big site, be creative with database variable and output tags.
  • Repeat the keywords used in title within the meta description
  • Use variations of the main terms in different orders and tenses

Best Practices for Meta Keywords Tag

  • Meta keywords tag should be limited to keywords most closely related to the concept of the page – Avoid using keywords that do not directly relate to the content on the page
  • Limit to about 12-15 keyword phrases (2 – 3 words each)
  • Alternatively, limit the length of the keywords field to 255 characters at the very maximum (including spaces) – If the recommendation of closely related keywords is followed, you should find this limit extremely difficult to hit
  • Separate with “, “ (comma followed by space and then next keyword)

Best Practice for Page Elements

  • Use H1 tags
    – Repeat the Top Keyword related to the page at least twice and use 2nd best keyword at least once
  • Use H2 tags
    – The H2 tag can (and should) be slightly longer than the H1 tag, make it a true sub headline
    – Focus on slightly different variations of the Top 2 keywords
    – Depending on length try working in a 3rd keyword
  • SEO Content
    – Keyword rich content is essential to gaining high rankings, make sure you use genuinely useful content, which just happens to have exact seo keywords in it!

Read is aloud, if you have to ask if it sounds bad – it probably is.

Have fun.

If you have any other top tips – let me know and I will add them in.

p.s. This is always work in progress !

My Inner Geek!

I was shown this video in a presentation the other day. This really brought out my inner geek. Maybe I am suited to my line of work !

Anyway – take a look at this video called “Web 2.0 … The Machine is Us/ing Us” by Michael Wesch.

Well, what do you think?

Link to YouTube >>

What do Search Engines want?

What do search engines want … (written in Mar 07 and moved here in July 08) will update at some point !!

… this can depend on who you ask – but here are some general notes.


  • Original and unique content of genuine value
  • Pages designed primarily for humans, with search engine considerations secondary
  • Hyperlinks intended to help people find interesting, related content, when applicable
  • Metadata (including title and description) that accurately describes the contents of a web page
  • Good web design in general


  • A site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
  • Make sure that our TITLE and ALT tags are descriptive and accurate. Check for broken links and correct HTML.
  • A site map for our users with links that point to the important parts of our site.
  • A useful, information-rich site, with pages that clearly and accurately describe our content.
  • Use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images.


  • Make sure that each page is accessible by at least one static text link.
  • Keep the text that you want indexed outside of images. For example, if you want your company name or address to be indexed, make sure it is displayed on your page outside of a company logo.
  • Add a site map. This enables MSNBot to find all of your pages easily. Links embedded in menus, list boxes, and similar elements are not accessible to web crawlers unless they appear in your site map.
  • Keep your site hierarchy fairly flat. That is, each page should only be one to three clicks away from the home page.Keep your URLs simple and static. Complicated or frequently changed URLs are difficult to use as link destinations. For example, the URL is easier for MSNBot to crawl and for people to type than a long URL with multiple extensions. Also, a URL that doesn’t change is easier for people to remember, which makes it a more likely link destination from other sites.


  • Sites should load quickly and be polished, easy to read and easy to navigate.
  • Sites should be well maintained and updated regularly.
  • Sites should offer thorough and accurate information that provides information that is highly relevant to a user’s search term(s).
  • Sites should offer additional links or information related to a user’s search term(s).Sites should demonstrate credibility by providing author and source citations and contact information.

You can get all that directly from their help sections or deduction!

E-commerce sites
There are a few additional considerations for transactional sites or sites with secure areas.
These additional guidelines for e-commerce sites are listed below:

  • Sites should provide secure transactions (preferably by SSL/SET)
  • Sites should disclose policies for customer privacy, returns, exchanges and other customer concerns
  • Sites should offer many types of the product being sought, relevant brands and/or an appropriate range of products
  • Sites should provide adequate product information
  • Sites should offer customer service by phone, preferably 24 hours a day

How do we give search engines what they want:
Use the words users would type to find our pages, and make sure that our site actually includes those words within it. Understand your customer and use their language. Isnt this marketing 101?

Dynamic pages (i.e., the URL contains a “?” character) – not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few. Seems obvious – CMS systems and cookies are the first culprits of ruining this !

Use a text browser such as Lynx to examine the site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of our site in a text browser, then search engine spiders may have trouble crawling your site.

Allow search bots to crawl the sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of our sites, as bots may not be able to eliminate URLs that look different but actually point to the same page. Accessibility is the single quickest way to fail in search engines!

Make sure our web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.

The robots.txt file on the web server tells crawlers which directories can or cannot be crawled. Make sure it’s current so that we don’t accidentally block the Googlebot crawler. We can test our robots.txt file to make sure we’re using it correctly with the robots.txt analysis tool available in Google Sitemaps or webmaster central (many others available).

Make sure that the content management system can export our content in a way so that search engine spiders can crawl our sites.


  • Avoid hidden text or hidden links.
  • Don’t employ cloaking or sneaky redirects.
  • Don’t send automated queries to Google.
  • Don’t load pages with irrelevant words.
  • Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
  • Don’t create pages that install viruses, trojans, or other badware.
  • Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programmes with little or no original content.
  • If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.

How to make a site search engine friendly:

  • Give visitors the information they’re looking for
  • Provide high-quality content on pages, especially the homepage. This is the single most important thing to do. If your pages contain useful information, their content will attract many visitors and entice webmasters to link to the sites naturally. In creating a helpful, information-rich site, write pages that clearly and accurately describe vacation rentals. Utilise keyword research findings by using keywords on the page.

Links help our crawlers find our site and can give your site greater visibility in search results. When returning results for a search, Google combines PageRank (their view of a page’s importance) with sophisticated text-matching techniques to display pages that are both important and relevant to each search. Google counts the number of votes a page receives as part of its PageRank assessment, interpreting a link from page A to page B as a vote by page A for page B. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.”

Keep in mind that Google’s algorithms can distinguish natural links from unnatural links. Natural links to your site develop as part of the dynamic nature of the web when other sites find your content valuable and think it would be helpful for their visitors. Unnatural links to your site are placed there specifically to make your site look more popular to search engines. Some of these types of links (such as link schemes and doorway pages) are covered in Google’s webmaster guidelines.

Only natural links are useful for the indexing and ranking of our sites.

Make your site easily accessible
Build our sites with a logical link structure. Every page should be reachable from at least one static text link.

Use a text browser, such as Lynx, to examine your site. Most spiders see your site much as Lynx would. If features such as JavaScript, cookies, session IDs, frames, DHTML, or Macromedia Flash keep you from seeing your entire site in a text browser, then spiders may have trouble crawling it.

Consider creating static copies of dynamic pages. Although the Google index includes dynamic pages, they comprise a small portion of our index. If you suspect that your dynamically generated pages (such as URLs containing question marks) are causing problems for our crawler, you might create static copies of these pages. If you create static copies, don’t forget to add your dynamic pages to your robots.txt file to prevent us from treating them as duplicates.

Things to Avoid

Don’t fill your page with lists of keywords, attempt to “cloak” pages, or put up “crawler only” pages. If your site contains pages, links, or text that you don’t intend visitors to see, Google considers those links and pages deceptive and may ignore your site.

Don’t use images to display important names, content, or links. Google’s crawler doesn’t recognize text contained in graphics. Use ALT tags.

Don’t create multiple copies of a page under different URLs.

And a new thing – think about how you present your search results. These ideally should not be treated as many many pages. If they add value then that is ok. But dont think that mulit criteria search results making hundreds of thousands of very similar pages is a great thing.

Think about what adds value and is easy for your users and then SEO should be taken care of.

Who is responsible for SEO in your company?

I say everyone !