Author Archives: admin

Google Analytics And WordPress

Google Analytics is my tool of choice for tracking visits to my websites. WordPress is my CMS of choice (at the moment) for powering my websites. Google Analytics and WordPress, therefore, are a good combination for me. This piece explains how to add your Google Analytics code to your WordPress driven website.

Installing Google Analytics In WordPress

This article assumes that you know how to get your Google Analytics code to add to your WordPress installation. According to Google, you need to copy their javascript code block into every webpage you want to track immediately before the </body>  tag. As WordPress uses templates to construct your pages, it’s an easy job to find the one you need to amend and add the code in the correct place. Then the code will automatically appear on every page, and therefore track visitors to every page of your site.

In your WordPress control panel in the Appearance section, click Editor.

template-editorThis takes you to the Edit Themes page. Over on the right, underneath Theme Files, you will see a list of all the templates used to create your website. Different WordPress themes use different templates, although some are common to all themes,  so what I see in my list may not be the same as what you see in yours. We need to find the template that has the </body> tag in it so that we can insert our Google Analytics code immediately before it. As the </body> tag appears towards the end of a web page, there is a good chance that it may be found in the template called footer.php. In fact, footer.php contains the </body> tag and the </html> tag so this is the file we need to edit.

footer-fileClicking on footer.php loads its code into the code editor for us to amend. Place your cursor just before the </body> tag and press Enter a few times to create some space in which to work in. The blank lines aren’t necessary to make the javascript work, but it does make the code a littl easier to read. Copy and paste your Google Analytics code into the space preceding the </body> tag.

paste-google-analytics-codeClick Update File to apply your changes. With this mouse click you’ve just added your Analytics code to every page on your site. You can easily verify that this is the case by loading your browser with a page on your site and viewing the source. Navigate to the bottom of the source code and you should be able to see your Analytics code. To speed this process up, I usually search for the characters ‘gajshost’ as that variable name uniquely identifies the javascript code.

WordPress Themes That Handle Your Google Analytics Code

Some clever designers out there have made the task even easier (but let’s face it, the above procedure is hardly rocket science) by providing an input box in the WordPress control panel that appears if you use their WordPress theme. Woo Themes, for example, provide a theme specific options page in which you can supply your Google Analytics javascript code and this means that you don’t even need to be in the same room as a template file’s code.


Use Expired Domains For Affiliate Sites

Back in April 2009, I was watching an expired domain that was won by someone in the Godaddy Expiring Domain Auctions, and recently I decided to check on the progress that the new owner had made since their purchase. Doing this is good practice if you want to get ideas of how to monetise expired domains. The domain in question is and the statistics that the domain claimed in the auction were:

  • Traffic – 4037 unique visitors per month
  • PR – 4

Using An Expired Domain For Your Own Purposes

The site used to hold a huge library of guitar tablature and one of my concerns when I was thinking about bidding on the domain myself was how on earth I would maintain all that tablature. It seemed too much like hard work to me. What the new owner has done, though, is slap a sales page on the domain’s homepage that hard sells a Clickbank guitar learning course. No maintenance necessary here. The site receives an impressive 4,000 unique visitors per month who are all interested in playing the guitar. The traffic is not as targeted as “buy guitar course” PPC traffic, but the traffic is free and in the guitar playing niche.

Checking Inbound Links

The site now consists of only one page, but what about all the old pages that used to be on the site that may still have links to them? In Yahoo Site Explorer, I found a staggering 57,581 links to the entire site and 57,398 to the domain only. That means that there are only 183 links to internal pages of the site other than the homepage – i.e. pages that don’t exist anymore. When buying an expired domain, it’s a good idea to look after all the existing links and make sure they don’t lead to any “page not found” errors. This is done by redirecting them to another page on the site. An expired domain’s inbound links are like gold dust and must be taken care of. However, in this case the vast majority of the links are to the domain anyway so even if the new owner is lazy and doesn’t bother with their redirects, most of the link juice will remain intact. I’m making the (potentially dangerous) assumption that those extra 183 links aren’t exceedingly valuable (for example, from places like dmoz, wikipedia, .edu sites etc)! In that case, you wouldn’t want to lose them. The only way for me to tell what the owner has done with those 183 links is to find each URL they point to, visit it and see whether it redirects to the homepage. But, of course, we don’t know how to identify those links and it is only a distraction from the main focus of this page – how to monetise an expired domain. I suspect that the value of those links is negligible, though. The next public toolbar PR update will tell us whether they passed any significant amounts of PR.

The Dangers Of Using Copy And Paste Sale Pages

I worry about that sales page. There are multiple versions of it spattered around the internet and duplicate content is bad news for search engine rankings. Although some duplicate content does escape the Google detector vans, using duplicate content is at risk of being filtered out of the SERPs or of being removed from the search engine’s index completely. It’s best avoided. It could be that the new owner knows the risks and the 4,000 UVs/month arrive via the humungous number of links anyway, so search engine rankings are of no concern.

Expired Domain Development

If you had bought this domain, one option you would have is to do a little development on the site yourself. This would involve doing some keyword research on guitar related phrases to determine what people are searching for in this niche and then writing a page about each phrase. See my article on using keyword research to drive content creation for more information on how this can be done. Although the domain receives more than 4,000 uniques / month anyway, I feel confident that the strength of the PR4 domain would help internal pages to rank well with minimal effort. In other words, it wouldn’t take long to boost the traffic numbers to much more than 4,000, simply by writing more highly targeted content. And just maybe, the search phrases that bring those visitors would match some other affiliate products that could be pushed. More mazoola you’ll have a hard time spending.

A Quick Way To Monetise Expired Domains

The lesson here? If you find an expired domain that has the following properties, you can quickly put it to work by using it to drive traffic to an affilate sale:

  • Decent traffic. This is important, as you don’t want to work too hard to build traffic from scratch yourself. Don’t buy an expired domain on the strength of its domain name, links or PR. If you read my 5 Minute Guide To Buying Expired Domains, you can find out how to quickly sift through the thousands of expired domains up for auction at Godaddy – and find a gem.
  • The domain is in a niche that is served by an affiliate product.

Only the owner of mistertab will know how successful their endeavour is, but there is nothing stopping you experimenting with similar expired domains. And if it works, rinse and repeat.

Make Money With Expired Domains

If you’ve followed the 5 Minute Guide To Buying Expired Domains, you may feel pretty confident about quickly and efficiently finding and buying a good expired domain. Once in your possession you could use your expired domain to drive affiliate sales. But is that the only money making option open to you? How do you find alternative methods of monetizing your domain?

Here’s the trick: copy what other successful webmasters do. We have our own little laboratory in which other people perform the experiments at their own risk, and we can observe and assess. The only problem with our study is that we can’t gauge the experimenter’s success. But we can generate ideas for our own endeavours. In a nutshell, we need to:

  • Identify expired domains that have been bought with the sole purpose of making money.
  • Recognise the monetization methods employed for that domain.
  • Assess what kinds of sites are suitable for this kind of monetization.
  • Seek out and buy domains that fall into that category and monetize in the same way.

I’ve done the hard work in the form of analysis, so you don’t need to. Over the last few months I’ve been keeping a record of all the expired domains that were receiving high traffic numbers and that were bought by someone in the Godaddy Auctions. I revisited each domain around a month after the auction end date. This time allows for the 7 day Godaddy transfer period, and also gives some time for the new owner to set the site up.

The results are interesting.

Domain Date Traffic PR How Is It Now Being Monetized 20th Apr 2009 886 2 Cash parking 20th Apr 2009 5397 4 I think the site was accidentally dropped by the owner as the content now on it is the same as the Wayback Machine shows from over a year ago. Many of the links to articles are redirected to another site that has been suspended, and I remember that when I checked the site previously, it wasn’t suspended. I suspect dodgy dealings, but I have no way of knowing exactly what. When I originally checked this site, it looked like a traffic funnel to another site the owner ran. 20th Apr 2009 5625 4 Cash parking 20th Apr 2009 4037 4 Affiliate sales page. 20th Apr 2009 684 4 Affiliate sales page 20th Apr 2009 593 2 Ha ha – it’s not what you think! Affiliate sales page. 7th Jun 2009 582 3 Bought for resale by Cash parked in the mean time. 7th Jun 2009 928 5 Generic content to give the appearance of a genuine content site, but the links in the right sidebar give the game away. The site was bought purely for PR (this domain has a PR5). I would imagine that either the owner owns the sites linked in the sidebar (they’re linking to their own sites to give them a boost in the search engines) or that the owner is selling links. I’m sure that checking the WHOIS for those sites would give an indication. 12th Jun 2009 12477 5 Generic content used simply as a vehicle for Google Adsense. The site gets a lot of visits so the earnings might make the domain purchase worthwhile. The site isn’t indexed by Google though so there is a dependance on referral traffic. Even so, 12,477 unique visitors via referring sites isn’t to be sneezed at. 8th Jul 2009 21537 0 Cash parking 8th Jul 2009 15912 0 Adsense for domains? 8th Jul 2009 11852 1 Cash parking 8th Jul 2009 10863 2 Cash parking 8th Jul 2009 10737 0 Cash parking 8th Jul 2009 10096 0 Cash parking 8th Jul 2009 9401 1 Cash parking 8th Jul 2009 8058 0 Cash parking 8th Jul 2009 7734 0 Cash parking 8th Jul 2009 7676 0 Csh parking 8th Jul 2009 7506 1 Cash parking 8th Jul 2009 6842 2 Cash parking 8th Jul 2009 6557 0 Cash parking 8th Jul 2009 5260 0 Cash parking 8th Jul 2009 5219 3 WordPress has been installed but no content as yet. Because a CMS is there, I’m guessing that this site won’t join the hordes of domains parked for cash that we’ve already seen. I suspect that the owner is going to develop something here. 8th Jul 2009 5132 3 Cash parking 8th Jul 2009 5108 0 Cash parking 8th Jul 2009 3068 o Cash parking 9th Jul 2009 44582 2 Seems like dummy content used as a vehicle for adverts. Nothing much going on here, but the 44K unique visitors are bound to pull some ad revenue. 9th July 2009 21965 4 Cash parking 9th Jul 2009 18978 0 Wow. Real content for a change. Strangely, the content dates back to before the domain was being auctioned. Did the original owner forget to renew and have to buy it back? 9th Jul 2009 18834 0 Cash parking 9th Jul 2009 18669 0 Some weird content here (not surprising given the macabre domain name) – and that is the highest Adsense/content ratio I’ve ever seen! 18.5K UVs / month with all those ads must surely be lucrative. Just an (high traffic) Adsense vehicle. 9th Jul 2009 18206 0 Cash parking 9th Jul 2009 15821 3 OMG. The new owner has had a brainwave and is actually doing something useful with this domain. FAIL style posters. Yes, there’s Adsense, but the content is good (is it generic – has the owner just copied existing FAIL styly pics?). 9th Jul 2009 15686 0 Cash parking 9th Jul 2009 14952 2 Cash parking 9th Jul 2009 13942 0 Cash parking 9th Jul 2009 10896 0 Cash parking 9th Jul 2009 9207 0 No content yet 9th Jul 2009 8488 0 Cash parking 9th Jul 2009 8456 0 Cash parking 9th Jul 2009 8110 0 Cash parking 9th Jul 2009 8023 2 Cash parking 9th Jul 2009 7880 0 Cash parking 9th Jul 2009 7315 0 Generic content used as a vehicle for Adsense. 9th Jul 2009 6342 0 Cash parking 9th Jul 2009 5473 0 Cash parking 9th Jul 2009 5103 0 Cash parking 9th Jul 2009 4521 2 Cash parking 9th Jul 2009 4341 2 Cash parking 9th Jul 2009 4330 0 Cash parking 15th Jul 2009 7270 0 WordPress blog installed but no content yet. 15th Jul 2009 19323 0 Cash parking 15th Jul 2009 11858 0 Cash parking 15th Jul 2009 11293 3 Kitten healing? Cash parking! 15th Jul 2009 2955 0 Cash parking

Make Money With Cash Parking

As you can see, the majority of the expired domains above were bought and then parked. On the one hand I’m a little disappointed that sites with obvious potential are simply being parked to make money, instead of being developed. However, on the other hand it’s a business decision to do so. If you can buy a domain that receives 20,000 unique visitors every month why not park it? Developing a site from scratch takes time, effort and money, whereas cash parking takes two minutes to set up. You can then sit back and watch the cost/revenue balance redress itself.

Which Expired Domains Are Right For Cash Parking

Obviously, traffic is an important factor in choosing a domain for cash parking. The more traffic you get the more revenue you’ll make. There is no point in parking a domain that receives no traffic. The lowest traffic domain above was receiving less than 1,000 unique visitors. As for what niche is best for your cash parked domain, it doesn’t seem to matter as we’ve seen a real mixture. In summary, it seems that if your expired domain is receiving more than 1,000 visitors it is ideal for cash parking. With that in mind, I can feel a little experiment coming on.

Weird Postscript

I’ve just bought the domain for $5. A bargain, especially when you consider that although it has only 4 inbound links, one of them is from a PR8 .edu page! Alas, that page isn’t even indexed, and has no inbound links so the PR showing on the toolbar must be wrong. Hence my link here to get the page indexed and see what happens. Oh well, it was only $5…

Website Sellers

Website sellers. What are we to do with them? It seems that the vast majority are not dealing from a full pack. Take this one, for example. The very first question, a request to post traffic stats, is asked on a Friday, and the seller only gets around to responding to it the following Thursday. That’s customer service for you. You would think the seller doesn’t want to part with their beloved site after all. Strike 1!

You can see that there were actually two questions before the seller decided to respond, and when he did, he completely ignored the first! Strike 2. Making bidders wait and then ignoring their questions. This guy hasn’t gotten round to reading “How To Make Friends And Influence People”.

I PMed the seller to add me to his Google Analytics account, which, to his credit, he promptly did. Regarding the claimed traffic of 2,100 uniques / month – guess what? That’s right, the real traffic was nowhere near. It was more like 330/month. But of course, a monthly total of 330 uniques won’t sell as well as a total of 2,100. So let’s simply make up the stats. If anyone questions them, we can always silence them with an outburst of gibberish.

Gibberish Corner

If anyone knows what this gibberish means, please enlighten me.

Hmmm – I am getting more than that. I only have it on one page and have been having problems with it. Income is still strong and if Google Analytics is your SOLE source of information that you rely on, then have fun with that as I anyone can manipulate analyitcs :)


You are getting more than that? How? We are both looking at the same stats! And what do you mean you have it on only one page? Just take a look at any random page you pick on the site and you can see the Google Analytics code there! What do you mean Anyone can manipulate analytics???? Duh. If that was true, why didn’t you manipulate it to show decent traffic stats? And just how can someone manipulate those stats? Anyway, it doesn’t matter whether people can manipulate those stats – they weren’t manipulated in this case. Jeez, I don’t know why I bother wasting my time on an auction site that is full of junk and is host to the weakest members of the gene pool who don’t know how to interrelate with other people and see nothing wrong with lying about their sites’ performance. Schoolkids and failed “internet marketers” mostly.

You think sellers are bad? Have you seen the Flippa staff lateley? Stay tuned, I can’t wait to tell you about that “disappointment”. You don’t know what vitriolic rant means!

For G’s sake get me my meds, I’m having a heart attack here!

Therapist’s Note: Paul won’t be posting for a short time while we work on some issues.

Website Analysis – 404 HTTP Error

Yes, it’s time to address those dreaded 404 not found errors. A 404 is returned whenever you have a link to a page that doesn’t exist. For example, somebody links to your about-us.html page but a year later you decide that “About Us” pages are so last year and remove it. Now the link points to… nothing. That creates a 404 not found error.

But what do we care? Well, links are very important for helping our pages rank highly in the search engines and also for funnelling visitors to our site. If we take care of our 404s we can:

  • divert the otherwise wasted SEO value of the links to real pages on our site and help them rank highly.
  • deliver visitors to a meaningful and engaging page on our site instead of a generic and unhelpful 404 not found page.

So, how do we detect 404s and how do we fix them?

Detecting 404 HTTP Errors

Enter Google Webmaster Tools. GWT is easy to set up and provides useful link information about your site (among other things). If you have GWT installed, from the dashboard click Not Found in the Crawl errors section. Doing this presents a list of URLs on your site that don’t exist but that have links to them.

Let’s pretend you have a million 404s and only a limited time to sort them out. Which ones do you fix first? Fortunately for us, there is a handy column to the right called Linked From that tells us how many inbound links each missing page has. Intuitively we know we should fix those pages that have the most links as then we will be gaining the most SEO value from those links, and we will also making the most potential visitors happy. Unfortunately, you can’t resequence this list by descending Linked From value. It’s lucky for us that you can “download this table” in a CSV, open it in Excel and then sort the list by descending Linked From value. Phew.


Fixing 404 HTTP Errors

I ain’t technical, but I know I can fix a 404 error with a 301 redirect in my .htaccess file. If these things mean nothing to you, do not despair. There is an upcoming article due at any moment that explains what these arcane terms mean.

Fixing 404 HTTP Errors With A 301 Redirect

Download the .htaccess file from your server so you know you’re working on the most up to date (or at least the “live”) version. Open it using notepad and paste in the following:

Redirect 301 /old.htm

Beware the initial forward slash. I’ve missed that off a few times and the sky fell on my head each time.

Now you have the tools to redirect missing URLs, but where do you redirect them to? That is the 64 million dollar question. You have four options:

  • Create a new page whose URL exactly matches the 404 and, hey presto, you don’t even need a redirect. The link will simply point to that new page.
  • Create a new version of the missing page and redirect the old URL to that.
  • Redirect the missing URL to the homepage.
  • Redirect the missing URL to the best matching page. For example, the missing page might be about monkey training, but you actually have a page about dog training – what the hell, it’s nearly the same thing. Redirect monkeys to dogs. This reminds me of the time I transplanted a monkey’s brain into a dog. Man, that was crazy. Redirecting pages might not be as much fun as transplanting brains, but it has more influence on the search engines. Unless it’s Matt Cutts brain we’re talking about…


Buying New Sites And 404s

When you buy an existing site, the chances are that you’ll have to sort out some 404s somewhere along the line. The site I’m currently analysing has a mere 6 pages missing – but it’s early days yet. It could be that Google simply hasn’t found any others yet. One nightmare of a site I bought last year had around 100 404s I had to redirect.

New Website Development – Website Traffic Analysis

Imagine the scene. You’ve just bought an existing and operational website that is currently receiving traffic. However, the website is still an entity unknown to you so you need to start recording traffic statistics that you can analyse at a later date. Only when you’ve analysed those traffic numbers can you get an idea of how the site is performing, how visitors behave on the site and how best to change the site. This article illustrates the process I use when analysing traffic statistics of a site I’ve just acquired. It’s not meant to be a blueprint that you follow religiously (though there are far worse things you could do :) ). Rather, it would be better to try out the ideas presented here, keep what works, throw away what doesn’t and then add your own ideas to… The Method.

Website Traffic Analysis

My tool of choice for recording website traffic is Google Analytics. There may be other analysis packages out there that are better, but my opinion is that GA is the most comprehensive free one available. The first problem we face when taking possession of a new site is that our Google Analytics tracking code is not currently on any of the site’s pages.  We need to add the code to every page.

Adding Google Analytics To A Content Management System

If the site currently runs on some content management system, adding your tracking code is usually a breeze. Some WordPress themes, for example, provide an input box whose sole purpose is to accept your tracking code. Adding it there will automatically put that code on every page. If the theme does not provide such an input box, then you will have to edit one of the templates (usually footer.php) to add it there. Again, once those changes are saved the code appears on all pages of the site. Here is a handy article on adding Google Analytics to WordPress.

Another popular CMS is Joomla. Just searching the Joomla extensions directory yields a large selection of modules you can use to add your tracking code. Using a module saves you the hassle of finding the write template whose code you’d have to hack amend.

Adding Google Analytics To A HTML Site

I’ve just bought an existing website and am in the collecting traffic data stage. Unfortunately for me, the site was written in straight html. This has many advantages, but applying Google Analytics tracking code to every page is not one of them! In this case I had to systematically work down the list of HTML files, and add the code to each one. Here is my streamlined process:

  1. Open the next HTML file using notepad.
  2. Jump to the end of the file.
  3. Position the cursor before the </body> tag
  4. Paste in my Google Analytics code.
  5. Ctrl-S to save the file.
  6. Close the file.
  7. Go to step 1.

My new site has around 300 pages, and the total time spent on this mind numbing task was around 30 minutes. That’s 10 pages updated / min. You can use that as a rough guide if you need to estimate how long it will take you to do your site. I did this as soon as I was in control of the hosting account for the domain as I wanted to start recording statistics ASAP. The sooner the analysis stage is complete, the sooner we can actually start working on the site.

Monitoring The Site’s Traffic Statistics

I took possession of my site late on 27/07/2009 and frantically added GA code to all 300 pages of the site. That means I have 8 full days of traffic statistics to look at. It makes interesting reading. Here’s what I’ve gleaned so far.

General Website Traffic Analysis

I’m encouraged by the fact that the site received 229 visitors on the first day. So far, in the 8 days I’ve been tracking, the site has been visited by 1,532 unique visitors. That’s an impressive amount straight out of the box, without me even changing a thing on the site. The traffic distribution over sources looks like this:


The search engine share is a good amount for me, because I just love doing SEO. I have great plans for SEOing this site. Once the analysis stage is complete, it’ll be keyword research all the way and already I have ideas for new content based on what I know people are searching for when they reach my site.

The referral traffic amounts to around 1,000 unique visitors / month, which again is a reasonable amount. If the search engines drop the site (I can’t imagine this happening) I know I will still get at least 1,000 visitors / month from other sources.

The direct traffic is puzzling. I know the site hasn’t changed in 3 years, and it’s obvious that the site is pretty static, so why people have bookmarked the site or are returning via type-ins, is anybody’s guess.

Traffic Trends At Weekends

We’ve seen traffic slumps at the weekend before, and this site experiences them too.


Affiliate Sales Ahoy

The most popular page after the home page (as far as pageviews are concerned) is a page called xxx.html where xxx is the name of a type of product, like “suitcases”. You know what this means, don’t you? Yes, that’s right. That page is perfect for selling items that fall into that category of product. All I need to do is find some affiliate products in that category. I’ve checked Commission Junction and there are plenty of products I can advertise. This will be on the “things to do” list that is produced by the planning stage.

Search Phrases

It’s more good news. The top search phrases that bring traffic to the site all support the idea that affiliate products can be sold here. Let’s pretend that the most popular page mentioned above is called suitcases.html. The top search terms are phrases like:

  • travel suitcase
  • samsonite suitcase
  • vintage suitcase

If people are searching for a product category like “travel suitcase”, there is a good chance they want to buy one. And if they want to buy one, I certainly want to sell them one!

Average Time On Page

The average time spent on a particular page is a useful thing to know. It tells us what pages keep our audiences riveted and which pages make them run a mile. I think there is the danger of reading too much into time on page if the number of pageviews is small, though. For example, that one person visited a page and spent half an hour there doesn’t tell us much because they may just have passed out at the keyboard from boredom and clicked off it when they came to. If, however, 100,000 people visited the page and the average time they spent there was half an hour, then you know you have some mesmerising page there!

For my purposes, I’m going to ignore all pages that have less than 50 pageviews. I set the pageviews threshold so low because I only have statistics for 8 days. After a month, we’ll be able to do a more accurate study.

But… there is one page that has received 5 pageviews and that has an average time on page of 28 minutes and 57 seconds. They can’t all have passed out, surely? Anyway, I’ll look at that when I have more data.


To see the average time spent on page in Google Analytics, click Content > Top Content. To sequence the list by descending times, click on the column heading for Avg. Time On Page.  Now pages whose visitors stay on the page for the longest time appear at the top. Just scanning the list of pages with PVs > 50 I can see some impressively long times:

  • 8:11
  • 4:55
  • 2:25
  • etc

This implies that those pages are getting read in their entirety. That means they are of high enough quality to engage the reader. This is good.

What about pages that don’t engage? Let’s click on the column heading for Avg. Time On Page to sequence the list by ascending times. Again, we’re going to ignore pages that haven’t been viewed enough times for these values to be meaningful. The first page I find with pageviews greater than 50 has an average time on page of 38 seconds. Is this good or bad? Well, the page is simply presents a list of linked articles for the reader to navigate to. That the average person spends 38 seconds reading down the list to find an article they want to read is not unreasonable. For other pages, 38 seconds might be abominably low. For a list of articles, I think it might be acceptable. Next page!

The next page has an average time on page of over a minute. Now we are in safe territory. My gut feel is that any page that keeps the visitor reading for over a minute is doing something right. As all pages further down the list are going to have longer times, let’s stop there. Conclusion, there isn’t much wrong with engagement for the pages we looked at. Of course, we only have 8 days of data, and that isn’t nearly enough to assess accurately. It gives me a warm glow, though.

Website Traffic Analysis – Conclusion

This was only a preliminary investigation to get a rough feel for how the site was performing, and to see whether there was anything seriously wrong we needed to fix. So far it’s looking good, but let’s not count our chickens. When we have a months worth of data, we’ll be in a better position to draw meaningful conclusions.

One more thing: in addition to analysing website traffic, you’ll need to analyse inbound links. More precisely, you’ll need to find and fix your 404 HTTP errors.

Website Revenue Proof

The guy that listed his website for sale here knew what he was doing. The auction details show a potential bidder everything they need to know without overwhelming them with unnecessary information. The owner has produced a video showing proof of revenue and traffic statistics and in it he comes across as a very affable guy. His profile picture is fantastic. Look at his face and tell me you don’t trust him already! He looks completely dependable.

The site is pretty good too: 60 pages of well written content – a rarity on Flippa. The traffic is good: around 7,000 unique visitors per month, most of them coming via the search engines. Monthly revenue was reported to be $450 which is a very nice sum considering that no work is required to maintain it at that level. That $450 is mostly profit.

So far, this site looks like a Good Thing.

There was only one thing that bothered me about the revenue for this site. The seller did provide proof of earnings from somewhere that looked valid. However, there was no way that the owner could prove that those earnings came solely from the site being sold. It’s possible that the seller owns other sites in the same niche that display the same affiliate links and that also contribute to the same pot of money. I’m not alleging that he is doing that, only that it’s a possibility that we shouldn’t dismiss. The fact that the seller has had an affiliate relationship with the merchant that started years before the site was created lends credibility to the possibility of other streams feeding that revenue. Way before the site was started, the seller made substantial amounts of money by selling the merchant’s product.

This distrust has nothing to do with the seller and how he conducted himself – he couldn’t have been more helpful, and provided a shining example of how to manage an auction. I even got a “don’t let this valuable website slip away” email towards the end of the auction! The distrust arises purely because it’s not possible to prove a one to one relationship between the earnings claimed and the site being sold. Adsense, Chitika, Amazon etc all offer the facility to use channels to segregate data for different sites (and also different areas within the same site), but unfortunately the affiliate program that the seller was using didn’t have that facility. If only there was some way to uniquely identify the site being sold in the earnings control panel…

This lack of concrete proof didn’t stop the site being sold for a hearty $6,950 though. I think that’s a good price, assuming that the revenue is legit. If there was no affiliate product to push, I imagine that the volume of traffic that the site attracts, and also the nature of the content, would lead to some decent Adsense revenue.

Analysing Inbound Links

The steps outlined below involve the use of Microsoft Access 2007 and SQL. If you don’t have this application, or you are scared of SQL (don’t be) turn back now! Or read on and then suggest an alternative approach and I’ll incorporate that into a future post. I initially tried to analyse inbound links using Microsoft Excel, but Excel limits the number of conditions you can specify in your filter criteria to 2. If this means nothing to you now, just wait until you’ve read the whole post! It will mean even less.

I was looking at a site for sale today. It was two months old and had decent traffic for a site so young. I’m not much of a link builder so I thought I would investigate the site’s inbound links. If the site had a large number of inbound links, then I would consider buying the site and if it didn’t have many then I would forget it and move on.

Get All The Inbound Links Using Yahoo Site Explorer

My first port of call was Yahoo Site Explorer to determine all the links to the site that Yahoo knew about. Note that these aren’t necessarily the links that Google and the other search engines know about, but it’s close enough to get a good idea. I may as well give you the site I was looking at, for demonstration purposes. These are the inbound links, excluding internals. That’s a big list. Straight away I can see that there are links gained from blog commenting on sites like desmondblog, theuniversitykid. I discount these immediately as they are nofollowed and have no value in Google.

Further down the list are a vast selection of links from namepros. So the site owner is accumulating forum sig links too. I’m not interested in these either as they contribute very little SEO value. Also, we both know that as soon as the site is transferred to me, those sig links will be long gone. There is no point in the ex-owner promoting a site that is no longer his.

So, I’ve got a big list of links and I want to remove all the ones that I recognise as blog comments and forum posts. What links remain, may (only may) be of value.

Export The List Of Inlinks From Yahoo Site Explorer

Those wonderful people at Yahoo provide us with the marvelous facility to export the list of links in TSV (tab separated value) format. We can export the list in a file, save it to our pc and then query the file to get just the links we’re interested in.

When you click the TSV link, you’ll be presented with the familiar ‘save file as’ dialogue box. Find a place on your hard drive to save it and ensure that the file extension is .txt and not .tsv.

Import Your Link Data Into Microsoft Access

Now that we have a file full of inbound links, we can import it into Microsoft Access. I created a database called InboundLinks and then imported the .txt file as a table within that database. I kept the name of the database generic so that I can import other site specific data in the future. Cunning, I know.

To import the links, click External Data > Import > Text File (you can now see why we changed the file extension to .txt). Ensure that Import the source data into a new table in the current database is selected and click the browse button to locate and select your text file. Click OK. On the next dialogue box, ensure that the Delimited option is selected. Although we saved our file as a .txt, it was exported as a TSV (tab separated). Click Next and then select Tab as the delimiter that separates the fields. Click Next. We’ll keep things simple and not bother renaming our fields, so click Next again. Let Access add a primary key by ensuring that that is selected and click Next. Either leave the Import To Table field at the default value or give it a meaningful name like ‘OMG I’m going to be rich once I’ve done this link analysis and bought that site‘. Click Finish and then close the dialogue box.

The data is in!

Analyse The Links With SQL

Double click on the table name on the left and then click the Create tab. We’re going to create a query. Click Query Design and if the Show Table dialogue box appears, close it. We’re going to start loosening nuts and bolts with SQL. In the Results category on the Design tab, click SQL. I know for a fact that some of the inlinks are from desmondblog. We don’t want these so I’m going to type a SQL statement that selects all records except those whose link contains ‘’.

select * from Url_inlinks where field2 not like ‘**’

Url_inlinks is the name of my inlinks table and field2 is the inlinks field (or column if you like excel). Once that’s typed in, click Run to see the links our SQL statement selected. This shortens the list slightly but there are still links from namepros in there that I don’t want. In fact, here is the list of sources of links I’m going to exclude:


and here is the corresponding SQL statement that excludes those sources:

SELECT * from Url_inlinks where field2 not like ‘**’
and field2 not like ‘**’
and field2 not like ‘**’
and field2 not like ‘**’

This shortens the list enough for me to assess the remaining inlinks but, to be honest, I wish I hadn’t embarked on this lengthy, rambling excursion because the links are pretty much worthless! There’s one from the Sitepoint auction page, and a few more assorted blog comments and forum post. All inconsequential, and none add any real value to the site for sale.

Still, this will be my blueprint for future link analysis. I’m going to call it the Link Checking Blueprint.

I know that the Office 2007 suite is an extravagant expense simply for checking links, but I’m told that there is database and SQL functionality in Open Office. Is there an easier way to get a list of inlinks and then omit the useless ones from view? Do you do this and have a better method? Let me know and I’ll steal your idea and write an ebook around it.

Create A PR7 Site

How To Create A High PR Site

This here is a clever chap. I don’t know where he got his brilliant idea from but what he’s done is pure genius. Either he’s created himself, or he’s got someone else to create, translations of some core pages on the W3C site. W3C are always looking for volunteers to translate certain pages on their site, so I’m sure his offer of translation services would have been gratefully received. He may already have established connections with them. Continue reading

Find Connected Sites

Thanks to Experienced People for listing some very useful tools for due diligence on websites.

When researching a site you’re thinking of buying, it’s helpful to know what other sites have the same Google Analytics code or Adsense code embedded on them. Unscrupulous sellers will often use the same Google Analytics code on multiple sites to create the impression that the site they are selling gets more traffic than it really does. Continue reading