Monthly Archives: October 2014

Website Sellers

Website sellers. What are we to do with them? It seems that the vast majority are not dealing from a full pack. Take this one, for example. The very first question, a request to post traffic stats, is asked on a Friday, and the seller only gets around to responding to it the following Thursday. That’s customer service for you. You would think the seller doesn’t want to part with their beloved site after all. Strike 1!

You can see that there were actually two questions before the seller decided to respond, and when he did, he completely ignored the first! Strike 2. Making bidders wait and then ignoring their questions. This guy hasn’t gotten round to reading “How To Make Friends And Influence People”.

I PMed the seller to add me to his Google Analytics account, which, to his credit, he promptly did. Regarding the claimed traffic of 2,100 uniques / month – guess what? That’s right, the real traffic was nowhere near. It was more like 330/month. But of course, a monthly total of 330 uniques won’t sell as well as a total of 2,100. So let’s simply make up the stats. If anyone questions them, we can always silence them with an outburst of gibberish.

Gibberish Corner

If anyone knows what this gibberish means, please enlighten me.

Hmmm – I am getting more than that. I only have it on one page and have been having problems with it. Income is still strong and if Google Analytics is your SOLE source of information that you rely on, then have fun with that as I anyone can manipulate analyitcs :)

Thanks

You are getting more than that? How? We are both looking at the same stats! And what do you mean you have it on only one page? Just take a look at any random page you pick on the site and you can see the Google Analytics code there! What do you mean Anyone can manipulate analytics???? Duh. If that was true, why didn’t you manipulate it to show decent traffic stats? And just how can someone manipulate those stats? Anyway, it doesn’t matter whether people can manipulate those stats – they weren’t manipulated in this case. Jeez, I don’t know why I bother wasting my time on an auction site that is full of junk and is host to the weakest members of the gene pool who don’t know how to interrelate with other people and see nothing wrong with lying about their sites’ performance. Schoolkids and failed “internet marketers” mostly.

You think sellers are bad? Have you seen the Flippa staff lateley? Stay tuned, I can’t wait to tell you about that “disappointment”. You don’t know what vitriolic rant means!

For G’s sake get me my meds, I’m having a heart attack here!

Therapist’s Note: Paul won’t be posting for a short time while we work on some issues.

Website Analysis – 404 HTTP Error

Yes, it’s time to address those dreaded 404 not found errors. A 404 is returned whenever you have a link to a page that doesn’t exist. For example, somebody links to your about-us.html page but a year later you decide that “About Us” pages are so last year and remove it. Now the link points to… nothing. That creates a 404 not found error.

But what do we care? Well, links are very important for helping our pages rank highly in the search engines and also for funnelling visitors to our site. If we take care of our 404s we can:

  • divert the otherwise wasted SEO value of the links to real pages on our site and help them rank highly.
  • deliver visitors to a meaningful and engaging page on our site instead of a generic and unhelpful 404 not found page.

So, how do we detect 404s and how do we fix them?

Detecting 404 HTTP Errors

Enter Google Webmaster Tools. GWT is easy to set up and provides useful link information about your site (among other things). If you have GWT installed, from the dashboard click Not Found in the Crawl errors section. Doing this presents a list of URLs on your site that don’t exist but that have links to them.

Let’s pretend you have a million 404s and only a limited time to sort them out. Which ones do you fix first? Fortunately for us, there is a handy column to the right called Linked From that tells us how many inbound links each missing page has. Intuitively we know we should fix those pages that have the most links as then we will be gaining the most SEO value from those links, and we will also making the most potential visitors happy. Unfortunately, you can’t resequence this list by descending Linked From value. It’s lucky for us that you can “download this table” in a CSV, open it in Excel and then sort the list by descending Linked From value. Phew.

redirect-404

Fixing 404 HTTP Errors

I ain’t technical, but I know I can fix a 404 error with a 301 redirect in my .htaccess file. If these things mean nothing to you, do not despair. There is an upcoming article due at any moment that explains what these arcane terms mean.

Fixing 404 HTTP Errors With A 301 Redirect

Download the .htaccess file from your server so you know you’re working on the most up to date (or at least the “live”) version. Open it using notepad and paste in the following:

Redirect 301 /old.htm http://www.yourdomain.com/new.htm

Beware the initial forward slash. I’ve missed that off a few times and the sky fell on my head each time.

Now you have the tools to redirect missing URLs, but where do you redirect them to? That is the 64 million dollar question. You have four options:

  • Create a new page whose URL exactly matches the 404 and, hey presto, you don’t even need a redirect. The link will simply point to that new page.
  • Create a new version of the missing page and redirect the old URL to that.
  • Redirect the missing URL to the homepage.
  • Redirect the missing URL to the best matching page. For example, the missing page might be about monkey training, but you actually have a page about dog training – what the hell, it’s nearly the same thing. Redirect monkeys to dogs. This reminds me of the time I transplanted a monkey’s brain into a dog. Man, that was crazy. Redirecting pages might not be as much fun as transplanting brains, but it has more influence on the search engines. Unless it’s Matt Cutts brain we’re talking about…

transplant-a-monkey-brain

Buying New Sites And 404s

When you buy an existing site, the chances are that you’ll have to sort out some 404s somewhere along the line. The site I’m currently analysing has a mere 6 pages missing – but it’s early days yet. It could be that Google simply hasn’t found any others yet. One nightmare of a site I bought last year had around 100 404s I had to redirect.

New Website Development – Website Traffic Analysis

Imagine the scene. You’ve just bought an existing and operational website that is currently receiving traffic. However, the website is still an entity unknown to you so you need to start recording traffic statistics that you can analyse at a later date. Only when you’ve analysed those traffic numbers can you get an idea of how the site is performing, how visitors behave on the site and how best to change the site. This article illustrates the process I use when analysing traffic statistics of a site I’ve just acquired. It’s not meant to be a blueprint that you follow religiously (though there are far worse things you could do :) ). Rather, it would be better to try out the ideas presented here, keep what works, throw away what doesn’t and then add your own ideas to… The Method.

Website Traffic Analysis

My tool of choice for recording website traffic is Google Analytics. There may be other analysis packages out there that are better, but my opinion is that GA is the most comprehensive free one available. The first problem we face when taking possession of a new site is that our Google Analytics tracking code is not currently on any of the site’s pages.  We need to add the code to every page.

Adding Google Analytics To A Content Management System

If the site currently runs on some content management system, adding your tracking code is usually a breeze. Some WordPress themes, for example, provide an input box whose sole purpose is to accept your tracking code. Adding it there will automatically put that code on every page. If the theme does not provide such an input box, then you will have to edit one of the templates (usually footer.php) to add it there. Again, once those changes are saved the code appears on all pages of the site. Here is a handy article on adding Google Analytics to WordPress.

Another popular CMS is Joomla. Just searching the Joomla extensions directory yields a large selection of modules you can use to add your tracking code. Using a module saves you the hassle of finding the write template whose code you’d have to hack amend.

Adding Google Analytics To A HTML Site

I’ve just bought an existing website and am in the collecting traffic data stage. Unfortunately for me, the site was written in straight html. This has many advantages, but applying Google Analytics tracking code to every page is not one of them! In this case I had to systematically work down the list of HTML files, and add the code to each one. Here is my streamlined process:

  1. Open the next HTML file using notepad.
  2. Jump to the end of the file.
  3. Position the cursor before the </body> tag
  4. Paste in my Google Analytics code.
  5. Ctrl-S to save the file.
  6. Close the file.
  7. Go to step 1.

My new site has around 300 pages, and the total time spent on this mind numbing task was around 30 minutes. That’s 10 pages updated / min. You can use that as a rough guide if you need to estimate how long it will take you to do your site. I did this as soon as I was in control of the hosting account for the domain as I wanted to start recording statistics ASAP. The sooner the analysis stage is complete, the sooner we can actually start working on the site.

Monitoring The Site’s Traffic Statistics

I took possession of my site late on 27/07/2009 and frantically added GA code to all 300 pages of the site. That means I have 8 full days of traffic statistics to look at. It makes interesting reading. Here’s what I’ve gleaned so far.

General Website Traffic Analysis

I’m encouraged by the fact that the site received 229 visitors on the first day. So far, in the 8 days I’ve been tracking, the site has been visited by 1,532 unique visitors. That’s an impressive amount straight out of the box, without me even changing a thing on the site. The traffic distribution over sources looks like this:

traffic-sources

The search engine share is a good amount for me, because I just love doing SEO. I have great plans for SEOing this site. Once the analysis stage is complete, it’ll be keyword research all the way and already I have ideas for new content based on what I know people are searching for when they reach my site.

The referral traffic amounts to around 1,000 unique visitors / month, which again is a reasonable amount. If the search engines drop the site (I can’t imagine this happening) I know I will still get at least 1,000 visitors / month from other sources.

The direct traffic is puzzling. I know the site hasn’t changed in 3 years, and it’s obvious that the site is pretty static, so why people have bookmarked the site or are returning via type-ins, is anybody’s guess.

Traffic Trends At Weekends

We’ve seen traffic slumps at the weekend before, and this site experiences them too.

traffic-down-at-weekends

Affiliate Sales Ahoy

The most popular page after the home page (as far as pageviews are concerned) is a page called xxx.html where xxx is the name of a type of product, like “suitcases”. You know what this means, don’t you? Yes, that’s right. That page is perfect for selling items that fall into that category of product. All I need to do is find some affiliate products in that category. I’ve checked Commission Junction and there are plenty of products I can advertise. This will be on the “things to do” list that is produced by the planning stage.

Search Phrases

It’s more good news. The top search phrases that bring traffic to the site all support the idea that affiliate products can be sold here. Let’s pretend that the most popular page mentioned above is called suitcases.html. The top search terms are phrases like:

  • travel suitcase
  • samsonite suitcase
  • vintage suitcase

If people are searching for a product category like “travel suitcase”, there is a good chance they want to buy one. And if they want to buy one, I certainly want to sell them one!

Average Time On Page

The average time spent on a particular page is a useful thing to know. It tells us what pages keep our audiences riveted and which pages make them run a mile. I think there is the danger of reading too much into time on page if the number of pageviews is small, though. For example, that one person visited a page and spent half an hour there doesn’t tell us much because they may just have passed out at the keyboard from boredom and clicked off it when they came to. If, however, 100,000 people visited the page and the average time they spent there was half an hour, then you know you have some mesmerising page there!

For my purposes, I’m going to ignore all pages that have less than 50 pageviews. I set the pageviews threshold so low because I only have statistics for 8 days. After a month, we’ll be able to do a more accurate study.

But… there is one page that has received 5 pageviews and that has an average time on page of 28 minutes and 57 seconds. They can’t all have passed out, surely? Anyway, I’ll look at that when I have more data.

average-time-on-page

To see the average time spent on page in Google Analytics, click Content > Top Content. To sequence the list by descending times, click on the column heading for Avg. Time On Page.  Now pages whose visitors stay on the page for the longest time appear at the top. Just scanning the list of pages with PVs > 50 I can see some impressively long times:

  • 8:11
  • 4:55
  • 2:25
  • etc

This implies that those pages are getting read in their entirety. That means they are of high enough quality to engage the reader. This is good.

What about pages that don’t engage? Let’s click on the column heading for Avg. Time On Page to sequence the list by ascending times. Again, we’re going to ignore pages that haven’t been viewed enough times for these values to be meaningful. The first page I find with pageviews greater than 50 has an average time on page of 38 seconds. Is this good or bad? Well, the page is simply presents a list of linked articles for the reader to navigate to. That the average person spends 38 seconds reading down the list to find an article they want to read is not unreasonable. For other pages, 38 seconds might be abominably low. For a list of articles, I think it might be acceptable. Next page!

The next page has an average time on page of over a minute. Now we are in safe territory. My gut feel is that any page that keeps the visitor reading for over a minute is doing something right. As all pages further down the list are going to have longer times, let’s stop there. Conclusion, there isn’t much wrong with engagement for the pages we looked at. Of course, we only have 8 days of data, and that isn’t nearly enough to assess accurately. It gives me a warm glow, though.

Website Traffic Analysis – Conclusion

This was only a preliminary investigation to get a rough feel for how the site was performing, and to see whether there was anything seriously wrong we needed to fix. So far it’s looking good, but let’s not count our chickens. When we have a months worth of data, we’ll be in a better position to draw meaningful conclusions.

One more thing: in addition to analysing website traffic, you’ll need to analyse inbound links. More precisely, you’ll need to find and fix your 404 HTTP errors.

Website Revenue Proof

The guy that listed his website for sale here knew what he was doing. The auction details show a potential bidder everything they need to know without overwhelming them with unnecessary information. The owner has produced a video showing proof of revenue and traffic statistics and in it he comes across as a very affable guy. His profile picture is fantastic. Look at his face and tell me you don’t trust him already! He looks completely dependable.

The site is pretty good too: 60 pages of well written content – a rarity on Flippa. The traffic is good: around 7,000 unique visitors per month, most of them coming via the search engines. Monthly revenue was reported to be $450 which is a very nice sum considering that no work is required to maintain it at that level. That $450 is mostly profit.

So far, this site looks like a Good Thing.

There was only one thing that bothered me about the revenue for this site. The seller did provide proof of earnings from somewhere that looked valid. However, there was no way that the owner could prove that those earnings came solely from the site being sold. It’s possible that the seller owns other sites in the same niche that display the same affiliate links and that also contribute to the same pot of money. I’m not alleging that he is doing that, only that it’s a possibility that we shouldn’t dismiss. The fact that the seller has had an affiliate relationship with the merchant that started years before the site was created lends credibility to the possibility of other streams feeding that revenue. Way before the site was started, the seller made substantial amounts of money by selling the merchant’s product.

This distrust has nothing to do with the seller and how he conducted himself – he couldn’t have been more helpful, and provided a shining example of how to manage an auction. I even got a “don’t let this valuable website slip away” email towards the end of the auction! The distrust arises purely because it’s not possible to prove a one to one relationship between the earnings claimed and the site being sold. Adsense, Chitika, Amazon etc all offer the facility to use channels to segregate data for different sites (and also different areas within the same site), but unfortunately the affiliate program that the seller was using didn’t have that facility. If only there was some way to uniquely identify the site being sold in the earnings control panel…

This lack of concrete proof didn’t stop the site being sold for a hearty $6,950 though. I think that’s a good price, assuming that the revenue is legit. If there was no affiliate product to push, I imagine that the volume of traffic that the site attracts, and also the nature of the content, would lead to some decent Adsense revenue.

Analysing Inbound Links

The steps outlined below involve the use of Microsoft Access 2007 and SQL. If you don’t have this application, or you are scared of SQL (don’t be) turn back now! Or read on and then suggest an alternative approach and I’ll incorporate that into a future post. I initially tried to analyse inbound links using Microsoft Excel, but Excel limits the number of conditions you can specify in your filter criteria to 2. If this means nothing to you now, just wait until you’ve read the whole post! It will mean even less.

I was looking at a site for sale today. It was two months old and had decent traffic for a site so young. I’m not much of a link builder so I thought I would investigate the site’s inbound links. If the site had a large number of inbound links, then I would consider buying the site and if it didn’t have many then I would forget it and move on.

Get All The Inbound Links Using Yahoo Site Explorer

My first port of call was Yahoo Site Explorer to determine all the links to the site that Yahoo knew about. Note that these aren’t necessarily the links that Google and the other search engines know about, but it’s close enough to get a good idea. I may as well give you the site I was looking at, for demonstration purposes. These are the inbound links, excluding internals. That’s a big list. Straight away I can see that there are links gained from blog commenting on sites like desmondblog, theuniversitykid. I discount these immediately as they are nofollowed and have no value in Google.

Further down the list are a vast selection of links from namepros. So the site owner is accumulating forum sig links too. I’m not interested in these either as they contribute very little SEO value. Also, we both know that as soon as the site is transferred to me, those sig links will be long gone. There is no point in the ex-owner promoting a site that is no longer his.

So, I’ve got a big list of links and I want to remove all the ones that I recognise as blog comments and forum posts. What links remain, may (only may) be of value.

Export The List Of Inlinks From Yahoo Site Explorer

Those wonderful people at Yahoo provide us with the marvelous facility to export the list of links in TSV (tab separated value) format. We can export the list in a file, save it to our pc and then query the file to get just the links we’re interested in.

When you click the TSV link, you’ll be presented with the familiar ‘save file as’ dialogue box. Find a place on your hard drive to save it and ensure that the file extension is .txt and not .tsv.

Import Your Link Data Into Microsoft Access

Now that we have a file full of inbound links, we can import it into Microsoft Access. I created a database called InboundLinks and then imported the .txt file as a table within that database. I kept the name of the database generic so that I can import other site specific data in the future. Cunning, I know.

To import the links, click External Data > Import > Text File (you can now see why we changed the file extension to .txt). Ensure that Import the source data into a new table in the current database is selected and click the browse button to locate and select your text file. Click OK. On the next dialogue box, ensure that the Delimited option is selected. Although we saved our file as a .txt, it was exported as a TSV (tab separated). Click Next and then select Tab as the delimiter that separates the fields. Click Next. We’ll keep things simple and not bother renaming our fields, so click Next again. Let Access add a primary key by ensuring that that is selected and click Next. Either leave the Import To Table field at the default value or give it a meaningful name like ‘OMG I’m going to be rich once I’ve done this link analysis and bought that site‘. Click Finish and then close the dialogue box.

The data is in!

Analyse The Links With SQL

Double click on the table name on the left and then click the Create tab. We’re going to create a query. Click Query Design and if the Show Table dialogue box appears, close it. We’re going to start loosening nuts and bolts with SQL. In the Results category on the Design tab, click SQL. I know for a fact that some of the inlinks are from desmondblog. We don’t want these so I’m going to type a SQL statement that selects all records except those whose link contains ‘desmondblog.com’.

select * from Url_inlinks where field2 not like ‘*desmondblog.com*’

Url_inlinks is the name of my inlinks table and field2 is the inlinks field (or column if you like excel). Once that’s typed in, click Run to see the links our SQL statement selected. This shortens the list slightly but there are still links from namepros in there that I don’t want. In fact, here is the list of sources of links I’m going to exclude:

  • desmondblog.com
  • theuniversitykid.com
  • namepros.com
  • webdesigntalk.net

and here is the corresponding SQL statement that excludes those sources:

SELECT * from Url_inlinks where field2 not like ‘*desmondblog.com*’
and field2 not like ‘*theuniversitykid.com*’
and field2 not like ‘*namepros.com*’
and field2 not like ‘*webdesigntalk.net*’

This shortens the list enough for me to assess the remaining inlinks but, to be honest, I wish I hadn’t embarked on this lengthy, rambling excursion because the links are pretty much worthless! There’s one from the Sitepoint auction page, and a few more assorted blog comments and forum post. All inconsequential, and none add any real value to the site for sale.

Still, this will be my blueprint for future link analysis. I’m going to call it the Link Checking Blueprint.

I know that the Office 2007 suite is an extravagant expense simply for checking links, but I’m told that there is database and SQL functionality in Open Office. Is there an easier way to get a list of inlinks and then omit the useless ones from view? Do you do this and have a better method? Let me know and I’ll steal your idea and write an ebook around it.

Create A PR7 Site

How To Create A High PR Site

This here is a clever chap. I don’t know where he got his brilliant idea from but what he’s done is pure genius. Either he’s created himself, or he’s got someone else to create, translations of some core pages on the W3C site. W3C are always looking for volunteers to translate certain pages on their site, so I’m sure his offer of translation services would have been gratefully received. He may already have established connections with them. Continue reading

Find Connected Sites

Thanks to Experienced People for listing some very useful tools for due diligence on websites.

When researching a site you’re thinking of buying, it’s helpful to know what other sites have the same Google Analytics code or Adsense code embedded on them. Unscrupulous sellers will often use the same Google Analytics code on multiple sites to create the impression that the site they are selling gets more traffic than it really does. Continue reading

Buying And Selling Websites Forum

Thank the Lord that this has happened! There is finally a forum devoted to the discussion of buying and selling websites. This was well overdue. The sub forum at sitepoint filled this space for a while but was soon overwhelmed by the ubiquitous, and inevitable, sig link spammers. Any frequent visitor to the SP forum will have noted its decline. Continue reading

How To Block Flash Ads In Adsense

Recently, a couple of my sites were overwhelmed by irrelevant Adsense adverts relating to the credit/loans niche. The mention of Harringtom Brooks and Adsense in the same sentence still makes me shudder! My sites had nothing to do with credit/loans so of course no visitors were likely to click on these adverts. Which means wasted page impressions that bring my Adsense CTR way down. Continue reading