Tips, Tricks, Tools & Techniques

for Internet Business, Life, the Universe and Everything

RSS Feed

Month: September, 2007

How to install a web analytics tool the right way

28 September, 2007 (13:20) | Testing, Tools, WordPress | By: Nick Dalton

Welcome to the first installment in the Tracking, Testing and Tweaking Your Web Site series. If you haven’t yet suggested topics you’d like to see covered in the future, it’s not too late to do so.

In this first article I’m going to cover installing a web analytics program for your web site. This is the first tool that you should to get into place to track your visitors. Even if you’re already using Google Analytics or a similar tool, please read on as there are some important tips in this article that will help you later on.

Web Analytics Tools

Your web hosting account probably came with a web statistics tool like Webalizer or AWstats. These tools can show you some statistics about your visitors and their page views, but they do not track visitor behavior very well. So ignore those tools.

The “installation” of most modern web analytics programs consist of adding a little piece of JavaScript code to each web page on your site. Since this code needs to go on every page most vendors recommend that you place it in the footer of your site (assuming that you have a footer file that is included on all web pages). There is another reason for placing the code at the end of the page: The JavaScript code takes some time to execute and if it’s at the bottom of the page the visitor will be able to view and read the full web page while the browser is processing the web analytics JavaScript. However if you want to do some more advanced web analytics, installing the JavaScript at the end of the page is sometimes the wrong thing to do. More about that later.

I suggest you use two different web analytics tools: one simple tool for real time information and another one for more advanced analytics. Real time information can be useful to react to abnormal traffic. For example: Last week one of my old blog posts became popular on StumbleUpon and I received over 500 visitors within a few hours. While it was a good blog post, it didn’t have any affiliate links. So I quickly added a few to capitalize on the traffic surge.

There are many, many web analytics tools to chose from. The simple tools have similar basic web analytics features and offer detailed data on the last 100 – 500 page views for free. If you want a larger buffer you have to pay a monthly fee. (Obviously if you’re receiving 10,000 page views per day then 500 is only going to last you an hour or so which is probably not all that useful.) The advantage with these simple tools is that the data is shown practically in real time. You can literally follow along as visitors are surfing your site. Not that this is good use of your time…


The simple web analytics tool that I use is StatCounter. Sign up for a free account and then answer a few questions about your site. Of these questions there are a few that are significant: Make sure that you add your own IP address to the IP Blocking section so that your own page views on the site are not included in the stats. StatCounter helpfully displays your current IP address next to the entry window, e.g. If your ISP doesn’t give you a static IP address you can block out a whole range of addresses with a * at the end (76.81.54.*). If you still see your own traffic occasionally you might consider blocking an even larger range (76.81.*.*).

The other important configuration with StatCounter is the look of the “counter”. In the early days of the web it was cool to display a counter on your web pages to show how popular your site was. For a business web site this looks very unprofessional, so you should be sure to select the “Invisible” counter.

After all the questions are answered you will get a snippet of JavaScript code to install on your web site. It will look something like this:
<!-- StatCounter -->
<script type="text/javascript">
var sc_project=xxxxxxx;
var sc_invisible=0;
var sc_partition=29;
var sc_security="xxxxxxxxx";
<script type="text/javascript" src=""></script><noscript><div class="statcounter"><a class="statcounter" href=""><img class="statcounter" src="" alt="website stats" /></a></div></noscript>

This JavaScript code should be added just before the </body> tag at the end of each of your web pages. If you’re using WordPress, the file to edit is called footer.php within the Theme you have selected for your blog. After you have added the code, do a View Source with your browser to make sure that it was installed correctly. If you do a View Source on this web page and scroll down to the bottom you will see what it should look like. Also with the real time nature of StatCounter you should quickly start seeing traffic statistics in the tool.

Google Analytics

The other web analytics tool that I use is Google Analytics. It is pretty advanced and you can’t beat the price: free. (You may have to sign up for a Google AdWords account to use Google Analytics, which may require a $5 activation fee. But there is no requirement to actually spend money on AdWords to use Google Analytics.)

In Google Analytics you should also block your own IP address. This is done in Analytics Settings -> Profile Settings -> Add Filter. Select Filter Type “Exclude all traffic from an IP address” and then enter your IP in this format: 76\.81\.54\.162 (Google uses powerful regular expressions here, hence the backslashes in front of each dot. Click on the help link for more information.)

I recommend that you place the Google Analytics JavaScript code in the <head> section of your web pages. This may seem counter to what I said above. But GA offers a feature called urchinTracker that allows you to track outbound links, among other things. To use urchinTracker the GA tracking code must be placed before any links or events that try to call it. The <head> section is a good place to ensure this.

For WordPress users the file to edit for the <head> section is header.php. If you’re not comfortable editing PHP files then there are several WordPress plugins that make it easier to install web analytics tracking code. Just make sure you get a plugin that allows you to control where on the page the code is added.

Again View Source on this page to see what the completed installation should look like.

That completes the installation of your web analytics tools. I know this was a very brief tutorial, so if there is enough interest I may create a Camtasia video to illustrate each step.

Customize your BlogRush widget with this WordPress plugin

28 September, 2007 (00:15) | Tools, WordPress | By: Nick Dalton

This post used to contain a nice description of a new WordPress plugin that I wrote. The plugin allowed you to completely change the style of the BlogRush widget by supplying your own stylesheet. With this feature it’s much easier to get the BlogRush widget to blend in to the style of your blog. However John Reese made a good point that if each blog customizes the look of the BlogRush widget, click through rates may vary wildly throughout the system. Therefore John requested that I take down the original post and the link to my plugin.

This is not the first nor will it be the last useful/controversial WordPress plugin I’m developing. So stay tuned! (A good way to do that is to subscribe to the RSS feed or to the email notifications for this blog.)

How to track, test and tweak everything on your web site

21 September, 2007 (15:20) | Testing | By: Nick Dalton

Half the time and money you spend on driving traffic to your web site is wasted. Do you know which half?

Regardless if your web site is for fun or profit you probably want to keep improving it in some way: more readers, sell more products, increase conversion rates, etc. To accomplish this – other than by dumb luck – you need to constantly track, test and tweak every aspect of your site.

I’m writing a series of blog posts on this theme starting with simple topics like using a web analytics program to track referrers and keywords that bring traffic to your site. More advanced topics will include:

  • How to setup and track outbound links from your site.
  • How to setup split testing and multivariate testing, and know the difference.
  • How to track the number of people who signup to your mailing list or RSS feed.
  • How to track keywords and referrer information all the way to sales through an affiliate network like ClickBank or Commission Junction
  • How to track the effectiveness of your article marketing.

What are you struggling with in terms of tracking and testing? What would you like to learn more about?

Even if you can’t think of any new topics, please vote for one of the above so that I know what to cover in depth and what topics there is little interest in.

Please post your questions as a comment or trackback below. No questions are too simple or too trivial, but please keep them on topic.

And don’t forget to bookmark this site and come back to read the entire article series. Or better yet subscribe to the RSS feed using the link at the top right corner of this page, or if you prefer email notifications, just fill out the small form to the right.

Update: Read the first installment now – How to install a web analytics tool the right way

How do you write a 100 page book in 12 hours?

20 September, 2007 (10:08) | Copywriting, Life | By: Nick Dalton

Preselling a product before you create it is usually a good strategy. That way you can determine if there is a viable market before you spend time and resources on developing the product. My friend Ken McArthur (famous for jvAlert, jvAlert Live and his digital watch) has managed to presell his upcoming book so well that a major publisher is making it their headline book this spring. A big advance has been paid, full color ads are being printed, everything for a big book campaign is in motion.

There is just one little problem: Ken hasn’t written his book yet! Maybe he took this preselling concept a little too far…

I’ve written many reports and ebooks so I know that writing a full length book is a lot of work, at least it is for me. So I would feel under pressure and be a bit worried if I was in Ken’s shoes. When I’m embarking on a new venture where I have no prior experience I always try to find a good mentor who’s already successful at what I’m trying to accomplish. For his book writing project Ken has engaged one of the best in the business: Glenn Dietzel and his team from Awaken the Author Within. They are confident that Ken will be able to write a bestselling book in just 12 hours.

12 hours for a hundred pages is 8 pages per hour, or about 30 words per minute. Whoa! That’s half the speed of a good typist doing clerical work. Ken has not only have to type this fast, but presumably also put some thought into what he is writing. I’ve got to see how this system works! At that speed I could crank out 10 long posts per day for this blog…

Ken has invited me (and you) to watch over his shoulder as he’s writing his book. You will learn the tips and tricks Glenn and his team use to make Ken a bestselling author in record time. If you’re doing any amount of writing this could be a very interesting and educational journey to watch. Ken is a big and generous man, but there is limited space to watch over his shoulder. You can sign up for a spot at

Can BlogRush survive its own success?

19 September, 2007 (13:19) | Technical Architecture | By: Nick Dalton

When you sign up for a web hosting account you have to select between a myriad of server configurations with different memory sizes and processor speeds. How do you know what to choose? If you are just running a blog or a product sales site then I can tell you that in almost all cases it doesn’t matter. Even the smallest web server will handle the load of a new web site. But what if your web site becomes the next YouTube overnight? Then it doesn’t matter either. A single server will never handle that amount of traffic.

What if you’re John Reese and you have this brilliant idea called BlogRush and you have the marketing resources to make it an overnight hit, then how do you decide beforehand how many servers you need?

I’ve architected web sites for Fortune 500 companies where sites regularly receive 5 million pageviews per day. In coming up with the architecture and the hardware requirements one of my first questions for my client is how much traffic do you expect in the next year. Typically they have no idea. This could be because they have poor web stats for their existing site or they are launching a genuinely new service. That’s when it’s time to make educated guesses and use a thought process something like this:

Let’s use BlogRush as our hypothetical example. Assume that 10,000 blogs sign up initially. Given the value proposition of receiving free traffic I’m guessing that 99% of the blogs that do sign up have very little traffic, say 100 pageviews per day. But some of the more popular bloggers like ShoeMoney, John Chow, DoshDosh, Yaro Starak, Rosalind Gardner, Mike Filsaime and Terry Dean, are also going to sign up. Assume that these larger blogs each contribute on average 10,000 pageviews per day. The total load on the BlogRush servers would then be close to 2 million requests to show their widget per day. Since the blogger audience is global we can assume that traffic is spread about evenly across the day giving us an average of 23 requests per second.

A single well tuned web server can easily serve 23 request per second, if the content is static. The problem is that the content served by BlogRush is very dynamic. The headlines to be served to each blog have to be gathered, the number of credits each blog has accumulated has to be tracked and the number of links displayed for each blog has to be subtracted from the accumulated credits. The calculations get significantly more more complex due to credits being tracking through 10 referral levels. And all this has to be done 23 times per second. Not a trivial problem.

One way that very large sites handle their traffic is through aggressive caching. You pay a service like Akamai $10k+ per month and 90% of your traffic problem goes away. But for this to work the majority of the content needs to be cacheble, and we have already determined that BlogRush does not fit that profile.

Let’s look at the next tier of servers. After a request hits the web server most of the heavy lifting is going to be done by the application or database servers. Since the functionality of BlogRush is very data intensive I would implement most of it in a few stored procedures inside the database.

This brings us to scaling the database server. There are basically two approaches here: Use one very large server to serve all requests and then have one mirrored standby server in case the first one goes down. This is the “big iron” approach and it’s pretty expensive. The other approach is to use several smaller database servers. This is seldom done in reality because few applications are suited to distributing the load across several databases that are not actively linked.

At first thought the BlogRush application seems to fall in this latter “not possible” category since you presumably need to keep track of advertisement credits in one central location. But assume for a moment that the credits are spread out across several unlinked databases that are updated independently of each other. Sure a given blog could run out of credits in one database while there are credits remaining in others. Over time that shouldn’t matter; All credits will accumulate and be used correctly.

So given the 10 servers BlogRush reportedly has, I would dedicate one to the members’ pages that you see when you login to your account, one server for constantly polling new headlines from all blog feeds and the remaining eight servers each running a web server and a database. A load balancer and firewall sits in front of it all to direct traffic to the least utilized server. As traffic grows you just add more servers.

According to Mike Filsaime, John Reese sees BlogRush becoming a $100 million company. But he has a very sizable traffic problem that goes along with a hugely successful business, and it’s growing exponentially. If the number of blogs that sign up to BlogRush increases by a factor of 10, then the load on the servers will increase more than 10 times. That’s what I call a scalability problem. A quite interesting one.

Note that I don’t have any direct insight into the systems or operations behind BlogRush. These are just my educated guesses based on my 10 years of experience as a technical architect for some rather large web sites.

A copywriting tip that anyone can implement

16 September, 2007 (22:03) | Copywriting, Life | By: Nick Dalton

Copy ProtegĂ© recently had a good tip that anyone – regardless of copywriting skills – can put to use immediately: After you have finished writing your copy, put it aside for 24-48 hours. When you haven’t consciously thought about the text for a day or two, read through it again with a fresh mind before you submit or publish it.

Terry Dean dispenses similar advice in his Monthly Mentor Club Newsletter: After you have done all your research, but before you start writing, take a break and let your subconscious work on the problem for a while.

While I’m by no means in the same league as Terry or the copywriting professionals at AWAI, I use a similar technique for my blog posts. I prefer to write using a nice pen and old fashioned paper. And since I have the great fortune to live in the beautiful Rocky Mountains, I work outdoors as much as possible; inspired by the sounds of nature.

My handwriting is much slower than my typing, so it gives me extra time to think about what I’m writing. Also I don’t worry about spelling, URLs, checking quotes and facts at this point. I just let the text flow. Then I let it sit for an hour, or a couple of days before I type it into my computer. Later as I’m reading and typing, I typically make significant improvements to the text.

Writing while offline also has the great benefit of minimizing distractions. As Matt says, it’s amazing the amount of work you can get done when you’re disconnected from the Internet.

The significance of digital watches

11 September, 2007 (19:36) | Life | By: Nick Dalton

My friend Ken McArthur has managed to stir up a controversy over his watch. Not an ordinary watch, mind you, a digital watch. Now digital watches are nothing to joke about. They have a very special meaning in life, the universe and everything – as any Douglas Adams devotee will tell you.

There are of course many problems connected with life, of which some of the most popular are – Why are people born? Why do they die? Why do they spend so much of the intervening time wearing digital watches?

Here is my advice to Ken: You should keep your digital watch. Not only does it define you, it actually defines the whole human race.

Far out in the uncharted backwaters of the unfashionable end of the Western Spiral arm of the Galaxy lies a small unregarded yellow sun. Orbiting this at a distance of roughly ninety-eight million miles is an utterly insignificant little blue-green planet whose ape-descended life forms are so amazingly primitive that they still think digital watches are a pretty neat idea.

Database backups

9 September, 2007 (01:22) | Security, Tools, WordPress | By: Nick Dalton

Databases cannot be backed up as regular files because the database continuously writes to its files, so you are very likely to backup an incomplete or corrupt set of files. Therefore you need to use a special database backup program to create the backup. After the backup program has done its job you can then copy, move and archive the backup files just like any ordinary file.

Most web server control panels come with a database administration program. A popular option is phpMyAdmin for MySQL databases and phpPgAdmin for Postgres. Here are step by step backup instructions for some of the more popular control panels. And equally important: restore instructions.

The drawback with these database administration programs is that you have to perform the backup manually. If you’re like most people you will put off a manual chore like this until it’s too late. Therefore the goal should always be to automate your backups. If you’re running a WordPress blog on your web site you should definitely install the excellent WordPress Database Backup plugin. This plugin used to be distributed with WordPress 2.0 but was later mysteriously dropped from the standard distribution. If you have other programs on your web site that store information in a MySQL database then you need a full backup script like AutoMySQLBackup. Note that this latter solution requires some Linux shell knowledge to setup.

A great feature of both the WordPress Database Backup plugin and AutoMySQLBackup is the ability to email the backup files to yourself every day. I recommend that you setup a new Google Gmail account which has over 2 GB of storage to receive your backup files. Unless you’re a very prolific blogger, 2 GB should last you for quite a while. Then once a month or so you can login to the email account and delete old files.

Just like with regular backups is critical that you test your restore procedure occasionally. For this you can install MySQL or Postgres locally on your PC and restore the data to it. Backup files typically contain checksums and integrity checks, so if the file restores without errors then it’s very likely that the restore was successful without you having to verify the contents of all the data.

Now go back something up!

Who’s got time to learn how to fish?

6 September, 2007 (06:32) | General, Life, Tools | By: Nick Dalton

Mike Filsaime has a thought provoking post today titled “Fish, Poles, Boxes, and Buttons…” where he argues that people do not want to learn how to fish, rather they want to push a button and get instant results. While I think this is a sad trend, it is a reflection of today’s society of instant gratification and short attention spans.

If I accept the fact that I cannot change society as a whole, how can I adapt my own products to this reality?

Most of my products are software tools so they fall in the “poles” category: My Article Tracker, My Ranking Tracker, rApogee and Unique Article Publisher. rApogee stands out from the group since it requires a service from a separate provider which costs $1000 per month. So my customers for rApogee are already very committed and thus my refund rate is very low.

These software tools were developed to automate work that would otherwise have to be done manually. But work is still required. If you are looking for a push button solution for your article marketing then I highly recommend the Content Spooling Network; articles are written and submitted for you automagically.

The Digital Security Report is an ebook and a series of videos that show you how to protect your digital products from free downloads. Definitely falls in the “fish” category. However there is an upsell where I offer to perform a security audit of your web site. It is expensive so it attracts people who do not have the time to learn how to do it themselves. I would categorize this as a solution in a box. It’s not quite a button since the web site owner still has to implement the suggestions in the security audit. Offering the implementation as a service would be possible too, but it would require that I be granted access to servers and the authority to make changes. So far I have not seen a lot of demand for this, but if this is a service you would buy, please let me know.

I am currently developing a line of professional WordPress plugins. More “tools/poles”… An easy way to make this a solution in a box would be to bundle the plugins with WordPress. One install and everything is configured and ready to go. Add a couple of optimized templates/themes and the best plugins preconfigred, and it’s definitely a box. Taking the last step to a “push button” offering is certainly doable too. Register domain names and add hosting just like Mike and Ray did for their 1,000 sites that sold out in a day. There will be some added logistics for transferring domains and hosting, but since this will be a premium offering the price should be able to support it.

Thanks for the ideas Mike!

Backup your web site

4 September, 2007 (14:57) | Security | By: Nick Dalton

If something were to happen to your web server right now, how long would it take you to restore all files and all functionality to your site? Maybe your web hosting plan includes daily backups, so why worry? There are many reasons to worry, and a little bit of proactive worry is in this case good.

When was the last time that you actually tested the backup and restore function offered by your web host? Assuming that this checks out ok, there are other reasons why you may need to quickly restore your web site to a different location: a billing dispute with your current web host, spam complaints against your server, copyright infringements on your site. (If you are guilty of sending spam and using copyrighted material without permission, you certainly deserve to lose your site. But even allegations against an innocent may lead you to lose control of your site during the investigation.) The hosting company may also simply go out of business. The ways you can lose your web site are many and unpredictable.

There are several parts to a complete backup of your web site:

1. All the files on the web server

This one is the easiest. Just use your favorite ftp program and copy all the files in the web server root directory (httpdocs or public_html depending on your control panel) and all subdirectories to your PC. Do it right now.

2. Any software you have installed

If you have installed WordPress, a forum, a shopping cart or any other software that did not come with your hosting plan, make sure you have a backup copy of that software and the accompanying installation instructions. Don’t rely on the software vendor to be able to provide you with the exact old version of their software that you have installed. (When you need to quickly restore your web site is not a good time to upgrade software components. You have enough things to worry about as it is.)

It is also better to have the original installation files and reinstall them from scratch, rather than to rely on finding all the necessary installed files scattered on the web server file system.

3. Configurations

Don’t forget to keep notes of all configuration changes you make to your web site. This includes DNS settings, Apache configuration changes, and anything you setup in your control panel.

It is difficult to automate backups of configurations since you typically don’t have ready access to where the settings are saved. I just keep a text file on my PC where I make running notes as I make configuration changes to my sites. One text file per web server.

4. Databases

Backing up databases can be tricky. I’ll cover this topic in a subsequent post.


Any backup system is worthless if you cannot restore your data. (Actually it’s worse than worthless: you’ve wasted time backing up data and you’ve been lulled into a false sense of security.) The time to test if you can restore your data is not after disaster strikes. Make sure that you test your procedures before you need to rely on them.

A simple (but not complete) check is to restore all the files to a different subdirectory. Create a new “restore” directory and try to restore all files there. Then compare the contents of the files with the originals. You probably cannot “run” your web site from this restore directory but at least you have verified that your backup does contain valid data and that you are able to restore it.

A more complete test would involve getting a separate web server to restore your files to. This can be a cheap shared hosting plan as long as it has the same software installed as your regular web server. If you are able to restore all your data and create a functional web site on this new web server, then you are in good shape. Repeat this exercise on a regular basis. If you have outsourced your web master tasks then this is a good exercise to ensure that the outsourced resource fully understands your site and your systems.

Keeping this second web site online and regularly updated using the procedures above makes it a disaster recovery site. Many large corporations have such a site available and ready to switch over to should disaster strike the main site. All you have to do is change the DNS settings to point to this site instead of the original site and you’re back in business while you work on restoring the main site. Even if the disaster recover site is not quite up-to-date it is still probably better than having no site at all.

What are some tools and techniques that you use to backup your web server?