Wednesday, December 19, 2012

Proud to be a Java Code Geek!

Hi All,

Just to say I'm proud to be part of Java Code Geeks - if you are interested in all things Java, go take a look and learn how to code! I'm taking a look at Android Apps over Christmas!

Friday, December 7, 2012

Software testing is common sense - right?

I'm talking here about functional testing, not performance testing, automated testing or penetration testing where you may need specialist technical skills. I may be playing devil's advocate - I'll let you decide.

I read and hear zealots advocating various tools and techniques, plying their trade. Consultancies all have their own 'new', bespoke way of doing things. It's special and better than the rest - but it isn't cheap.

I sometimes wonder, does it all need to be this complicated? Do I need to learn these techniques to know what to do and how to behave?

I don't think so - there I have said it - what do you think?

I think there are a handful of things a tester needs to have at the forefront of their mind when testing:

  • what are the business (or otherwise) goals of the software?
  • what problem is the software trying to solve?
  • what risks are there - how likely are they and how serious are they?
  • have empathy with the customer/user - get inside their heads - behave like them
So, do testers need the latest new-fangled jargon and terminology? Well, I think you know where I stand. I think testers need:
  • To be intelligent
  • To have business acumen (when testing business software)
  • Common sense
  • Good communication skills
  • People skills
  • Attention to detail
  • (you can't make a silk purse out of a sow's ear no matter what framework you use)
Testers should free themselves from the processes sometimes imposed on them to focus the mind on the problem at hand - the software and the business goals of that software.

Tuesday, December 4, 2012

Is your website mission critical?

For some businesses, their website isn't just a static or even dynamic information giving tool. For some, it is more than just an advert providing information on how to get in touch. For some, it is the absolute life-blood of their business.

A few years back I worked for an airline. I can't remember the exact figures, but the website turned over approximately £300M per annum and that was approximately, for the sake of argument, 80% of the business.

If the website went down for any length of time, it was very serious indeed. In a very competitive market place where margins are tight, it could spell the end. Not only is there the immediate cost of lost sales and the added cost of having to increase the head count in the call-centre, but the damage to the brand is considerable. If customers have a bad experience, they can be quite unforgiving and go elsewhere.  The cost of trying to win back those customers is very high indeed, and may even be impossible if your competitors look after them right. Also, disgruntled customers tend to be more vocal than the happy ones - so you can be sure they will be telling anyone who will listen not to bother going to XYZ airlines!

So what can you do?

Well, I have blogged before about software testing and whether you need it or not. As the owner of a company which advocates and provides specialist software testing services, I would wouldn't I!?

So software testing can go a long way to avoid situations where your website, the life-blood of your company, is running smoothly.

However, it isn't the be all and end all. Sometimes, even the most robust software, well written and thoroughly tested can go down. Why? There are lots of reasons. Perhaps it relies on a flaky 3rd party service. Perhaps there is a power cut and the UPS only has 3 hours of juice in it. Maybe there is a hardware failure.

In these situations, day or night, you need to know about it, so you can fix it before your customers even notice if at all possible.

With it's expertise in test automation, my company can provide a bespoke system to monitor your e-commerce web site 24x7. In the case of an airline for example, we could drive bookings through the website, checking data returned during the booking process - did the seat assignment work, did the insurance booking work, did the car hire booking work, did the hotel booking work and of course, did the flight booking work! If any of these fail, an alert email and or text can be sent to the support desk and they can investigate. Get in touch if this is of interest to your business.

Monday, December 3, 2012

Twitter Scheduler Application

I'm quite new to Twitter, but believe it is a powerful tool for small businesses such as mine.  I wanted to make it even more powerful for my software testing consultancy business, without finding it taking up too much of my time.

The functionality I wanted was to be able to set up my planned tweets first thing in the morning, before the 'real' work began - and then leave some sort of application or robot working in the background to actually send the tweets when I wanted them sent.

This enables me to write a few tweets and get them sent at key points during the day (or night) for my target markets - for example when the West Coast USA is finishing the working day - I'm UK based.

There are commercial tools which have this functionality, but they are paid for and also I am told quite fiddly to set up.  Plus I wanted to write my own!

It turns out it is very easy....

I wrote the application in Java using the twitter4j library.  I needed only a handful of classes:

I had a Tweet class which had fields such as message, dateTimeToSend and hasBeenSent.

I had a TweetScheduleReader which utilised the Java CSV library.  This read in my planned tweets from a CSV file - reading in a list of messages, the time to send them and a flag as to whether they had been sent yet.  This gets converted into a List of Tweets held in memory.

Then there is a class with a main method which just called the TimerTask run method hence:

timer.schedule(timerTask, new Date(), DELAY);

So starting from the moment this line of code is called, every DELAY (in my case 1 minute), the timetask run method is scheduled.  So what gets run?  See below.

I had a class which extended TimerTask and the run method was this:

public void run() {

//Login to twitter
Twitter twitter = new TwitterFactory().getInstance();
String accessToken = getSavedAccessToken();
String accessTokenSecret = getSavedAccessTokenSecret();
AccessToken oathAccessToken = new AccessToken(accessToken,accessTokenSecret);


//read the list of tweets from CSV
List<Tweet> tweetsToSend = scheduleReader.getTweets();
if (tweetsToSend == null || tweetsToSend.size() == 0) {
tweetsToSend = scheduleReader.getTweets();

Date currentTime = new Date();
List<Tweet> updatedTweetToSend = new ArrayList<Tweet>();
for (Tweet tweet : tweetsToSend) {
if (currentTime.after(tweet.getDateTimeToSend())) {
if (!tweet.isHasBeenSent()) {
//send it!
try {
                                                //I appended the date to make the tweet unique - else it gets rejected by             //Twitter
twitter.updateStatus(tweet.getMessage() + " " + new Date());
} catch (TwitterException e) {
//mark it as sent



//update the tweets in memory marking them as sent if appropriate


I found this tutorial very useful regarding the authorisation:

Wednesday, November 28, 2012

How to measure the effectiveness of your testers - some metrics suggestions

A long time ago I studied an MBA and on one of the Management Accounting papers, we were provided with a lot of data about some sales teams.  We were supposed to analyse the information and state which sales staff were performant.  It was a bit of a trick question really - drawing the exam candidate into seeing which sales people made the most sales and therefore announcing that they were the ones most worthy of a pay rise/bonus.  Of course, some of the sales staff had peachy sales regions to cover and it would have been hard NOT to sell stuff - others were harvesting stony ground and any sales they achieved should have been well rewarded.

That's the problem with metrics - often we measure what is easy to measure and use it to motivate our staff - this can be a very bad thing and drive behaviour which is not optimal for the organisation.

Now I work in IT and software testing in particular, and I have come across organisations looking to collate metrics for their software testing effort while working at my software testing consultancy business.

I hear people jump to metrics like - the number of bugs raised per day - I have even read blogs suggesting such metrics.  This is an interesting point I believe, as many crowd sourcing software testing organisation use a pay-per-bug model - perhaps they are measuring (and rewarding) the wrong thing because it is easy to measure?

Clearly this is not a good metric to measure - which hopefully the sales team example illustrates.

But, it isn't easy to come up with good ones - but I'll give it a go for people to shoot down, or maybe build on - maybe between us we can come up with some good ones!

So what is the behaviour we want to reward, and therefore what metrics might we measure to encourage that behaviour?

Well, how about we want to find bugs earlier in the SDLC and avoid serious bugs in production?  Therefore how about these metrics?

  • For a given software module, measure the number and seriousness of bugs raised in the live environment.  The lower this number, the better the testers and developers who worked on that module during development and QA phases.  This sort of metric is measurable using bug tracking systems such as Bugzilla, Jira and others.  However, be careful - if the testers are finding bugs right up to the release date, maybe you are releasing your software too soon and haven't given the testers a long enough QA phase.
  • Further to the above metric, analysis of bugs found in the live environment can be conducted, such as whether they should have been easy, moderate or hard to find in the testing phase.  Clearly, if testers who tested a module are letting 'easy to find' bugs through to production, something needs to happen - look at the way they test, some training or maybe they didn't have enough time?
  • For a QA phase, measure the number of bugs raised over time.  The earlier the bugs are raised the better the tester - we want to reward this behaviour as quicker feedback results in quicker and cheaper fixes.

I'm not totally happy with these metrics, but when you start to think about this topic, you will find it is decidedly tricky!

Please share your ideas below!


Thursday, November 22, 2012

A Selenium/WebDriver example in Java

A couple of years back, I was pitching for some work and the client wanted to see how I would tackle a real world problem.  They asked me to automate some tasks on the web site.

The task was to go to various woot web sites and to read the product name and price of the offer of the day.

I wrote a little bit of Selenium code and thought I'd post it here in case any of it is useful to anyone.

I got the job - so it can't be too bad.

First up I defined an interface to represent a woot page:


import com.thoughtworks.selenium.Selenium;

 * This interface defines the methods we must implement for classes
 * of type Woot.  Woot web sites have one item for sale every 24 hours.
 * @author Tony
public interface Woot {

* Defines the interface of the method we use to get the price
* of the item for sale on a Woot website
* @param selenium the selenium object we pass in which is used to interact
* with the browser/web page
* @return String representation of the price of the item for sale
public String getPrice(Selenium selenium);

* Defines the interface of the method we use to get the product name
* of the item for sale on a Woot website
* @param selenium the selenium object we pass in which is used to interact
* with the browser/web page
* @return String representation of the product name of the item for sale
public String getProductName(Selenium selenium);


Then I implemented this interface a few times to represent the actual behaviour of the various woot pages - here for example if the winewoot page:

public class WineWoot extends BaseWoot {

* Constructor
* @param url pass in the url of the web site
public WineWoot(String url) {
* Implementation of the method to get the price of the object for sale on 
* the Woot web site.
public String getPrice(Selenium selenium) {
//if you need to update the xpath to the piece of text of interest - use xpather firefox plugin
String xPath = "//html/body/header/nav/ul/li[8]/section/div/a/div[3]/span";
selenium.waitForCondition("selenium.isElementPresent(\"xpath=" + xPath + "\");", "12000");
return selenium.getText(xPath) + " ";

* Implementation of the method to get the product name of the item for sale 
* on the Woot web site
public String getProductName(Selenium selenium) {
//if you need to update the xpath to the piece of text of interest - use xpather firefox plugin
String xPath = "//html/body/header/nav/ul/li[8]/section/div/a/div[2]";
selenium.waitForCondition("selenium.isElementPresent(\"xpath=" + xPath + "\");", "12000");
return selenium.getText(xPath) + " ";

Note - back then I used the xPather plugin - this doesn't work for recent versions of firefox, so now I use firebug.

Then I wrote the actual 'test':


import com.thoughtworks.selenium.*;

import java.util.ArrayList;
import java.util.List;

 * This class is where we define tests of the Woot web sites
 * @author Tony
public class TestWoots extends SeleneseTestCase {
* Outputstream for our results file
private BufferedWriter out;
* Our list of Woot web sites we want to test
private List<BaseWoot> sites = new ArrayList<BaseWoot>();
* This is where we do any set up needed before our test(s) run.
* Here we add the list of Woot web sites we want to test and we create an 
* output stream ready to write results to file
public void setUp() throws Exception {
sites.add(new BaseWoot(""));
sites.add(new ShirtWoot(""));
sites.add(new WineWoot(""));
try {
//let's append to our file...
FileWriter fstream = new FileWriter("out.csv", true);
       out = new BufferedWriter(fstream);
       out.write("Site, Product Name, Product Price");
} catch (Exception e) {
System.err.println("Error creating a file to write our results to: " + e.getMessage());

* Tests getting the item name and price for the item for sale on each Woot web site we test.  We see the results of the test 
* in std out in the form of a table and we also write the results to a csv file.  
* If there are any errors getting the information, this is displayed instead.  
* How to run me: open command prompt and from the directory where our selenium server is 
* located type: java -jar selenium-server-standalone-2.0b3.jar (or equivalent) and wait for the server to start up. 
* Then just run this unit test.
public void testGetItemsAndPrices() throws Exception {
//for each Woot site in our list of sites we want to test
for (BaseWoot woot : sites) {
//let's put this in a try catch block as we want to try ALL the sites - some may be down or slow...
try {
selenium = new DefaultSelenium("localhost", 4444, "*firefox", woot.getUrl());
//add a new row for our table to std out
//print out the information we need - the site, the title of the item for sale and the price
String siteUrl =  woot.getUrl();
String productName = woot.getProductName(selenium);
String productPrice = woot.getPrice(selenium);
//sometimes there are commas which mess up our csv file - so
//we substitute with ;
productName = productName.replace(",", ";");
System.out.print("website: " + siteUrl + " ");
System.out.print("product name: " + productName);
System.out.print("price: " +   productPrice);
out.write(siteUrl + ", " + productName + ", " + productPrice);
} catch (Exception ex) {
//here may may see that the web site under test has changed and the xpath to the price or product name may need to 
//be changed in the Woot class
System.out.print("problem getting the data for: " +  woot.getUrl()+ " " + ex.getMessage() + " ");
} finally {

* Any tear-down we need to do to cleanup after our test(s).
* Here we just stop selenium and close the output stream
public void tearDown() throws Exception {

I know this code worked for a couple of years, and I have made some minor changes to get it to work with the current web sites - all I had to do was get the latest selenium-server-standalone.jar for it to work with the latest firefox and also to update the xpaths to the price and product name information.  That would be a good improvement to the code - to make it data driven - such that we could just update the xpaths in a config file rather than changing the hard-coded ones I have used here.  That was the only feedback from the client actually.

Anyway - hope you find it useful - and if you need expert automation engineers get in touch with Doogle!

Tuesday, November 20, 2012

To automate or not to automate - that is the question

OK, first of all, my main area of work is software functional test automation, so you might think I would advocate the automation of everything and claim it is the panacea to war and pestilence. Not so. In fact, I think over exaggerated claims by some software vendors and automation practitioners damage the industry.

However, as you might expect, I certainly believe that software test automation done the right way can be very valuable and cost effective.

I have a client I have been working with for 4 years now and they are still getting a great ROI from the Silk Test project.  What are the critical success factors of the project?  Here are some characteristics which I believe have made it so successful:

  • The software under test is quite stable - certainly the GUI is pretty stable and the API. This means that the tests are not too hard to maintain. Also, the way the tests have been written, means that changes to the GUI require code changes only (where possible) in the files.
  • The test suite replaces many many man hours of manual testing. This is because tests have been chosen where lots of data changes and needs to be verified - very slow and difficult to do manually - a great way to use computing power which will check the same things over and over again without getting bored.
  • The tests are written in an OO way such that a number of versions of the SUT can be tested using the same test code.
  • The SUT is enterprise/distributed software which can be configured in many ways - we are able to test a huge number of permutations with the same test software.
  • The software under test is deployed on many virtual machines using different operating systems (e.g. Windows 2008, Windows 2008r2, Windows 2005 and so on) - you can imagine how long it would take to regression test changes in the SUT on all these combinations - hence the benefit of automation increases - as it saves even more man-hours of testing.
  • Doing a full regression using automation gives much quicker feedback to the development team than regression testing manually - quicker feedback equals quicker to market and cheaper to bug fix.
  • The regression suite gives the customer a great deal of confidence and frees up manual testers to be more creative.  They are less bored and more motivated - and they find more bugs as a consequence.
As always, let me know what you think!

Sunday, November 18, 2012

Trying to get more traffic!

I'm going through the process of claiming my blog with Technorati.  Apparently I need to post this code on my blog, so here goes. CVQVPZAGTDGE

If I do get loads more traffic I'll write back here to let you know it works!

Thursday, November 15, 2012

So - you don't need software testing?!

As promised, I'm feeling inspired to write another post - I'm getting a bit of a bug here - quite appropriate for a software tester!

This one is a bit of a list really of software 'disasters' - just in case you don't think you need software testers - if you do need software testing - be sure to check out my software testing company in Devon, UK!

These are in no particular order, but first up is Tesco in the UK.  They apparently didn't do sufficient cross browser testing and ie9 didn't work properly for their banking customers for a whole week!  Customers were left unable to access their accounts.  The story is here:

Another UK bank also had issues recently.  This time it was Nat West who didn't know how much money was in peoples' accounts after an 'upgrade'.  Commentators here put this down to insufficient testing:

A famous bug caused the Ariane 5 rocket to explode - the rocket cost £7 billion to produce.  The bug is described in this link - basically, they tried to fit a 64-bit number into a 16-bit space:

Software plays an increasing role in our lives - in cars and aeroplanes for example which more than ever are 'fly-by-wire'.  For example, Toyota recently had to recall 400,000 cars due to a software glitch which caused problems with their braking:

Another famous and very serious software problem in the medical field was the Therac-25 radiation machine which led to overdoses - more can be read about that one here:

OK - some of these are very serious indeed and not all software bugs will have such serious consequences.  However, if you are a software development company, or have a public facing web site for your business, or even develop software for in-house use in your corporate organisation, can you afford to ignore the need for software testing?

I'd love to hear of your examples and comments - please post them below - and I'll add them to the list!

Near shoring software testing

Hi - I'd like to write a bit about this subject as it is dear to my heart - as I run a software test house specialising in test automation in the UK and conduct a lot of my work remotely.

There are of course a lot of reasons to require the services of a test house/team of freelance testers - let's just assume for now that you are are sold on the need for some testing already - maybe that subject could be my next post - I think it will be actually:

  • You can't afford a full time team as you don't need people 5 days a week throughout the year - you just need to handle the peaks and troughs
  • Your in-house team needs help sometimes - maybe due to workload or maybe due to a skills gap
  • You may be required by a customer to get a third party view of your development work - they may not be happy for you to sign off quality in-house
There are many other reasons - post some - I'll add to this list!

Then of course you have a lot of choices how to out-source:
  • You could go off-shore - this may prove to be cheaper - at least on the surface
  • You could get some contractors to come and work in your offices
  • You could try crowd-sourcing
My belief is that near-shoring has a lot of benefits compared to the other options:
  • There are no cultural differences between your company and the out-sourced company
  • There are no language barriers
  • There is no time-zone difference (in the UK at least)
  • Arranging a face-to-face meeting can be done quickly and easily - and is generally quite low cost - it may even be possible to avoid overnight stays etc
  • You may not have office space for contractors and, let's say you are based in a high wage part of the UK or elsewhere, near-shoring could have cost savings in terms of wages
  • I find that working remotely often has benefits for the customer - one major one is I find people don't get dragged into lots of internal meetings which can soak up a lot of time and energy - the test house can focus on the task at hand
  • Out-sourcing in general provides a new view on your software - the test house doesn't suffer from possible 'group-think' which may have emerged in your own organisation
  • Crowd sourcing has it's place - there are some very good reasons to use it sometimes - for example you may want coverage geographically, with many browser types and platforms - crowd sourcing is great for that - I just wonder if sometimes it is useful to get a very stable team who get to know your product in detail
Anyway - I'll probably add to this post and it would be great to incorporate your views.  It can be quite a contentious subject - so let's have some good debate!  At the end of the day, there is room for all models - it's just a question of picking what is right for you.

Wednesday, November 14, 2012

Software testing tips

OK here we go with my first real post.

As I have mentioned, I want to share software tips - especially software testing tips.

Here are some - I hope to add to this list over time.

Need to be more agile in your development? Need Continuous Integration? I have written a tutorial on Nunit/Jenkins.

I found this useful about how to view .pages files on a PC/Windows machine.  Basically, rename them as a zip and open preview.

If you need Jenkins to run from the DOS prompt (say SYSTEM user doesn't have the permissions needed for your build).

.NET 4.0/4.5 inconsistencies - an example of why we need to test on different platforms.

Some more on this subject - need to upgrade to .NET4.5 to get some WPF bugs fixed?

Actually, this sort of thing, having to regression test on many platforms/browsers, is a very good argument for test automation - and that's exactly what Doogle Ltd is expert at - automated software testing experts.

Useful conversion from pdf to word:

Another useful thing! I needed an electronic signature - I used fountain pen free for the iPad.

Need to get your head around Business Driven Development (BDD)? I found this useful.

Quick tip. Copy-paste the content from your web site and spell/grammar check in your word-processor of choice.

Need to share source code with others in an SVN repository? I like unfuddle:

Need to convert between file types for say an inline email? Try

Tip of the week: take screen shots and videos of applications under test and share with Jing:

Top tip: Need to check your web site is up? I use this:

Need to simulate poor network for testing purposes? I use TMnetSim:

Need to view a very large text file (a log file maybe?) and Notepad can't cope? Try Large Text File Viewer:

Old pc or laptop running slow? - try an excellent LINUX OS - it's free!

Here's another software testing tip. Ever needed to take a screen shot with an iPad? Here's how:

I used to use XPather a plugin for firefox (but it no longer supports the latest versions of firefox).  So I have been using Firebug - I find this excellent to find Xpaths to use in my Selenium scripts.

Testing Tip: If you need to test an iPad, iPhone or iPod app, install the .ipa with i-Funbox