Twitter is over capacity

I like friendly error messages; I don’t think you can get much cuter than Twitter’s over capacity error message:

Twitter error message
Twitter error message

I just love how the whale is smiling. It makes it such a better error message. I don’t think many companies would be game to use a whale on their capacity error page (if they even had one) but in this case it just works.

Austin Workshop on Test Automation (AWTA) 2009

Watircraft are organising the Austin Workshop on Test Automation to be held on 16-18 January 2009 in Austin, Texas.

I have been approved to attend. It means three long flights from Australia (about 25 hours each way) but I am really looking forward to attending and meeting different people who are involved in Watir.

I haven’t been to America before either, so it should be really good.

Watir podcast eight

Episode eight of the Watir Podcast was released today in which Željko Filipin interviews me about my experience in using Watir, and also about my newly announced role as the Watir Wiki Master.

Check it out if you’re interested: http://watirpodcast.com/alister-scott/

Automated testing quick wins, low hanging fruit, breathing space & oxygen

I’ve seen a lot of automated testing efforts fail, and have also had to personally deal with the repercussions and expectations that have been set by these failed efforts.

For example, I clearly remember the first day on my new job that I moved 1500km to Brisbane for. I was being introduced to the Project Director whose first words to me were:

I have never seen automated testing succeed so I will be watching you very closely!”.

Not the best thing to hear on your first day in a job!

I’ve been thinking a fair amount about why automated testing fails to meet expectations. Sure, there is a lot of sales hype generated by test tool vendors and consultants, not a good thing, and there’s also  practitioners out there without the skills or discipline to deliver successful automated testing solutions, but there must be something else.

The problem is, I believe, that the time and effort to deliver a successful automated testing solution is huge. An automated testing framework might be deemed unsuccessful before it has even been given a chance to be successful! This is why I am a strong believer in first identifying some automated testing quick wins, some low hanging fruit, pardon the idiom.

A quick win is that something that requires a small amount of effort, input, for a huge amount of gain, output. These are sometimes hard to find, but almost always deliver a good outcome: some breathing space.

An example I can use is a simple application monitoring script. A place I worked had a problem with the availability of a public facing web app. Server monitoring wasn’t effective, the web/app server could be running fine but no one could log on via the web! There wasn’t a way to know when it was unavailable to users without first getting complaints via email and phone calls from unhappy people.

It only took me a few hours to quickly develop and test a ruby/watir automated testing script that I set to continuously run to monitor the web app availability. If the web app was unavailable, it would send off an email/sms to the people responsible for getting it running again.

The script was hugely successful. The downtime was reduced drastically, and since people saw patterns about when it was going down, it was easier to determine the cause of the availability problem. Since the script used only free/open source software (ruby and watir), there wasn’t any costs or time taken to acquire the software needed. People were like ‘wow: we didn’t know we could do this so easily’.

I recently attended the Test Automation Workshop on the Gold Coast in Australia and one presentation stuck in my mind. It was by a guy called Don Taylor who used to work in my team in Canberra and his presentation was called “Oxygen for Functional Automated Testing“. He told us that quite a few people emailed him when the program for the workshop went out asking: “What’s this tool called Oxygen?” But it wasn’t a tool at all, but rather about oxygen, the breathing space you need for successful automated testing.

And that’s what I consider to be the biggest output from these quick wins. It’s what automated testing needs to be successful. The breathing space generated from short term automated testing quick wins has enabled me to spend time and effort into creating robust automated testing frameworks that have been designed to be maintainable and successful in the long term. A whole heap of oxygen.

Mac Bank’s monotonic website is a refreshing design change

As I’ve mentioned before, I surf the web a lot and I’ve got used to surfing a lot of different sites. Every now and then I come across a new web page that, design wise, completely blows my mind. Last night that web page was the homepage of Macquarie Private Wealth.

Macquarie Private Wealth home page
Macquarie Private Wealth home page

I think their monotonic design, with limited shades of purple, is superbly aesthetic. It strikes me as bold, confident and professional. It’s a brave move, most modern websites use a lot of colour, but they’ve resisted and in my opinion it’s worked.

Perpetual home page
Perpetual home page

Just compare it to Perpetual’s home page which also uses monotonic images, but with lots of colour elsewhere. I don’t find this page has any where near as much impact.

Automated testing SWOT analysis

I attended the Test Automation Workshop at Bond University on the Gold Coast (Australia) last week (LinkedIn Group). It was good to see what others in the field are doing and share my views on automated testing. In the final session, participants were asked to share their own SWOT analysis on automated testing as it currently stands. Here’s mine (remember, it’s my personal view only):

(S) Strengths

  • Testing community
  • Level of knowledge

(W) Weaknesses

  • Automated testing is WAY TOO COMPLEX: too much code, too many spreadsheets, too many system programming languages in use, too many vendorscripts
  • Requirements based testing has flaws (see my diagram)
Requirements Based Testing Venn Diagram
Requirements Based Testing Venn Diagram

(O) Opportunities

  • Open source testing tools growth (Watir, Selenium, FIT)
  • Use them at home, write about them! Share your knowledge.
  • Done well, automated testing gives you breathing space to do other things.

(T) Threats

  • Management’s expectations: replace manual testing, ‘codeless’ automated test frameworks. Instead focus on do better testing, do quicker testing.
  • Poor practitioners: give automated testing a bad name. (Possibly because they don’t have a personal development framework)
  • Bad metrics: don’t compare with something you wouldn’t have done anyway (eg. saved 10,000 hours of execution).  Metrics around bug counts.

If you disagree (or agree) with any of these leave a comment and let me know why!