Austin Workshop on Test Automation (AWTA) 2009

Watircraft are organising the Austin Workshop on Test Automation to be held on 16-18 January 2009 in Austin, Texas.

I have been approved to attend. It means three long flights from Australia (about 25 hours each way) but I am really looking forward to attending and meeting different people who are involved in Watir.

I haven’t been to America before either, so it should be really good.

Watir podcast eight

Episode eight of the Watir Podcast was released today in which Željko Filipin interviews me about my experience in using Watir, and also about my newly announced role as the Watir Wiki Master.

Check it out if you’re interested: http://watirpodcast.com/alister-scott/

Automated testing quick wins, low hanging fruit, breathing space & oxygen

I’ve seen a lot of automated testing efforts fail, and have also had to personally deal with the repercussions and expectations that have been set by these failed efforts.

For example, I clearly remember the first day on my new job that I moved 1500km to Brisbane for. I was being introduced to the Project Director whose first words to me were:

I have never seen automated testing succeed so I will be watching you very closely!”.

Not the best thing to hear on your first day in a job!

I’ve been thinking a fair amount about why automated testing fails to meet expectations. Sure, there is a lot of sales hype generated by test tool vendors and consultants, not a good thing, and there’s also  practitioners out there without the skills or discipline to deliver successful automated testing solutions, but there must be something else.

The problem is, I believe, that the time and effort to deliver a successful automated testing solution is huge. An automated testing framework might be deemed unsuccessful before it has even been given a chance to be successful! This is why I am a strong believer in first identifying some automated testing quick wins, some low hanging fruit, pardon the idiom.

A quick win is that something that requires a small amount of effort, input, for a huge amount of gain, output. These are sometimes hard to find, but almost always deliver a good outcome: some breathing space.

An example I can use is a simple application monitoring script. A place I worked had a problem with the availability of a public facing web app. Server monitoring wasn’t effective, the web/app server could be running fine but no one could log on via the web! There wasn’t a way to know when it was unavailable to users without first getting complaints via email and phone calls from unhappy people.

It only took me a few hours to quickly develop and test a ruby/watir automated testing script that I set to continuously run to monitor the web app availability. If the web app was unavailable, it would send off an email/sms to the people responsible for getting it running again.

The script was hugely successful. The downtime was reduced drastically, and since people saw patterns about when it was going down, it was easier to determine the cause of the availability problem. Since the script used only free/open source software (ruby and watir), there wasn’t any costs or time taken to acquire the software needed. People were like ‘wow: we didn’t know we could do this so easily’.

I recently attended the Test Automation Workshop on the Gold Coast in Australia and one presentation stuck in my mind. It was by a guy called Don Taylor who used to work in my team in Canberra and his presentation was called “Oxygen for Functional Automated Testing“. He told us that quite a few people emailed him when the program for the workshop went out asking: “What’s this tool called Oxygen?” But it wasn’t a tool at all, but rather about oxygen, the breathing space you need for successful automated testing.

And that’s what I consider to be the biggest output from these quick wins. It’s what automated testing needs to be successful. The breathing space generated from short term automated testing quick wins has enabled me to spend time and effort into creating robust automated testing frameworks that have been designed to be maintainable and successful in the long term. A whole heap of oxygen.

Mac Bank’s monotonic website is a refreshing design change

As I’ve mentioned before, I surf the web a lot and I’ve got used to surfing a lot of different sites. Every now and then I come across a new web page that, design wise, completely blows my mind. Last night that web page was the homepage of Macquarie Private Wealth.

Macquarie Private Wealth home page
Macquarie Private Wealth home page

I think their monotonic design, with limited shades of purple, is superbly aesthetic. It strikes me as bold, confident and professional. It’s a brave move, most modern websites use a lot of colour, but they’ve resisted and in my opinion it’s worked.

Perpetual home page
Perpetual home page

Just compare it to Perpetual’s home page which also uses monotonic images, but with lots of colour elsewhere. I don’t find this page has any where near as much impact.

Automated testing SWOT analysis

I attended the Test Automation Workshop at Bond University on the Gold Coast (Australia) last week (LinkedIn Group). It was good to see what others in the field are doing and share my views on automated testing. In the final session, participants were asked to share their own SWOT analysis on automated testing as it currently stands. Here’s mine (remember, it’s my personal view only):

(S) Strengths

  • Testing community
  • Level of knowledge

(W) Weaknesses

  • Automated testing is WAY TOO COMPLEX: too much code, too many spreadsheets, too many system programming languages in use, too many vendorscripts
  • Requirements based testing has flaws (see my diagram)
Requirements Based Testing Venn Diagram
Requirements Based Testing Venn Diagram

(O) Opportunities

  • Open source testing tools growth (Watir, Selenium, FIT)
  • Use them at home, write about them! Share your knowledge.
  • Done well, automated testing gives you breathing space to do other things.

(T) Threats

  • Management’s expectations: replace manual testing, ‘codeless’ automated test frameworks. Instead focus on do better testing, do quicker testing.
  • Poor practitioners: give automated testing a bad name. (Possibly because they don’t have a personal development framework)
  • Bad metrics: don’t compare with something you wouldn’t have done anyway (eg. saved 10,000 hours of execution).  Metrics around bug counts.

If you disagree (or agree) with any of these leave a comment and let me know why!

Create fancy wiki home pages with Confluence, Lozenge and Nuvola icons

I think that it’s important to have nice looking wiki pages, especially for those high level pages that are viewed by a large audience. I’ve put this post together to explain how to setup Atlassian’s Confluence wiki with some freely available macros and icons to make fancy wiki pages to impress.

Prerequisites

1) Atlassian Confluence

You need a running instance of Atlassian’s Confluence Wiki – you can get a free 30 day trial or free personal server license for personal use.

2) Content Formatting Macro

You need to have the free Content formatting macros installed. This can be easily done in Confluence under Administration->Plugin Repository.

3) Nuvola Icon sets attached to Confluence

You need to download the Nuvola icon set locally and attach these to your Confluence instance. Nuvola is a elegant looking icon set released free under LGPL.

Download this file, then unpack it (on Windows use something like 7-Zip), then upload all the 48×48 icons to new wiki pages in Confluence using the Confluence File Uploader that conviently runs directly from your browser.

I created five separate wiki pages under a master page called ‘Icons‘. I called these icons pages: ‘Icons Actions‘, ‘Icons Apps‘, ‘Icons Devices‘, ‘Icons Filesystems‘ and ‘Icons Mimetypes‘. You can create these pages under any space, so if you have an admin type space you’re probably best to use that.

You should put {gallery:columns=8} in each icon wiki page so the icons appear as thumbnails. You can put {children} in your Icons page to display the other child icon page names.

Creating the Actual Page

Creating the actual page is pretty easy once you have all the icons uploaded. The page uses three macros: columns, sections and lozenges. It’s a matter of reading the doco for each and experimenting with what looks best. I’ve posted my wiki source below.

You can then easily create a page template with some icons and blank labels so that others who use your wiki can easily create pages that look like this.

The Result

The final result
The final result - click to enlarge

Source code is posted below.

Continue reading “Create fancy wiki home pages with Confluence, Lozenge and Nuvola icons”

Short, friendly URLs

Here’s a good comparison that shows why I like short, friendly URLs.

HP Quality Centre

URL: https://h10078.www1.hp.com/cda/hpms/display/main/hpms_content.jsp?zn=bto&cp=1-11-127-24_4000_100__
Length: Poor: 98 Characters
Friendliness: Poor: There’s no mention of Quality Centre or what it is. Lots of meaningless numbers and codes. Also, does a product page really require a secure page?
Google Search Blurb: Poor: “HP Quality Center is designed to address the wide-ranging challenges that…”

Google search for HP Quality Centre
Google search for HP Quality Centre

Atlassian JIRA

URL: http://www.atlassian.com/software/jira/
Length: Very good: 39 characters
Friendliness: Excellent: Includes both JIRA and software, so you know what it is. No unneccessary numbers or codes.
Google Search Blurb: Excellent: “Browser-based bug, issue, task and defect tracking system, and project management software solution used for open source and enterprise projects.”

Google search for Atlassian JIRA
Google search for Atlassian JIRA

It’s easy to see which one is better.