Categories
Case Study Experience Reports

Delivering a vision

Vision is hard. Often it can feel fluffy and woolly. The vision for our group was “Fulfilling customer needs, through innovative trusted solutions, that we take pride in” but it wasn’t tangible. I established some quality pillars/strands for it. One around discovering behaviours, a sort of BDD-vibe to it and a second about quality attributes (NFRs).

Sharing the vision

When I shared this vision, I did have a picture of how I thought we could deliver with quality and talked about what this might look like. I also highlighted the various changes, improvements and pathways for us to get there. I tried to ground it in practicalities.

I also acknowledged that it would take years and in truth, we may never truly achieve it. I also acknowledged that things can and probably will change. It was a vision, not an expectation.

Delivering for the vision

To make the vision relevant, I’d refer back to these with each initiative I pushed. This meant that when pitching an idea and sharing progress in sprint reviews I could highlight how it ties in with our vision. This helps share the reasoning and get buy in.

Some of my initiatives hadn’t had time to fully settle in and see the value come through to be declared massively successful, but the ones tied to my vision never flopped. They are least made *some* improvement. They also stepped us slightly forward.

Having a vision is nice but you need to back it up. I’ve rarely felt like the visions that I’ve been sold meant anything so I feel like I can take great pride in having involved in setting a quality vision then delivering tangible real changes that could help us achieve it.

A slight reality check

It is worth calling out that I called this “Delivering a vision” and yes, in a year I had made a number of improvements but we were so far off. I also failed in getting people to recall the vision. Heck, our quality vision was meant as part of our group’s overall vision and I was probably the only one from our group who could remember it.

I would balance that by pointing out that I understood and acted upon our vision. For most it was word flotsam but for me, it was a destination to steer towards.

Categories
Case Study Experience Reports Ramblings

User journeys in refinement

Thinking about user journeys in testing isn’t a particularly new topic (although probably truly conducted a lot less than we would like to admit). I suspect even rather than that is the user journey in design and planning. At least once engineering teams are involved. This is something that I’ve had little chance to explore in practice but it really interests me.

I’ve explored a few techniques in my own time from feature mapping to story mapping. I like the structure of story mapping (and how it ties to example mapping), letting us consider the workflow, MVP and priorities.

Once the PO from our group took a really interesting approach that I quite liked.

We started off by discussing the parts of our larger feature up for consideration. We talked through personas. We drafted a few workflows. Then we prioritised what we thought was most important and tapped into those workflows some more. What was the most important one?

This gave the team more enthusiasm and ownership of what we were working on. However 6 months on and real grumblings and discontent came in. We’d abandoned trying to identify user journeys. This was because the business had already decided the workflows and priorities around 2 years ago. We were lagging well behind the design on the system. We picked up a solution to implement. Then tried to inject the “user” into our stories a little more artificially.

What made this frustrating is how awesome the UX guy was at being open to collaborate. But that lag between a workflow being agreed, mock-ups demoed to customers and then later, us getting involved, really broke our engagement with trying to solve customer issues. Yes, I pushed using personas and we included user benefits in our demos but it was “tacked on”.

My taking from this experience was that when the development teams got involved in considering the user early it made a real difference. Who is this for? What are they trying to achieve? However this only works if you get development, test, UX/design and product in the room together.

Get these key stakeholders involved in mapping out the problem and desired solution.

Categories
Case Study Experience Reports

User Journeys & Testing

I wanted to share a really cool activity that I did with a couple of developer teams a couple of months ago.

Two teams had (roughly) a sprint of testing with a push to be user focused. I was brought in to lead this effort (with no notice!). The end result was a number of new bugs, insights into users, better bugs and developers running a demo on what they’d learnt. It was pretty awesome!

As developers freed up, we put them in pairs and then the three of us would have a 60-90 minute workshop to design some tests.

We started by talking about the feature the pair were looking at. What is it? Why is it used? Who uses it? How does it fit into a daily workflow? On hand were some personas that I’d previously created to help guide us on who would do different actions and how they would be doing things in parallel.

Once we had a rough idea of what someone is trying to achieve when using our feature, we plotted out a journey using sticky notes. When we had decisions to make, they were notes for future journeys (e.g. is this first time or returning?). When we realised that we had different people involved, we mixed up the colours.

Eventually we had our journeys or tours. The teams then optionally wrote them up… or added annotations to our board. They then setup accounts for each persona and we’re using a shared customer like environment with no debuggers or sims in sight (sort of – we had a way to inject ourselves to look into bugs without polluting the environment).

To execute the tests, the devs would pair, where one person drives whilst acting as Alex, then huddle around the other person’s screen as Sam did their tasks and back again. All the while, taking notes around the experience, what they learnt and specific actions and timings. Not everyone was perfect at this (it’s a skill), but the group embraced it well.

I would be bouncing around to help and also picked up one task myself and would live stream myself doing the testing for people to watch how I’d work.

The feedback from the group after was great. Not only did it find new issues and showed a new way to test but people enjoyed it. Developers enjoyed hands on testing. Whilst obviously there’s things I would have done better, especially with given the timeline, it was definitely a success.

Final parting thought. I’d never been able to get this type of testing on the agenda. I wasn’t convinced that I ever would. However that didn’t mean that I’d forget about it. You never know when you’ll have your chance to shine so always have something in your back pocket.

Categories
Case Study Experience Reports

Collaborative Test Strategy

I’ve always disliked writing test strategies and plans. Reviewing them was even worse. Just tedious long documents that tell me very little. Usually almost a copy paste as projects tend to be pretty similar. I did play with the one pager but still, it felt like a pointless exercise. We had a ways of working that incorporated testing.

In fact, inspired by Robbie Falck, I did our test strategy as a ways of working. That was well received by the teams but there was a push from the business to have documented test strategies per epic.

Not the WoW I used but we did may the various stages of a story and the activities performed.

I ended up taking inspiration from the one pager and organised a meeting with the team and we filled it in. I then carried and mixed it up. Eventually I finally started seeing the value. It wasn’t the document. It was still as pointless as ever. The value was in the conversations we had, the risks identified and the outcomes of the discussions about what we’ll need to do.

I liked doing this in phases. In our first session we typically started from a diagram of the system. What are we changing? What is impacted? What is the technology? (although later in my time I started by asking… what is the problem we’re solving). I’d also try and get a feel for what we knew and didn’t know. We’d ask about API changes – do we need to do threat modelling? Finally if there’s barriers to testing the feature (kit, environment etc), let’s highlight those early.

Deliberately blurry – but there’s a model of the system, discussion points and then some notes on key questions we want to ask.

I could then catch up with the team, or a couple of folk, again and ask what new have we learnt? What possible risks are new and what have we progressed on potential risks from our first chat. This is again focused in a collaborative way. We should know even more about the architecture so now I can tap into performance & load testing as well.

Whilst I evolved my templates for facilitating, I did explore different methods. It depended on the feature and our knowledge. I loved a diagram but having a series of prompts to ask questions or a mind map of SFDIPOT, it varied. This was to try and get us asking some slightly different questions and keep things fresh. The point is the discussion, not filling in a form, which leads to copy-paste strategies.

A mix of approaches

In terms of planning *how* to test everything, we focus that per story. If we identify dedicated testing activities, they are their own story. We shouldn’t need a document saying that we’ll do performance tests and unit tests. They are part of the definition of done or acceptable criteria.

So I’m happy to do away with the test strategy documents. They are still worthless in my view. However facilitating discussions involving the various team members to identify the risks and challenges we’ll face. Then documenting the testing needs through the usual tickets.

At the end of the day, if we’re trying to shift left then why have distinct documents about testing. Instead, yes let’s talk about the testing and intertwine that with what is required to close a story.