A useful list of standards for APIs
It is so easy to be solemn; it is so hard to be frivolous
I have just started the discovery phase of a new project. As we think about how we want the team to work, I am keen to make the case for cheerfulness being a core behaviour. This has, in part, been driven by thinking about the Royal Marines ethos having seen them at Beating Retreat earlier this month. A paper by Exeter University on the RM ethos states:
In referring to a common heritage, [humour] affirms the group and in the shared euphoria of laughter it binds members together ever more closely. Laughter in the face of danger is as important for group solidarity as symbols of collective identity such as the green beret.
We are not in any danger – this is IT after all – but I think we should strive for cheerfulness in the face of adversity.
Architects understand how to design and develop the functional elements of a system. Approaches like Test Driven Development (TDD) have made it easier to test functionality from the beginning of the development process. However, a lot of important considerations (security, scalability, agility, etc) are often lumped together into a bucket labelled “non-functional requirements” and left to a point in the process where it is difficult and expensive to fix any problems.
Evaluating Software Architectures sets out the Architecture Tradeoff Analysis Method (ATAM) as an approach to ensuring that, as architects, we understand the qualities of a system. ATAM evaluations are a collaborative process between a project and an evaluation team that includes the following steps:
- Present the ATAM. The evaluation leader describes the approach so that everyone understands what is going on.
- Present business drivers. Someone from the project (the product owner if you are using some form of agile) describes what business outcome is for the change that is being made. This gives you your first clue as to what the primary architectural drivers might be (e.g., high availability or time to market or high security).
- Present architecture. An architect describes the architecture, focusing on how it addresses the business drivers.
- Identify architectural approaches. The architect then identifies the architectural approaches – and hopefully there are some recognisable patterns – that have been taken
- Generate quality attribute utility tree. The team then works out the quality factors that comprise system “utility” (performance, availability, security, modifiability, usability, etc.) and prioritises them
- Analyse architectural approaches. Taking the highest priority qualities, the team then tests the architecture (for example, if the architecture is focused on high performance, the team will probe the architecture to see if it will perform). During this step, the team notes any architectural risks, sensitivity points (parts of the architecture that have a significant influence on a quality), and tradeoff points (parts of the architecture where a change can affect a number of qualities) are identified.
- Brainstorm and prioritise scenarios. The project and the evaluation team generate a larger set of scenarios (a form of user story) from stakeholders. This set of scenarios is prioritised via a voting process involving the entire stakeholder group.
- Analyse architectural approaches (again!). This step repeats the exercise in Step 6, but using the highly ranked scenarios from Step 7. You are using these scenarios as test cases to confirm the earlier analysis.
- Present results. Based on the information collected in the ATAM (approaches, scenarios, attribute-specific questions, the utility tree, risks, non-risks, sensitivity points, tradeoffs), the team presents the findings to the assembled stakeholders. The project should then act on the findings.
The first thing to say is that the book itself is a very academic, dry read. However, the process itself is relatively straight forward and common sense. The practices around scenarios and testing, for example, are familiar to anyone using an agile process.
I use the process as part of our Technical Design Authority process to test architectures before they become too developed (and therefore too difficult to change). It is also a useful tool to use when looking at different solution options. You can inform your decisions by testing each option. Lastly, involving users and other stakeholders in the process opens up the conversation about non-functional requirements (“do you really want 99.999% availability?”) earlier in the process.
The theme that Mark Ronson takes here has a lot of parallels with what we do as technologists. You can take a view that every new idea (virtualisation, cloud, social media, mobile, etc, etc) is just a rehash of something that has gone before. Alternatively, like Mark, you can see it as “…we take something we love and we build on it.”
Stephen Johnson, in his book Where do Good Ideas Come From, talks about the adjacent possible – where innovation comes from bringing things that we already know (and the idea that innovation doesn’t appear from nowhere). Music is like this. So is enterprise technology. We shouldn’t dismiss this, but work with it as a natural part of the creative process.
You can’t learn any of it from books; you have to learn music on the bandstand. The first row is not close enough; you have to be up there on the bandstand.
Wynton Marsalis in Si Sos Brujo: A Tango Story (via David Byrne, How Music Works)
I was prompted to start a blog by Simon Wardley as a way to share my experiences as an IT professional in government. So here goes…here is my first post.
My plan is to update this regularly with something useful. I will post a longer entry at least once a month – I hope to do better than that but am starting with an MVP. Most of the time I will post something short:
- a useful link
- a quick review of a book
- a quote or comment
I should start with a warning – reading this blog may not help! As Wynton Marsalis observes, the best way of learning is by doing…
A classic resource if you are stuck for ideas on presenting data.