Tuesday, October 23, 2007
This has been, perhaps, my most successful project to date - and certainly the one of which I am most proud. Several things worked well: we followed an agile approach with Crystal as its philosophy, and Extreme Programming as its practice, and this delivered on its promises. Having said that, we didn't do much pair programming - and that was perhaps a serious omission, since each developer became an expert in only one area of the system. Perhaps more importantly, we only has an on-site customer for about twenty percent of the whole time. This meant that we guessed requirements far too often. Furthermore, we released frequently: at least once per month, but received nowhere near enough feedback along the way. If I could change one thing, then, it would have been far more involvement by the customer's decision makers and business experts - rather than just their technical staff.
Comparing this project to earlier ones, I see that upper management can (and did) make a great difference to a project. If a team is just left to "get on with it" then it is highly unlikely they will magically materialise something that adds business value. Upper management needs to help everybody stay focused on delivering clearly articulated business value, communicative on progress and plans, and collaborative in steering towards them. Collaboration, I have learned, is not just about sticking people in the same room and telling them to work together, but about ensuring that people really care about and are committed to the success of the project, and put in personal effort to ensure that one another is as effective as possible. "Team players", then, are not hired or born, but cultivated.
The end of this particular project has actually caused me to reflect on a few of the project's successes:
Unlike just about every other project I have worked on over the past 20 years, this is one I am actually quite proud of. There have been ups and downs along the way, but overall I sense that the project has moved forward with a regular heartbeat, has resulted in a very happy client, and has taught me a tremendous amount both technically and professionally.
In fact, the project has been so inspiring, that I am going to take the core concepts from the project, and rework them over the coming months into a product with much wider appeal. As a sneak preview, I will be developing in the first instance several technologies that will simplify considerably the construction of CRUD applications. Although many developers look down at CRUD applications, I have found that non-trivial ones are very challenging, and very valuable to businesses.
It won't be until the end of the year that I start full time on this, since I am deep in the process of complete hand-over of my current client work. Starting in the new year, though, I will be developing pretty rapidly and full-time, and then releasing my output as open source software.
One vital point is that this is not intended to be a technology that can be thrown at any project to make it successful. Rather, the intent is to help reduce the technical burdens that plague many projects, so that developers and business experts can spend more time on communication and collaboration in working out business-value-laden requirements. The developers will still have to be technical, but should become more productive, freeing them to spend more time on systems analysis with customers. Furthermore, the technology will, I hope, match well with the agile principles of beings able to keep pace with requirements change, so that the collaborations between developers and business experts can remain ongoing and fruitful. That is, as well as eliminate big-up-front-design, we can let go of big-up-front-requirements-analysis.
The ultimate aim is then to establish a reputation as an effective collaborator in the development of practical software, which helps companies focus on delivering business value rather than being held back with technical distractions. From this, I hope to establish a new direction in my consultancy career - helping people to use the technologies I have released, in conjunction with several others that bring clear benefit. The focus will be always be on understanding evolving business value and how that maps to changing requirements in the software systems supporting that business value. It will be, for me, a fresh challenge, and I hope, for future clients, a rewarding experience.
Wednesday, January 10, 2007
I have worked on several agile projects and found they went really well. We talked to the customers, found out what they wanted at each stage (and even let them change their mind sometimes), delivered frequently, and proved it all worked with a fine body of tests.
These projects were clearly successful – so maybe agile methodologies were the answer. I did have one niggling concern, though. I noticed it was always the best people we put onto these projects – not just in terms of their programming abilities, but also in the analytical ability of working out with customers what the requirements were (albeit only in small scale increments).
These developers, then, had somehow gained the requirements gathering skills seen in analysts of old – they weren't drawing lots of code-level diagrams, nor were they simply writing down what the customer said. Rather, they were doing real analysis of their interactions with customers.
I often wondered what would have happened if the folks involved were not so highly skilled in their analysis abilities. Eventually, I actually found the answer (at least in a single project which shall remain nameless to protect the guilty). A customer asked a team with which I was working to develop some business software over two years. We were to deliver it incrementally – taking an agile approach – with deliverables every month or two. For each increment, the team negotiated requirements just-in-time with business folks, gave demonstrations to the customer, developed lots of tests to prove the software worked properly, and was given the thumbs up at every stage that we were doing a great job.
Then, after two years (and about 15 releases) we were told "that isn't what we wanted at all".
Well, what a knock to the head that was. How could we have been “doing a great job” throughout the whole project, and then have delivered the wrong thing at the end? We had been "fully agile" we had customer involvement and approval at every step. It didn't add up.
It took me a while, but eventually I realised that we were taking customer "thumbs up" as a sign that we were doing the right thing, whereas the customer was taking our own frequent deliverables as a sign that the project has a heart beat rather than stalling, and hence was in no need of interference. As one business person later said to me “I am am too old to understand all this – we leave it to you younger technical guys”. Nobody was questioning whether we were actually building the right system. All the little pieces were correct, but the overall system was completely wrong.
Eventually, for this and many other reasons, the company hired a new very senior manager to oversee most of the business operations. He turned out to be just what was needed: somebody with the authority and insight to keep asking "is this strategically correct"? That is, he remained very much focused on continually asking not "can we make a computer system that meets tactical obligations?" but more importantly "what are the strategic business needs towards which our computer systems should contribute?"
His questions were quite simple, but they caused a major shake-up. The two year agile project was frozen (a euphemism for canceled). The project started again, but this time with a small group of very experienced people from both the business side and the technical side, continually driving requirements decisions from strategic business needs. That is, not just asking "What and How?" but also "Why?"
I see this as the real heart of the problem in many of the projects in which I have been involved. We were forever trying to get people to pin down the "what, and how?" requirements of the computer systems in a specifications that we could then translate into code. This is not analysis, this is transcription.
Analysis – the part developers rarely have the background and (perhaps more importantly) the authority to do – is the art of asking "Why?" all the time.
One thing I have noticed throughout my career is the increasing absence of “systems analysts”. In the old days, there used to be folks, separate from the programmers, whose job (at least in theory) was to have a foot in both the technical and the business camps. The idea was that they could speak the language of both side.
Analysts were there for a reason. Software development requires things to be defined in a pretty orderly way, whereas the business world is a pretty chaotic place – it doesn't have all the clearly defined rules and structures that programmers wish it had. The role of the analyst was to try to make order from chaos.
The analyst could speak to the business folks in their own terms, elicitiing just enough formality to work out what was needed, and at the same time speak to the technical folks with the precision and certainty they hope for.
Analysts have gone out of style. Nowadays we have the generic term "developer": a technical guru who can also speak directly to customers. Perhaps the idea of a “do it all” developer came about when object oriented languages made us think we are modeling the world as it really is.
Unfortunately, in the business world the objects in the real world are often not easy to work out unless you know a lot about the business area you are working in. Therefore, you are supposed to ask the business experts. Unfortunately, the business experts are often experts at doing their jobs, but often don't have the necessary experience or inclination to "pin down the rules" the way we would like.
The answer is, of course, for the developers and the business experts to be able to work together. This is where we find that most developers sorely lack the “requirements gathering” skills that good analysts developed over many years. Developers tend not to be good at dealing with "fluffy thinkers" and race to capture the “domain objects” as quickly as possible. To achieve this, many developers grasped at graphical object oriented notations to “express” requirements agreed with business experts.
Getting the business folks and developers talking is a good thing. But focusing those discussions on capturing requirements in terms of unambiguous object oriented diagrams meant that developers often ended up viewing the diagrams as something that can be translated straight from boxes and lines into code. Supporting this view, a number of software tools vendors advocated the effectiveness of “round-trip-engineering” wherein the diagrams and program code are simply seen different notations for the same precise model.
The goal, then, became getting the diagrams tied down, so the developer could then directly reflect them in program code.
This meant we didn't have much analysis going on any more – rather, the focus was on demanding a diagrammatic specification, adorned with supportive text where needed, that was precise enough to be translated into a programming language.
This led inevitably to big fat object-oriented diagram-heavy specification documents that most business folk could never write well, and that most developers often complained had big holes and contradictions. This meant that despite the “progress” neither side was happy.
Agile methodologies came to the rescue with the obvious idea that big problems can be solved in small steps - focus on a bit a time and your chances of going wrong are minimized. Furthermore, if you keep the customer closely involved in each step, and measure your progress to make sure everybody is happy, people can see good progress, and even learn from each step as a way of decided what the next step should be about.
Fresh from my PhD I starting working in Geneva at a large bank producing financial indices. They had a old legacy system they wanted to replace, and my PhD research was just the thing to do it. I met one of the senior managers there, and it was giddy times for the first few days where together we dug deep into the existing system and came up with all sorts of fresh strategies for address the problems within it.
Plans in hand, we went to visit the head of development – and there the problems began. The man was a genius – alas, he was also obsessed with pet projects such as devising his own programming language with which the new system should be built. Pointing out that this was a fun but irrelevant technical distraction from the actual issues of the index calculation engine soon put me in his bad books. The next few months involved me being forced to battle against the willpower of the head of development and his dreams of a personally designed programming language. I failed, and so did the project.
Ultimately, the head of development was moved sideways, and I was given higher authority to address the real problems. With a team of around half a dozen top notch developers, we learned all we could about index calculations and produced monthly releases to the bank for the next six months. They seemed quite pleased with the results, but by now the top level in the bank had already lost faith in the Geneva branch, and eliminated dozens of people and their projects – me included.
This was frustrating – since we had received quite some acclaim for the effectiveness of the indexing system we were producing. However, the very top level in the company decided they could manage quite well without it. This made me ponder just how real those business needs could have been. I was starting to learn lessons that even my four years of deep study at university had not taught me.
By now, even with my own limited reflective powers, I was starting to see troubling patterns across all of the development projects in which I had been involved.
Was I a jinx? After all, most of the projects I had been on never really amounted to much, and those that did were very murky and unsatisfying compared to what I imagined they could potentially have been. Maybe I was the problem – maybe the projects would have been great successes if I hadn't been their to mess them up. Or maybe I had just been unlucky, and all my projects were atypical. Or maybe, just maybe, I was waking up to the sobering reality that most software projects are far less pleasant on the inside than they appear from the outside. I decided I need to investigate.
I left the bank, and went back to University to pursue a PhD – I wanted to immerse myself in understanding where things were going wrong. Now, most folks in the department turned out to be working on things far removed from “real life” - they were investigating parallel garbage collection schemes for Haskell, and the like.
Thankfully, my own supervisor (Stuart Kent – now at Microsoft) turned out to be a wonderfully practical man. We spent oodles of time researching modeling notations, software architectures, design patterns, and all sorts of interesting things. My PhD ended up exploring why and how software architectures decay over time resulting in fragile systems that people were afraid to change – it even offered a field-proven solution, taking a agile refactoring-driven approach to restoring adaptability. Importantly, my research advocated a reflective process, wherein the development team monitors customer satisfaction and adapts the process according.
It was great – I felt that I now understood lots more about the theories underpinning software development, and what can go wrong. With my new PhD in hand, I was ready to return to in-the-trenches software development and apply my adaptive process in the real world.