Where are we?

July 21st, 2013

For a multitude of reasons, we’ve moved our blogging to agileteams.wordpress.com. See you there! (and feel free to drop us a note if you can recommend an interesting wordpress.com blog to follow)

self-consistency

November 12th, 2011

Some of the theories around human judgment and decision-making suggest that humans find it hard to be consistent. Every so often I’ve had a chance to assess my own consistency, and confirmed that yes, I’m very human! Tonight I had a chance to re-test that theory on myself.

I just finished reviewing 11 conference abstracts. There are some good topics I’m looking forward to: I hope the other reviewers and the Program Committee agree with me about accepting them 😉 Reviewing and submitting my ratings on each abstract took me an average of 17 minutes. Since I did them in my spare time, in subsets over the past two weeks, it seemed possible I hadn’t judged the first subset or first few quite the same way as I did the others, due to ‘decision fatigue’ or other factors. So before I called my opinions ‘final’, I decided to I’d do a quick check on whether I’d been consistent.

The review process calls for 3 binary judgments, yes/no, where no indicates a submittal isn’t a good fit in some way; 6 Likert scale ratings from 1-5, where 1=strongly disagree, 2=disagree, 3=neutral, 4=agree, 5=strongly agree; specific comments to the author and the committee, and a final vote (reject, weak reject, neutral, weak accept, accept). So I put all 11 entries into a spreadsheet, with my yes/no and 1-5 ratings, and my votes.

The first sanity check I did was to calculate an average and median for 5 of the Likert values, to compare vs. the 6th which was “overall”. I was inconsistent in 1 of the 11. After re-reading it and comparing it to the others, I adjusted the overall score upward 1 notch to make it align better with the 5 specific ratings.

Then I did a quick pivot table, cross-tabulating my overall ratings against my votes. This let me see whether, for the same vote, I had a range of overall ratings, or if they all had the same rating. Of the 11, it seemed at first that I was inconsistent on 3. For instance, 3 entries initially had the same overall rating, but 1 of the 3 had a different vote – the explanation was that it had a NO which made it (IMHO) not acceptable, while the other 2 had only YESes. In another two cases, I had given different votes to two abstracts rated the same. I re-read them to see whether I thought my vote was wrong, my rating was wrong, or if there were other factors. I ended up adjusting the overall rating on one, the vote on the other.

What does this prove? Well, not much – it’s a small, and not-unbiased, study in decision-making! But I did draw two conclusions.

1. Even with good guidance and clear rating scales to guide me, and experience as a reviewer, I am still humanly inconsistent.

2. Doing just a wee bit of simple analysis gave me more uniform ratings and votes.

I think it was well worth the 15 minutes it took to build the spreadsheet, and the additional 20 minutes to review, adjust, update, and submit my revised ratings. I feel more confident that I made better judgments, and I figure I owe it to the submitters to try to be as objective as possible in assessing their hard work!

The final validation, of course, will be seeing how well my votes align with the program committee’s final choices. For that, I’ll have to wait for the program to be announced …

3 INCOSE takeaways on requirements

July 1st, 2011

Last week’s INCOSE International Symposium was refreshing. The sessions and events offered great opportunities to network with other industry professionals interested in systems and not just software, and I joined some useful tutorials in systems engineering. I’m still distilling my thoughts on “systems engineering vs. software engineering”, and will post later on that topic.

Participating on the “Is Requirements Engineering Really Necessary?panel with Brian Berenbach, Mark Sampson, and James Hulgan was great fun. We don’t have the official session survey ratings yet, but we drew an audience of several hundred who never ran out of questions for us. My top 3 takeaways from our discussions are:

  1. Emphasize activities, not titles. The more stakeholders and team members who understand and can use Requirements Engineering methods effectively, the more the system and business will benefit. RE advocates have to remember, though, that most systems engineers aren’t, and don’t want to become, “requirements engineers” or even “requirements analysts”. They are committed “control systems engineers”, “electronics engineers”, “software system engineers”, or “power systems engineers” who are passionate about their own domains of expertise. To be most useful, training in requirements elicitation and analysis should be aligned to their domain worlds, instead of expecting systems engineers to align with the world of RE.
  2. System requirements need systems thinking, too. How formally requirements are managed should depend on the risks and consequences – not all requirements are “created equal”. Likewise, how requirements are documented should depend on who they are being documented for – the audience who needs to understand and use them. With today’s increasingly complicated systems and escalating time-to-market pressures, the same old mountain-of-text-documents approaches don’t scale; we need to adapt, and start ‘system-engineering’ how we handle our requirements to fit the needs of the business and the system.
  3. No silver RE bullets. Requirements engineering isn’t a panacea that can solve any and all problems in a system. Requirements aren’t mushrooms to be “gathered” for analysis. They’re more like truffles that need to be carefully searched-for and unearthed. RE techniques can help you find the truffles and ensure that key needs aren’t overlooked. And RE can help you analyze and manage needs to ensure that requirements are well-defined, prioritized, verifiable, and necessary. RE can’t guarantee that you’ll never miss a requirement, include extraneous features, or misinterpret an important aspect. Using a mixture of senior and junior staff can help: experienced people are guided by the pain that came from overlooking key requirements or quality attributes in the past, and junior people can help the team avoid “expertosis”, by questioning assumptions and asking “why?”.

(My “point of view” slide can be downloaded from the Agile Teams website, as well as my 1-page position statement.)

management lessons from Bones

June 2nd, 2011

I don’t watch much TV, but one forensics show I’ve occasionally caught and enjoyed is the American TV series “Bones“. An old episode from the start of season 2 recently caught my attention for its clues on good software management. (This is an occupational hazard for me, but a fun one: I find myself seeing software development parallels, paradigms, and patterns in all kinds of non-software related contexts!) Read the rest of this entry »

RESS’11 wants YOU

April 10th, 2011

Requirements engineering for ‘systems of systems’ is an emerging area of research which is critical to many complex challenges we face today. Key industries include transportation, hospital networks, smart buildings and smart grids, defense systems, as well as systems from many other domains.

Are YOU doing interesting work with requirements for such systems? The RESS (Requirements Engineering for Systems of Systems) workshop at RE’11 is looking for contributions addressing issues, challenges, and solutions related to requirements engineering for such systems. Please consider submitting a paper describing your work and ideas!

All papers (4-page position papers, 6-page experience reports, or 8-10 page full papers) must be submitted by May 16. The full CFP (call for papers) is online at http://re.cs.depaul.edu/RESS/pages/workshop_papers.html. The workshop will be held on Aug. 30, 2011 in beautiful Trento, Italy as part of the RE’11 Conference – see re11.org for more information.

back to the future

February 27th, 2011

Over the past few months, my work scope has grown to encompass a new area: data mining and advanced analytics. As part of my newest project, I’ve been enjoying the opportunity to do some classical and new-fangled data analysis, and some real coding. I’ve written – and tested, and refactored – about 1000 lines of C# code in the last few weeks. That’s nowhere near the amount I used to produce early in my career, but you know what? It’s still fun. 🙂 Anyway, I thought I’d share 3 key observations that have begun to jell as a result of my new work.

  1. What’s changed the most are the development tools. Compared to the simple text editors I started with, modern IDEs offer way more built-in guidance and ‘accelerators’ (although they do sometimes get in the way). And I’m learning to leverage the vast amount of online help and forums available nowadays. Learning to navigate the complexities of newer IDEs and debuggers, and become hyper-efficient in using them, will still take a little time, I’m sure.
  2. Oddly, the languages themselves haven’t changed all that much. I began with assembler and FORTRAN and quickly moved into RatFor, a C-like Rational FORTRAN preprocessor. I also did some work in Pascal and Ada, and lots of batch scripting, before moving to C and C++, then into Java. Picking up C# over the last few weeks has been straightforward.
  3. My background in both agile methods and more formal approaches to architecture and requirements is clearly influencing how I do my work now. Thinking about what might be ‘the simplest thing that will possibly work’ and how to test steers my ‘XP For One’ task planning. I mull up front whether I need to design for performance to handle the huge datasets I’m working with now, and plan spikes to help me test early. For this project, throughput on my laptop is more than adequate so far: my programs runs through 4 years’ worth of data in just a few minutes. Robustness and error detection in how I clean and process my data are critical, though.

Bottom line: it’s gratifying to know that after so many years focusing more on management and process, I do still have the design, programming, and math skills to tackle and solve technical problems hands-on. It’s still cool to wake up in the morning with a piece of a solution to a coding challenge I fell asleep thinking about, or to find my mind puzzling out an answer while I’m in the shower. And I like that this experience is building my mental framework for my future technical leadership, whether in management or coaching or research, to truly understand what product development teams using these latest tools are coping with. It’s all good!

upcoming events in Requirements

October 30th, 2010

REFSQ’11 (Requirements Engineering: Foundation for Software Quality) in Essen, Germany, March 28-30, 2011: Calls for empirical proposals are being accepted now through January 7 (I’m on the Empirical Research Fair program committee). The Empirical Research Fair is new in 2011, and is intended to offer “lively discussion between academics and industrials to identify the right context for empirical studies” as well as identifying “empirical studies that can be conducted during the REFSQ 2011 itself”. Are you a researcher seeking to address the needs of people doing real requirements engineering in practice? Or an industry practitioner who would like to find an academic or two interested in RE research that’s genuinely relevant to your business? See the call for proposals, and please consider submitting one!

INCOSE (International Council on systems Engineering) IS2011 in Denver, CO, June 20-23, 3011: I’ve been  invited to participate in a proposed Requirements Engineering panel at the 21st INCOSE International Symposium. My point of view will focus on bringing an agile perspective to the topic of requirements engineering in systems engineering. This promises to be great fun if our proposal is accepted (we’ll find out around Feb. 22). We hope to generate some light, with minimal heat, in the discussion. Oh, and if you’ve been working on advancing the state of the art in engineering critical systems, check out the call for papers: they’re looking for a diverse range of submissions (due Nov. 3).

5 steps to customer dis-service

September 11th, 2010

How to lose a customer in 5 easy steps, motivated by a customer dis-service experience I had today with a company I pay lots of money every month for tech-related services: Read the rest of this entry »

my 5-step Flash installation workaround

September 9th, 2010

Nowadays it seems that I always have to work around problems with installing Adobe Flash Player updates on Windows. I’ve had outright failures, and I’ve had the installation appear to work for IE only to break Firefox, and vice versa. And I always have multiple computers to update, whenever there’s a new version. So I’ve decided it’s time to actually document my ‘process’ for doing it, to save myself time from now on. Let me know if this tip helps you, too. Read the rest of this entry »

on coaching agile teams

August 18th, 2010

Today’s webinar on “Confessions and Coaching for Leading Agile Teams”, hosted by PMI Agile, was well-attended (at least 324 people) and insightful. Here’s a brief summary of Lyssa Adkins‘ key points.

[The webinar was marred only by a few technical problems: un-muted children of an attendee in the background, and some key slides that showed blank in the Acrobat Connect Pro session. The noise impediment was quickly ‘bulldozed’ away by competent moderators, and the slides will be posted afterwards in addition to the webinar recording to help compensate for the second issue.]

The ‘confessions’ part of the webinar consisted of announcing the winners of the contest. Update: The videos are available online via the PMI Agile wiki page for Confessions of an Agile Project Manager videos.

The bulk of the webinar was Lyssa, talking about her lessons learned in coaching agile teams as a ‘recovering command-and-control-aholic’. I won’t go into details here, since the entire webinar should be available online within 24 hrs. In brief she covered 8 ‘radical thoughts’, followed by some further confessions and thoughts on how to improve effectiveness as an agile coach:

  1. Be detached from outcomes
  2. Take it to the team
  3. Be a mirror
  4. Master your face (be sure it shows you believe in the team)
  5. Let there be silence (learn to “get comfortable with uncomfortable silence”)
  6. Model being outrageous (to help team members “see brick walls'”)
  7. Let the team fail (at small things, all the time – they’ll get stronger)
  8. Be their biggest fan (on how they’re growing as a team and as individuals) 

The second half of Lyssa’s talk focused on becoming a better coach by improving both your agile mentoring skills and your professional coaching skills. She compared being an agile coach to being a river raft guide: the trip down the river is different every time, but experience helps greatlywith knowing where to rest, how to help the team navigate around hazards, and how to get people back into the raft and moving downstream again after a spill.

Her closing point: Agile coaching is “40% doing, 60% being” – it’s important to be what you want your teams to do, to walk the talk of commitment, simplicity, etc. Journaling and finding a ‘reflective surface’ can help.

In short, good stuff. I’ll definitely be downloading the slides and getting the webinar recording. I’ve already signed up for her free weekly coaching inspiration emails, and her book’s now moved higher up in my wish list.