The Joel Test is a crude but effective tool for checking the maturity of a software development team, using simple, concrete questions to determine whether a team is following core best practices. The test could use a little sprucing up, to reflect improvements in the state of the practice over the last 10 years, to take into account some of the better ideas introduced with XP and Scrum. For example, “Do you make daily builds?” (question 3) should be updated to ask whether the team is following Continuous Integration. And you can argue that “Do you do hallway usability testing” (question 12) should be replaced with a question that asks whether the team works closely and collaboratively with the customer (or customer proxy) on requirements, product planning and prioritization, usability, and acceptance. And one of the questions should ask whether the team conducts technical (design and code) reviews (or pair programming).
A number of other people have considered how to improve and update the Joel Test. But all in all, the Joel Test has proved useful and has stood the test of time. It is simple, easy to remember, easy to understand and apply, it is directly relevant to programmers and test engineers (the people who actually do the work), it is provocative and it is fun. It makes you think about how software should be built, and how you measure up.
How does the Joel Test work?
It consists of 12 concrete yes/no questions that tell a lot about how the team works, how it builds software, how disciplined it is. A yes-score of 12 is perfect (of course), 11 is tolerable, a score of 10 or less indicates serious weaknesses. The test can be used by developers, or by managers, to rate their own organization; by developers who are trying to decide whether to take a job at a company (ask the prospective employer the questions, to see how challenging or frustrating your job will be); or for due diligence as a quick “smoke test”.
A recent post by Thomas Ptacek of Matsano Security explores how to apply the Joel Test to network security and IT management. In the same spirit, I am proposing a “Joel Test” for software security: a simple, concrete, informal way to assess a team’s ability to build secure software. This is a thought experiment, a fun way of thinking about software security and what it takes to build secure software, following the example of the Joel Test, its principles and its arbitrary 12-question framework. It is not, of course, an alternative to comprehensive maturity frameworks like SAMM or BSIMM, which I used as references in preparing this post, but I think a simple test like this can still provide useful information.
So, here is my attempt at the 12 questions that should be asked in a Software Security “Joel Test”:
1. Do you have clear and current security policies in place so developers know what they should be doing, and what they should not be doing? Realistic, concrete expectations, not legalese or boiler plate. Guidelines that programmers can follow and do follow in building secure software.
2. Do you have someone (person or a team) who is clearly responsible for software security? Someone who helps review design and code from a security perspective, who can coach and mentor developers and test engineers, provide guidelines and oversight, make risk-based decisions regarding security issues. If everybody is accountable for security, then nobody is accountable for security. You need to have someone who acts as coach and cop, who has knowledge and authority.
3. Do you conduct threat modeling, as part of, or in addition to, your design reviews? This could be lightweight or formal, but some kind of structured security reviews need to be done especially for new interfaces, major changes.
4. Do your code reviews include checks for security and safety issues? If you have to ask, “ummm, what code reviews?”, then you have a lot of work ahead of you.
5. Do you use static analysis checking for security (as well as general quality) problems as part of your build?
6. Do you perform risk-based security testing? Does this include destructive testing, regular penetration testing by expert pen testers, and fuzz testing?
7. Have you had an expert, security-focused review of your product’s architecture and design? To ensure that you have a secure baseline, or to catch fundamental flaws in your design that need to be corrected.
8. Do product requirements include security issues and needs? Are you, and your customers, thinking about security needs up front?
9. Does the team get regular training in secure development and defensive coding? Microsoft’s SDL recommends that team members get training in secure design, development and testing at least once per year to reinforce good practices and to stay current with changes in the threat landscape.
10. Does your team have an incident response capability for handling security incidents? Are you prepared to deal with security incidents, do you know how to escalate, contain and recover from security breaches or respond to security problems found outside of development, communicate with customers and partners.
11. Do you track security issues and risks in your bug database / product backlog for tracking and followup? Are security issues made visible to team members for remediation?
12. Do you provide secure configuration and deployment and/or secure operations guidelines for your operations team or customers?
These are the 12 basic, important questions that come to my mind. It would be interesting to see alternative lists, to find out what I may have missed or misunderstood.