Code Review Checklist

The following is the result of some lessons learned from code reviews done for me by peers.

Before Sending A Code Review

  • Is there a chance that testing was done without updating all segments of the applications? (Eg. front end was changed, back end was not.)
  • Update repository and rerun all tests before making diff
  • Is test data such that the result of the tests doesn't truly indicate pass or fail?
  • Are there any files not included in the diff?
  • Have all tests and/or validations been run?
  • Use the rest of this checklist to do a self-review of your own code
  • Are there related or reciprocal classes that also need to be changed?
  • If this is a bug fix, are similar bugs potentially lurking in other parts of the code?
  • Am I trying to abuse a data type or class to work in a way it was not specifically designed for?

Process

  • This completely addresses the given MKS unless the comment specifies it is a partial solution with more code pending
    • For QA bugs, the solution addresses all details given
  • Does this impact QA or Build in a way that they are not expecting (eg. install or configuration changes)?
    • If so, has this been communicated?

General Coding

  • Result of function calls captured, verified, and handled
  • Variables' scopes are minimal
  • Variables' names are clear
  • Variables are used close to where they are declared
  • Loop boundaries
  • Ensure correct breaks in switch statements
  • For non-C code, approximately adheres to C coding standards
  • Follow failure paths through each function:
    • make sure all paths are handled
    • make sure error messages make sense for all paths that terminate there
  • Sufficient comments?
  • Sufficient logging?
  • Is there any corresponding code that needs to be changed as well? (Eg. Change to a deserializer w/o a change to the corresponding serializer.)
  • Are error messages easy to understand, well formed?
  • For 'if' statements, are all conditions dealt with (especially if there is no 'else' clause)?
  • If this is a second round of review, ask how much of the testing was redone.
  • Has code been copy/pasted from other parts of the code? Is code duplicated?
  • If this code imposes new restrictions on data, does it also handle migrating existing data?
  • Newly unused methods/properties/classes removed?

Unit Tests

  • Each change has sufficient unit tests
    • Are there further ways to break the code than those tested?
  • For each unit test (see What Makes a Good Test?)
    • Unit test is clear (Clarity means that a test should serve as readable documentation for humans, describing the code being tested in terms of its public APIs.)
    • Unit test is complete and concise (A test is complete when its body contains all of the information you need to understand it, and concise when it doesn't contain any other distracting information.)
    • Unit test is resilient (A resilient test doesn't have to change unless the purpose or behavior of the class being tested changes. )

Testing

  • If the new code relies on a file, are there tests for missing and invalid files?
  • Sufficient testing?
    • Was app tested on all supported platforms (OS, browser)?
    • Was the data used adequate to test the change?
    • Does the 'expected result' section capture all the expected behaviour or only one part of it?

Using SQL in code

  • Bind variables to correct fields of SQL statements
  • Read fields of SQL selects to correct variables
  • Are there any side-effects from JOINs?

C/C++

  • Coding standards adhered to:
    • no () casting
  • No unnecissary qualification of member functions in header (screws gcc4)
  • Check that all exceptions are being caught
  • No reference variables pointing to temporaries

C#

  • Check that all exceptions are being caught
  • Are all the "using" statements still necessary?

Perl

  • strict/warning/-w used
  • use perlcritic?
  • String to int conversion w/o suspending warnings
  • Variable name typos (should be checked by strict/warning/-w)
  • Are variable assignments happening in the expected context?

XML

  • Tags matched
  • Schema updated
  • Schema validation performed

Upgrade Scripts

  • For each action performed by the script, does it depend on components or data that may not yet be present on the box to be upgraded? Does this upgrade depend on another upgrade, and if so, are they guaranteed to run in the correct order?
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License