Each line is prefixed with the corresponding review number(s). Main weaknesses: 1 - "seems incremental over previous work such as DDT" 2 - "I found it hard to understand the whole story of how SymDrive helps programmers--it's really a collection of tools for building hardware-independent tests of hardware-dependent software. Despite the author's best efforts, it's hard to make a convincing case that their system protects against the kinds of bugs that they claim it does." 3 - "This work seems like useful but fairly simple extension of past work. The evaluation is somewhat modest." 4 - "Most ideas in the paper were present in previous work (DDT), such as symbolic execution of driver code, and symbolic hardware." 5 - "There doesn't seem to be much novelty other than the application of symbolic execution to a new domain. The claimed benefit is thin as it improves testing, but doesn't necessarily find bugs." "The problem is that the paper seems to be light on the insights and novel conclusions" Other comments: 1 - Introduction oversells. Need to set expectations right, since we test individual functions. 1,3 - Unconvincing experiments. 1,3 * Need more real patches tested 1 * Need false +/- reported 1 * Need time to execute a function 1 * Limits on sizes of functions that we can test? 3 * Find new bugs 4 * Report coverage of entire driver not just a few functions. 2 - Clarify: we should say "Many aspects of device driver correctness have nothing to do with the underlying device." 2 - Clarify: driver could still pass all the structural test cases that we write, but fail to work. 2 - Paper is just a collection of tools. Unclear precisely how testing a driver would proceed. Exposition unclear. 3 - Not novel 1 * Work is incremental compared to DDT 3 * Testing framework is just assertions 3 * Search heuristic DFS/BFS 4 * Symbolic hardware / symbolic execution of driver code 4 * Paper is weak compared to DDT > binary vs source code requirement > 6 drivers vs 2 > DDT provides more(?) bug detection tools 5 * SymDrive simply reuses previous symbolic execution techniques while applying them to device drivers 5 - Significant burden on developer to write new test specifications and checks Specific questions: 2 - What did we achieve 100% coverage of? Paths, lins of code, what? 2 - SymDrive is good at finding structural errors - buffer overflows, missed cases, dead paths. How common are these kinds of errors? Is that the point of SymDrive? 2 - [gleaned from comments]: when precisely are specifications tests executed ("certain points")