Proactive security review Proactive security review and test efforts are a necessary component of the software development lifecycle. Resource limitations often preclude reviewing the entire code base. One way to prioritize security efforts is risk-based attack surface approximation (RASA), a technique that uses crash dump stack traces to predict what code may contain exploitable vulnerabilities. […]

Read More

Peter Rigby and myself are organizing a workshop on Industrial Software Testing at the 24th ACM SIGSOFT International Symposium on the Foundations of Software Engineering (FSE) held in Seattle, WA, USA. When? The workshop will be held Monday, November 14, 2016. The submission deadline is July 1, 2016. Please check the workshop website for the […]

Read More

News The public data sets for our paper on “Untangling Code Changes” published at the Mining Software Repository working conference in 2013 have moved. Please find the data sets using out GitHub repository” https://github.com/kimherzig/untangling_changes. The git repository snapshots of the individual projects used in our analyses can be found in the download section below. General […]

Read More

In 2013 we published a paper with the title “The impact of tangled code changes” for the 10th Working Conference on Mining Software Repositories. This year, we published an extended version of this paper showing the impact of tangled code changes on actual defect prediction models proposed by so many researchers in the past decade: […]

Read More

While Microsoft product teams have adopted defect prediction models, they have not adopted vulnerability prediction models (VPMs). Seeking to understand this discrepancy, we replicated a VPM for two releases of the Windows Operating System, varying model granularity and statistical learners. We reproduced binary-level prediction precision (~0.75) and recall (~0.2). However, binaries often exceed 1 million […]

Read More

Security testing and reviewing efforts are a necessity for software projects, but are time-consuming and expensive to apply. Identifying vulnerable code supports decision-making during all phases of software development. An approach for identifying vulnerable code is to identify its attack surface, the sum of all paths for untrusted data into and out of a system. […]

Read More

Testing large systems such as the Microsoft Windows operating system requires complex test infrastructures, which may lead to test failures caused by faulty tests and test infrastructure issues. Such false test alarms are particular annoying as they raise engineer attention and require manual inspection without providing any benefit. The goal of this work is to use empirical data to minimize the number of false test alarms reported during system and integration testing. To achieve this goal, we use association rule learning to identify patterns among failing test steps that are typically for false test alarms and can be used to automatically classify them.

Read More