Team Member

Rainer Niedermayr


… is consultant for software quality at CQSE. He studied information systems at the Technische Universität München and at the Aalto University in Espoo, Finland. After graduating, he worked for two years as software engineer in a medium-sized software company. Alongside his consulting activities, he is a Ph.D. student in the area of software testing.

  • +49 space::163 space::7163900
  • niedermayr@invalid::cqse.eu
  • @nrainer2

Blog Posts


The issue metrics are an exciting feature of Teamscale. They allow you to analyze and visualize issues (a.k.a. tickets) of an issue tracker using a query language. Benjamin presented this feature in a previous blog post.

In this post, I will show how issue metrics can be used in threshold configurations to assess the number of critical bugs.

Read more...


Teamscale computes a number of metrics for each analyzed project (lines of codes, clone coverage, comment completeness, …) and supports uploading further metrics from external systems (e.g. from the build server). The computed metrics are updated with every commit and provide an overview over the state of a software project. As it is sometimes hard to tell what is good and what is bad, the next Teamscale version will be equipped with a new powerful feature that provides an assessment of metric values based on (built-in or custom) threshold configurations. It will help the user interpret the values.

Read more...


The Software Maintainability Index (MI) is a single-value indicator for the maintainability of a software system. It was proposed by Oman and Hagemeister in the early nineties [1]. The Maintainability Index is computed by combining four traditional metrics. It is a weighted composition of the average Halstead Volume per module, the Cyclomatic Complexity, the number of lines of code (LOC) and the comment ratio of the system.

Read more...


Publications


Rainer Niedermayr, Stefan Wagner:

Is the Stack Distance Between Test Case and Method Correlated With Test Effectiveness?

Proceedings of the 23rd International Conference on Evaluation and Assessment in Software Engineering (EASE’19), 2019.

Roman Haas, Rainer Niedermayr, Tobias Roehm, Sven Apel:

Poster: Recommending Unnecessary Source Code Based on Static Analysis.

Proceedings of the 41st International Conference on Software Engineering Companion (ICSE’19), 2019.

Roman Haas, Rainer Niedermayr, Elmar Juergens:

Teamscale: Tackle Technical Debt and Control the Quality of Your Software.

Proceedings of the 2nd International Conference on Technical Debt (TechDebt’19), 2019.

Rainer Niedermayr, Tobias Roehm, Stefan Wagner:

Poster: Identification of Methods with Low Fault Risk.

Proceedings of the 40th International Conference on Software Engineering Companion (ICSE’18), 2018.

Jakob Rott, Rainer Niedermayr, Elmar Juergens, Dennis Pagano:

Ticket Coverage: Putting Test Coverage into Context.

Proceedings of the 8th Workshop on Emerging Trends in Software Metrics (WETSoM’17), 2017.

Rainer Niedermayr, Elmar Juergens, Stefan Wagner:

Will My Tests Tell Me If I Break This Code?

Proceedings of the International Workshop on Continuous Software Evolution and Delivery (CSED’16), 2016.