Get PDF The Goal Question Metric Method : a Practical Guide for Quality Improvement of Software Development

Free download. Book file PDF easily for everyone and every device. You can download and read online The Goal Question Metric Method : a Practical Guide for Quality Improvement of Software Development file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with The Goal Question Metric Method : a Practical Guide for Quality Improvement of Software Development book. Happy reading The Goal Question Metric Method : a Practical Guide for Quality Improvement of Software Development Bookeveryone. Download file Free Book PDF The Goal Question Metric Method : a Practical Guide for Quality Improvement of Software Development at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF The Goal Question Metric Method : a Practical Guide for Quality Improvement of Software Development Pocket Guide.

Email address for updates. My profile My library Metrics Alerts. Sign in. Get my own profile Cited by View all All Since Citations h-index 21 13 iindex 42 Jos J. Trienekens Technische Universiteit Eindhoven Verified email at tue.

Metrics & Measurement Framework using MoSCoW & GQM Methods

Verified email at american. Verified email at tudelft. Articles Cited by Co-authors. Encyclopedia of software engineering, , Biffl Ed.

Metrics & Measurement Framework using MoSCoW & GQM Methods

Boehm, J. Brown, M. Kaplan, D. Norton, The strategy map: guide to aligning intangible assets, Strategy Leaders. Basili, A. Trendowicz, M. Kowalczyk, J. Heidrich, C. Seaman, J. Basili, M. Lindvall, M. Regardie, C. Heidrich, J. Basili, D. Aqeel, Sehrish. Monden, et al. ED, No. Basili, et al. Humphrey, W. The leaders of the software unit then decide on a set of actions i. Some of these scenario steps directly generate measurement goals, e. These measurement goals are then the starting point for apply- ing traditional GQM to determining questions and specific metrics and data e.

This not only provides a meaningful rationale for the software measurement effort, but also provides a blueprint for interpreting the data at each level up the chain. Assumptions may be reexamined and modified to create new strategies and scenarios. Most importantly, the effects of any changes can be understood in the context of the entire goal set. However, for certain application fields, it makes sense to focus on certain goal and strategy levels and give them more concrete names, as in our case for software-oriented organizations. Table 1 defines all conceptual elements and Figure 2 gives an over- view of the relationships between those elements.

Table 1. Strategy decisions A set of possible approaches for achieving business goals. Software goals A set of goals directly related to the software process or product for implementing the strategy decisions. Scenarios Sets of concrete steps for achieving selected software goals. Templates are available to support this. Measurement goals Goals that help to make scenario steps operational through meas- urement.

Interpretation models Models that help interpret data to determine whether goals at all levels are achieved. Assumptions Estimated unknowns that can affect the interpretation of the data.


  • The Arab-Israeli Conflict: A Documentary History of the Struggle for Peace in Palestine.
  • Performance assessment of an e‐learning software system for sustainability?
  • Duplicate citations!
  • Software Product Metrics – Goal-Oriented Software Product Measurement | SpringerLink!
  • External Environmental Costs of Electric Power: Analysis and Internalization.
  • Software Product Metrics – Goal-Oriented Software Product Measurement.

Context factors Environmental variables that change the kind of models and data that can be used. A business goal may be refined by more specific business goals. Four different types of business goals are distin- guished: growth goals, success goals, maintenance goals, and specific focus goals. Growth goals include acquiring new projects within the current competence areas, expanding the existing project set, evolving existing competen- cies, building new competencies, etc. Success goals include delivering good products to customers, controlling costs, shrinking schedules, increasing profits, achieving corporate visibility e.

Maintenance internal goals include transparency, employee satisfaction, controlled risk, learning environ- ment; the idea is to measure to assure no decrease. Specific focus goals include such things as making the helpdesk more efficient, or predicting if a proposed effort has a good ROI. They are used to explicitly define the rationale behind a business goal, select a strategy, interpret a software goal, select a relevant scenario, and define measurement goals.

Related Posts

When the complete goal hierarchy is defined, the measures can be taken and interpretations made to see if the goals at all levels have been achieved. What influences the efficiency? Found defects per reviewer per hour Found defects per reviewer per hour Human Factors?

Human Factors? Technical Factors? For instance, one wants to safeguard a place in the market and therefore increase costumer satisfaction. The template consists of eight sections. First, the main focus cost, profit, turnover, market share, prestige, customer sat- isfaction and object people, market, a project, collection of projects, customer of the business goal is ad- dressed as well as the basic activity that is performed reducing, increasing, achieving, pursuing, or provid- ing the main focus of the business goal.

Finally, the scope needs to be defined the whole organization, a certain business unit, or a person as well as constraints limited influence on certain factors, laws, mis- sion statement and basic principles and relations with other goals tradeoffs, hierarchy, and ordering. Context factors and assumptions restrict the set of applicable strategies. For in- stance, in order to test in reliability, the software test processes must be examined.


  1. Account Options!
  2. Services on Demand.
  3. Secrets (D20 System Accessories).
  4. Information Systems Strategy and Governance 8 Twenty Eighth International Conference on Information Systems, Montreal c The most promising goal considering feasibility, cost, and benefit for each potential software goal is se- lected. Again, context factors and assumptions help to define the right selection criteria.

    Again, measures and models demonstrating how to aggregate and interpret the measures are defined based on the magnitude and timeframe definition of the formalized software goal. The results of these inquiries lead to a list of context factors and assumptions.

    Explaining Goal Question Metric (GQM) with XBOSoft's Phil Lew

    The measurement object, pur- pose, quality aspect, viewpoint, and context are defined. For a certain goal business goal, software goal, or measurement goal , a set of complementary goals may exist that additionally support the current goal. A set of competing goals may exist that conflict with the current goal while other goals may be totally unaffected by the current goal; these are referred to as indifferent goals.

    Implementing a Measurement Program After defining a measurement program, it must be deployed within the context of a specific organization. Typical instrumentation activi- ties include creating a measurement plan and preparing data collection tools.

    Related Posts

    A Measurement Plan defines the proc- 9. In principle, a certain measure may be collected once or multiple times.

    Which documentation for software maintenance?

    Multiple measurements may be performed per designated frequency or on an event basis. The number of re- quirements measures, for instance, may be collected after the requirements freeze once , at the end of each devel- opment life cycle phase time basis , or after each requirements change event basis.

    Additionally, a measurement plan may include suggested implementation guidelines, e. Data collection tools facilitate collecting, storing, maintaining, and retrieving measurement data. Dependent on the type of collected data qualitative or quantitative , measurement instruments may range from manual, paper-based data collection forms to semi-automated forms, email-triggered collection, and online forms.

    Fully automated col- lection may involve using static analyzers of software development products e. Other data collection instruments may be stand-alone tools or plug-ins, or may be an integral part of larger software development environments e. An important issue to consider during the selection of data collection instruments is their acceptance by data providers. Without such acceptance, data collection usually suffers from missing, faked, or manipulated data. The experience factory EF Basili et al. EF provides mechanisms to access and modify reuse stored data in order to meet the in- formation needs of a specific project, based upon actual business, software, and measurement objectives.

    Data Collection and Validation Data collection is not limited to gathering the measurement data. Correct implementation of a measurement program requires data validation to assure that later data analysis and interpretation are based upon valid data. As already mentioned, the selection of appropriate people data providers and tools are key determinants of valid data.

    Therefore, the purposes of the measurement should be clearly communicated to the affected personnel and data collection should be automated whenever it is possible. Besides preventive actions, a postmortem analysis of measurement outputs might be used to identify potentially in- valid data. For that purpose, basic descriptive statistics or simple consistency checks might be used. Descriptive sta- tistics might be used to identify potentially invalid data e.


    • Benefits of Software Metrics;
    • What are Software Metrics? Examples & Best Practices?
    • Using GQM and TAM to evaluate StArt - a tool that supports Systematic Review.
    • The Goal/Question/Metric method: a practical guide for quality improvement of software development!
    • Enabling Real-Time Business Intelligence: 4th International Workshop, BIRTE 2010, Held at the 36th International Conference on Very Large Databases, VLDB 2010, Singapore, September 13, 2010, Revised Selected Papers.

    Potentially invalid data may then be either excluded from the analysis or explicitly considered during analysis and interpretation. Consistency checks may, for instance, insist on collecting the same data with various instruments e. Such an approach would, however, require collecting redundant data. On the other hand, consistency checks might be based on common sense and expectations regarding the measured output. If measured data does not match intui- tive expectations, the underlying reasons should be investigated. Information Systems Strategy and Governance 10 Twenty Eighth International Conference on Information Systems, Montreal Finally, the data collection process should be controlled with respect to the measurement plan.

    This includes track- ing that the data is collected according to definitions and that it is provided according to schedule. Data Analysis, Visualization, and Interpretation Data analysis and visualization typically consist of first looking at descriptive statistics and then applying quality models. Yet, various analysis techniques require specific characteristics of the input data. Statistical methods, for instance, usually assume completeness and normal distribution of the data. On the other hand, the computational complexity of some machine learning techniques limits their practical applicability for large data sets.

    Moreover, certain analysis methods require input data to be measured on specific measurement scales e. In consequence, applying certain analysis methods requires, in practice, prior data preprocessing.

    fensterstudio.ru/components/cymujufi/seso-rastrear-el.php As the first step of analysis, descriptive sta- tistics are usually employed to understand the nature of the analyzed data represented by such characteristics as range, central tendency, and dispersion. Example descriptive statistics include mean, median, and standard deviation of data. Next, inferential analysis is used to analyze dependencies between measures and interpret with respect to the achievement of goals on various abstraction levels. The impact of measures and goals at lower levels of abstraction on measures and goals on higher levels is evaluated. During data analysis, the context information should be consid- ered in order to properly interpret results.

    Context, represented by a set of context factors, characterizes important attributes of the setting in which the measurement objects product, processes, etc. Context specifi- cation is an important part of defining goals and deriving measures, since it prevents drawing wrong conclusions from the analysis. Visualization supports the analysis of both rough measurement data as well as the results of data analysis, provides a basis for interpreting the results of data analysis, and supports understanding complex data characteristics and inter- actions.

    Numerous graphical notations exist to present data. The selection of a certain notation depends on the pur- pose of the visualization e. The most common graphs include box plots, pie charts, histograms, bar charts, and scatter plots. Management of a Measurement Program As with any other engineering process, in order to be effective, measurement must be integrated within project and organizational processes and must be managed appropriately.

    Typical management activities include 1 planning measurement, 2 performing measurement, and 3 evolving measurement. Planning measurement involves estab- lishing commitment at the management and project levels, defining a measurement program and integrating it into existing technical and management processes, as well as setting up organizational structures for executing the meas- urement program i. Per- Finally, evolving measurement applies monitoring and improvement methods to the measure- ment process itself. The capability of the established measurement program is, for example, evaluated with respect to costs and profits.

    Iterative control and improvement of the measurement program is supposed to keep it tailored to the changing characteristics of a particular application context, e. Practical Example The example presented in this section is a hypothetical project, but built upon real measurement project experience over many years.