Measuring Data Quality for Ongoing Improvement

A Data Quality Assessment Framework


Author: Laura Sebastian-Coleman

Publisher: Newnes

ISBN: 0123977541

Category: Computers

Page: 376

View: 6439

The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You’ll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You’ll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies. Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges Enables discussions between business and IT with a non-technical vocabulary for data quality measurement Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation

Data Quality Assessment


Author: Arkady Maydanchik

Publisher: Technics Publications

ISBN: 163462047X

Category: Computers

Page: 336

View: 4626

Imagine a group of prehistoric hunters armed with stone-tipped spears. Their primitive weapons made hunting large animals, such as mammoths, dangerous work. Over time, however, a new breed of hunters developed. They would stretch the skin of a previously killed mammoth on the wall and throw their spears, while observing which spear, thrown from which angle and distance, penetrated the skin the best. The data gathered helped them make better spears and develop better hunting strategies. Quality data is the key to any advancement, whether it’s from the Stone Age to the Bronze Age. Or from the Information Age to whatever Age comes next. The success of corporations and government institutions largely depends on the efficiency with which they can collect, organize, and utilize data about products, customers, competitors, and employees. Fortunately, improving your data quality doesn’t have to be such a mammoth task. DATA QUALITY ASSESSMENT is a must read for anyone who needs to understand, correct, or prevent data quality issues in their organization. Skipping theory and focusing purely on what is practical and what works, this text contains a proven approach to identifying, warehousing, and analyzing data errors – the first step in any data quality program. Master techniques in: • Data profiling and gathering metadata • Identifying, designing, and implementing data quality rules • Organizing rule and error catalogues • Ensuring accuracy and completeness of the data quality assessment • Constructing the dimensional data quality scorecard • Executing a recurrent data quality assessment This is one of those books that marks a milestone in the evolution of a discipline. Arkady's insights and techniques fuel the transition of data quality management from art to science -- from crafting to engineering. From deep experience, with thoughtful structure, and with engaging style Arkady brings the discipline of data quality to practitioners. David Wells, Director of Education, Data Warehousing Institute

Toward a Framework for Assessing Data Quality


Author: Carol S. Carson

Publisher: International Monetary Fund


Category: Economic indicators

Page: 69

View: 5589

This paper describes work in progress on data quality, an important element of greater transparency in economic policy and financial stability. Data quality is being dealt with systematically by the IMF through the development of data quality assessment frameworks complementing the IMF’s Special Data Dissemination Standard (SDDS) and General Data Dissemination System (GDDS). The aim is to improve the quality of data provided by countries to the IMF; and to assess evenhandedly the quality of countries’ data in Reports on the Observance of Standards and Codes. The frameworks bring together best practices including those of the United Nations Fundamental Principles of Official Statistics.

Data Quality Assessment Methods for the Eastern Range 915 MHz Wind Profiler Network


Author: National Aeronautics and Space Administration (NASA)

Publisher: Createspace Independent Publishing Platform

ISBN: 9781722141646


Page: 50

View: 3852

The Eastern Range installed a network of five 915 MHz Doppler Radar Wind Profilers with Radio Acoustic Sounding Systems in the Cape Canaveral Air Station/Kennedy Space Center area to provide three-dimensional wind speed and direction and virtual temperature estimates in the boundary layer. The Applied Meteorology Unit, staffed by ENSCO, Inc., was tasked by the 45th Weather Squadron, the Spaceflight Meteorology Group, and the National Weather Service in Melbourne, Florida to investigate methods which will help forecasters assess profiler network data quality when developing forecasts and warnings for critical ground, launch and landing operations. Four routines were evaluated in this study: a consensus time period check a precipitation contamination check, a median filter, and the Weber-Wuertz (WW) algorithm. No routine was able to effectively flag suspect data when used by itself. Therefore, the routines were used in different combinations. An evaluation of all possible combinations revealed two that provided the best results. The precipitation contamination and consensus time routines were used in both combinations. The median filter or WW was used as the final routine in the combinations to flag all other suspect data points. Lambert, Winifred C. and Taylor, Gregory E. Kennedy Space Center NAS10-96018...

Uncertainty and Data Quality in Exposure Assessment


Author: World Health Organization

Publisher: World Health Organization

ISBN: 9241563761

Category: History

Page: 158

View: 9470

Assessment of human exposure to chemicals is a critical input to risk assessment and ultimately to decisions about control of chemicals. This two-part publication aims to improve the quality of information available to decision-makers and its communication. Part one sets out ten principles for characterizing and communicating uncertainty in exposure assessment. A tiered approach to the evaluation of uncertainties using both qualitative (simple) and quantitative (more complex) methods is described. Different sources of uncertainty are identified and guidance is provided on selecting the appropriate approach to uncertainty analysis as dictated by the objectives of the assessment and information needs of decision-makers and stakeholders. Part two addresses the quality of data used in exposure assessment and sets out four basic hallmarks of data quality - appropriateness accuracy integrity and transparency. These hallmarks provides a common vocabulary and set of qualitative criteria for use in the design evaluation and use of exposure assessments to support decisions. This publication is intended exposure assessors risk assessors and decision-makers.

Elements of Spatial Data Quality


Author: S.C. Guptill,J.L. Morrison

Publisher: Elsevier

ISBN: 1483287947

Category: Science

Page: 202

View: 5825

Elements of Spatial Data Quality outlines the need and suggests potential categories for the content of a comprehensive statement of data quality that must be imbedded in the metadata that accompanies the transfer of a digital spatial data file or is available in a separate metadata catalog. Members of the International Cartographic Association's Commission on Spatial Data Quality have identified seven elements of data quality: positional accuracy, attribute accuracy, completeness, logical consistency, lineage, semantic accuracy and temporal information. In the book the authors describe: components of each data quality element, possible metrics that can be used to measure the quality of each criteria, possible testing and rating schemes, and how these parameters might differ from a producer or user point of view. Finally no volume of this nature would be complete without a chapter devoted to necessary future research in this subject area. The chapter points out areas in need of further investigation and speculates about the use and transfer of digital spatial data in tomorrow's electronic world and at developments in presenting specified data quality information in a visualization. This book will be of interest to all of those individuals involved in geographical information systems and spatial data handling.

The Practitioner's Guide to Data Quality Improvement


Author: David Loshin

Publisher: Elsevier

ISBN: 9780080920344

Category: Computers

Page: 432

View: 4211

The Practitioner's Guide to Data Quality Improvement offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. It shares the fundamentals for understanding the impacts of poor data quality, and guides practitioners and managers alike in socializing, gaining sponsorship for, planning, and establishing a data quality program. It demonstrates how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. It includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning. This book is recommended for data management practitioners, including database analysts, information analysts, data administrators, data architects, enterprise architects, data warehouse engineers, and systems analysts, and their managers. Offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. Shows how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. Includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning.

Executing Data Quality Projects

Ten Steps to Quality Data and Trusted Information (TM)


Author: Danette McGilvray

Publisher: Elsevier

ISBN: 0080558399

Category: Computers

Page: 352

View: 7393

Information is currency. Recent studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions. In this important and timely new book, Danette McGilvray presents her “Ten Steps approach to information quality, a proven method for both understanding and creating information quality in the enterprise. Her trademarked approach—in which she has trained Fortune 500 clients and hundreds of workshop attendees—applies to all types of data and to all types of organizations. * Includes numerous templates, detailed examples, and practical advice for executing every step of the “Ten Steps approach. * Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices. * A companion Web site includes links to numerous data quality resources, including many of the planning and information-gathering templates featured in the text, quick summaries of key ideas from the Ten Step methodology, and other tools and information available online.

Forest inventory and analysis national data quality assessment report for 2000 to 2003


Author: James E. Pollard,Rocky Mountain Research Station (Fort Collins, Colo.)

Publisher: N.A


Category: Forest surveys

Page: 43

View: 2885

The Forest Inventory and Analysis program (FIA) is the key USDA Forest Service (USFS) program that provides the information needed to assess the status and trends in the environmental quality of the Nation's forests. The goal of the FIA Quality Assurance (QA) program is to provide a framework to assure the production of complete, accurate and unbiased forest information of known quality. Specific Measurement Quality Objectives (MQO) for precision are designed to provide a window of performance that we are striving to achieve for every field measurement. These data quality goals were developed from knowledge of measurement processes in forestry and forest ecology, as well as the program needs of FIA. This report is a national summary and compilation of MQO analyses by regional personnel and the National QA Advisor. The efficacy of the MQO, as well as the measurement uncertainty associated with a given field measurement, can be tested by comparing data from blind check plots where, in addition to the field measurements of the standard FIA crew, a second QA measurement of the plot was taken by a crew without knowledge of the first crew's results. These QA data were collected between 2000 and 2003 and analyzed for measurement precision between FIA crews. The charge of this task team was to use the blind-check data to assess the FIA program's ability to meet data quality goals as stated by the MQO. The results presented indicate that the repeatability was within project goals for a wide range of measurements across a variety of forest and nonforest environments. However, there were some variables that displayed noncompliance with MQO goals. In general, there were two types of noncompliance: the first is where all the regions were below the MQO standard, and the second is where a subset of the regions was below the MQO standards or was substantially different from the other remaining regions. Results for each regional analysis are presented in appendix tables. In the course of the study, the task team discovered that there were difficulties in analyzing seedling species and seedling count variables for MQO compliance, and recommends further study of the issue. Also the task team addresses the issue of trees missed or added and recommends additional study of this issue. Lastly, the team points out that traditional MQO analysis of the disturbance and treatment variables may not be adequate. Some attributes where regional compliance rates are dissimilar suggest that regional characteristics (environmental variables such as forest type, physiographic class, and forest fragmentation) may have an impact on the ability to obtain consistent measurements. Additionally, differences in data collection protocols may cause differences in compliance rates. For example, a particular variable may be measured with a calibrated instrument in one region, while ocularly estimated in another region.

Data Quality Assessment Statistical Methods for Practitioners EPA Qa/G9-S - Scholar's Choice Edition


Author: U S Environmental Protection Agency

Publisher: Scholar's Choice

ISBN: 9781298045478


Page: 202

View: 2969

This work has been selected by scholars as being culturally important, and is part of the knowledge base of civilization as we know it. This work was reproduced from the original artifact, and remains as true to the original work as possible. Therefore, you will see the original copyright references, library stamps (as most of these works have been housed in our most important libraries around the world), and other notations in the work. This work is in the public domain in the United States of America, and possibly other nations. Within the United States, you may freely copy and distribute this work, as no entity (individual or corporate) has a copyright on the body of the work.As a reproduction of a historical artifact, this work may contain missing or blurred pages, poor pictures, errant marks, etc. Scholars believe, and we concur, that this work is important enough to be preserved, reproduced, and made generally available to the public. We appreciate your support of the preservation process, and thank you for being an important part of keeping this knowledge alive and relevant.