Executing Data Quality Projects

Ten Steps to Quality Data and Trusted Information (TM)

DOWNLOAD NOW »

Author: Danette McGilvray

Publisher: Elsevier

ISBN: 0080558399

Category: Computers

Page: 352

View: 6302

Information is currency. Recent studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions. In this important and timely new book, Danette McGilvray presents her “Ten Steps approach to information quality, a proven method for both understanding and creating information quality in the enterprise. Her trademarked approach—in which she has trained Fortune 500 clients and hundreds of workshop attendees—applies to all types of data and to all types of organizations. * Includes numerous templates, detailed examples, and practical advice for executing every step of the “Ten Steps approach. * Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices. * A companion Web site includes links to numerous data quality resources, including many of the planning and information-gathering templates featured in the text, quick summaries of key ideas from the Ten Step methodology, and other tools and information available online.

Executing Data Quality Projects

Ten Steps to Quality Data and Trusted InformationTM

DOWNLOAD NOW »

Author: Danette McGilvray

Publisher: Morgan Kaufmann

ISBN: 0080951589

Category: Computers

Page: 352

View: 6890

Information is currency. In today’s world of instant global communication and rapidly changing trends, up-to-date and reliable information is essential to effective competition. Recent studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions. In Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information, Danette McGilvray presents a systematic, proven approach to improving and creating data and information quality within the enterprise. She describes a methodology that combines a conceptual framework for understanding information quality with the tools, techniques, and instructions for improving and creating information quality. Her trademarked "Ten Steps" approach applies to all types of data and to all types of organizations. * Includes numerous templates, detailed examples, and practical advice for executing every step of The Ten Steps approach. * Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices. * A companion Web site includes links to numerous data quality resources, including many of the planning and information-gathering templates featured in the text, quick summaries of key ideas from The Ten Step methodology, and other tools and information that is available online.

Executing Data Quality Projects

Ten Steps to Quality Data and Trusted Information

DOWNLOAD NOW »

Author: Danette McGilvray

Publisher: Morgan Kaufmann

ISBN: 9780123743695

Category: Business & Economics

Page: 325

View: 7554

Introduces a systematic, effective approach to enhancing and creating data and information quality that integrates a conceptual framework with essential tools, techniques, and instructions, accompanied by helpful templates, real-world examples, and advice, as well as highlighted definitions, key concepts, checkpoints, warnings, communication activities, and best practices. Original. (Intermediate)

The Practitioner's Guide to Data Quality Improvement

DOWNLOAD NOW »

Author: David Loshin

Publisher: Elsevier

ISBN: 9780080920344

Category: Computers

Page: 432

View: 497

The Practitioner's Guide to Data Quality Improvement offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. It shares the fundamentals for understanding the impacts of poor data quality, and guides practitioners and managers alike in socializing, gaining sponsorship for, planning, and establishing a data quality program. It demonstrates how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. It includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning. This book is recommended for data management practitioners, including database analysts, information analysts, data administrators, data architects, enterprise architects, data warehouse engineers, and systems analysts, and their managers. Offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. Shows how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. Includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning.

Data Quality

The Accuracy Dimension

DOWNLOAD NOW »

Author: Jack E. Olson

Publisher: Elsevier

ISBN: 9780080503691

Category: Computers

Page: 300

View: 6358

Data Quality: The Accuracy Dimension is about assessing the quality of corporate data and improving its accuracy using the data profiling method. Corporate data is increasingly important as companies continue to find new ways to use it. Likewise, improving the accuracy of data in information systems is fast becoming a major goal as companies realize how much it affects their bottom line. Data profiling is a new technology that supports and enhances the accuracy of databases throughout major IT shops. Jack Olson explains data profiling and shows how it fits into the larger picture of data quality. * Provides an accessible, enjoyable introduction to the subject of data accuracy, peppered with real-world anecdotes. * Provides a framework for data profiling with a discussion of analytical tools appropriate for assessing data accuracy. * Is written by one of the original developers of data profiling technology. * Is a must-read for any data management staff, IT management staff, and CIOs of companies with data assets.

Data Quality Assessment

DOWNLOAD NOW »

Author: Arkady Maydanchik

Publisher: Technics Publications

ISBN: 163462047X

Category: Computers

Page: 336

View: 3984

Imagine a group of prehistoric hunters armed with stone-tipped spears. Their primitive weapons made hunting large animals, such as mammoths, dangerous work. Over time, however, a new breed of hunters developed. They would stretch the skin of a previously killed mammoth on the wall and throw their spears, while observing which spear, thrown from which angle and distance, penetrated the skin the best. The data gathered helped them make better spears and develop better hunting strategies. Quality data is the key to any advancement, whether it’s from the Stone Age to the Bronze Age. Or from the Information Age to whatever Age comes next. The success of corporations and government institutions largely depends on the efficiency with which they can collect, organize, and utilize data about products, customers, competitors, and employees. Fortunately, improving your data quality doesn’t have to be such a mammoth task. DATA QUALITY ASSESSMENT is a must read for anyone who needs to understand, correct, or prevent data quality issues in their organization. Skipping theory and focusing purely on what is practical and what works, this text contains a proven approach to identifying, warehousing, and analyzing data errors – the first step in any data quality program. Master techniques in: • Data profiling and gathering metadata • Identifying, designing, and implementing data quality rules • Organizing rule and error catalogues • Ensuring accuracy and completeness of the data quality assessment • Constructing the dimensional data quality scorecard • Executing a recurrent data quality assessment This is one of those books that marks a milestone in the evolution of a discipline. Arkady's insights and techniques fuel the transition of data quality management from art to science -- from crafting to engineering. From deep experience, with thoughtful structure, and with engaging style Arkady brings the discipline of data quality to practitioners. David Wells, Director of Education, Data Warehousing Institute

Measuring Data Quality for Ongoing Improvement

A Data Quality Assessment Framework

DOWNLOAD NOW »

Author: Laura Sebastian-Coleman

Publisher: Newnes

ISBN: 0123977541

Category: Computers

Page: 376

View: 9626

The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You’ll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You’ll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies. Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges Enables discussions between business and IT with a non-technical vocabulary for data quality measurement Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation

Enterprise Knowledge Management

The Data Quality Approach

DOWNLOAD NOW »

Author: David Loshin

Publisher: Morgan Kaufmann

ISBN: 9780124558403

Category: Business & Economics

Page: 493

View: 3852

This volume presents a methodology for defining, measuring and improving data quality. It lays out an economic framework for understanding the value of data quality, then outlines data quality rules and domain- and mapping-based approaches to consolidating enterprise knowledge.

Practical Data Migration

DOWNLOAD NOW »

Author: Johny Morris

Publisher: BCS, The Chartered Institute

ISBN: 1906124841

Category: Database management

Page: 266

View: 6967

This book is for executives and practitioners tasked with the movement of data from old systems to a new repository. It uses a series of steps developed in real life situations that will get the reader from an empty new system to one that is working and backed by the user population. Recent figures suggest that nearly 40% of Data Migration projects are over time, over budget or fail entirely. Using this proven methodology will vastly increase the chances of achieving a successful migration.

Data Quality and Record Linkage Techniques

DOWNLOAD NOW »

Author: Thomas N. Herzog,Fritz J. Scheuren,William E. Winkler

Publisher: Springer Science & Business Media

ISBN: 9780387695051

Category: Computers

Page: 234

View: 6932

This book offers a practical understanding of issues involved in improving data quality through editing, imputation, and record linkage. The first part of the book deals with methods and models, focusing on the Fellegi-Holt edit-imputation model, the Little-Rubin multiple-imputation scheme, and the Fellegi-Sunter record linkage model. The second part presents case studies in which these techniques are applied in a variety of areas, including mortgage guarantee insurance, medical, biomedical, highway safety, and social insurance as well as the construction of list frames and administrative lists. This book offers a mixture of practical advice, mathematical rigor, management insight and philosophy.

Data Quality for Analytics Using SAS

DOWNLOAD NOW »

Author: Gerhard Svolba

Publisher: SAS Institute

ISBN: 162959802X

Category: Mathematics

Page: 356

View: 2143

Analytics offers many capabilities and options to measure and improve data quality, and SAS is perfectly suited to these tasks. Gerhard Svolba's Data Quality for Analytics Using SAS focuses on selecting the right data sources and ensuring data quantity, relevancy, and completeness. The book is made up of three parts. The first part, which is conceptual, defines data quality and contains text, definitions, explanations, and examples. The second part shows how the data quality status can be profiled and the ways that data quality can be improved with analytical methods. The final part details the consequences of poor data quality for predictive modeling and time series forecasting.

The Data Governance Imperative

DOWNLOAD NOW »

Author: Steve Sarsfield

Publisher: IT Governance Publishing

ISBN: 1849280134

Category: Business

Page: 161

View: 6790

This practical book covers both strategies and tactics around managing a data governance initiative to help make the most of your data.

How to Build a Business Rules Engine

Extending Application Functionality Through Metadata Engineering

DOWNLOAD NOW »

Author: Malcolm Chisholm

Publisher: Morgan Kaufmann

ISBN: 9781558609181

Category: Computers

Page: 484

View: 7146

� This is the only book that demonstrates how to develop a business rules engine. Covers user requirements, data modeling, metadata, and more. � A sample application is used throughout the book to illustrate concepts. The code for the sample application is available online at http://www.refdataportal.com. � Includes conceptual overview chapters suitable for management-level readers, including general introduction, business justification, development and implementation considerations, and more. � This is the only book that demonstrates how to develop a business rules engine. Covers user requirements, data modeling, metadata, and more. � A sample application is used throughout the book to illustrate concepts. The code for the sample application is available online at http://www.refdataportal.com. � Includes conceptual overview chapters suitable for management-level readers, including general introduction, business justification, development and implementation considerations, and more.

The Data and Analytics Playbook

Proven Methods for Governed Data and Analytic Quality

DOWNLOAD NOW »

Author: Lowell Fryman,Gregory Lampshire,Dan Meers

Publisher: Morgan Kaufmann

ISBN: 0128025476

Category: Computers

Page: 292

View: 2008

The Data and Analytics Playbook: Proven Methods for Governed Data and Analytic Quality explores the way in which data continues to dominate budgets, along with the varying efforts made across a variety of business enablement projects, including applications, web and mobile computing, big data analytics, and traditional data integration. The book teaches readers how to use proven methods and accelerators to break through data obstacles to provide faster, higher quality delivery of mission critical programs. Drawing upon years of practical experience, and using numerous examples and an easy to understand playbook, Lowell Fryman, Gregory Lampshire, and Dan Meers discuss a simple, proven approach to the execution of multiple data oriented activities. In addition, they present a clear set of methods to provide reliable governance, controls, risk, and exposure management for enterprise data and the programs that rely upon it. In addition, they discuss a cost-effective approach to providing sustainable governance and quality outcomes that enhance project delivery, while also ensuring ongoing controls. Example activities, templates, outputs, resources, and roles are explored, along with different organizational models in common use today and the ways they can be mapped to leverage playbook data governance throughout the organization. Provides a mature and proven playbook approach (methodology) to enabling data governance that supports agile implementation Features specific examples of current industry challenges in enterprise risk management, including anti-money laundering and fraud prevention Describes business benefit measures and funding approaches using exposure based cost models that augment risk models for cost avoidance analysis and accelerated delivery approaches using data integration sprints for application, integration, and information delivery success

Making Enterprise Information Management (EIM) Work for Business

A Guide to Understanding Information as an Asset

DOWNLOAD NOW »

Author: John Ladley

Publisher: Morgan Kaufmann

ISBN: 0123756960

Category: Computers

Page: 552

View: 5026

Making Enterprise Information Management (EIM) Work for Business: A Guide to Understanding Information as an Asset provides a comprehensive discussion of EIM. It endeavors to explain information asset management and place it into a pragmatic, focused, and relevant light. The book is organized into two parts. Part 1 provides the material required to sell, understand, and validate the EIM program. It explains concepts such as treating Information, Data, and Content as true assets; information management maturity; and how EIM affects organizations. It also reviews the basic process that builds and maintains an EIM program, including two case studies that provide a birds-eye view of the products of the EIM program. Part 2 deals with the methods and artifacts necessary to maintain EIM and have the business manage information. Along with overviews of Information Asset concepts and the EIM process, it discusses how to initiate an EIM program and the necessary building blocks to manage the changes to managed data and content. Organizes information modularly, so you can delve directly into the topics that you need to understand Based in reality with practical case studies and a focus on getting the job done, even when confronted with tight budgets, resistant stakeholders, and security and compliance issues Includes applicatory templates, examples, and advice for executing every step of an EIM program

Building and Managing the Meta Data Repository

A Full Lifecycle Guide

DOWNLOAD NOW »

Author: David Marco

Publisher: Wiley

ISBN: 9780471355236

Category: Computers

Page: 416

View: 2879

"This is the first book to tackle the subject of meta data in data warehousing, and the results are spectacular . . . David Marco has written about the subject in a way that is approachable, practical, and immediately useful. Building and Managing the Meta Data Repository: A Full Lifecycle Guide is an excellent resource for any IT professional." -Steve Murchie Group Product Manager, Microsoft Corporation Meta data repositories can provide your company with tremendous value if they are used properly and if you understand what they can, and can't, do. Written by David Marco, the industry's leading authority on meta data and well-known columnist for DM Review, this book offers all the guidance you'll need for developing, deploying, and managing a meta data repository to gain a competitive advantage. After illustrating the fundamental concepts, Marco shows you how to use meta data to increase your company's revenue and decrease expenses. You'll find a comprehensive look at the major trends affecting the meta data industry, as well as steps on how to build a repository that is flexible enough to adapt to future changes. This vendor-neutral guide alsoincludes complete coverage of meta data sources, standards, and architecture, and it explores the full gamut of practical implementation issues.Taking you step-by-step through the process of implementing a meta data repository, Marco shows you how to: - Evaluate meta data tools Build the meta data project plan - Design a custom meta data architecture - Staff a repository team - Implement data quality through meta data - Create a physical meta data model - Evaluate meta data delivery requirements The CD-ROM includes: - A sample implementation project plan - A function and feature checklist of meta data tool requirements - Several physical meta datamodels to support specific business functions Visit our Web site at www.wiley.com/compbooks/ Visit the companion Web site at www.wiley.com/compbooks/marco

Competing with High Quality Data

Concepts, Tools, and Techniques for Building a Successful Approach to Data Quality

DOWNLOAD NOW »

Author: Rajesh Jugulum

Publisher: John Wiley & Sons

ISBN: 111841649X

Category: Business & Economics

Page: 304

View: 5114

Create a competitive advantage with data quality Data is rapidly becoming the powerhouse of industry, but low-quality data can actually put a company at a disadvantage. To be used effectively, data must accurately reflect the real-world scenario it represents, and it must be in a form that is usable and accessible. Quality data involves asking the right questions, targeting the correct parameters, and having an effective internal management, organization, and access system. It must be relevant, complete, and correct, while falling in line with pervasive regulatory oversight programs. Competing with High Quality Data: Concepts, Tools and Techniques for Building a Successful Approach to Data Quality takes a holistic approach to improving data quality, from collection to usage. Author Rajesh Jugulum is globally-recognized as a major voice in the data quality arena, with high-level backgrounds in international corporate finance. In the book, Jugulum provides a roadmap to data quality innovation, covering topics such as: The four-phase approach to data quality control Methodology that produces data sets for different aspects of a business Streamlined data quality assessment and issue resolution A structured, systematic, disciplined approach to effective data gathering The book also contains real-world case studies to illustrate how companies across a broad range of sectors have employed data quality systems, whether or not they succeeded, and what lessons were learned. High-quality data increases value throughout the information supply chain, and the benefits extend to the client, employee, and shareholder. Competing with High Quality Data: Concepts, Tools and Techniques for Building a Successful Approach to Data Quality provides the information and guidance necessary to formulate and activate an effective data quality plan today.

Data Quality

Concepts, Methodologies and Techniques

DOWNLOAD NOW »

Author: Carlo Batini,Monica Scannapieco

Publisher: Springer Science & Business Media

ISBN: 3540331735

Category: Computers

Page: 282

View: 3551

Poor data quality can seriously hinder or damage the efficiency and effectiveness of organizations and businesses. The growing awareness of such repercussions has led to major public initiatives like the 'Data Quality Act' in the USA and the 'European 2003/98' directive of the European Parliament. Batini and Scannapieco present a comprehensive and systematic introduction to the wide set of issues related to data quality. They start with a detailed description of different data quality dimensions, like accuracy, completeness, and consistency, and their importance in different types of data, like federated data, web data, or time-dependent data, and in different data categories classified according to frequency of change, like stable, long-term, and frequently changing data. The book's extensive description of techniques and methodologies from core data quality research as well as from related fields like data mining, probability theory, statistical data analysis, and machine learning gives an excellent overview of the current state of the art. The presentation is completed by a short description and critical comparison of tools and practical methodologies, which will help readers to resolve their own quality problems. This book is an ideal combination of the soundness of theoretical foundations and the applicability of practical approaches. It is ideally suited for everyone researchers, students, or professionals interested in a comprehensive overview of data quality issues. In addition, it will serve as the basis for an introductory course or for self-study on this topic.

Data Strategy

DOWNLOAD NOW »

Author: Sid Adelman,Larissa Terpeluk Moss,Majid Abai

Publisher: Addison-Wesley Professional

ISBN: N.A

Category: Computers

Page: 342

View: 9164

Without a data strategy, the people within an organization have no guidelines for making decisions that are absolutely crucial to the success of the IT organization and to the entire organization. The absence of a strategy gives a blank check to those who want to pursue their own agendas, including those who want to try new database management systems, new technologies (often unproven), and new tools. This type of environment provides no hope for success. Data Strategy should result in the development of systems with less risk, higher quality systems, and reusability of assets. This is key to keeping cost and maintenance down, thus running lean and mean. Data Strategy provides a CIO with a rationale to counter arguments for immature technology and data strategies that are inconsistent with existing strategies. This book uses case studies and best practices to give the reader the tools they need to create the best strategy for the organization.

Corporate Data Quality

Prerequisite for Successful Business Models

DOWNLOAD NOW »

Author: Boris Otto,Hubert Österle

Publisher: epubli

ISBN: 3737575932

Category: Business & Economics

Page: N.A

View: 7365

Data is the foundation of the digital economy. Industry 4.0 and digital services are producing so far unknown quantities of data and make new business models possible. Under these circumstances, data quality has become the critical factor for success. This book presents a holistic approach for data quality management and presents ten case studies about this issue. It is intended for practitioners dealing with data quality management and data governance as well as for scientists. The book was written at the Competence Center Corporate Data Quality (CC CDQ) in close cooperation between researchers from the University of St. Gallen and Fraunhofer IML as well as many representatives from more than 20 major corporations. Chapter 1 introduces the role of data in the digitization of business and society and describes the most important business drivers for data quality. It presents the Framework for Corporate Data Quality Management and introduces essential terms and concepts. Chapter 2 presents practical, successful examples of the management of the quality of master data based on ten cases studies that were conducted by the CC CDQ. The case studies cover every aspect of the Framework for Corporate Data Quality Management. Chapter 3 describes selected tools for master data quality management. The three tools have been distinguished through their broad applicability (method for DQM strategy development and DQM maturity assessment) and their high level of innovation (Corporate Data League). Chapter 4 summarizes the essential factors for the successful management of the master data quality and provides a checklist of immediate measures that should be addressed immediately after the start of a data quality management project. This guarantees a quick start into the topic and provides initial recommendations for actions to be taken by project and line managers. Please also check out the book's homepage at http://www.cdq-book.org/