BUS610 Study Guide

Site: Saylor Academy
Course: BUS610: Business Intelligence and Analytics
Book: BUS610 Study Guide
Printed by: Guest user
Date: Sunday, May 19, 2024, 6:25 PM

Navigating this Study Guide

Study Guide Structure

In this study guide, the sections in each unit (1a., 1b., etc.) are the learning outcomes of that unit. 

Beneath each learning outcome are:

  • questions for you to answer independently;
  • a brief summary of the learning outcome topic; and
  • and resources related to the learning outcome. 

At the end of each unit, there is also a list of suggested vocabulary words.

 

How to Use this Study Guide

  1. Review the entire course by reading the learning outcome summaries and suggested resources.
  2. Test your understanding of the course information by answering questions related to each unit learning outcome and defining and memorizing the vocabulary words at the end of each unit.

By clicking on the gear button on the top right of the screen, you can print the study guide. Then you can make notes, highlight, and underline as you work.

Through reviewing and completing the study guide, you should gain a deeper understanding of each learning outcome in the course and be better prepared for the final exam!

Unit 1: Introduction to Business Intelligence

1a. Define business intelligence and outline the major historical eras of business intelligence

  • What is business intelligence, and how does it influence businesses today?
  • How do business intelligence systems differ from other kinds of systems?
  • How do business intelligence systems support managerial decision-making?
Business intelligence (BI) is the set of concepts and methods that support decision-making through information analysis, delivery, and processing. BI is the data analysis, reporting, and query tools that help users derive valuable information.
 
The following processes comprise BI architecture:
 
  1. Data collection – the operational systems that provide the required BI data
  2. Data integration – involves the ETL (extract-transform-load) functions needed to transfer the data from the original source into a format compatible with other data stored in the data warehouse
  3. Data storage – the data warehouse or data mart in which the data is stored
  4. Data processing – includes the concepts and tools utilized in the evaluation and analysis of the data
  5. Data presentation – is the process of preparing and presenting the analysis results
The following figure provides an overview of the individual processes and the components which belong to each process step.
 


Business intelligence (BI) combines analytics, data warehousing, and mining, visualization, and data infrastructure to help organizations make effective data-driven decisions. This is a different focus than Information Management and Data Processing. Information management generally includes the management and reporting of transactions and the operational systems necessary to operate the business. This is different from the focus of BI on support for decision-making. Data processing, as generally defined, would involve the use of relational database systems and transaction processing. There might also be some use of SQL for queries and reports.
 
Business intelligence (BI) combines analytics, data warehousing, mining, visualization, and data infrastructure to help organizations make effective data-driven decisions. This requires a thoughtful process to identify which data is most relevant to any given decision problem.
 
Since the advent of data-driven decision-making in the 1950s, business intelligence has always striven to collect and analyze the largest amount of data possible. The focus of BI is on the data needed to make decisions; this is not necessary, only the most current data. There is often a need for historical data to make forecasts and support decision-making. The focus of a BI system would not be on the sheer amount and variety of data but rather on the data most relevant to the decision-maker's needs.
 
To review, see Business Intelligence.
 

1b. Explain how business intelligence is used today to support decision-making and process improvement

  • How do business intelligence systems support decision-making?
  • What kind of data is needed in business intelligence systems?
  • How is data obtained?
  • How is a business intelligence system managed?
Business Intelligence systems can be utilized to uncover hidden patterns, unexpected relationships, and market trends or reveal preferences that may have been difficult to discover previously. Armed with this information, organizations can make better decisions about production, financing, marketing, and sales than they could before.
 
Data-driven decision-making is collecting data, extracting patterns and facts from that data, and using those facts to guide decisions. Utilizing data in decision-making is superior to using a person's, group's, or organization's intuition. It can create better decisions, generate new business insights, and identify new business opportunities.
 
To begin making data-driven decisions, the organization must start with a clear objective about what they are trying to accomplish. It could be increased sales, reduced manufacturing costs, improved process efficiency, or other measurable outcomes. Once the objective(s) have been determined, the organization gathers and analyzes the available data to make decisions. After the decision is implemented, it is important to determine if the analysis validated the results.
 
Primary content is intentionally created by humans, typically users. When thousands or more of these are combined and anonymized, they can be used to analyze popular or emerging trends. Humans also create primary content in videos, academic papers, blogs, and the like that can be mined, for instance, for sentiment analysis. Primary content is also intentionally created by humans/users through their social media activity, browser history, or other direct activities.
 
Four distinct leadership roles are taking on the challenges of navigating big data and analytics for business intelligence:

  1. Chief Data Officer - Acts as the data owner and architect and should set data definitions and strategies
  2. Data Scientist - Classically trained as data engineers, mathematicians, computer scientists, or statisticians
  3. Chief Analytics Officer - Owns a board realm of responsibilities and functions to maintain forward-thinking progress
  4. Data Manager - Oversees a fluid connection between the data agenda and technology agenda
A way to develop an organizational culture that emphasizes empowerment toward analytics is to invest in employee training in analytics. This can create a data-literate company capable of infusing analytics throughout the organization. Suppose an organization creates a culture where all individuals have a working knowledge of data science. In that case, they will be able to ask the right questions and make stronger data-driven decisions. This emphasis on data literacy can also be promoted by adding analytics competencies to every employee role in some manner so that the organizational culture is one with a steady foundation of analytics
 
To review, see Business Intelligence.
 

1c. Assess how business intelligence is likely to evolve in the future based on changing business needs and technology

  • What are some of the newer technologies being incorporated into business intelligence systems?
  • How do business intelligence systems support unstructured decisions?
  • How must organizational culture support business intelligence?
From the 1950s until about the 1990s, BI systems mainly focused on well-organized data in easily comparable formats. Since then, the design focus of these systems has been shifting and expanding to support other types of decisions, particularly unstructured decisions. Unstructured decisions have always been difficult to formulate and implement on computerized systems, including BI. At the current time, however, BI is expanding to include some of the newer systems and technologies that support unstructured decision-making.
 
BI systems are constantly evolving. The use of new tools like neural networks and autonomous AI is now facilitating the expansion of BI capabilities into areas of decision-making that have typically only been the realm of human decision-makers.
 
A key to priming an organization to be a leader in business intelligence and analytics in the future is creating a culture that values transparency and trust. Building an organizational culture that values information transparency supports an atmosphere of trust and openness. Scholars also refer to relational transparency as a primary component of authentic leadership. By openly displaying metrics, organizations hold themselves accountable to improve weak areas and encourage members to present new, innovative solutions.
 
To review, see 1.3: The Future of BI.
 
 

Unit 1 Vocabulary

This vocabulary list includes the terms that you will need to know to successfully complete the final exam.
 
  • artificial intelligence (AI)
  • big data
  • big data analytics
  • business intelligence
  • business intelligence architecture
  • chief analytics officer
  • chief data officer
  • database management systems
  • data manager
  • data mining
  • data processing
  • data scientist
  • data warehousing
  • decision support
  • information management
  • primary content
  • transaction processing systems
  • unstructured decisions
  • visualization

Unit 2: Business Intelligence as Decision Support

2a. Outline how business intelligence teams "define the problem" by learning more about the context of "the problem" and its relationship to other aspects of the operation 

  • What is meant by the term analytics mindset?
  • What are the three major decision-making styles?
  • What is the benefit of making decisions driven by data?

To make informed decisions based on data, managers must have an analytics mindset to understand how the data is derived, interpreted, and communicated. Therefore, they need to develop an analytical skill set that helps them know what makes sense and understand where analytics adds value. This enables them to be confident enough to ask pertinent questions of the analyst. Read this article to learn the benefits of managers developing an analytical mindset.
 
Decision-making boils down to choosing between alternatives against a defined set of selection criteria. There are usually costs versus benefits, advantages versus disadvantages, and alignment with preferences. The more factors to consider, the more challenging the decision will be. Adding time limits and personal emotions further complicates the process. Utilizing data to help optimize the alternatives informs the decision-maker with much more support to make their decision.
 
There are different decision-making styles. Each decision-making style is characterized by either a task or social focus and a high or low tolerance for ambiguity. Styles with a high tolerance for ambiguity can work with unknown variables as they come to a conclusion. Those with a low tolerance for ambiguity want as much clarity as possible in all the circumstances and information that lead to their decisions.
 
Decision-making styles can generally be defined into three categories:

  1. Psychological, which is based on the decision-maker's needs, desires, preferences, and/or values.
  2. Cognitive, which involves an integrated feedback system between the decision-maker and the environment's reaction to those decisions. This style includes iterative cycles and regular reassessments to measure the impacts of the decision.
  3. Normative, which derives decisions based on the communication and sharing of logic within an organization's normative construct. The decision has to fit in and support the organization's mission and goals.

A major part of decision-making involves analyzing a defined set of alternatives against selection criteria. These criteria usually include costs and benefits, advantages and disadvantages, and alignment with preferences. The ability to make effective decisions that are rational, informed, and collaborative can greatly reduce opportunity costs while building a strong organizational focus.
 
Data-driven decisions are supported by facts, not guesses or hunches. Making decisions supported by data versus guesses or hunches helps the organization make better decisions, enables replication, and minimizes liabilities from making unsupported decisions.
 
The classic model of decision-making has been in existence for many decades and forms the basis from which other, more modern decision-making models build. A solid understanding of and the ability to apply the classical model is an essential core skill of any decision-maker.
 
The following figure provides an overview of the managerial decision-making process.



To review, see Overview of Managerial Decision-Making.
 

2b. Compare the different methods BI uses to support management teams, such as data mining, reporting, and visualization, trend and statistical analysis, and predictive analytics 

  • What are the possible outcomes of the decision-making process?
  • What types of analysis tools can be used to support decision-making?
  • What methods are used to store, manage, and mine data to support decisions?

Data-driven decision-making uses a variety of machine learning approaches for data analysis by characterizing a decision problem and ascertaining the connections between the problem variables (input, internal, and output variables) without having explicit knowledge of the physical behavior of the decision model.
 
The processes included in BI architecture are data collection, data integration, data storage, and data processing.
 
For a business problem to be well-defined, it must be measurable and the operation repeatable over a specific period. To be measurable, the results must be able to be measured or counted to determine if the prediction was accurate. A repeatable operation requires that the chosen attribute to measure must occur regularly and have a repeatable pattern. A specific period requires the variable to have a specific beginning and end, such as a week, month, or quarter.
 
Intelligence analysis and decision-making can have elements of both art and science. This is because standard approaches and tested techniques can codify and help make the processes orderly and productive. However, these methods are only as good as the individuals who implement them. Humans can be unpredictable, and even the best forecaster can be hindered by the outcome of a project, even when the analysis has been perfect.
 
Technological advances have changed the practice of BI exponentially. The ability of sophisticated software to collect and process data from myriad sources allows so much more information to be available to analysts and managers, and it can overload them. Dashboards are important for presenting data. Something as simple as an electronic catalog from which you can search for library materials could be considered a basic dashboard. You put in your search terms, which act as data filters, and the system shows you the best matches.
 
To review, see Decision-Making Tools.
 

2c. Describe various management tasks, from policymaking to performance evaluation to improving procurement strategies to identify relevant trends to understand how they benefit from using BI 

  • What are the differences between the four different analytical models to frame tactical and strategic questions?
  • What are the various analytics domains that could be deployed in an organization?

A well-defined business problem should start with a question that needs to be answered. It needs to be measurable and the operation repeatable over a specific period. An organization needs to have a starting point of what it would like to know, such as a relationship between x and y. Defining the requirements and the objective of a decision is a critical first step in the whole process. If the requirements are not defined well, the subsequent steps in the process will only take the decision-maker further from their objectives. If we ask the wrong question, we will get the wrong answer. In defining the objectives and requirements of a given decision, the decision-maker will also need to look at the information in new and novel ways. The managerial catchphrase you sometimes hear is thinking "out of the box" and not letting past assumptions limit your ability to see the bigger context of a given decision-making problem
 
Analytics can assess and visualize decisions, describe the implications of historical data, predict and model future expectations, and optimize internal processes. Navigating and deriving value from big data is critical to successful organizational management.
 
There are four primary tactical and strategic analytical models managers may use to frame analytics:

  1. Descriptive analytics: what happened
  2. Decision / diagnostic analytics: why it happened
  3. Predictive analytics: what will happen
  4. Prescriptive analytics: how we can make it happen

Forecasting is how an analyst or team uses analysis to develop estimates on what is likely to happen in the future. This works much better in the short term than in the longer term, as conditions are unlikely to change as much in the next 6-12 months as they are in the next 5-7 years. However, as the COVID-19 pandemic has shown us, sometimes there are shocks in the environment that make even the best short-term forecasting look unreliable in hindsight. Keep in mind that such extreme external shocks are rare. The last global pandemic, for instance, was a century ago. Thus, there is definite value in data-driven forecasting.
 
Analytics can impact nearly every domain in an organization, including finance, marketing, talent, customers, risk management, transportation, and sales.
 
To review, see Why You Think You're Right Even if You're Wrong.
 

Unit 2 Vocabulary 

This vocabulary list includes the terms that you will need to know to successfully complete the final exam.

  • alternatives
  • analytical model
  • analytics
  • analytics mindset
  • business problem
  • classic model
  • cognitive analytics
  • dashboard
  • data mining
  • decision-making style
  • descriptive analytics
  • diagnostic analytics
  • directive
  • forecasting
  • measurable
  • normative
  • psychological
  • prediction/predictive
  • reporting
  • visualization

Unit 3: Data Mining and Text Mining

3a. Choose appropriate datasets to meet the requirement 

  • What is big data, and how is it used?
  • How are data mining systems used to extract data from the data warehouse?
  • What are the differences between big data produced intentionally or unintentionally by humans and machines?

Big data typically describes data sets so large or complex that traditional data-processing techniques often prove inadequate. The structure of big data is described by:

  • Volume: amount measured in gigabytes or terabytes
  • Velocity: one-time snapshot frequency streams
  • Variety: structured, numeric, alpha, unstructured, text, sound, image or video, genomics
  • Veracity: validation, noise level, deception, detection, relevance, ranking
  • Value: the usefulness of the data in supporting decisions that add economic value

This figure illustrates these characteristics.


We store big data in the data warehouse and use data mining techniques to extract data for use by business intelligence systems. Data mining systems are designed to find patterns and correlations in data from data warehouses and generally prepare data for use in the decision support systems used by decision-makers. This means that they facilitate decision-making but are not directly involved in the decision-making process.
 
The vast majority of the current data was created in just the past few years. The challenge is to extract value from and put it to work for organizations and individuals. The vast amount of personal data produced by citizens can be of value to the public and private sectors.
 
To review, see Big Data.
 

3b. Describe the four stages of the data mining process: data generation, data acquisition, data storage, and data analytics 

  • How are data mining systems used to extract data from the data warehouse?
  • What is involved in the data preparation process?
  • Why is the data preparation and cleaning process important in supporting a BI system?

Data mining is a data analysis technique that focuses on modeling and knowledge discovery for predictive rather than purely descriptive purposes. Data mining arose primarily along with data warehouses to address some of the limitations in OLTP systems.
 
Data mining is often implemented to populate a data warehouse. Data mining evolved to cater to the limitations of transaction processing systems to deal with massive data sets with high dimensionality, new data types, and multiple heterogeneous data resources.
 
The following figure illustrates the lifecycle of the data preparation process. Data must first be gathered from both internal and external sources. This data is likely to be stored in a wide variety of formats. We then use data discovery processes, like data mining, to understand the information and insights that the data might provide to the managerial decision-maker. After this, we must clean the data. Data gathered from a wide variety of sources is likely to contain inaccuracies and inconsistencies and lack the degree of integrity needed to provide a reliable source of actionable information. After the data has been cleaned, it will likely be necessary to transform it into formats and structures that will be more appropriate to support a business intelligence system. We may also enrich the data by providing additional insights, expansions, and clarifications. Finally, we must store the data in the data warehouse.
 
This data preparation process is fundamental to the ultimate success of the business intelligence system. Again, the adage of "garbage in, garbage out" comes into play. Suppose we do not remove the biases and inconsistencies from the source data. In that case, these biases and inconsistencies will find their way into our decision-making process, and the quality of our decisions will suffer.


The four leadership roles that are needed to take on the challenges of implementing big-data analytics in an organization include:

  1. Chief Data Officer: the data owner and architect who sets the data definitions and strategies
  2. Chief Analytics Officer: has a board-level realm and responsibilities to maintain forward-thinking progress
  3. Data Scientists: provide high technical skills and are proficient in their understanding of the business
  4. Data Manager: serves as the organizer and architect of the data

Online Analytic Processing systems use computing methods that enable users to easily and selectively extract and query data to analyze it from different points of view. These systems are recipients of data provided through data mining. They are also capable of dealing with high dimensionality, new data types, and multiple heterogeneous data resources.
 
To review, see Practical Real-world Data Mining.
 

3c. Standardize and exploit text and develop a taxonomy 

  • What are some of the reasons we use text mining?
  • How is text mining accomplished?

Much of the information and data that we are interested in using to support decision-making will take the form of natural language text. A natural language is any human language – English, Spanish, German, Chinese, etc. Natural languages by themselves are complicated and create myriad problems in text refinement methods for identifying textual relationships.; one example is words having the same spelling but with divergent meanings, such as "live" (to live) and "live" (to see in person). Text mining considers both as similar, while one is a verb and the other an adjective.
 
Text mining is transforming unstructured natural language text into a structured format to identify meaningful patterns and new insights. As these systems continue to evolve, the next major innovation is likely to include some of the recently developed AI systems. Technological advances will likely enhance analysts' ability to standardize and exploit text. AI techniques, especially those that can speed up the data mining process, are some of the most recent developments in the field. Their inclusion in advanced text mining systems is likely to be the most transformational.
 
Text analytics enables businesses to discover insights and meaning from unstructured text-based data. Through the analytic processing of unstructured text, the underlying facts of the situation are discovered.
 
Text analysis is how information is automatically extracted and classified from text data. For example, a text could take the form of survey responses, emails, support tickets, call center notes, product reviews, social media posts, and any other feedback given in free text format. Text analytics enables businesses to discover insights from within this unstructured data.
 
To review, see Introduction to Text Mining.
 

3d. Evaluate data quality based on source reliability, accuracy, timeliness, and application to the requirement 

  • What factors constitute data quality?
  • How can data quality be evaluated?

Data is obtained from a wide variety of sources and is widely diverse in terms of reliability, accuracy, timeliness, and appropriateness to the application.
 
Quantitative data is information that can be tabulated and measured. Data is measured by numbers and is clearly defined. For example, researchers can calculate the number of specific responses to a multiple-choice or yes/no question. Qualitative data is descriptive and can tell researchers how respondents feel about a particular product or service and what influences their purchase decisions.
 
Qualitative data are measures of 'types' and may be represented by a name, symbol, or number code. Qualitative data are data about categorical variables (what type or name). Quantitative data are measures of values or counts and are expressed as numbers. Quantitative data are about numeric variables (how many, how much, or how often).
 
There are some common issues when dealing with Big Data. Two critical ones are data quality and data variety (such as multiple formats within the dataset) – deep learning techniques, such as dimension reduction, can be used to solve these problems. Traditional data models and machine learning methods struggle to deal with these data issues, further supporting the case of deep learning, as the former cannot handle complex data with the framework of Big Data.
 
Knowledge discovery in databases (KDD) is discovering useful knowledge from a collection of data. The overall goal of the data mining process is to extract information from a data set and transform it into an understandable structure for further use. Data mining is just one step of the knowledge discovery process (the core step). Some steps that follow are pattern evaluation (this step interprets mined patterns and relationships), akin to your analytic process, and Knowledge Consolidation, which is similar to reporting your findings, although they ought to be more robust than simply consolidating knowledge to responsibly respond to your requirements. Like analysis, KDD is an iterative process. If the pattern evaluated after the data mining step is not useful, the process can begin again from the previous steps. You should utilize the learning transcribed in your journal in a similar fashion as your understanding grows with the most relevant pieces built upon to achieve the most useful and relevant knowledge.
 
To review, see Big Data Analytics for Disparate Data.
 

3e. Identify methods for optimization, filtering, or "cleaning" data for standardization and effective comparison 

  • What are some ways we can optimize or filter data for standardization?

Raw data is usually not suitable for direct analysis. This is because the data might come from different sources in different formats. Therefore, data preparation is an essential task that transforms or prepares data into a form that's suitable for analysis.
 
The following are some of the more common methods of preparing data:

  1. Aggregation – Multiple columns are reduced to fewer columns. Records are summarized
  2. Normalization – Data is scaled or shifted, perhaps to a range of 0-1
  3. Augmentation – Expand the dataset size without collecting more data. For example, in image data via cropping or rotating
  4. Formatting – Data is modified to a consistent form
  5. Imputation – Fill missing values using estimates from available data

Data lineage includes the data origin, what happens to it, and where it moves over time (essentially the whole journey of a piece of data). This page explains the concept of data lineage and its utility in tracing errors back to their root cause in the data process. Data lineage is a way of debugging Big Data pipelines, but the process is not simple. There are many challenges, such as scalability, fault tolerance, anomaly detection, and more. For each of the challenges listed, write your own definition.
 
Your data must be rigorous and contain a highly representative sample to achieve the most relevant, reliable, and reflective insights. It is pointless to collect data from only one subset of a large population when you wish to market to the whole.
 
The database administration process and the database administrator are responsible for the design and administration of data models and the data integrity constraints included in those models. Missing data elements are likely caused by poor data integrity controls and would thus represent a result of poor administration. Database administration is the function of managing and maintaining database management systems (DBMS) software. As a part of this, database administrators are responsible for the data modeling and design process and ensure that operational databases are designed to high professional standards.
 
To review, see Capturing Value from Big Data.
 

Unit 3 Vocabulary 

This vocabulary list includes the terms that you will need to know to successfully complete the final exam.

  • aggregation
  • augmentation
  • big data
  • big data analytics
  • business intelligence
  • business intelligence architecture
  • cognitive
  • data collection
  • data integration
  • data lineage
  • data mining
  • data quality
  • data warehouse
  • business knowledge
  • chief analytics officer
  • chief data officer
  • data manager
  • data scientist
  • formatting
  • imputation
  • knowledge discovery
  • natural language
  • normalization
  • text mining
  • transaction processing system
  • value
  • variety
  • velocity
  • veracity
  • volume

Unit 4: Data Warehousing and Integration

4a. Describe how data storage and integration have changed over time to enable the prediction of future trends in data storage

  • What are some of the issues associated with cloud data storage?
  • What role does data governance play in the administration of data?

From punch cards being used to communicate information to equipment a long time before computers were developed to Professor Fredrick Williams creating RAM in 1948, the history of data storage is wide, varied, and extremely complex, with the longest-serving era being that of IBM from the mid-1950s to approximately 2003 with their magnetic disk storage development and market domination. Since then, the technological development of data warehousing and storage regarding speed has moved beyond, but for large mainframes, it remained relatively the same for size.
 
Before the advent of relational databases, most transaction processing systems were characterized by application-specific data structures. Applications were not integrated, and thus there was no way to share data between applications. Since the advent of relational databases, there has been a much higher degree of centralization and coordination of transaction processing system data.
 
A cloud system consists of IT components (hardware, software, and infrastructure) that enable the delivery of cloud computing services such as SaaS (software as a service), PaaS (platform as a service), and IaaS (infrastructure as service) via a network, typically the public internet. Cloud systems must be highly flexible and allow for various technologies and systems of all vintages and standards. Cloud systems and the vendors and service providers who support them must be able to integrate many different types of technology and systems of different vintages and vendors. New technology and systems are constantly being developed, and cloud systems must allow for these new technologies to be integrated into the older technologies already in use.
 
Cloud services vendors must be able to provide non-proprietary network management solutions to allow for the wide range of technologies that must be integrated into the cloud system.
 
Data governance is the collection of processes, roles, policies, standards, and metrics that ensure the effective and efficient use of information in enabling an organization to achieve its strategic goals. Data governance defines who can take what action, upon what data, in what situations, and using what methods. Data governance frameworks and maturity models have been developed to aid the organization in ensuring that its governance policies and processes are serving the organization's needs in the most effective way.
 
The operational DBA would typically be concerned with the operational database and not the administration of the data warehouse. A specific DBA role should be created for the administration and management of the warehouse. In addition, one or both of these database systems may include cloud systems and may not even involve the use of a locally-managed data center.
 
This figure is a conceptualization of a cloud storage system.


To review, see Modeling and Management of Big Data in Databases.
 

4b. Explain the basic concepts and theories of data warehousing, such as dimensional modeling and ETLA (extract, transform, load, analyze) 

  • What are some of the challenges of data warehouse administration?
  • How is data extracted from the data warehouse?

The fundamental purpose of a data warehouse is to store data extracted from internal transaction processing systems and external sources. The data will be reformatted to meet the needs of the BI systems that will use it. The data warehouse will not be integrated with and will not contain operational data from the transaction processing systems. In addition, the data warehouse may or may not be segmented into specialized data marts.
 
To support the needs of BI systems, the DBA must ensure that the data stored in operational and transaction processing systems can be extracted and moved to the data warehouse supporting the BI system. This extraction process must also allow for the conversion of the operational data into whatever format meets the needs of the warehouse and the BI system.
 
Organizational data is expected to be utilized more than ever to support business intelligence applications, data warehousing, data marts, and advanced analytics for business decision support.
 
Extraction systems will only extract and transform the information they are specified for. Extraction systems have no mechanisms for auditing and checking data quality, completeness, or reliability. Such systems exist to automate extracting data from the warehouse, transforming it into the appropriate formats, and loading the data into the BI systems.
 
The following figure provides a high-level conceptual representation of the data warehousing and management process. Notice that data is fed to the data warehouse from various sources, both internal and external to the organization. This means that the data will be in many different formats and will need to be adapted for inclusion in the warehouse, which is the process of the staging phase. The warehouse itself may be subdivided into smaller sections called data marts. The structure of these data marts would depend on the needs of the user.


To review, see Data Warehouse Strategies.
 

4c. Explain data warehouse administration and security issues, such as user access and accountability, encryption, and emerging challenges 

  • What are some of the challenges of data warehouse administration?
  • How can security be managed and enforced in the data warehouse?

The database administrator of a warehouse will typically not be involved in the typical roles of an operational database. Their focus is specific to the warehouse and the BI systems it supports. Thus, the time it takes to make decisions, including the time it takes to extract from the warehouse, would be a primary concern.
 
The Database administrator of a warehouse may (rarely) be a team member on an applications redesign project but does not bear primary responsibility for such projects.
 
As security policies are developed to support business operations, good security practice ensures that data is only made accessible to those staff who have a documented business need to access the data. Security policies should be designed so that only those users who have a legitimate need to access particular data are given access to that data. This is particularly important in the case of sensitive information like customer data.
 
Fault tolerance simply means a system's ability to continue operating uninterrupted despite the failure of one or more of its components. This is true whether it is a computer system, a cloud cluster, a network, or something else. You can make a BI Server architecture more fault tolerant by using multiple instances that will tend to increase redundancy and result in a more fault-tolerant configuration.
 
A number of factors have changed recently regarding the future of data storage. As a move toward more security, containers are being used with more microservice architectures being implemented, and how those issues, such as operationality, will be a key trend to address. As cloud infrastructure grows, so does the market for on-premise storage facilities, as more businesses want in-house control.
 
To review, see Big Data Management.
 

Unit 4 Vocabulary 

This vocabulary list includes the terms that you will need to know to successfully complete the final exam.

  • analytics development
  • cloud storage
  • containers
  • database administrator (DBA)
  • data center
  • data governance
  • data warehouse
  • data mart
  • fault tolerant
  • network management
  • platform as a service (PaaS)
  • relational database
  • security
  • software as a service (SaaS)
  • transaction processing system

Unit 5: Data Analytics

5a. Explain the difference between describing and analyzing data

  • How is data analytics used to study data?
  • How do we draw conclusions and support managerial decisions?
Data analytics is the "thinking" part of BI. Once the information has been mined, organized, and stored, the analyst must access it through structured queries. The analysis process applies rigorous methodologies to study information and interpret the results. Using these methodologies allows the analyst to determine how the information relates to the needs of their management team. Data analysis is often done using tools like dashboards, such as Tableau.
 
Analytics is where information becomes intelligence. It is transformed from disparate data points that can be described in terms of data sets into patterns resulting from the analysis. This is where the real brainwork of the analytic process takes place. The methods are myriad and are highly dependent upon both the available inputs and the requirements for your particular project.
 
Managing and extracting valuable meaning from big data is not only a science challenge but, more than anything, a leadership challenge. Becoming a big-data-enabled organization requires a culture of empowerment, trust, transparency, and inquiry. These qualities allow analytics to be woven throughout an organization's fabric, which elevates the investment and commitment to analytics. Across the literature, it is acknowledged that big data's managerial and leadership challenges outrank the technical challenges associated with utilizing big data to solve business goals.
 
Conclusions are based on an analysis of the data. Descriptive data is beneficial for presentation and facilitating conclusions but is not a conclusion itself.
 
To review, see Data Analysis.
 

5b. Apply various analytic techniques to various datasets to make analytic estimates

  • How do you use analytic techniques to find meaning in data?
  • What are some of the more commonly used classes of analytic techniques?
  • What are the criteria for selecting the best graphics to display data to a particular audience?
Descriptive analytics collects historical data from reporting, scorecards, clustering, and various other sources of information, enabling managers to highlight trends and identify opportunities and risks. It is also one of the lower levels of analytics focusing on the past and moderate benefits and complexity.
 
Predictive analytics leverages statistical models and machine learning to enable managers to predict future outcomes with varying degrees of statistical confidence. Descriptive analytics collects historical data from reporting, scorecards, clustering, and various other sources of information, enabling managers to highlight trends and identify them. Decision analytics uses data-driven models and visualizes outcomes of specific organizational behaviors allowing managers to visualize the various results of different strategic approaches. Prescriptive analytics uses optimization and simulation, enabling managers to produce recommended decisions through analytical modeling.
 
  1. Prescriptive – What we should do based on the data
  2. Diagnostic – Identify causal relationships
  3. Cognitive – Use artificial intelligence and machine learning
  4. Descriptive – Summarization and aggregation of data
  5. Predictive – Determine the likelihood of future behavior
It is important to know how many times a value appears when organizing data. Questions like the number of hours students study or the percentage of families with multiple pets. Frequency (also called absolute frequency), relative frequency, and cumulative relative frequency are measures that answer questions like these.
 
The absolute frequency is the number of times a value occurs in the data. The relative frequency is the ratio of the number of times a value occurs in the total number of values. The cumulative relative frequency is the summation of all relative frequencies and adds up to 1 (or 100%).
 
Analytics has defined stages of development, from descriptive analytics to cognitive analytics. As an organization moves along with analytics development, the benefits of the outcomes and implementation complexity increase.
 
Many analytic techniques can be applied to the attributes of a particular decision problem space. It is important to understand these techniques and under what circumstances each is most appropriate.
 
  1. Clustering - Used to group related attributes in sets that have common characteristics
  2. Classification - Identifying attributes and assigning them to sets based on their characteristics
  3. Prediction - Using the past values of an attribute to assign a future value
  4. Profiling - Searching for attributes that have preselected characteristics
  5. Smoothing - Taking the average of an attribute over time
Decision trees are one technique that can be structured to solve many different types of problems. Applying the correct tree type to the problem under consideration is important.
 
When displaying data to an audience, it's important to choose the right choice to help them quickly understand the point being made. Some simple charts that can be used include:
 
  1. Line charts for comparing trends, multiple datasets over time, or correlations
  2. Area charts for comparing change over time from two or more variables
  3. Column charts for showing frequency distribution and comparing datasets
  4. Bar charts for ranking datasets or comparing datasets
  5. Pie charts for comparing datasets as percentages of a whole.
To review, see Prediction and Inference in Data Science.
 

5c. Determine what kinds of scenarios and simulations would be most useful for your business case

  • How is scenario construction used in decision-making?
  • How are simulation techniques used to model real-world systems?
Scenarios place analysts in the role of the decision-maker or other figure whose decisions, influences, agendas, and profiles the analyst is attempting to model or forecast. Just as in the military, these games often also include "Red Teaming", which means trying to anticipate what your adversary will do given certain conditions. In real military war games, the physical "red team" is given a challenge, and along the way, key options or needed equipment or sources or something they expected to rely upon to win is taken away. The value of the exercise is to see how adaptive the military unit, or in this case, the analyst team, can be when environmental challenges present themselves and all requirements, timelines, and other elements of the process remain the same.
 
Simulation of a system is the operation of a model in terms of time or space, which helps analyze the performance of an existing or proposed system. Simulations are similar to scenarios, although today, simulations often take the place of computational models representing some problem to be solved that might be too expensive or dangerous to attempt in the real world. These computer simulations enable analysts to see what happens in a given situation, then ask, "What happens if X is changed?" Simulations are often used to experiment with environmental conditions or to predict behavior, such as consumers in a marketplace when a new competitor is introduced. The simulation process should follow a defined procedure.
 
Once the outcomes from the decision are captured, it is important to return to determine if they were supported by the analysis. The results can then be utilized to better inform the analysis for future uses.
 
The following figure illustrates the basic process of simulation. We create a model of the real world that we represent as a "black box." Within this box are all the mathematical details of our model. We then subject our model to various inputs and observe the outputs that result. Assuming that we have created a reasonable model of the real-world situation, the output from our model should be a very realistic approximation of the results that would be obtained in the real world.





Unit 5 Vocabulary

This vocabulary list includes the terms that you will need to know to successfully complete the final exam.

  • area chart
  • bar chart
  • classification
  • clustering
  • cognitive
  • diagnostic
  • frequency
  • frequency distributions
  • line chart
  • prescriptive
  • prediction
  • profiling
  • scenario
  • simulation
  • smoothing

Unit 6: Data Reporting and Visualization

6a. Examine visualization best practices for different audiences 

  • How do visualizations make large amounts of data easier to understand?
  • How do visualizations reveal data at several levels of detail, from a broad overview to a fine structure?

Well-crafted data visualizations not only present data in easily understood images but when done well, they enable the viewer to quickly perceive insights they may have missed if presented in summary tables and spreadsheets. Good data visualization does not only convert large amounts of data into images, but when done well, it engages the viewer and tells a story.
 
Well-crafted visualizations present complex ideas or results and communicate them with clarity, precision, and efficiency. Visualizations should:

  • show the data;
  • have the viewer on the substance instead of the methodology;
  • avoid distorting the data;
  • present many numbers in a small space;
  • make large data sets easier to understand; and
  • present the data at several levels of detail, from a high-level overview to a deep data dive.

With the advent of better software, faster processors, and cheaper memory, it has become easier to create and iterate visualizations. With this power comes responsibility, as it is crucial to building good visualizations that clearly articulate the analyst's point. Visualizations can be effective or ineffective, generating very strong feelings either way.
 
Data visualizations are useful for data cleaning, exploring data structure, detecting outliers and unusual groups, identifying trends and clusters, spotting local patterns, evaluating modeling output, and presenting results. It is essential for exploratory data analysis and data mining to check data quality and improve an analyst's familiarity with the structure and features of the data before them.
 
Presentation graphics are usually a select number of graphics created for any number of people and need to be well-designed and well-created with an effective explanatory text, either verbally or textually. They are used to convey known or summarized information.
 
Exploratory graphics can include several graphics created for an individual such as yourself. They don't need to be perfect but provide alternate views and additional information.
 
To review, see The Beauty of Data Visualization.
 

6b. Explain the increasing ways in which human decision-making is converging with artificial intelligence to improve both 

  • How is OLAP used to conduct BI analysis?
  • How are emerging AI technologies being used and converging with human decision-making?

The business sector developed Online Analytical Processing technology (OLAP) to conduct business intelligence analysis and look for insights. An OLAP data cube is a multidimensional array of data. A data cube is designed to organize data by grouping it into different dimensions, indexing the data, and precomputing queries frequently. It is important to understand the operations we can perform on a data cube and apply them correctly to the problem under consideration. OLAP systems allow flexible and dynamic questions to be asked of big data. Combining OLAP with the techniques of multicriteria decision-making allows business executives to incorporate insights from real-world data into the systematic evaluation of different business options. This improves the quality of complex decisions and leads to better business outcomes with the same resources.
 
In the age of big data and artificial intelligence, there are several unintended risks and potential malicious uses of the technology. Every organization needs to adopt a set of data governance standard operating procedures. They are opening themselves up to unnecessary litigation and a negative reputation without such.
 
The primary objective of a weak AI is to emulate human mental faculties through the use of models implemented on a computer. Notice that the focus is on the output, not the process. We are trying to achieve a result similar to what a human would have achieved. We are not trying to model or duplicate the cognitive process that humans used. The objective of a weak AI system is not to embody human capabilities, which would imply that the AI "thinks" in the same manner as a human; it is to emulate them on a computer through models. The main focus of a weak AI is on the output, not the cognitive process. In particular, emotions are poorly understood, even in humans, and machines are not, at present, capable of modeling or emulating them.
 
We use models that attempt to emulate the results of human cognition on a computer, but we cannot model many aspects of human cognition. In particular, emotions are poorly understood, even in humans, and machines are not, at present, capable of modeling or emulating them.
 
For the full benefits of technological advances to be gained in society, a collaborative approach to machines and humans working together must continue to be paramount. For generations, humans and machines have worked together. Why would it stop now? Humans will continue to offer creativity, social skills, and qualitative aspects to the partnership. Machines will bring quantitative aspects, speed, and the ability to scale rapidly. The nuances of language, such as the ability to joke, still remain outside of the grasp of machines, while quantum computing is nearly impossible for humans. By combining forces, true innovation is bound to happen.
 
As AI capabilities and ubiquity are extended, humans will need to learn how to work with it and ensure that its influence on human well-being is positive. We will need to ensure that human judgment maintains its primacy and that we are up to the task. There will be immense economic pressure to adopt AI. We must train a new generation of both data scientists and data science users who can guide this adoption to the benefit of humankind.
 
To review, see Artificial Intelligence and the Future of Humans.
 

6c. Describe the critical elements of reporting that clearly communicates analytic estimates to decision-makers 

  • How are data visualizations used to communicate information to decision-makers?
  • How can we effectively communicate via memoranda and other written forms?

Mass communication takes many forms in business. Memoranda and letters are two generally used in an official capacity. How your reports are written, including content, form, beautiful aesthetic data visualization presentations, and utilizing a framework such as the SMART model to showcase your goal setting with robust data will set you apart.
 
Visualizations are a tool to help the audience better understand large data sets. They should not be distracting or distorting the data and help the viewer focus on the substance rather than the methodology.
 
Good data visualization uses different visual characteristics (color, size, orientation, etc.) to encode information effectively at higher densities than would be possible in a plain text version. It is essential to test your visualizations with real users during the development process. This testing should focus on measuring the expressiveness and effectiveness of the presentation. Write in your journal about how people with a visual impairment might be included in the development process. A good visualization is a piece of data art composed to achieve a purpose. Whenever somebody looks at your visualization, you want them to reach the same conclusion as you, and they should be able to do so without having to dissect the information.
 
During data analysis, some helpful tips to keep in mind include:

  1. Communicate all of the results
  2. Try to avoid bias when interpreting data
  3. Failure to confirm the original hypotheses does not mean the research results are useless

Reducing the need for the audience to interpret the findings in visualization is the key to an effective presentation. This can be accomplished by the type of chart used or by highlighting key points through color choices.
 
Storytelling has been a useful tool to communicate information and knowledge over time. Using visualizations to tell a story with data helps make the information more concise and memorable. The most effective storytelling helps the audience reach the right conclusion and take the appropriate action.
 
To review, see Memorandums and Business Letters.
 

Unit 6 Vocabulary 

This vocabulary list includes the terms that you will need to know to successfully complete the final exam.

  • artificial Intelligence (AI)
  • data cube
  • clarity
  • efficiency
  • exploratory graphics
  • linear storytelling
  • online analytical processing (OLAP)
  • precision
  • presentation graphics
  • random access storytelling
  • user-directed path storytelling
  • visualization
  • weak AI

Unit 7: Data Analysis Dashboards

7a. Explain the capabilities and limitations of dashboards for organizing and manipulating data and expressing analytic estimates 

  • How do dashboards help humans understand and organize data?
  • What kind of dashboards are there, and how does each support different levels of management?

BI uses computers to exploit data, but humans and computers understand data differently. Effective organizations ensure that their data is presented in ways that help their teams interact with it and use it to make decisions. BI Dashboards help make information accessible and useful for many purposes, such as monitoring and evaluation, personnel and activity management, procurement and inventory, pricing, and more. The visualization delivered through dashboards allows large, otherwise overwhelming amounts of data to be easily digestible and understandable. Identifying what the data means allows for more informed and relevant business decisions. Dashboards provide a place for interacting, evaluating, connecting, and visualizing data from multiple sources. While dashboards do provide greater visibility with information available, this does come with limitations, including but not limited to attempts to incorporate too much information without understanding constraints, coupled with no predetermined rules for how the metrics should be used. All of this can result in clunky, unusable data.
 
It is important to fit the correct type of dashboard to the requirements of a given managerial decision-maker. A dashboard is a tool used to manage information from a single access point. It helps managers and employees to keep track of the company's KPIs and utilizes business intelligence to help companies make data-driven decisions. Different dashboards are appropriate for different levels of the organization.
 
The types of dashboards include:

  1. Operational Dashboard – Shows shorter time frames and detail on processes
  2. Tactical Dashboard – Used by middle management to track performance
  3. Strategic Dashboard – Focused on long-term objectives and high-level metrics
  4. Analytic Dashboard – Contains large amounts of data to facilitate decision support and studies

Dashboards should be structured to report and communicate information to decision-makers. The idea is to rapidly convey the important KPIs efficiently. Any dashboard would provide information quickly. Dashboards should motivate managers to spend more time on the BI system by causing them to reflect thoughtfully on the data they are seeing. Better decisions often require more time spent on reflection and analysis.
 
To review, see Universal Dashboards.
 

7b. Compare various dashboard designs to evaluate how effectively they present key performance indicators (KPIs) from sales and customer retention to recruitment to company financials 

  • Why do organizations develop key performance indicators (KPIs)?
  • What are the elements of effective KPIs?

Note the primary reason for creating KPIs – to measure success against strategic objectives. Such objectives are often difficult to measure, but there is no way to obtain feedback without measures. The KPIs are the measures used to assess organizational performance against the identified critical success factors and targets developed by the organization as a part of the strategic planning process.
 
Measures must be developed for KPIs, and this can be a challenge since many of them are intangible or lack easily obtainable data.
 
KPIs should be:

  1. Simple – A KPI should be as straightforward to measure as possible
  2. Relevant – Ensures that the right decision-makers are responsible for measuring specific KPIs
  3. Aligned – KPIs should support, and be derived from, the overall strategic goals of an organization
  4. Actionable – They should be easy to understand, and users should know what to do to achieve an effective outcome
  5. Measurable – KPIs should avoid generalized goals and provide specific insights into how the business is performing

Every organization is different, and so are its KPIs. To determine what is appropriate, linking KPIs to strategy and objectives is paramount to hone your focus and constant evaluation to ensure they are the most relevant KPIs. When choosing your KPI, focus on key metrics but remember to capture and identify those that are both lagging and leading.
 
To review, see 5 KPIs Every Business Must Consider.
 

7c. Describe the uses and differences among strategic, operational, and analytic dashboards 

  • What are the different needs of different levels of management?
  • How do dashboards support the different needs of different levels of management?

A dashboard is a tool used to manage information from a single access point. It helps managers and employees to keep track of the company's KPIs and utilizes business intelligence to help companies make data-driven decisions. Different dashboards are appropriate for different levels of the organization.
 
The level of decision-maker served by the dashboard will determine the requirements of the dashboard's scope and scale.
 
Dashboards can be designed, implemented, and deployed for every type of business, department, and function of a company. From recruitment of employees to sales, product monitoring, customer service live chats, and other areas. How the data is visualized makes a big difference. Charts in your dashboards, whether pie, line, or bar, can portray the same information but be misconstrued. Depending on what you wish to show, there is a chart type to suit your goal, but selecting the correct one means asking the right questions at the outset of your design process.
 
A temporal database stores results over time rather than just the current period results. These data are critical to strategic decision-making and must be included in a strategic dashboard.
 
Executive support of the most important critical success factors is critical because the executive makes everything else happen. They are responsible for supporting enterprise-wide data-driven decision-making by providing enterprise-wide support. This includes making sure the right resources are allocated to the initiative, assigning the right people to the team, and obtaining commitments from various departments in the organization. They are also responsible for helping the team work across departmental silos to access all of the organization's data.
 
To review, see Performance Dashboard Design.
 

Unit 7 Vocabulary 

This vocabulary list includes the terms that you will need to know to successfully complete the final exam.

  • analytic dashboard
  • dashboard
  • key performance indicator (KPI)
  • operational dashboard
  • strategic dashboard
  • temporal database

Unit 8: Project Management

8a. Evaluate how well an analysis aligns with a problem statement to ensure that requirements are appropriately addressed 

  • Why are requirements definition and management so critical in BI projects?
  • What process steps should be implemented to ensure that requirements are developed effectively?

Once analysts make their estimates or assessments, the management team has to determine how to turn the information into action. The organization's health and future depend upon its intelligence. There are various ways to manage the process, from the daily operations of the analytic team to information integration throughout the organization for optimal decision-making at all levels.
 
If a project is not well managed, it is likely to go off the rails. No matter how well the individual analysts or other team members perform, there will be no cohesion and no certainty that all requirements are fully met.
 
Requirements management establishes stakeholders' wants and needs and then reviews these to create a set of baseline requirements for solutions development and benefits management. Its goals are to ensure that all relevant stakeholders have the opportunity to express their wants and needs, reconcile multiple stakeholder requirements to create a single viable set of objectives and achieve stakeholder consensus on a baseline set of requirements.
 
A clear and agreed expression of requirements and their acceptance criteria is essential for the success of any project, program, or portfolio. Requirements may be expressed as physical deliverables, business benefits, aspirations, functions, or technical needs.
 
The designer of the BI system should be able to identify all relevant information and manage the process of working with users to develop scope definitions for the system. The developer needs to realize that the end-user may not always know exactly what they will need the system to do and should be able to fill in any gaps.
 
This chart illustrates the relationship between benefits, solutions, and requirements.


To review, see Requirements Management.
 

8b. Explain how an organization can ensure that BI reporting has value to its decision-making teams 

  • What are the characteristics of effective BI teams?
  • How can BI teams be managed?
  • What are the features of effective project report writing?

A key requirement for business intelligence systems and processes is supporting decision-making teams. A part of this is the effective management of analytic teams. Optimal analytic teams are diverse, with varying perspectives and skill sets, with a healthy respect for each member's area and level of competence. Depending upon their competence, they will quickly learn what tasks should naturally fall to which member. Eventually, the team itself will develop its own personality, which is a positive in a highly functional team. The key to developing the right team is to start with the right members. Success begins with recruitment – put the right people in the right seats on the bus and let them figure out where they need to drive.
 
Teams will quickly learn what tasks should naturally fall to which member, depending upon their competence. Even with the best members, teams undergo a growing process, which, if well managed, they form by getting to know each other and their norms and creating healthy work patterns. Then, they perform well, getting positive feedback from the manager and the decision-maker, making them stronger for the next project. The storming process occurs at any point in the project, sometimes early as they scope out their turf and become aware of others' strengths and weaknesses and sometimes later when project pressures increase. Research shows that when bonds appear to be fraying, the skillful manager will get "out of the way of the storm" so the team can work out their issues, then re-engages to ensure the team can effectively proceed apart.
 
The stages of team development are:

  1. Forming – individuals focus on defining and assigning tasks, establishing a schedule, organizing the team's work, and other start-up matters
  2. Storming – members begin to share ideas about what to do and how to do it that compete for consideration
  3. Norming – a period focused on developing shared values about how team members will work together
  4. Performing – team members work together easily on interdependent tasks and can communicate and coordinate effectively

Technical writing is the way that development teams report their progress. Technical writing is precise writing. Vague, overly general, hyperbolic, or subjective/ambiguous terms are not appropriate in this genre. You do not want to choose words and phrasing that could be interpreted in more than one way.
 
Technical reports should contain writing that is:

  1. Coherent – Ensures that the reader can easily follow your ideas and your train of thought
  2. Concise – Uses the least words possible to convey the most meaning while still maintaining clarity
  3. Concrete – Involves using specific, precise language to paint a picture for your readers so that they can more easily understand your ideas
  4. Complete – Includes all requested information and answers all relevant questions
  5. Courteous – Uses the least words possible to convey the most meaning while maintaining clarity

To review, see Building Successful Teams.
 

8c. Evaluate what management approaches to business intelligence would be most effective for your business case 

  • What role does the project charter play in managing the project?
  • Why is risk identification and management an important part of project management?

The project charter is a particular document that is a high-level description of the project and its purpose. It should be concise. Specific details about the project should then be included in the project scope and plan documents. The charter should focus on the big-picture strategy of the project. The purpose and justification and the project's "why" are critical charter elements. They are not necessary for the scope documents after the project has been approved. Interested people can always be referred back to the charter.
 
Variance analysis represents a best practice in managing project scope as the project proceeds through the execution phase of the project. Variance analysis consists of the production of regular project status reports, identifying deviates from the project plan, and determining how to address those deviations.
 
Risk management is critical for several reasons. Suppose you have an IT problem in your organization without an effective organizational or departmental risk anticipation and mitigation plan. In that case, your tech tools could become unavailable at a critical time in your project execution. You may also discover that your platform is missing key features or add-ons that make your project more difficult or even impossible to complete as planned. In this case, it is crucial to run through the collection, analysis, and reporting processes you intend to use and make sure you have the tools to conduct them most efficiently. It is also important to ensure you have tools your personnel can use. Not all data collection, warehousing, exploitation, and reporting tools have the same functionality and ease of use. If your team is not fully acquainted with your system, you may have significant delays and a lack of output. These must be worked out with your organization or department before committing to the project.
 
Problems with data can also relate to a lack of IT tools, but they are more likely to relate to the availability and quality of your data. This can be related to needing a dataset your organization does not have a subscription to or even obtaining data to discover it is incomplete or inaccurate. Once you have the data you think you need, it may need more cleaning and normalization than you had anticipated, or it may not relate as directly as you had anticipated to the specific requirement(s) of your project.
 
To review, see Risk Management Planning.
 

Unit 8 Vocabulary

This vocabulary list includes the terms that you will need to know to successfully complete the final exam.

  • coherent writing
  • complete writing
  • compromise building
  • concise writing
  • concrete writing
  • conflict resolution
  • consensus formation
  • courteous writing
  • deliverable
  • forming
  • norming
  • performing
  • project charter
  • risk management
  • storming
  • variance analysis