Warning: A non-numeric value encountered in /homepages/0/d741589923/htdocs/clickandbuilds/AssociationAnalytics/wp-content/themes/Builder/lib/builder-core/lib/layout-engine/modules/class-layout-module.php on line 499

Warning: A non-numeric value encountered in /homepages/0/d741589923/htdocs/clickandbuilds/AssociationAnalytics/wp-content/themes/Builder/lib/builder-core/lib/layout-engine/modules/class-layout-module.php on line 499

Archive for data governance

Successful Data Governance for NetForum Users

AUG Meeting – Educational Case Studies

Data Governance is the foundation for creating value and minimizing risks related to data, one of your associations largest assets. It doesn’t have to be difficult! We’ll take the mystery out of creating the processes and policies for sustainable, continuously improved data governance. Join us to hear how the American Geophysical Union (AGU) implemented Data Governance around netFORUM and other key source systems to:

• Ensure high quality data to support business operations and analytics
• Protect from significant fines related to gaps in personal data protection
• Reduce operating costs related to a lack of trust in reports and data
• Minimize confusion related to conflicting reports

Karine Blaufuss, Director, Business Data and Intelligence, American Geophysical Union (AGU)
Julie Sciullo, CEO, Association Analytics


Start the conversation at your association with the Data Governance Jumpstart Guide including:

• Roles and Responsibilities
• Strategic planning
• Tactical decision making
• Process Management
• Stewardship
• Organization structure
• Data Governance Policy
• Risk / Issue management process
• Data Quality management process


Download Your Jumpstart Guide


Data Governance – Bachelorette Style!

AMS Fest,  Washington, DC   November 1-2, 2018

Your association management software captures a ton of rich member data. Today there are a growing number of tools that empower your organization to transform data into assets to be leveraged for informed decisions and increased value for your members. Welcome to the data economy! In exchange for their data, members and customers expect security, privacy, and additional value in return. Are you prepared to take on the high level of trust, responsibility, and usage polices that accompany this trade?

During this session, we will unpack this topic with roses – which data element is the most attractive to you? Allow us to introduce you to six eligible bachelors — Mr. Defined, Mr. Clean, Mr. Secure, Mr. Private, Mr. Built (Architecture), and Mr. Organized.

The decisions you make will play an important role as you develop strong, yet flexible, data governance for transactions in this dat(e)-a economy.

Session Objectives:
• Building a business case for governance to gain executive support
• Data Inventory 101
• Managing data permissions
• Balancing privacy with personalization

Julie Sciullo, CEO, Association Analytics
Gretchen Steenstra, Strategic Consultant, Technology Management, DelCor Technology Solutions

About AMS Fest DC
Thought Leader Sessions. Case Studies. Tactical and Strategic Breakouts. Platform Previews. More!
Association TRENDS is hosting AMS Fest bringing you elbow-to-elbow with other association executives, AMS consultants and AMS vendors that all want to engage in discussions around nothing but AMS. You’ll be sure to hear talks related to any (and maybe all) of the following: the future of AMS, what’s trending, what’s not working and what is working, what’s the most innovative, should associations or technology vendors be the guiding force in development, what new systems are out there (there are many!), CRM vs AMS, should we expect cryptocurrency to effect our systems in the future, preparation for mergers and acquisitions, data privacy laws, and the list goes on… see the schedule for yourself.


An Approach to Analytics both Hamilton and Jefferson Could Embrace

Happy 4th of July!  What a great time to think about data independence, democratization, and governance for your association.  In this post we’ll talk about the balance between the central management of data by IT and data directly managed by association staff.
Leading analytics tools provide great capabilities to empower people to make data-guided decisions. The ability to analyze diverse data from a breadth of sources in a usable way for association staff is a key feature of these tools. Examples include Power BI Content Packs and Tableau Data Connectors. These range from pre-built data sources based on specific applications such as Dynamics, Google Analytics, and Salesforce; to relatively rarer “NoSQL” sources such as JSON, MarkLogic, and Hadoop data. These tools rapidly make data from specific applications available in formats for easy reporting, but can still lead to data silos. Tools such as Power BI and Tableau provide dashboard and drill-through capabilities to help bring these difference sources together.

Downstream Data Integration

This method of downstream integration is commonly described as “data blending” and “late binding”. An application of this approach is a data lake that brings all data into the environment but only integrates specific parts of data for analysis when needed. This approach does present some risks, as the external data sources are not pre-processed to enhance data quality and ensure conformance. In addition, business staff can misinterpret the data relationships that can lead to incorrect decisions. This makes formal training, adoption, and governance processes even more vital to analytics success.

What about the Data Warehouse?

When should you consider using content packs and connectors and how does this relate to a data warehouse and your association? The key is understanding that they do not replace a data warehouse, but is actually an extension of it. Let’s look at a few scenarios and approaches.

  • Key factors to consider when combine data is how closely the data is linked to common data points from other sources, the complexity of the data, and the uniqueness of the audience. For example, people throughout the association want profitability measures based on detailed cost data from Dynamics, while the finance group has reporting needs unique to their group. An optimal approach is to bring cost data into the data warehouse while linking content pack data by GL codes and dates. This enables finance staff to visualize data from multiple sources while drilling into certain detail as part of their analysis.
  • Another consideration is the timeliness of data needed to guide decisions. While the data warehouse may be refreshed daily or every few hours, staff may need the immediate and real-time ability review data such as meeting registrations, this morning’s email campaign, or why web content has just gone viral. This is like the traditional “HOLAP”, or Hybrid Online Analytical Processing, approach where data is pre-aggregated while providing direct links to detailed source data. It is important to note that analytical reporting should not directly access source systems on a regular basis, but can be used for scenarios such as reviewing exceptions and individual transaction data.
  • In some cases, you might not be sure how business staff will use data and it is worthwhile for them to explore data prior to integration into the data warehouse. For example, marketing staff might want to compare basic web analytics measures from Google Analytics against other data sources over time. In the meantime, plans can be made to expand web analytics to capture individual engagement, align the information architecture with a taxonomy, and track external clicks through a sales funnel. As these features are completed, you can use a phased approach to better align web analytics and promote Google Analytics data into the data warehouse. This also helps with adoption as it rapidly provides business staff with priority data while introducing data discovery and visualizations based on actual association data.
  • Another important factor is preparing for advanced analytics. Most of what we’ve described involves interactive data discovery using visualizations. In the case of advanced analytics, the data must be in a tightly integrated environment such as a data warehouse to build predictive models and generate results to drive action.

It’s not about the Tools

The common element is that using data from sources internal and external to your association requires accurate relationships between these sources, a common understanding of data, and confidence in data to drive data-guided decisions. This makes developing an analytics strategy with processes and governance even more important. As we’ve said on many occasions: it’s not about the tools, it’s the people.
Your association’s approach to data democratization doesn’t need to rise to the level of a constitutional convention or lead to overly passionate disputes.

How to Develop a Data Governance Policy

Do you have a data governance policy for your association? In this post, we’ll discuss why data governance is important and what your policy should include.

Why You Need a Data Governance Policy

Data is an asset, just like cash, buildings, and people. Just like other assets, data requires strong, consistent management.
When we neglect data, the results aren’t pretty – you get data quality issues, conflicting information, and confused staff who don’t know how to get answers. The result is lower quality data that causes mistrust and frustration. Staff turn to other means — intuition, politics, and tradition — when it’s not easy to make data-guided decisions and that can cost your organization.
Data governance is a cross-functional management activity that, at its core, recognizes data as an enterprise asset. A data governance policy will ensure that your association is treating data as an asset.

Developing a Data Governance Policy

Data governance policies can be authored by an internal team. Although, you may want to consider hiring an outside consultant if you have a large amount of data and data systems and/or would like a third party partner to provide an objective perspective. Here’s a 10-step process to developing your own policy.

  1. Communicate the value of data governance internally to business users and leadership. If your organization doesn’t currently have data governance, you may need to establish a business case. Consider the cost of the current situation and also the possible savings if your organization had data governance.
  2. Build a Data Governance Team. An internal team can help manage data governance and help ensure cross-departmental support.
  3. Assess the current state of data governance in within IT departments and business operations.
  4. Determine roles and responsibilities. A RACI chart could help you map out who is responsible, who is accountable, who needs to be consulted, and who should be kept informed about changes.
  5. Establish expectations, wants, and needs of key stakeholders through interviews, meetings, and informal conversations. This serves two purposes – you get valuable input but it’s also an opportunity to secure buy-in.
  6. Draft your policy and ask key stakeholders to review it and endorse it.
  7. Communicate the policy to all stakeholders. This could be a combination of group meetings and training, one-on-one conversations, recorded training videos, and written communication. Remember to consider other’s learning and communication preferences when selecting how to communicate.
  8. Establish performance metrics and a way to monitor adherence to the policy.
  9. Review performance regularly with your data governance team.
  10. Keep the policy fresh. Regularly review your data governance policy to make sure it reflects the current needs of the organization and stakeholders.

What Your Data Governance Policy Should Include

Unsure of what to include in your written Data Governance Policy? There are a number of factors to consider when developing a data governance policy and ensuring that data governance is adopted at a cultural level. Here’s an outline of a policy that you can adapt for your organization.

  • Goals – Establish overarching goals for each area below. Establish performance metrics so you can evaluate success.
  • People – Define the key data-related roles throughout the organization. For every data system, identify the data stewards who manage data as a corporate asset and focus on data quality; the data owners who have decision-making authority and define data quality standards; and the IT staff who provide technical support and monitor compliance. Consider using a RACI chart for this section.
  • Data Inventory – Inventory and document all data sources. Regularly review the inventory to include new sources and remove old sources.
  • Data Content Management – Identify purposes for which data are collected. Communicate purposes to staff and customers. Review collection policies regularly.
  • Data Records Management – Develop and adhere to policies that define how records should be created, maintained, and deleted.
  • Data Quality – Assign responsibility for data quality to appropriate staff. A data steward should perform regular audits to ensure quality.
  • Data Access – Define permissions and who has access to what systems.
  • Data Security – Define policies around data security, sharing of data, access to data. Include a risk assessment in this section that indicates the risk and the probability of risk occurrence.

Graphic Designed by Freepik

How Can Associations Use SQL 2012 Data Quality Services (DQS)?


This is what bad data is like…

Data Quality

How valuable is the Ford Pinto brand?  How about a patent for a cat scuba suit?  Like other intangible assets, the value of data is rooted in its quality.  Nonprofits have an even greater motivation to maintain the integrity of their information, because their organizational success is dependent on effectively communicating via membership data.  One of the data quality management tools we deploy at DSK Solutions includes SQL Server 2012 Data Quality Services.  DQS is especially valuable because it offers an efficient, semi-automated means for associations to create a data quality foundation to build their enterprise analytics.  By identifying data attributes and functional dependencies, DQS can effectively correct bad entries (cleanse) and eliminate duplicate records (match).

The Benefits of Data Quality Services

First, DQS has the powerful ability to automatically discover knowledge about your data.  Even with only a sample of the larger data set, DQS can identify inconsistent, incomplete, and invalid data.  For example, using Term-Based Relations (TBRs), DQS can identify strings that are inconsistent with the rest of the entries in that column.  So, if ninety-nine of your entries use “123 Oak St” as the street address and one uses “124 Oak St,” DQS will correct the odd entry to be consistent.  Additionally, developers can build domain rules that define the correct format or value.  For example, if a user email does not follow the pattern “something@somthing.com”, DQS can either mark the entry as invalid for later review or automatically update with the missing characters.
Next, DQS can check for consistencies throughout the record.  Using third party reference tools or user-defined rules, associations can validate that data is logical.  For example, if an entry lists a member city as “Chicago” and member state as “DC,” DQS can identify the inconsistency and either mark it as invalid or correct it to “IL.”  Another valuable feature is that users can develop matching rules to determine duplicate entries.  For example, if two records are 95% similar (again, based on user-defined rules), DQS can eliminate duplicate rows and consolidate the data into one unique entry.

2 types of data profiling

Two types of data profiling

Finally, DQS has an effective user-interface for controlling the discovery and cleansing process.  A DQS project steps through mapping the fields to rule domains, creating results that rate data on completeness and accuracy, and managing the project results.
Unfortunately, DQS is not a magic bullet.  There are some challenges to implementing DQS for large databases.  For example, the implementation of DQS for an AMS/CRM involves many important steps.  First, analysts, like DSK Solutions, consolidate problem data into a single table or view (DQS transformations work on one table, not entire databases).  Next, associations cleanse and match the data using a combination of DQS SSIS transformation and manual data verification.  Finally, data experts reintegrate the groomed data back into the original table structure (including considerations for timing, normalizing, and other SQL scripting).
DQS implementation

DQS Implementation for netFORUM

To conclude, it is important to note the DQS is knowledge-driven, meaning that it will take data-oriented managers to develop a strategy for a final asset.  As the non-profit world embraces data, DQS will play a pivotal role in creating the level of quality necessary to build an effective business intelligence infrastructure.

Data is an Asset

Data is one of the most important assets an association has because it defines each association’s uniqueness. You have data on members and prospects, their interests and purchases, your events, speakers, your content, social media, press, your staff, budget, strategic plan, and much more. But is your data accurate and are you using it fully? Your data is an asset and should be carefully cultivated, managed and refined into information which will allow you to better serve your community and ensure you remain viable in today’s competitive landscape.
Although data is one of the most important ‘raw materials’ of the modern world, most organizations do not treat it that way. In fact, according to The Data Warehousing Institute, the cost of poor data quality in America is six hundred billion dollars every year. Data quality issues are also the cause of many failed IT projects.
Your data is talking to you, are you listening?
Associations have known for a long time that data is essential for market segmentation. However, there is so much more that can be done to harness data and use it as a strategic asset. Hidden within your data are stories about which members are at risk of not renewing, which prospects are likely to join, who might make a good speaker, where the best location for your next event is, the level to which you can raise rates without a decrease in member count, your best strategy for global expansion, and much more. We would be wise to listen to the stories our data is telling us, and to make sure the data on which they are based, is accurate.
The insights you glean from your data are only as good as the underlying data itself. It’s obvious that if the input is flawed, the output will be misleading. When it comes to data, there is a direct correlation between the quality of the data and the accuracy of the analysis. I’m no longer surprised at the high number of duplicate records, and the high percentage of incomplete, inaccurate and inconsistent data we find when we begin to analyze an association’s data. Because it is difficult to quantify the value of data in the same way we can measure cash, buildings and people, the activities designed to manage and protect data as an asset are often low on the priority list. That is, until a business intelligence or analytics project is undertaken. Then suddenly data quality management (DQM) takes center stage.
DQM is a Partnership between Business and IT
Business responsibilities include: 1) Determining and defining the business rules that govern the data, and 2) Verifying the data quality.  IT responsibilities include: 1) Establishing architecture, technical facilities, systems, and databases, and 2) Managing the processes that acquire, maintain, disseminate data
DQM is a Program, Not a Project
DQM is not really a “project” because it doesn’t “end”. Think of DQM as a program consisting of the following activities:

  • Committing to and managing change – are the benefits clear and is everyone on board?
  • Describing the data quality requirements – what is the acceptable level of quality?
  • Documenting the technical requirements – how exactly will we clean it up and keep it clean?
  • Testing, validating, refining – is our DQM program working?  How can we make it better?

DQM is Proactive and Reactive
The proactive aspects of DQM include: establishing the structure of a DQM team, identifying the standard operating procedures (SOPs) that support the business, defining “acceptable quality”, and implementing a technical environment.  The reactive aspects include identifying and addressing existing data quality issues. This includes missing, inaccurate or duplicate data. For example:

  1. Important data may be missing because you have never collected it. The information you have on a member may allow you to send a renewal, but it’s not enough for you to determine their level of interest in the new programs you are rolling out in the coming year. Or the information you have on your publications is enough to be able to sell them online, but because the content is not tagged in a way that matches customer interest codes, you can’t serve up recommendations as part of the value your association offers. Associations must not only have accurate data but more data in order to fully understand the contextual landscape in which our members and prospects operate.
  2. When organizations merge, data from the two separate organizations needs to be combined, and it can often be very time-consuming to determine which aspects of the record to retain and which to retire. A determination must also be made about how to handle the historical financial transactions of the merged company.
  3. With the ability for visitors to create their own record online, the increase in duplicate records is on the rise. The Jon Smith who purchased a publication online is really the same Jonathan Smith who attended the last three events and whose membership is in the grace period. Because he used different email, a duplicate record is created and you miss the opportunity to remind him of the value of his membership when he registered for the event.

Sometimes it’s not until a data quality issue surfaces in a publicly embarrassing way that an organization decides to truly tackle the problem – a board report has erroneous data, an association cannot reply quickly to a request for statistics from an outsides source, thereby losing the PR opportunity, the CEO cannot determine the primary contact for an organization in the system. It’s usually only after several situations like these that DQM receives serious attention, but it is unfortunate that it often starts with a search for blame. This engenders fear which represents a threat to the success of a DQM initiative. It is essential that DQM programs begin with acceptance of the current state and commitment to a better future. A promise of “amnesty” with regard to what happened in the past can go a long way toward fostering buy-in for the program.
How do you Eat an Elephant?
The easiest way to start a DQM program is to start small. Identify an area that requires attention and focus first on that. In order to obtain support from key stakeholders, show how the program ties in with the association’s strategic plan.  After you identify the primary focus (for some it might not be company names, it might be demographics), set an initial timeframe (such as 3 months). Make the first project of the program manageable so you can obtain a relatively quick win and work the kinks out of your program.
Steps for your First DQM Initiative:

  1. Create a new position or assign primary DQM responsibilities to an individual
  2. Build a cross functional team and communicate the value of the program
  3. Decide how to measure quality (example # records reviewed/cleaned)
  4. Set a goal (# records)
  5. Reference the goal in the performance evaluation of the individuals on the team
  6. Evaluate progress
  7. Revise

Data is an asset and is one of the most important assets an association has because it is unique in its detail and context and can be used strategically to ensure we remain relevant and viable. When analyzed and leveraged properly it can provide a competitive advantage in attracting and retaining members and creating new sources of non-dues revenue. It is important that the underlying data is accurate and complete and a well-organized DQM program should be considered essential. Worldwide attention is being given to the importance of making “data-driven decisions”. Private industry and government have been using data to guide their decisions for many years and now is the time for associations to recognize data as the valuable asset it is.

Balanced Scorecard and Business Intelligence

Sometimes when an organization begins a business intelligence initiative (BI) they are so excited about data visualization and data transparency in the form of dashboards that the first thing they want to do is start measuring everything.  I believe that strategy comes before measures and those organizations that thoughtfully and purposefully align what they are measuring to their strategic plan achieve more meaningful long-term results from their BI initiative.

The Balanced Scorecard is a performance management system designed to align, measure, and communicate how well an organization’s activities are supporting the strategic vision and mission of the organization.
It was originated by Drs. Robert Kaplan (Harvard Business School) and David Norton as a performance measurement framework that added strategic non-financial performance measures to traditional financial metrics to create a more ‘balanced’ view of organizational performance.  Four strategic perspectives are addressed within the Balanced Scorecard framework:
  1. Customer 
  2. Financial 
  3. Internal Processes – commonly includes technology, systems, etc.
  4. Learning and Growth (aka “Organization Capacity”) – commonly includes people, training, etc.
Objectives (goals) are set for each perspective, measures (numbers) that represent things to be measured (such as sales, customers, returns) are identified and can then be transformed into ratios or counts, which serve as Key Performance Indicators (KPIs).  Initiatives (projects) are undertaken in order to “move the needle” in a positive direction on the KPI gage for that measure.
Balanced Scorecard dashboards include both leading and lagging indicators.  For example, customer and financial KPIs are traditionally lagging indicators – the numbers indicate what has already happened.  KPIs for the two perspectives of internal processes and learning/growth are leading indicators.  This is because positive results achieved with respect to internal processes and learning/growth initiatives should lead to a positive result in the customer and financial KPIs.
Gartner is a leader in the field of information technology research and they organize BI capabilities into three main categories:  analysis, information delivery and integration.  The concept of “scorecards” fits into their BI analysis category.  Gartner recognizes that tying the metrics displayed in a dashboard to an organization’s strategy map ensures that the most important things are being measured, because each measure on a scorecard is tied to the organization’s strategic plan.  Sounds obvious right?  But it’s still relatively rare and that’s a subject for another post.

Warning: A non-numeric value encountered in /homepages/0/d741589923/htdocs/clickandbuilds/AssociationAnalytics/wp-content/themes/Builder/lib/builder-core/lib/layout-engine/modules/class-layout-module.php on line 499