Archive for Data Analytics for Associations – Page 2

Refreshing Tableau Public Data


Tableau Public is a free tool to share your visualizations with the world.  One question we often here is how do you refresh data for a Tableau Public visualization.  Tableau Public will do an automatic, nightly refresh data of Google Sheets data, no extra work on your side.   However, there is a bit of additional work required in order to automatically refresh your Google Sheets data.  In our example we are pulling data from a AWS Redshift database using Python version 3.6
Google Sheets Set-Up
This set-up will create the credentials necessary to refresh your Google Sheets data.

  • Create a project in Google Developers Console https://console.developers.google.com/flows/enableapi?apiid=drive. Create an account if you have not already done so.
  • Click on Library and search for the Google Drive API, and verify the API is enabled.
  • Next, click on Credentials. Then click on create credentials -> OAuth Client ID, choose “Other” as the application type and enter a name for your credentials.  You will need these credentials to connect to Google Sheets.
  • Now, download the client ID you just created. Move this file to your working directory and rename it client_secret.json.  For example, C:\xxxx\xxxx\.credentials
  • Go into your Google Drive account and create a blank google sheet and copy the File ID from the url. https://docs.google.com/spreadsheets/d/1kadJ8y1_VZnKGqPx6ohA3F-OdHtablmyzr2ZF6nuH18/edit#gid=0.  You will need this ID for your Python script.

Python Scripts

  • For this script you will need to install both psycopg2 and google-api-python-client libraries.
  • Now create your Python script which begins with getting your credentials. You can find this at the Google Sheets developers guide on Step 3.
  • Once you have your get_credentials function in place, you will need to connect to your Redshift environment and insert your SQL query. You can connect to Redshift using the psycopg2 package.
  • We then use csv_writer to write the database query to a CSV file.
  • Now overwrite your Google Sheets file with the new data. This requires the Google Drive API.
  • When you run your script, a browser window will open and ask you to enable authentication. This will ensure your code has rights to write to Google Sheets.  This will only happen the first time this is run.
  • Once you have this set-up, you can schedule the Google Sheets refresh to happen prior to the Tableau Public refresh, which occurs around 6 am ET.

I hope this helps you get started with utilizing Tableau Public’s built in data refresh so you can share your visualizations with the world.
 

DAX the Way – with Power BI

DAX stands for Data Analysis eXpressions and it is the formula language used in Power BI.  It’s similar to Excel formulas and it’s an extremely powerful language for creating measures and calculated fields within your Power BI data model.  The downside is that it requires a bit of a learning curve to get up to speed.  There are a lot of great resources out there for learning DAX and to help us along the April release of Power BI Desktop included a preview feature called Quick measures.
Quick measures allow us to quickly and easily create common and powerful calculations.  A Quick measure generates the DAX for us based on the input provided in a dialog box, then presents the results in the report. Best of all, we can see the resulting DAX formula that will be executed, and use that to expand our knowledge of DAX.  Learning by example is always the best way in my book.

We create Quick measures by right-clicking a field in the Fields well, then selecting Quick measures from the menu that appears. Also, we can right-click any value in the Values pane of an existing visual. There are 6 categories of calculation types with a number of calculations for each.  As of July 2017, the Quick measure feature is still in “preview” mode, so to see it, we have to enable it.  Select File > Options and Settings > Options > Preview Features, then select the checkbox beside Quick measures. Restart Power BI Desktop after you make the selection.
When creating a Quick measure, a menu will pop up which guides us through the creation of the specific measure.  It’s a little confusing at first but after a little trial and error things start to make sense.
Another cool thing is that there’s a Quick Measures Gallery in the Power BI community site where DAX experts have shared some advanced DAX snippets.  These community contributions often make it into the Power BI Desktop Quick measures menu in future PBI Desktop releases.
So give Quick measures a try and soon we’ll all be singing DAX the way. Uh huh, Uh huh. I like it!.

An Approach to Analytics both Hamilton and Jefferson Could Embrace

Happy 4th of July!  What a great time to think about data independence, democratization, and governance for your association.  In this post we’ll talk about the balance between the central management of data by IT and data directly managed by association staff.
Leading analytics tools provide great capabilities to empower people to make data-guided decisions. The ability to analyze diverse data from a breadth of sources in a usable way for association staff is a key feature of these tools. Examples include Power BI Content Packs and Tableau Data Connectors. These range from pre-built data sources based on specific applications such as Dynamics, Google Analytics, and Salesforce; to relatively rarer “NoSQL” sources such as JSON, MarkLogic, and Hadoop data. These tools rapidly make data from specific applications available in formats for easy reporting, but can still lead to data silos. Tools such as Power BI and Tableau provide dashboard and drill-through capabilities to help bring these difference sources together.

Downstream Data Integration

This method of downstream integration is commonly described as “data blending” and “late binding”. An application of this approach is a data lake that brings all data into the environment but only integrates specific parts of data for analysis when needed. This approach does present some risks, as the external data sources are not pre-processed to enhance data quality and ensure conformance. In addition, business staff can misinterpret the data relationships that can lead to incorrect decisions. This makes formal training, adoption, and governance processes even more vital to analytics success.

What about the Data Warehouse?

When should you consider using content packs and connectors and how does this relate to a data warehouse and your association? The key is understanding that they do not replace a data warehouse, but is actually an extension of it. Let’s look at a few scenarios and approaches.

  • Key factors to consider when combine data is how closely the data is linked to common data points from other sources, the complexity of the data, and the uniqueness of the audience. For example, people throughout the association want profitability measures based on detailed cost data from Dynamics, while the finance group has reporting needs unique to their group. An optimal approach is to bring cost data into the data warehouse while linking content pack data by GL codes and dates. This enables finance staff to visualize data from multiple sources while drilling into certain detail as part of their analysis.
  • Another consideration is the timeliness of data needed to guide decisions. While the data warehouse may be refreshed daily or every few hours, staff may need the immediate and real-time ability review data such as meeting registrations, this morning’s email campaign, or why web content has just gone viral. This is like the traditional “HOLAP”, or Hybrid Online Analytical Processing, approach where data is pre-aggregated while providing direct links to detailed source data. It is important to note that analytical reporting should not directly access source systems on a regular basis, but can be used for scenarios such as reviewing exceptions and individual transaction data.
  • In some cases, you might not be sure how business staff will use data and it is worthwhile for them to explore data prior to integration into the data warehouse. For example, marketing staff might want to compare basic web analytics measures from Google Analytics against other data sources over time. In the meantime, plans can be made to expand web analytics to capture individual engagement, align the information architecture with a taxonomy, and track external clicks through a sales funnel. As these features are completed, you can use a phased approach to better align web analytics and promote Google Analytics data into the data warehouse. This also helps with adoption as it rapidly provides business staff with priority data while introducing data discovery and visualizations based on actual association data.
  • Another important factor is preparing for advanced analytics. Most of what we’ve described involves interactive data discovery using visualizations. In the case of advanced analytics, the data must be in a tightly integrated environment such as a data warehouse to build predictive models and generate results to drive action.

It’s not about the Tools

The common element is that using data from sources internal and external to your association requires accurate relationships between these sources, a common understanding of data, and confidence in data to drive data-guided decisions. This makes developing an analytics strategy with processes and governance even more important. As we’ve said on many occasions: it’s not about the tools, it’s the people.
Your association’s approach to data democratization doesn’t need to rise to the level of a constitutional convention or lead to overly passionate disputes.

How Do I Create an Analytics Strategy & Roadmap for My Association?

When I think about the many reasons an association should create an Analytics Strategy and Roadmap, I am reminded of Stephen Covey’s great advice: “Begin with the end in mind”.  Business intelligence and analytics implementations deliver the most value wAnalytics Strategyhen the analytics strategy is connected to the overall strategy of the association.  After all, a strategic roadmap is a plan that defines where an organization is, where it wants to go, and how it will get it there.  However just like the famous quote from Lewis Carroll, “If you don’t know where you want to go, any road will lead you there.”

8 Primary Goals for Analytics Strategy and Roadmap

During the 18 years we have been helping associations determine how to use data to make decisions, we have identified 8 primary goals for an Analytics Strategy & Roadmap process:

  1. Align analytics objectives with the organizational strategic plan.
  2. Recognize the importance of data as a key organizational asset that requires oversight and governance to ensure quality.
  3. Assess and document the current state of Data, Technology, Processes and Culture.
  4. Quantify the direct and indirect costs of the current situation (data is not clean, accessible, understood and consistently used for decisions).
  5. Identify achievable desired outcomes and understand their value.
  6. Prioritize these outcomes according to business impact, technical complexity, and organizational considerations.
  7. Educate executives and staff about what is possible and what to expect from an analytics initiative.
  8. Establish a high level plan for implementing the analytics strategy, including scope, cost and schedule.

How do I Create an Analytics Strategy and Roadmap?

There are many steps to do this well, and here are the key ones to get you started:

  1. Review your association’s strategic plan and identify measurable objectives and outcomes which can be achieved with the optimal use of data
  2. Create an Analytics Scorecard by honestly evaluating your association’s Data, Technology, Reporting and Organizational Culture
  3. Evaluate your current Data Governance Process, including quality, accountability, semantics, integration, etc.
  4. Assess each data source using an impact/complexity matrix
  5. Identify metrics, Key Performance Indicators (KPIs), and business questions that can be answered with data
  6. Create a risk matrix that maps the probabilities of each risk – including the cost of not taking action with analytics
  7. Review internal staff resources and identify staff augmentation and training needs
  8. Understand the options available for data warehousing and analytics tools such as Power BI or Tableau

The Ultimate Outcomes

Our experience has shown that key outcomes an association will discover as part of their Analytics Strategy and Roadmap initiative are the clear need to establish and maintain a data governance program, implement a “single version of the truth” and embrace a culture of analytics.  These things only happen when leadership recognizes data as an asset and key data sources are combined (and consistently updated in a data warehouse) and staff are rewarded for adopting an analytical mindset.  The best way to determine if this is the case for your organization is to begin with the end in mind and undertake an Analytics Strategy and Roadmap project.

Are you asking the right questions about your email marketing?

Chart of email metrics
Attend a webinar or perform a web search on the topic of email marketing effectiveness and you’re essentially guaranteed to receive standard information about KPIs such as click rate and open rate. You might get so far as to discover that by combining proper email hyperlink formatting you can track that user’s clicks through your web site when you combine Google Analytics web tracking for a more complete picture of user interests. However, can you answer other basic questions about the emails you are sending besides their open and click rates, such as:

  • How many messages have been sent in the past X period?
  • Which campaigns (segmented groups) get the most messages sent, clicked, and opened?
  • Which specific links were clicked the most?
  • For the highest rated email links, did those click through users visit any other pages on the site?
  • How long is it from when a message is sent to when it is opened? What about by a specific member type or job title?
  • Which people and/or organizations had bounced messages and the reason?
  • Who are the top and bottom individual email engagers based on opens or clicks?
  • What is the most popular day of the week for messages to be opened (sample shown below)?
  • What is the most popular time of day to open a message (sample shown below)?

email_date_time
By including your email marketing statistics and your web traffic from Google Analytics in your data mart, you can answer these questions and more. Consider experimenting with A/B testing in your email marketing campaigns to see if the same message gets more opens when sent at a different time on the same day. Turn “spam” into perfectly timed and important content for your constituents.

Introduction to Data Augmentation

What is Data Augmentation?

Data analytics provides a complete understanding of your customers so you can provide better, more relevant products and services. However, the value of analytics is limited without rich data on customer characteristics, interests, purchases, and behaviors.
Associations typically have lots of data in their business systems. From membership data in your Association Management System to registrant data in your Event Management System, your critical business systems are flush with data.
Yet, there are instances when data is not complete or would be more meaningful if combined with an external data source. This is where data augmentation can help.
Data Augmentation is the process of improving base data with information from internal and external sources.

How to Augment Data

So what’s the process for data augmentation?

  1. Identify gaps in your data. Look for missing demographic and psychographic information. For example, do you know the job titles of your members? Do you know your members’ educational history? What are your members interested in?
  2. All activities should support the mission of your association and data augmentation is no exception. Once you have identified missing data, prioritize gaps based on how that information would help advance the mission of your organization if you had it.
  3. Identify where you can get missing data. You may need to conduct a survey of members. You may need to purchase a third-party data set. See our list of free data sources.
  4. Evaluate data. You should evaluate potential external data sources for cost, completeness, and the level of complexity and effort for integration.
  5. Plan for time and resources to obtain data. Third-party data sources, like Hoovers, may require an investment. Be sure budget appropriately.
  6. Plan how you will collect the data. Make sure you have a plan for how to acquire the data. You should also evaluate the ROI of the data to ensure that you have a strong business case for collecting the data. Consider both the direct and indirect costs of surveying members and purchasing lists.
  7. Determine where data will be stored or appended. In some cases, it may make sense to add data to a field in your AMS or another system. There are also going to be instances where it doesn’t make sense to add data to the AMS. For example, U.S. Census data or prospect lists may not be appropriate to include in your system of record.

Before collecting data, it’s important to define how the data will be used and how (and if) it will be included in the data warehouse. This will help ensure the data is not only collected, but usable. Happy hunting!

Are You Ready for Cloud Analytics?

Cloud analytics is likely to be one of the biggest analytics trends in 2017. According to a recent survey published by the consulting firm Enterprise Management Associates, cloud analytics was of key importance and an essential part of their analytics strategy for 70% of respondents.

What is Cloud Analytics?

Technopedia offers a good definition of cloud analytics:
“Coud analytics is a type of cloud service model where data analysis and related services are performed on a public or private cloud. These solutions and services are delivered through cloud models, such as hosted data warehouses.”

Benefits of Cloud Analytics

There are a number of benefits to hosted, cloud-based solutions, including:

  • Low Startup Costs – You don’t have to make large capital investments by purchasing on-premise servers.
  • Rapid Provisioning and Implementation – Implementation can take a matter of days and setting up new users or environments
  • Accessibility – Cloud-based solutions are accessible anywhere and do not require a VPN connection.
  • Reliability – Cloud infrastructure is more reliable than premise-based deployments.
  • Scalability and Flexibility – It is easier and more cost effective to add space in a hosted environment.
  • Risk Management – Most hosted solutions include risk management, redundancies, and disaster recovery, minimizing risk for the association.

Association Analytics® builds and maintains data warehouses in a virtual, private cloud on the industry-leading Amazon Web Services platform. Gartner recognized Amazon Web Services as a leader in the Magic Quadrant for Data Management Solutions for Analytics.
We use Amazon DynamoDB and Amazon Redshift. Amazon DynamoDB is a fast and flexible NoSQL database service and is a fully managed cloud database that supports both document and key-value store models. Amazon Redshift is optimized for data warehousing and more efficient when handling large quantities of data.

What Happens When You Have No Data Governance?

Association Analytics® CEO Debbie King recently co-presented a webinar on predictive analytics with Abila’s Carlos Restrepo. During the webinar, Debbie and Carlos asked the audience of association professionals if they had an effective data governance program. Of the 67 replies, only 12% said their association has a data governance program.

PollResults-DataGovernance

According to this informal poll, only 12% of associations have a data governance program.


 

Consequences of No Data Governance Program

Data governance is a holistic approach to the way you manage, collect, use, and store data. A good data governance program includes a cross-functional team, a defined set of procedures, and a plan to execute and monitor those procedures.
There are consequences to not having data governance.
In the absence of a data governance program, the decisions made about key data systems are made by association staff who “own” the system. When there are different owners for systems (which is very common), the result is inconsistencies in data availability, collection, usability, integrity, and security.

Example of the Consequences

Imagine an association with an Association Management System (AMS) and an event registration system. Each system is managed by a different department and there is no overarching governance. Since there is no shared understanding of what data should be collected, the events team might collect mailing addresses information, but permit addresses in a different format than what is in the AMS. This instantly causes data quality problems. It also can negatively impact user experience if there is no integration or procedures to update the AMS. Members may believe that by updating their address in one system, it will update in all others.
If your association doesn’t have a data governance program, this scenario may be all too familiar.

Short-Term and Long-Term Impact

Without data governance, there is an inconsistent application of validation and business rules which leads to inefficiencies and data quality issues. There are long-term impacts as well. Ultimately, poor quality data can lead to a decrease in the trust level of the data in the system, which in turn can lead to the abandonment of the system by business users.
Association Analytics® works with associations to develop sensible data governance policies. For assistance with your data governance, contact us at info@associationanalytics.com.

Output vs. Outcome: Metrics for Email Marketers

Email marketing is a primary way that associations engage with members. With the cost of email marketing systems and staff time, the investment in email marketing can be significant. So how do you measure the success of emails to ensure a return on investment?
There are two types of email metrics or key performance indicators (KPIs) – those focused on Output and those focused on Outcomes. What’s the difference? Output is the “how” — how you do something or, in this case, deliver an email. Harvard Business Review defines outcomes as “the difference made by the outputs” or “the benefit your customers receive.”

Output Email Metrics

Output Email Metrics focus on operational performance, which is generally measured by bounce rate and type of bounce. This is more about the quality of the list and a good barometer for data hygiene.

  • Bounce Rate = (Number of Bounces/Total Recipients of Email Campaign) * 100
  • Type of Bounce (e.g., Hard Bounce, Soft Bounce, etc.)
  • Spam Complaint Rate = (Number of Spam Complaints/Total Recipients of Email Campaign) * 100
  • Unsubscribe Rate =  (Number of unsubscribes/total recipients) * 100
  • Churn Rate = Percent change in list size after the unsubscribes, complaints, and hard bounces are taken into account

These are important metrics to track as they can help identify data quality and deliverability issues.

Outcome Email Metrics

Outcome Email Metrics focus more on engagement and the effectiveness of your email campaign at convincing readers to take action.

  • Conversion Rate = (Number of people who completed the desired action/Total Recipients) * 100. Conversion rate is the best metric for measuring outcomes of email campaigns. Here’s a tutorial on setting up Google Analytics to measure conversion rates from Smart Insights.
  • Open Rate = Emails Opened/(Emails Sent-Number of Bounces) * 100
  • Clickthrough Rate = (Total clicks OR unique clicks/Total Recipeints) * 100
  • Email Sharing/Forwarding Rate

There are many email metrics that you can use, but, ultimately, metrics should be based on your strategy and unique goals.
Banner Designed by Freepik

Tame Your Data Analytics with Governance

As more and more associations invest in data analytics, it’s important to develop some policies that govern the use of data analytics. It might seem inconsequential, but an established governance policy for data analytics can bring order to your analytics efforts and help maximize your investment.

Signs that You Need Data Analytics Governance

It’s easier than ever to use data analytics. Many programs now come with some sort of analytics capabilities and business tools, like Tableau and Power BI, make it simple to visualize data. Data analytics democratizes data, but that democratization can cause problems when there is not agreement on what data to use and how to use it.
Here are some signs that you may need data analytics governance:

  1. Staff is using different, conflicting data to create visualizations
  2. Staff is unsure which report, dashboard, or visualization to use to make decisions
  3. Old dashboards that use out-dated data or key performance indicators are still in use

If any of those symptoms sound familiar, don’t be alarmed. This sort of confusion is common when you are dealing with a new technology and data analytics is still relatively new. The key is to start thinking about how you can improve communication and management of data analytics through better governance.

Developing a Data Analytics Governance Policy

Here are some steps to help you develop a data analytics governance policy.

  • Form a Cross-Departmental Team – People are more likely to support that which they helped create. Form a cross-departmental team to help develop the policy and monitor governance going forward.
  • Define Responsibilities – Consider developing a RACI matrix to help identify responsibility for creating, editing, and deleting visualizations and dashboards.
  • Identify Data Sources – Your data governance policy should have a complete data inventory, but it’s also important to document which data sources feed each dashboard or visualization. This provides context for the visualization, but also helps staff find data sources to correct data quality issues when they arise.
  • Determine Visualization Tools – There are a number of visualization tools available, but too many tools can cause confusion and slow adoption. Minimize the number of tools you use and carefully assess the value of visualization tools before introducing them.
  • Document and Track Analytics Projects – For new analytics projects, document the intended purpose, timeline, and measures for success. Make sure staff knows about past, current, and pending projects to avoid duplicate work and to increase usage.
  • Validate – Ask staff members to validate and test new dashboards and visualizations for accuracy and clarity.
  • Review – Regularly assess dashboards and visualizations for accuracy and relevancy. It’s important to check to make sure data visualizations are still accurate. Cross-check data against the data source to check for quality. When considering relevancy, think about the impact of the visualization. Has it added value to decision-making processes? Is it being used? Is it answering current business questions? If not, then you may need to retire the visualization.

Finally, make sure your data analytics policy is part of your larger data governance policy. And keep it simple! This depends on your staff, but most people have trouble remembering a lengthy set of rules. Consider developing a short, memorable, and meaningful list of policies.
Governance can be a little dry, but it’s key to maintaining a shared understanding of data and ensuring the ROI of your dashboards and visualizations.
Banner Designed by Freepik