Warning: A non-numeric value encountered in /homepages/0/d741589923/htdocs/clickandbuilds/AssociationAnalytics/wp-content/themes/Builder/lib/builder-core/lib/layout-engine/modules/class-layout-module.php on line 499

Warning: A non-numeric value encountered in /homepages/0/d741589923/htdocs/clickandbuilds/AssociationAnalytics/wp-content/themes/Builder/lib/builder-core/lib/layout-engine/modules/class-layout-module.php on line 499

Archive for Business Intelligence – Page 2

A Detailed Explanation of Level of Detail Calculations

Level of detail expressions (which are sometimes also referred to as “LOD Expressions” or “LOD Calculations”) are an advanced analytics feature in Tableau. LOD Expressions are useful for cohort analysis or looking at averages or totals across segements.
When Tableau first released Level of Detail calculations, we provided an intro into Fixed Level of Detail calculations. Now that Tableau has upgraded to version 10, I wanted to spend time going into the other elements of LOD calculations and some updates that occurred with the upgrade.

Structure of a LOD Calculation

There are three types of LOD calculations:

  1. Fixed;
  2. Include;
  3. Exclude.

Regardless of which type you do; the syntax stays the same. Also, new in Tableau 10, you can use expressions on dimensions within the calculation. For example, previously if you wanted to use the year from a date field, you would have to create a separate calculated field in order to determine the year and then reference that calculated field within your LOD expression.
{(Fixed/Include/Exclude) <Dimension 1>, <Dimension 2>: <aggregation of measure>}
{FIXED [State], YEAR([Event Date]): COUNTD([Registration Key])}


With a Fixed LOD, you specify the dimensions and the aggregation computes for those specific dimensions, regardless if these dimensions are on your visualization. If we use the formula above as an example, the distinct count of registrations will always happen per state, per year, even if you do not have state or event date on the visualization.


With an Include LOD, you specify dimensions, in addition to the dimensions in the view, to aggregate your measures against.
{INCLUDE year([Event Date]),[State Province Code]: AVG([Amount])}
If you do an average of amount, it will calculate the average amount per topic name. The LOD calculation does an average per year, state, and topic name.


With an Exclude LOD, you specify what dimensions including in the view, to exclude when you aggregate your measures.
{EXCLUDE [Topic Name]: AVG([Amount])}
In this example, the LOD calculation, will average per state and year and ignore the topic name.
Level of Detail calculations are very powerful tools to use within Tableau. It is easy to get stuck in one way of doing your analysis and visualizations. I hope this helps you think of new ways to attack your analysis.

What Level of Social Media Analytics Are You At?

As associations strive to better understand their customers to provide excellent service and relevant products and services, social media strategy and data are growing sources of competitive advantage. Social media is a rich source of customer interests and preferences that complement other data sources, like your membership database, finance, and email marketing systems, to provide a 360° view of your customer.
Social Media Analytics is the process of collecting data from social sources, like Facebook, Twitter, or blogs, and analyzing that data to make decisions. There are three levels of social media analytics. Which level is your organization at?

Level 1. Social Media Management

Your organization likely has a basic strategy to engage on social media. You’re posting to popular platforms to build online presence and share content. You’re using a management tool such as Hootsuite or Sprout Social to connect multiple social media accounts and manage traffic in a single place. But is your association able to measure the effect of these interactions with customers and take action on this information?

Level 2. Social Media Monitoring

Also known as Social Listening, Social Media Monitoring (SMM) is the process of identifying and assessing what is being said about a brand or product in social media. Monitoring tools listen to online conversations for specific keywords.  Common keywords include your twitter handle, hashtag, brand mentions, competitors, and links. Monitoring gives a high level view of overall brand sentiment, as well as real-time keyword alerts.
SMM can be used for brand awareness campaigns, improving customer service, finding new customers, or joining online conversations with customers or influencers. Popular SMM tools include Brandwatch, Brand24, and Mention.

Level 3. Social Media Intelligence

Social Media Intelligence (SMI) is the process of analyzing social media data to inform business strategy and guide decision making. It takes monitoring to the next level with a broader reach and deeper analysis of markets, customers and even competitors. This requires the use of more sophisticated tools that automatically mine millions of online conversations, using Natural Language processing and advanced algorithms to determine context well beyond a simple positive or negative.
SMI is being used to measure sentiment about individual products or services, map customer journeys, and detect risks and threats. Popular SMI tools include Crimson Hexagon, Sysomos, and Synthesio.
It’s important to note that social media data is one of many sources that are part of a comprehensive data strategy. Learn more about the power of combining data.

How to Succeed with Data Analytics

If you’ve ever participated in a data analytics implementation, you may be familiar with the indescribable excitement around the project. Who wouldn’t be eager for a solution that makes it easier and more efficient to understand and serve your customers?
But what happens after excitement of your initial implementation fades, the consultants have gone home, and your dashboards have lost their shiny, new appeal? How can you ensure a return on your investment?
Recently, Galina Kozachenko (Association for Financial Professionals) and Debbie King (Association Analytics) discussed the afterglow of data analytics as part of the weekly Association Chat series hosted by Kiki L’Italien. You can replay the recording here. Here are my top 5 takeaways for how to succeed with data analytics:

1. Align Analytics with Association Strategy

What gets measured, gets done. “Analytics and strategy need to live side by side,” said Galina. It’s important that for every strategy, you have a hypothesis that’s tested by measuring and tracking defined metrics.

2. Manage Your Scope

Don’t start too big. “We have seen the greatest success when an association starts out by analyzing one area at a time,” said Debbie King. Prioritize your business areas and ensure successful implementation of one area before moving on to others.

3. Establish and Enforce Data Governance

Data governance is elusive, but attainable if you treat data as an enterprise asset that is the responsibility of everyone. Galina recommended evaluating your data early in any analytics engagement to better understand what elements will need to be kept clean in the future. Read more about data governance.

4. Identify a Data Champion

One of the most important factors in successful adoption of a data strategy is having one (or more) data champions. These internal staff members are able — through influence, education, or example — to advance the cause of data throughout the organization.

5. Be Prepared to Manage Change

Data analytics is an exercise in change management and that change won’t happen overnight. “It’s not a one week journey,” said Galina, “but once the traction picks up, it will be like a self-propelling train.”
To help ensure adoption, you need the support and buy-in of leadership and staff. Communication throughout the project is key. Be prepared to continuously demonstrate value through “quick wins” and sharing success stories. Your data champion or analysts also will need to commit to spending time training, providing analysis, and working with both the early adopters and the risk-averse.
Debbie recommends publicizing and promoting internal data analytics work in much the same way you would promote an external benefit to members. Pick an area and provide a weekly summary to leadership about the meaning of “story in the data”. Encourage the analytical mindset by having a “Question of the week”. Look for and show examples of surprises in the data that defy intuition. Increase visibility and stimulate interest by placing a monitor in the kitchen or lobby that shows high level visualizations that rotate each day.

failure-quoteGiving Up is the Only Sure Way to Fail

Ultimately, though, success isn’t even possible if you don’t try. “The only time you can really call an analytics initiative a failure is if you give up,” said Debbie. “It’s an iterative process and the most important thing is to get started where you are.”

Take Action: Using Web Analytics for Marketing Automation

This week we’re wrapping up work with a client that focused on using data analytics to facilitate marketing automation based on web traffic. For example, the client wanted to message only to a group of members who visited a particular page or series of pages on their website. By visiting the page, they showed implicit interest in the topic, which means they might be interesting a related publication or event.
To do this, we took web analytics data from the data mart and put it back into the Association Management System (AMS).  The goal was to enable a particular set of data to be used in a more automated fashion for marketing.
This is a slightly unconventional approach, but here’s why we did it.
In order for the data set to be used in automated marketing it needed to be accessible to the marketing engine, in this case, RealMagnet. If you’re familiar with RealMagnet, you likely know that data can be brought into it manually or via an integration with your AMS.  The client wanted to automate the process so we had to duplicate the Google Analytics data into the AMS so it was accessible to RealMagnet.
The flow of the solution ended up being like this:
As you can see, the data we put into the AMS originated from Google Analytics.
It’s important to note that Google Analytics’s API has daily limits. The data set we brought isn’t currently close to the limit, but we did experience large spikes. Plus, other sets of Google Analytics data is extracted nightly, which count against the daily limit. However, if we were to duplicate the extract directly to the AMS, we’d be closer to the imposed limit which could limit future growth.  Another downside to that approach would have been implementing the Google Analytics extract on the AMS server which means you would have to maintain it in two places on two separate VPNs.
Instead of going directly from Google Analytics to the AMS, we opted to use the API of the AMS to insert the data to a custom table once it was extracted nightly from the Google Analytics API. Aside from being a more straight-froward approach, this enables us to clean the data prior to insert. We removed any entries that weren’t authenticated with a logged in user in which the customer key was tracked in a custom dimension in Google Analytics.  We also performed aggregation to sum total page views per user, per day loading back to the AMS.
Once the data is in the AMS, it is a simple join to the individual record to get the email address and perform marketing automation by writing queries and flowing data into the RealMagnet engine.
At the beginning of this post I used the word “unconventional.” On the surface talking about putting data in an automated fashion from the data mart into the AMS seems so, but once you get into the details it becomes clear it really is the optimal solution for associations who want to take action on their data.

Our Favorite Public Data Sources

US Census MapWe’ve demonstrated the importance of both leveraging the data that your association already has along with extending beyond the walls of your organizations to understand customer journeys.  Incorporating publicly available data provides many creative opportunities to further create association analytics to drive data-guided decisions.
1. U.S. Census Bureau
The most commonly requested data is probably that provided by the U.S. Census Bureau.  Census data is often associated with basic population counts required for Congressional Apportionment, but it is much more than just counting people.  The American Community Survey is updated annually and details changes in local communities.  Along with a trove of economic data, other valuable Census data includes American FactFinder (AFF) with diverse areas such as E-Commerce sales, home-based business, and purchased services.
Here is a quick tip concerning Census data.  The common geographic information in your data is probably zip code, which is really intended for United States Postal Service logistics. Fortunately, the Census Bureau provides ZIP Code Tabulation Areas (ZCTAs) that are generalized representations zip code areas along with data describing geographic relationships.
2. Data.gov
Another great source is data.gov, which aggregates information from nearly 500 publishers driven by the 2013 Federal Open Data Policy requiring “newly-generated government data is required to be made available in open, machine-readable formats, while continuing to ensure privacy and security.”  The data.gov website includes a range of data as broad as analytics born from your imagination:

3. National Oceanic and Atmospheric Administration
Eager to learn how weather impacts event registration?  The National Oceanic and Atmospheric Administration (NOAA) provides weather data including temperature and precipitation along with normal levels by the hour.
4. Centers for Medicare & Medicaid Services
How about the prevalence of health care topics and related data such as settings-of-care and insurance payments?  The Centers for Medicare & Medicaid Services (CMS) provides a range of health-related data.
5. Wikipedia
The most collective data source of all is provided by crowd-sourced Wikipedia that includes project and page view trends.
6. 990 Data
No discussion of association data sources can be complete without mentioning available data about associations themselves.  Annual IRS “990 data” provides details of organizations exempt under Sections 501(c)(3) through 501(c)(9) of the Internal Revenue Code.
Honorable Mentions
Various non-profit organizations and other NGOs contribute their valuable data to the public, including the National Opinion Research Center (NORC), the Pew Research Center, and the Sunlight Foundation that promotes making government and politics more accountable and transparent.
A growing source of public data compiled by data scientists is provided by Kaggle, a young company mainly known for holding data science competitions.  Fascinating data can come from unexpected sources, such as private organizations that generate unique data as a result of their core business.  For example, the ADP National Employment Report has evolved into an eagerly awaited economic indicator.  Another example is the widely cited U-Haul National Migration Trend Report detailing population changes that occasionally surprise people.
Now What?
If you’re anything like me, you might not have managed to read this far as exploring these data sources can rival the most addictive websites and video games.  A great feature of Tableau Desktop is to quickly visualize diverse data.  Once you decide data should be consistently available through the organization, creating a sustainable data architecture ensures your association can flexibly explore all available data together while providing the foundation for even more opportunities using advanced analytics.

What will data do for my Associaton? Understanding Value in 3 Stages

Stages of Value
Previously, we’ve discussed how to kick start an analytics initiative with Strategy and Discovery, a process where you compare the current state to a desired future state for the purpose of creating an Analytics Roadmap.   This exercise can illustrate the value in a completed engagement.  But since analytics engagements are long processes of continuous improvement, it’s often important to understand what (and to some degree, when) results and impact to expect along the way.
For this, we can consider three stages of value that occur during a data analytics initiative.
Optimization – This means you’re doing what you already do, only better.  Often this value can be obtained just be using the base features of a product or an early iteration of a custom solution.  In the case of data analytics, solutions offer easier access to current data and rich visualizations.  Value from optimization is primarily at the individual level or within a single department.  For example, associations typically notice, when compared to their previous systems, a new analytics solution is:

  • Faster – saves time compared to manual processes to collect data and create reports.
  • More Accurate – single version of the truth means consistent numbers and metrics. Less handling and summarizing increases accuracy.
  • Visually appealing – new visualizations are more attractive and accessible than tables and spreadsheets.

Process Improvement – When individuals are able to perform tasks faster and more accurately, this optimization can be leveraged to change how you do business.  This means improving processes that may span multiple departments, implementing new processes that were not possible, or retiring processes that are redundant or just don’t make sense anymore.  Process Improvement value spans across functions and business areas in order to increase productivity.  For example;

  • One source of data for analysis – no more pulling data from multiple systems/departments
  • Targeted communications – Membership and marketing messaging and content can be tailored to particular audiences, reducing waste and increasing effectiveness.
  • More personal service to customers and prospects increases engagement
  • Rich customer insight drives decisions on products and services
  • Implicit and explicit behaviors are better understood
  • Financial forecasts and comparisons use current data.

Strategy – this highest level of value occurs when the strategic objectives and the mission of the organization are advanced.  Examples include:

  • Increased dues and non-dues revenue
  • Improved Member satisfaction
  • Increased Membership
  • Finding New Audiences
  • Increased Relevance
  • Higher lifetime value of members

All this from analytics?
Strategic value is the cumulative effect of optimizations, process improvements, and various other factors.  Because of this, it’s naturally the most difficult to achieve or to attribute to a single cause.  Analytics is one important element to drive and measure strategic value.

A Beginner’s Guide to Analysis with R Part II

This is a continuation of our previous blog on a Guide to Analysis.  The previous blog covered defining your S.M.A.R.T. goal.  In this section we will discuss preparing and checking your data for analysis.
Our S.M.A.R.T. Goal: Determine what program changes will increase next year’s membership retention for first year members by 10% compared to the previous two years

  • Dependent Variable
    • Renewal (This would be a yes/no answer; did they renew or did they not?)
  • Independent Variables
    • Chapter Participation
    • University
    • Committee Participation
    • Gender
    • Age
    • Organization Type
    • Organization Size
    • Location
    • Amount of Events Attended
    • What type of events did they attend?

Our dependent variable “Renewal” will need to be transformed so that it will be a binary variable (0 or 1).  Our independent variables are a mix of qualitative and quantitative variables.  For the purposes of our analysis, we are going to transform our qualitative variables (Organization Type, Gender, etc.) into quantitative variables.  For example, for gender you would normally have options: Female, Male, Other. For your data model you will instead need a field for each option, with a 1 or 0. The new variables will be Gender – Male, Gender – Female, and Gender – Other.gears
After initial data preparation, you will need to run descriptive statistics on all variables to to better understand your data and identify places where you may have missing or poor data.  You do this using R Studio to catch data issues before you begin your model.
R Studio

  1. Pull in data file to analyze.  If you are using Excel, it is easiest to save as csv or text.  Please note that R Studio will automatically be able to access some locations on your computer.  On my computer it is the My documents folder.  If the file is saved there, I do not need to put in the location.
    • CSV code: Friendly file name <- read.csv (“location/file name.csv”, header = TRUE)
    • TXT code: Friendly file name <- read.delim(“location/file name.txt”)
  2. Make sure you have the appropriate packages installed to do descriptive statistics, packages are collections of R functions, data, and compiled code.  For the purposes of our analysis, we will be using the Hmisc and Psych packages.  You can install these by clicking on Install on the Packages tab.
  3. Now we are going to do descriptive statistics on the file.
    • Load library(Hmisc)
    • Load library(Psych)
    • Describe(Friendly file name)
    • Summary(Friendly file name)

Next time we will talk about what to look for in your descriptive statistic results and how to resolve any potential data issues.

Data Visualizations: Super Highway from the Eye to the Brain

Did you know that approximately 70% of the body’s sense receptors reside in the eye? Of all 5 senses, vision stands out dramatically as our primary and most powerful channel of input from the world around us! Not only that, but apparently, “the eye and visual cortex of the brain form a massively parallel processor that provides the highest-bandwidth channel into human cognitive centers.”* I envision a super highway from the eye to the brain. This helps explain why data visualization is so powerful.
In order to make good decisions from data, you’ll need to not only be able to see the data, but also successfully analyze it. There are experts who specialize in this area, but you may have the aptitude to successfully analyze data as well. Just as a sonographer may have years of experience and training in reviewing ultrasounds, with just a little direction, first time parents can see their baby’s image and delight in her features, especially when the sonographer helps out by positioning the wand to get the classic profile picture! What is needed is the skill to see meaningful patterns in data. This can be learned and developed with practice. It is definitely something I enjoy!
Take a look at the examples below which show the same data in a table format and in a colored bar chart. Looking at the table, does anything stand out to you? I’ll give you a minute … Maybe largest number of registrants in 2014? Anything else? What additional questions do you have?
Now, let’s take a look at the bar chart. Even without looking at labels or legends, our eye is drawn to the highest bar and the darkest color. What are those about? An analyst will be immediately curious about why the highest number of registrants and the most events are not happening in the same year. (It was as a result of a concerted effort to target market based on analytics data.) It is amazing how quickly we were able to go from looking at the data in a visual way to asking questions on the path to making informed decisions.
*Information Visualization: Perception for Design, Second Edition, Colin Ware, Morgan Kaufmann Publishers, 2004 

Put Tableau Content in its Place

Tableau Organization

Tableau Organization

I am the type of person who loves to organize just about anything and I think I’m pretty good at it.  About 15 years ago I did some organizing in my parent’s basement.  Everything ended up in a logical place, off the floor, and with a label when appropriate.  It was magnificent, except for one thing — now, I’m the one that knows where everything is.  My parents weren’t able to figure out my brilliant system.  Here’s the funny part.  Some I get calls, even today, asking where things are.
The lesson I learned is that an effective organizational system must be intuitive enough that others can find things without assistance.  The same concept holds true for organizing your Tableau Server environment to manage both permissions and content.  Here are a few tips to help make sure everybody can find what they are looking for so time isn’t wasted gaining access to new insights.

  • Establish a common language dictionary for data and business terminology.
  • Organize using Projects
    • Create a project for each business area (Events, Membership, Finance, etc.)
    • Executive Dashboard(s) should have their own project and permissions.
    • Have a project that represents a development area with restricted permissions
    • Publish all data sources to a project named “Data Sources” or something similar and public. This reduces duplication and aids permissions, especially when your association is using an open data policy.
  • Publish the fewest possible number of data sources
  • Create and add tags to help people receive accurate search results
  • Incorporate permissions into the content structure

Before you start creating projects and assigning permissions, there are a few helpful questions to ask and answer that will guide a maintainable structure.

  1. Are you going for an open data policy where there are few restrictions on access to cross functional data?
  2. Do you have security concerns across business areas? For example, is Finance data restricted by department or to only a few individuals?
  3. Is there information that must be kept confidential or restricted (e.g. donations, due to privacy or compliance concerns)
  4. Will you enable the web authoring feature?

Following these simple guidelines and taking into account special permissions and security issues will have you well on your way to implementing the perfect organizational system in Tableau Server for your association’s data.

Need a Data Champion? Start Searching Close to Home

One of the most important factors in successfully adopting a data strategy is the data champion. This person is able — through influence, education or example — to advance the cause of data throughout the organization. His or her role will vary between organizations, but should include some of the following:

  • Analytics strategy
  • Change management
  • Technical duties, including responsibility for analytics infrastructure and source systems
  • Advanced data analysis
  • Responsibility for data integrity and data governance
  • Business area domain expertise
  • Staff training and support

Often there are no obvious candidates for such a wide ranging and strategic role, leaving association executives to decide whether to hire an external candidate or develop one from existing staff.
For some associations, hiring from outside is the clear choice. It offers the advantages of bringing new skills ad capabilities, new perspective and additional bandwidth to the team. The learning curve can be reduced by hiring someone who has specific experience in analytics and the selected tool set.
But hiring can be difficult and expensive. Be prepared to compete for hard-to-find talent in a market that heavily favors candidates. Whether you believe the U.S. job market has truly rebounded or we’re still in the grips of a jobless recovery, there is no disputing that some of the hottest and fastest growing careers are in data and analytics.
Data scientist topped the recent Glassdoor survey of the 25 best jobs in America, with analytics manager coming in at #11. In fact, the ability to understand and analyze data is recognized as a key success factor across occupations. Fast Company names understanding of analytics among the eight career skills you need to be competitive in 2016.
Association leadership may do better if it looks to its current staff for answers. A good choice for data champion may be:

  • A single senior leader with a broad knowledge of the association’s business and data requirements, and the influence to advance the analytical mindset throughout employees.
  • A group of department-level subject matter experts who are the most familiar with the data and who lead by example, showing others how they can leverage analytics to perform their jobs more efficiently.
  • An experienced analyst, who is an expert in analytics tools and works across all departments providing support to department leads and staff.

Internal candidates know the organizational structure, the culture, and most importantly the data at a level that it will take an external hire months or longer to equal. Yes, there will be a learning curve in the new position, and backfilling a current position may be a challenging, but creating an opportunity for an existing staff member through promotion or reorganization is often the safest bet.

Warning: A non-numeric value encountered in /homepages/0/d741589923/htdocs/clickandbuilds/AssociationAnalytics/wp-content/themes/Builder/lib/builder-core/lib/layout-engine/modules/class-layout-module.php on line 499