Warning: A non-numeric value encountered in /homepages/0/d741589923/htdocs/clickandbuilds/AssociationAnalytics/wp-content/themes/Builder/lib/builder-core/lib/layout-engine/modules/class-layout-module.php on line 499

Warning: A non-numeric value encountered in /homepages/0/d741589923/htdocs/clickandbuilds/AssociationAnalytics/wp-content/themes/Builder/lib/builder-core/lib/layout-engine/modules/class-layout-module.php on line 499

Archive for Data Analytics for Associations – Page 3

Make a New Year’s Resolution to Make Data-Guided Decisions

In a world where the pace of change is always increasing and a failure to adapt can result in diminished value, we can’t rely on the old way of doing things.
The old way of doing things was basing decisions solely off of instincts, politics, or traditions. These things are undoubtedly important to the decision making process (and that’s not going to change). However, when intuition is supported by data, decisions are more reliable and effective.
There’s evidence to support this. According to MIT Sloan Review, top-performing organizations use analytics five times more than lower performers.
Data empowers leaders to make decisions with confidence. Leaders should be able to ask questions, to find evidence, and to make decisions based on what’s actually going on.
Data analytics – the process of transforming data into actionable information and insights – is the key to do this.

Why don’t we use data to make decisions?

One reason is the technical complexity of accessing data. Traditionally, to get a question answered, business leaders had to go to IT. By the time the IT people export the data and prepare it. This causes a delay and prevents the business leader from rapidly responding to issues. It can also cause frustration for both sides if their are follow-up questions or changes to the original request.
You can solve this by combining data from many different sources and putting it into a data warehouse. Then, you can layer visualizations on top of the data warehouse so you have a single version of the truth.
Another reason we don’t use data is not as easy to solve. Some people are drawn to data to make decisions and discover new opportunities. Individuals with an analytical mindset are able to analyze information, identify problems and trends, and solve complex problems. They are also curious. They ask “why” and they want to learn how to do things better, which improves the whole organization.
Not everyone has an analytical mindset, but fortunately it can be taught. You can also strategically hire for an analytical mindset and foster it within current staff.

curiousDo you have an analytical mindset?

We use a scorecard to asses the current state of analytics in an organization. This information helps us develop an analytics strategy. Many of the questions we ask in the scorecard help us determine if an organization has an analytical mindset.
Do you have an analytical mindset? See how you respond to the questions below to find out.

  1. Do you use data to guide decision-making?
  2. Are you able to effectively communicate observations and conclusions about data?
  3. Do you take action using data analytics?

Whether you have an analytical mindset or want to improve, we challenge you to consider how you can use data in 2017 to support decision-making and find new opportunities. Make a New Year’s resolution to:

  • Make evidence-based decisions
  • Leverage data as an asset
  • Consider untapped data sources
  • Look at combining multiple sources to gain new insights

Happy New Year!
Banner Designed by Freepik

Using Learning Analytics to Support Students

For many associations, education is more than just a source of revenue and a key member benefit. It is a cornerstone for the organization’s existence. Despite the importance of education, associations still struggle with how to measure the success of professional development programs. According to Tagoras, 87% of associations offer e-learning, but less than 30% use data to make decisions about their educational programs.
The problem is two-fold for associations.
First, there’s the question of what associations should be measuring to gauge the impact programs have on students.
Second, associations have to figure how to measure impact.

What to Measure in Learning Analytics

Recently, Debbie King, Association Analytics® CEO, presented a session about learning analytics at the American Society of Association Executives (ASAE) Spark Conference, an online conference about the art and science of adult learning. What is learning analytics and how can it help your association?
George Siemens, a noted expert on the topic, wrote that “learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.”
This is distinct from business analytics for educational programming, which focuses on operational or financial performance.
Most associations have access to business analytics for their educational programming. They can find revenue, expenses, demographics of registrants, or number of registrants for any given program. This is important information, but doesn’t necessarily provide insight into how to better engage students or how to improve student success.
To better understand the distinction between learning analytics and business analytics, it’s helpful to look at the difference between common key performance indicators in both areas.
Learning Analytics KPIs
See the difference? Learning analytics is focused on the student and their success. Business analytics for professional development is focused on the program and its success. The two are closely related, but separate and its worth considering both to ensure program success.

How to Measure Learning Analytics

One major obstacle to engaging in learning analytics is determining how you will measure professional development programs.
Where is the data needed to understand impact? How do you combine data from multiple, disparate data sources to easily analyze information in a single location?
After you define what you want to measure, you need figure out where to get the information. Relevant data sources may inclue your Learning Management System (LMS), Association Management System (AMS), Event Management System (EMS), and even yoru web analytics program.
The data can then be integrated with a data warehouse where it can be combined and prepared for consumption by business users. You can visualize and interact with the data using a business intelligence tool, like Power BI or Tableau. To learn more about data warehousing, see our post on data warehousing.
Banner Designed by Freepik

Columns Aren’t Just for Advice and Holding Up Buildings: They Can also Help Your Analytics

Traditional databases, like an Association Management Systems, are designed to handle frequent transactions and store data. This is very different from dimensional data models that are specifically designed for analysis while aligning with the analytical workflow.
columns_rows

Data, Files, and Blocks

“Cloud computing” is a bit of a misnomer. Data is still stored in files made up of blocks on a computer.
To increase efficiency, databases store entire table rows of data in the same block. For example, all of a customer’s attributes such as name, address, member type, and previous event attendance are stored in a single block for fast retrieval. In this scenario, each “row” represents an individual customer while each “column” represents their different attributes.
If you think about it, most analytics involves aggregating data such as sums, counts, and averages that span many rows. Your exploration might eventually lead you to detailed individual records, but it will likely take several steps to identify these records. This means that if you looking at say, average event revenue, the database will need to retrieve entire records from several blocks just to get the revenue field for the eventual calculation. Imagine having to individually navigate many shelves from left to right when you could just quickly create a stack of what you need!

Columns and Rows

Similar to the goal of a dimensional data model, database technologies can further optimize analytics by primarily storing data in columns instead of rows. For this scenario involving average event revenue, the database simply accesses a single block with all of the data for the revenue column across all rows.
These columnar databases significantly improve performance and storage while providing several other key benefits.

  • Data compression: Since columns are generally the same data type, compression methods best suited for the type of data can be applied to reduce needed storage. In addition, aggregations can sometimes be performed directly on compressed data.
  • Advanced analytics: Many of the algorithms underlying advanced analytics leverage vector and matrix structures that are easily populated by single columns of data.
  • Change tracking: Some technologies track changes at the column level, so you can maintain granular history without having to unnecessarily repeat other data.
  • Sparse data storage: For columns that maintain valuable data that is infrequently populated such as specify product purchases; traditional database technologies need to maintain “NULL” values while column-based databases avoid this storage.
  • Efficient distributed processing: Similar to managing file blocks, column-based technologies can distribute data across machines based on column to rapidly process data in parallel.

Potential Options

Examples of columnar database technologies include Apache HBase, Google BigQuery, and Amazon Redshift. HBase is part of the open-source Hadoop ecosystem, BigQuery is a cloud-based service based on technology that served as a precursor to Hadoop, and Amazon Redshift is a cloud-based service that is part of the popular Amazon Web Services (AWS) offering.
Speaking of holding up buildings, our friends at the National Council of Architectural Registration Boards created some great visualizations based on Amazon Redshift using Tableau Public. Analytics tools such as Tableau and Microsoft Power BI offer native connectors to Amazon Redshift and other big data technologies.  These technologies are another way that you can enhance your analytics using data and tools that you already have with cloud services to rapidly make data-guided decisions for your association.

The Importance of Tableau Server Backup (TSBAK)

What’s TSBAK?

If you are not a Tableau Server administrator, then you might have been thinking – great another acronym. This is an important one and is a file type. TSBAK stands for Tableau Server Backup.
TSBACK is an important step that should not be skipped. Setting up a Tableau Server environment is so easy this step is often overlooked or not known by administrators.
Quite often we see customers looking for assistance when a Tableau environment already exists. In those cases it is highly likely that the server is backed up as a whole with a virtual image snapshot, but depending on the circumstances of needing a backup – you might find that to be unhelpful.
If you need to restore, the only official supported method by Tableau is to have a TSBAK file.

How to Create a Backup File

A backup file is easy to create and can be automated in a batch script. The steps are:

  1. Open a command prompt with administrative privilege.
  2. Navigate to the Tableau Server bin directory where the tabadmin file is located.
  3. Type the following: tabadmin backup filename.tsbak
  4. You may also specify the destination of the backup file by typing: tabadmin backup “D:\MyPath\Backups\filename.tsbak

Tips for Setting up TSBAK

The only other criteria is to make sure the destination drive has plenty of free space. If you are setting up a regular backup schedule (you should) then here are some additional helpful tips.

  • We recommend a minimum of weekly to create a backup.
  • If you are on a recent version of Tableau Server, then you do not need to stop the server to create a backup.
  • When including the backup as part of an automated script, it is helpful to also backup and trim the log files. This helps keep memory free and Tableau performing at its best. See commands tabadmin ziplogs and tabadmin cleanup for more information.
  • Visit the Tableau support site for additional examples and command switches available.
  • Having an automated script makes upgrades easier since you can initiate the script off schedule and get a pre-upgrade backup file.

If your association’s Tableau Server environment is not being backed up using the described and only supported methodology, we encourage you to implement it immediately.

What Level of Social Media Analytics Are You At?

As associations strive to better understand their customers to provide excellent service and relevant products and services, social media strategy and data are growing sources of competitive advantage. Social media is a rich source of customer interests and preferences that complement other data sources, like your membership database, finance, and email marketing systems, to provide a 360° view of your customer.
Social Media Analytics is the process of collecting data from social sources, like Facebook, Twitter, or blogs, and analyzing that data to make decisions. There are three levels of social media analytics. Which level is your organization at?

Level 1. Social Media Management

Your organization likely has a basic strategy to engage on social media. You’re posting to popular platforms to build online presence and share content. You’re using a management tool such as Hootsuite or Sprout Social to connect multiple social media accounts and manage traffic in a single place. But is your association able to measure the effect of these interactions with customers and take action on this information?

Level 2. Social Media Monitoring

Also known as Social Listening, Social Media Monitoring (SMM) is the process of identifying and assessing what is being said about a brand or product in social media. Monitoring tools listen to online conversations for specific keywords.  Common keywords include your twitter handle, hashtag, brand mentions, competitors, and links. Monitoring gives a high level view of overall brand sentiment, as well as real-time keyword alerts.
SMM can be used for brand awareness campaigns, improving customer service, finding new customers, or joining online conversations with customers or influencers. Popular SMM tools include Brandwatch, Brand24, and Mention.

Level 3. Social Media Intelligence

Social Media Intelligence (SMI) is the process of analyzing social media data to inform business strategy and guide decision making. It takes monitoring to the next level with a broader reach and deeper analysis of markets, customers and even competitors. This requires the use of more sophisticated tools that automatically mine millions of online conversations, using Natural Language processing and advanced algorithms to determine context well beyond a simple positive or negative.
SMI is being used to measure sentiment about individual products or services, map customer journeys, and detect risks and threats. Popular SMI tools include Crimson Hexagon, Sysomos, and Synthesio.
It’s important to note that social media data is one of many sources that are part of a comprehensive data strategy. Learn more about the power of combining data.

How Your Association Can Implement Propensity Modeling

Last week, we introduced you to Propensity Modeling and how it can help your association make data-guided decisions while providing great value to your customers. We’ll now dig into some of the technical detail and steps to implement Propensity Modeling.

Step 1. Prepare Your Data

Consistent, complete, and accurate data is the foundation of predictive modeling. Your data should ultimately look like a very wide row with a dependent variable of 1 or 0 relating to the business action taken (or not) along with a variety of independent variables with values at the time of transaction.
Categorical data should be converted to “dummy” variables where values are transformed into individual columns as opposed to row-based data that is ideal for data exploration.  Fortunately, the ability to quickly access high-quality and timely data regardless of source from an environment such as a dimensional data model makes the process much easier.

Step 2. Select Your Variables
checklist-1402461_960_720

Incorporating the right mix of features is vital to the success of any predictive model. While it’s great to have many variables available as candidates, having too many can actually harm model accuracy.
Several automated stepwise techniques are available to propose variables by iterating through different combinations while considering measures such as significance and model error. Simply relying on automated processes is not recommended as statistics should be tempered with business expertise to identify variables that are not meaningful or pick between highly correlated variables. Another challenge is the potential for over fitting, meaning the selected variables based on the sample data are not best for unseen data.

Step 3. Select Your Modeling Technique

Next, you will want to select a modeling technique. You will likely be deciding between a linear regression model and a logistic regression model.
Linear Regression models have outcomes based on nearly infinite continuous variables, such as time, money, or large counts. Propensity Modeling generally leverages Logistic Regression models to derive probability-based scores between a fixed range of 0 and 1. The underlying algorithms used to create models are very different as well.
Logistic Regression is often perceived as an approach to estimate binary outcomes by rounding to 0 or 1, but a score of .51 is very different from a score of .99.  A common approach is to assign records to categories using deciles, or 10 bins with equal ranges.

Step 4. Determine If You Need to Use Any Other Analytic Techniques

You can use several other advanced analytic techniques to accomplish goals similar to Propensity Modeling.

  • Clustering is a form of unsupervised learning as the model is not based on a specific outcome or dependent variable, but simply groups records such as individuals.  The groups can result in customer segments that are ideal for certain products or marketing approaches.
  • Collaborative Filtering is based solely on the actions of groups of users as opposed to individual characteristics.  This is a common approach for recommendation systems based on actions such as purchases, product ratings, or web activity.
  • Decision Trees traverse a path of variables with branch “splits” based on the contributions of variables to ultimate outcomes.  This technique can be effective when a very small set of variables lead to outcomes influenced by downstream groups of variables.

You can also combine models, where the results of one model are the input to another to create a ensemble models.
The decile scores generally represent a range from “sure thing” to “lost cause”. You can use the different decile groups to guide approaches such as the effort to retain individuals, pricing strategies, and marketing messages.

Step 5. Determine Measurement Approach

The Lift of a Propensity Model represents the ratio of the rate based on applying a model to the rate based on “random” individuals. An ideal way to derive this measure is to maintain a control group for comparison to a similar group leveraging the Propensity Model. If can be a difficult decision to risk potential revenue, so a common approach is to simply compare before-and-after results.

Step 6. Consider How You Will Take Action

Before using any analytics model, it’s a good idea to consider how you can take action on the information. What decisions will you make as a result of the information? Similarly, how will you measure the results of the action and use it to inform your model?
For example, you can use a propensity model to reduce expenses. Targeting individuals differently based on their propensity to take action can optimize costs in different ways. Costs might be direct costs, such as actual print mailings or list rentals, or costs can be indirect, such as many non-personalized emails that contribute to information overload. You will want to establish a baseline and a goal for cost reduction to measure success of the model.

Step 7. Identify Your Tool

A range of different options are available to implement Propensity Modeling.

  • R Programming: A popular open-source statistical programming language with many mature packages to perform the techniques underlying Propensity Modeling.
  • Alteryx Software: A software platform offering pre-built tools for different modeling techniques and business scenarios.
  • Amazon Machine Learning: A cloud-based service that is part of the comprehensive Amazon Web Services environment that provides visual wizards for tools to perform Propensity Modeling

This may seem like a lot of steps, but once you have all of your comprehensive data easily accessible along with an available user-friendly tool, all you will need is your imagination to better understand your association’s customer journeys to make valuable data-guided decisions.

The Power of Combining Data from Multiple Sources

Super Charge Your Data

Combining or blending data happens when you connect two or more different data sources. Combining sources from multiple data sources reminds me of one of my son’s favorite shows, Power Rangers. While each one is committed to fighting evil, each Ranger has a unique skill and weapon. When their enemy is too great to handle individually, they combine their unique powers to create a Megazord. A well designed data mart is the ultimate Megazord that can battle the evilness of fragmented information.

Message Activity Analysis

Let me tell you what I mean. Information from your marketing system can be measured and analyzed. You are probably familiar with some common marketing key performance indicators (KPIs) such as number of sent emails, delivery rate, bounce rate, open rate, etc. This is interesting in itself to analyze which messages have higher opens and clicks and which ones are below average.
You might get something that looks like this:
standard message stats

Combine Powers

What makes this information really pop is combining it with your other data sources. Combining the messaging activity data with demographics from your AMS helps you evaluate the influence that things like job level, member type, age and/or gender have on your key messaging metrics.
open rate by generation
In this example, when we look at Open Rates by generation, we can see that those in “The Greatest Generation” have dramatically lower Open Rates than the other generations. The “Baby Boomers” have the highest open rate. If we were to only look at the average open rates, we might miss this distinction. What actions could you take knowing this information? Perhaps sending an extra mailing to your older members for important communication?

How to Blend Data

Watch this advanced Tableau tutorial to learn how to blend data. For deep analysis and improved performance, we recommend investing in a data warehouses and data marts using a dimensional data model. Learn more about our approach to data blending.

How to Succeed with Data Analytics

If you’ve ever participated in a data analytics implementation, you may be familiar with the indescribable excitement around the project. Who wouldn’t be eager for a solution that makes it easier and more efficient to understand and serve your customers?
But what happens after excitement of your initial implementation fades, the consultants have gone home, and your dashboards have lost their shiny, new appeal? How can you ensure a return on your investment?
Recently, Galina Kozachenko (Association for Financial Professionals) and Debbie King (Association Analytics) discussed the afterglow of data analytics as part of the weekly Association Chat series hosted by Kiki L’Italien. You can replay the recording here. Here are my top 5 takeaways for how to succeed with data analytics:

1. Align Analytics with Association Strategy

What gets measured, gets done. “Analytics and strategy need to live side by side,” said Galina. It’s important that for every strategy, you have a hypothesis that’s tested by measuring and tracking defined metrics.

2. Manage Your Scope

Don’t start too big. “We have seen the greatest success when an association starts out by analyzing one area at a time,” said Debbie King. Prioritize your business areas and ensure successful implementation of one area before moving on to others.

3. Establish and Enforce Data Governance

Data governance is elusive, but attainable if you treat data as an enterprise asset that is the responsibility of everyone. Galina recommended evaluating your data early in any analytics engagement to better understand what elements will need to be kept clean in the future. Read more about data governance.

4. Identify a Data Champion

One of the most important factors in successful adoption of a data strategy is having one (or more) data champions. These internal staff members are able — through influence, education, or example — to advance the cause of data throughout the organization.

5. Be Prepared to Manage Change

Data analytics is an exercise in change management and that change won’t happen overnight. “It’s not a one week journey,” said Galina, “but once the traction picks up, it will be like a self-propelling train.”
To help ensure adoption, you need the support and buy-in of leadership and staff. Communication throughout the project is key. Be prepared to continuously demonstrate value through “quick wins” and sharing success stories. Your data champion or analysts also will need to commit to spending time training, providing analysis, and working with both the early adopters and the risk-averse.
Debbie recommends publicizing and promoting internal data analytics work in much the same way you would promote an external benefit to members. Pick an area and provide a weekly summary to leadership about the meaning of “story in the data”. Encourage the analytical mindset by having a “Question of the week”. Look for and show examples of surprises in the data that defy intuition. Increase visibility and stimulate interest by placing a monitor in the kitchen or lobby that shows high level visualizations that rotate each day.

failure-quoteGiving Up is the Only Sure Way to Fail

Ultimately, though, success isn’t even possible if you don’t try. “The only time you can really call an analytics initiative a failure is if you give up,” said Debbie. “It’s an iterative process and the most important thing is to get started where you are.”

How to Hire for an Analytical Mindset

What is the Analytical Mindsest?

In today’s fast-paced, data-driven world, high-performing organizations seek individuals with an analytical mindset. Individuals with an analytical mindset are able to analyze information, identify problems and trends, and solve complex problems. They are also curious. They ask “why” and they want to learn how to do things better, which improves the whole organization.

How to Hire for an Analytical Mindset

Many of our association clients ask us, “How do I hire someone with an analytical mindset?”  Here are five tips your association can follow during the interview process to make sure your next hire has an analytical mindset.
curious

Ask what they have learned recently.

To test for intellectual curiosity, ask candidates to tell you about something they learned in the last few months. Why did you choose that topic to learn about? What was their approach to learning about the topic? How did they use what they learned?
Remember, a truly curious person often learns things outside of their core area of expertise because they value learning for its own sake.

Give them a short assignment.

Look for individuals who can not only identify problems, but quickly develop quality solutions. Before your second interview (after your initial screening), assign a short project and see how they respond. They should be eager to do it and you can tell by the quality of the result whether they were curious or just did surface treatment.
For example at Association Analytics®, we ask applicants to create a dashboard with Tableau. They can use any data set and create any type of visualization. Then, they walk us through their work and explain their findings. We listen carefully to understand their thought process – “I noticed x and wanted to find out why so I then created z”. We also look for self-awareness and critical thinking. Can they look at their work and identify ways it could be improved?

Listen for quality questions during the interview. 

Curious candidates will have lots of quality questions and they shouldn’t just be about the company benefits. We listen for questions about training and learning opportunities, questions about other team member’s skills, and questions about the nature of the analytical problems we solve for our clients.

Look for the intersection of business and technology.

Many people think that analytics is a purely technical field. But it’s not. A data analyst must understand the context – the business environment and its culture, processes, strategies and tactics – in order to truly succeed. The analyst must understand why the findings matter in order to clearly communicate the meaning of the data to the business staff who make decisions. They have to be curious about cause and effect. Probe carefully for this understanding by asking questions about the meaning.
Conversely, look for the capacity for data analysis in non-technical staff. They don’t need a background in statistics, but they should be able to review and understand data and be able to ask questions to learn more.

Validate your findings.

To find individuals with an analytical mindset and intellectual curiosity, employers must also be curious. Strive to learn as much as you can about a candidate. Validate your findings with outside assessments.
We ask candidates to complete the StrengthFinders Survey. According to Gallop, people who use their strengths every day are six times more likely to be engaged on the job. What we found with StrengthFinders is that people with the strengths of “Learner”, “Achiever”, “Analytical” or “Input” are all innately curious so we look for and select for those strengths.

Derby Curse or Myth? Using Data to Challenge Instinct and Assumptions

The Major League Baseball All-Star festivities kick off tonight with the home run derby, where the leading power hitters face off to see who can hit the most home runs in a three round contest. Like most sports all-star events, the derby struggles to maintain relevance and TV ratings, but there is no shortage of interest in the so-called Home Run Derby CurseThe debate on whether the curse is fact or fiction is a fun example of the importance of using data to challenge assumptions.
According to the curse, participating in the derby causes a major decline in home run production in the second half of the season. The theory is that all those big swings rob players of their energy or mess up their swing mechanics causing big declines in performance over the second half of the season. There are many examples to support this conclusion. In a famous example from 2005, Bobby Abreu of the Phillies came into the derby hitting .307 with 18 HR.  He launched 41 homers in a record setting derby win, then slumped through the second half of the season with a .260 batting average and only 6HR, blaming the decline on residual effects from the derby.
Data analysts and baseball stats geeks have studied this subject exhaustively in recent years, proving the curse to be a myth using two main arguments:

  • Regression to the mean. Derby participants are very often enjoying extraordinary, career best, seasons hitting home runs at a much higher than usual rate.  It’s natural for them to regress in the second half to a rate that is closer to their normal production.
  • There is no abnormal drop in production. Looking past some high profile second half collapses, you find:
    • 9 of the previous 16 winners have actually increased HR production in the second half. (Yahoo Weekly Rotation).
    • The production of derby participants does decline but less than those who did not participate (thebiglead).
    • The second half of the derby season is low relative to their hot first half but exceeds their own career averages (SABR).

Watch the derby for fun and don’t be surprised if the curse is mentioned. In the office, use the same concepts to challenge your assumptions and beliefs about your business. Here are three tips help you distinguish fact from fiction in your association.

  1. Don’t rely on instinct or anecdotal evidence. Those who perpetuate this myth are disregarding the data. Your association’s data is an asset.  Don’t guess or assume when you can know.
  2. Compare results to baseline data. A trend in a particular segment is only significant to the extent it differs from the general population.
  3. Avoid Confirmation biasBelievers in in the derby myth can find a few cases every year to support their theory while ignoring the other data.  In business, errors like this can be costly. Even if a belief or assumption seems reasonable, be open to exploring alternatives.

Warning: A non-numeric value encountered in /homepages/0/d741589923/htdocs/clickandbuilds/AssociationAnalytics/wp-content/themes/Builder/lib/builder-core/lib/layout-engine/modules/class-layout-module.php on line 499