Archive for Association Analytics – Page 2

How to Harness the Power of Recommendation

Taking a customer-focused approach to data analytics helps provide optimal value, enhance engagement and understand the overall customer journey. Individuals’ actions provide valuable information that goes further than what is collected with surveys and online profiles. Additionally, actions uncover hidden patterns that can be used to build a recommendation system to guide customers toward other interests.
Here are the most common approaches to creating recommendation systems:

  • Collaborative filtering. This is based on data about similar users or similar items. It includes these techniques:
    • Item-based: Recommends items that are most similar to the user’s activity
    • User-based: Recommends items that are liked by similar users
  • Content-based filtering: Makes suggestions based on user profiles and similar item characteristics
  • Hybrid filtering: Combines different techniques

Recommendation systems results are similar to those on sites that suggest products and people, like Amazon and LinkedIn. Collaborative filtering leads to more of a self-learning process, since it is entirely based on actual activity and not data provided by users. There are scenarios where the others are more appropriate that we’ll address soon.
Similarity between users or items is measured by “distance” calculations from those long-ago geometry and trigonometry classes. You can use the results with a visualization tool such as Tableau, creating a similarity matrix and quickly identifying relationships.
It is sometimes helpful to group individuals and items into categories, which can be done by combining similarity scoring with data mining techniques like cluster analysis and decision trees.
Recommendation systems generally require data structured by columns instead of the row-based data that is best for interactive data discovery. Similar to text analytics, the items themselves — meetings, publications, donations, and content — represent large columns. It’s used by specialized R packages for the recommendation system features described in this book.
These algorithms generally need binary values, like a “yes” if someone purchased an item and “no” if he did not. But if users can rate items on a scale of 1-5, what does a score of 3 mean? Normalizing scores based on individual and overall ratings is a good way to answer this question.
The data requirements are really not as onerous as they may sound. Once data is in the right format for the R analysis tools, your imagination can take over to drive actionable association analytics. Content-based filtering works well for new users, and a hybrid approach can help prevent a “filter bubble” where some people get a too-narrow set of interests from similar recommendations.
Data from meeting registrations, membership history, donations, publication purchases, content interaction, web navigation, survey responses and profile characteristics can be used to guide association customers. Additionally recommendations can bring people with common interests together. This new insight can be used to enhance all customer interactions, ranging from email marketing to dynamic website presentation to event sessions.

How Analyzing Social Media is Like Walking Across Bridges

How do your customers connect with one another? Social media mixed with a historic mathematical theory can help you find those patterns and bridge gaps.
Combining social media with other external data can help your association use a range of personal interaction and engagement to move toward a more customer-focused approach. Analyzing social networks is often done through the mathematical concepts of graph theory and network theory, showing relationships between individuals. Richard Brath and David Jonker described using these concepts for business in their book.
Using technical analysis helps you identify people who are “connectors” and link to several groups, “influencers” who help groups form, and cliques that would otherwise be difficult to detect. Setting people up as a simple shape and line graph can be understood by:

  • Counting incoming and outgoing links between people.
  • Looking at the density of direct connections.
  • Examining the shortest and longest paths between people.
  • Considering how far the shapes are from one another.
  • Seeing how people tend to cluster together.

Graph analysis comes just from activities and does not make use of other attributes, like demographics. You can supplement this analysis with other data that you have. You can also use this kind of analysis to identify customers similar to the ones you found through social network analysis. It may be interesting to study things like how members’ interaction levels differ from those of non-members.
To bring social media data together with your other customer information, it should all come together in a data mart. In addition, you can also introduce text analytics to provide additional context. Different social media platforms make data available through application programming interfaces (APIs), which all have their own technical integration options and data scopes.
While social media analysis has been getting more popular recently, the graph theory that is used to break it down is centuries old. Mathematician Leonhard Euler famously used this theory in the 16th century to solve the “Seven Bridges of Königsberg” problem, devising a way to walk through a city while crossing each of its seven bridges only once.
While crossing physical bridges may not be on your list of priorities, social network and graph analysis can help cross several metaphoric bridges in your association, including:

  • Segmenting customers: Coming up with similar people, based on links and attributes.
  • Analyzing influence: Finding people with large numbers of connections and activity.
  • Analyzing the market basket: Figuring out items that are commonly purchased.
  • Finding general correlations: Seeing relationships between people, products, events and other things.
  • Website visit analysis: Determine which webpages are the most popular.

Visualization techniques can communicate the message in social network data through node size, node color, link weight, link colors, and labels. You can use a combination of visualization choices in Tableau to tell social network stories.

Words with (Association) Friends

Associations define the future through the exploration, analysis, and visualization of data. This generally involves using existing data to consistently describe key business events like event attendance, member engagement, training course popularity and website traffic. We can tell great stories with this data, but actual language can be the best form of communication. There are a lot of opportunities to use text analytics to help associations make even more confident data-guided decisions.

Taming Big Data

Text analytics is often viewed as within the realm of big data. This makes sense as it generally aligns with the volume, velocity, and variety characteristics commonly used to define big data.
Like other forms of data, text can be used to discover structure, meaning and relationships and provide context to other values. In the case of text, the data shows if a word is in content such as documents, comments and social media posts.
Picture a giant spreadsheet with one column for each of the nearly 10,000 commonly used words in the English language. That’s quite a bit of data for even the savviest Excel user or AMS application. Measures represented by the intersections between the rows and columns might include counts of words in documents and how close words are to one another.
Fortunately, several proven methods exist to make text data much more manageable. They include:

  • Removing “stop words” such as “a”, “and”, and “the” that are not likely to be studied.
  • Using frequency thresholds that include counts and how unique words are in the content.
  • Using stemming to group similar words with different suffixes, like “recommend,” “recommended” and “recommending.”
  • Applying the statistical technique of factor analysis that groups words by ideas and themes.

We don’t use these techniques just to address the volume challenges posed by big data. More concise data significantly improves the value of all advanced analytics.

Context is key

Text analytics data is much more valuable in conjunction with other internal and external information like index terms in documents, tags assigned to social content, survey questions accompanying free-form text comments, and characteristics of individuals generating content. Seemingly basic categorizations – like comments tagged as high quality by customers or those made by individuals with a high level of engagement – can significantly impact the analysis and help perform predictive analytics against new data.
You can also provide meaning to text through ontologies, which assign relationships similar to association business processes. For example, an “attendee” is associated with an “event.” They can be defined as part of the text analytics process, or obtained from third-party sources.

The usual models

Once our text is structured in a usable and manageable way, we can apply advanced analytics and statistical methods. We use techniques tailored to this form of data, including categorizing and grouping documents and words. These include:

  • Clustering – Grouping things determined to be similar, like words that often occur together or have similar meanings.
  • Classification Trees – Assigning documents to a categories based on hierarchical rules. A document with the word “event” might be assigned a more detailed category, like “Detroit” or “annual conference.”
  • Graphs – Showing how variables are interconnected and influence one another. These are part of a broader category and are better used for scenarios such as modeling social networks.

There are two ways to categorize these approaches. A supervised approach means the goal is known, like assigning documents to a list of topics. In an unsupervised study, techniques like clustering are used to find similar documents – but without first identifying specific criteria. As with other types of advanced analytics, the modeling process is iterative and requires some manual validation.

What are they saying?

Sentiment analysis of social network content, or looking at positive and negative feelings, is a popular goal of text analytics. Deriving sentiment is more challenging than other applications of text analytics because of nuances in language and difficultly in understanding tone. Many suggested word lists are available to assist.
These two sentences both could indicate a person’s opinion about an event:

  • “I really got a lot a great information from this event!”
  • “There was much more great information presented at the prior events.”

Another potential pitfall of sentiment analysis is from whom the data comes. Are individuals with negative experiences more likely to voice their opinion that those with positive feedback?
Sentiment analysis underscores the importance of making data-guided decisions, as observations should be investigated and measured over time before drawing definitive conclusions.

Applications for Associations

Associations can gain valuable information from a variety of common business scenarios.

  • Social media and collaborative platforms – Assigning categories and other similar comments.
  • Event surveys – Understanding specific feedback beyond discrete questions.
  • Meeting abstracts – Automatically assigning topics.
  • Document similarity – Recommending similar documents and identifying expertise.
  • Customer bios – Identifying individual areas of expertise.
  • Customer service contacts – Interpreting the reason for the contact.

A range of enterprise and other software tools, including the popular (and free) R programming language, are available to implement text analytics.  You can also visualize the results of text analytics using leading tools such as Tableau to create visualizations such as heat maps, document clusters, and word clouds.
Your association analytics can include true customer conversations and engagement detail available from text using these approaches and tools that are part of our proven 5 step methodology.

Find the Business in Your Data

Recently I read the phrase, “find the business in your data.”  For years I have been saying, “your data is telling a story, and once you understand the story, you can change the ending.” Both ideas are similar and powerful: hidden within your data are the stories about what business your association should be in!  So often association business models are based on what we think our members and customers want, or what they said they wanted on a survey.  But we know that what people say is not the same as what they do.
The best way to understand and serve our customers is to combine what they say they are interested in (explicit interests) with what they actually demonstrate interest in (implicit interests).  For example, if an individual indicates on their profile or a survey that they are interested in governance and board effectiveness (explicit interests), but an analysis of their web activity shows they read articles on digital media and innovation (implicit interests), then we know that we want to engage with them about all of these topics.  The way to do this is to combine behavioral and social data with transactional data from the AMS or CRM in order to truly get a 360 degree view of a customer, their interests, and their engagement.  So how do associations find the business in their data?  Over the years, we have found four primary ways that analytics can do just that:
Performance Management/KPI’s: What happened?
Data Discovery: Why did it happen?
Predictive Modeling: What will happen?
Social/Behavioral Data (Big Data): How can we make happen what matters most?
The worst thing in the world for an association is to experience a slow decline in their relevancy to their audience but to not understand the reasons why.  This “boiling frog” syndrome is worse than a dramatic decline because it is easy to ignore it or think it is not important, especially if only certain areas of the association’s business are declining but overall the organization is doing well.  The best way to understand the business in our data is to start to understand the stories that are hidden there.  Progressive associations are making 2016 the year they invest in analytics as the best way to remain vibrant and grow.

Association Analytics Network Launches!

The first meeting of the Association Analytics Network, sponsored by DSK Solutions and held at ASAE, 1575 I Street, NW, Washington DC was held on Friday, December 11, 2015, from 10 am – 2 pm.  The focus of the meeting was an open conversation about the current state of association analytics and best practices for adoption.

  • Welcome and Introductions – Debbie King, CEO, DSK
  • Association Analytics: State of the Art and the Future – Reggie Henry, CIO, ASAE
  • Association Stories (short presentations) – Debbie Herrin, OSA; Karine Blaufuss, AGU; David Stephenson, Worldwide ERC
  • Working Lunch
  • Open discussions/Networking

 Discussion Topics

  • What’s working
  • What’s not
  • How to encourage the analytical mindset
  • Adoption acceleration strategies
  • Key Performance Indicators/What to measure
  • What’s next

Key Benefits:

  • Ideas and feedback from peers in an informal environment
  • Network with others association analytics leaders
  • Concrete suggestions to improve your analytics operations
  • Stimulating conversations, good company, free lunch!

 Next Meeting:
The next meeting is scheduled for the week of March 21, 2016.

Enabling the Analytic Workflow

Just as associations have a treasure of diverse data waiting to have a conversation, Tableau offers a variety of options for interacting with your organization’s data to enable association analytics.

  • Live data source connection – serves as a pass-through to a data mart or source systems that submits queries during interaction
  • Published data source – contains connection information that is independent of any workbook and can be used by multiple workbooks
  • Packaged workbook – encompasses data source connection information associated with a specific workbook

Data extracts leverage Tableau’s high-performance data engine that is based on VizQL, a technology that combines data querying and visualization,and does not require the limitation of
completely loading data into memory.  This means that business staff can efficiently explore data with fast responsiveness.
Published data sources can be used by multiple workbooks to benefit from consistent customized folders, field-level customizations, data hierarchies, calculated fields, dimension/measure assignments, and data selection criteria.  This ensures that any changes, such as the assignment of business-friendly names, will automatically be available to all visualizations and dashboards using the data source.  The underlying data is automatically refreshed based on customizable schedules.  Published data sources can also be organized by project and contain keyword tags to facilitate discovery.  These and other benefits make published data sources the optimal option for association analytics.
Data to Match the Business
The process of creating and editing data sources involves interacting with databases or other data formats.  Source data should be optimized for analysis and structured in a way that matches business processes.  A process to align data organization with actual business events and analysis goals such as Modelstorming ensures valuable business staff engagement and future flexibility.  Adding a new attribute involves simply adding it to a descriptive dimension table, while new dimensions can be quickly created and aligned with fact tables representing measured business events.  Likewise, new business events can be rapidly linked to existing descriptive dimension tables.
More Data to More People
Business staff throughout the organization can create and edit visualizations leveraging published data sources using a browser-based Web Edit feature that is part of Tableau Server and Tableau Online.  This feature provides an optimal set of capabilities similar to Tableau Desktop and does not require additional licenses.   These features include the ability to create any visualization type from the same data sources as Tableau Desktop.  In addition, custom edit and view permissions can specify which groups can access data sources and create visualizations.


The Analytic Workflow
In addition to rapidly creating data sources for exploration, visualizations and dashboards should align with a process for analytic thinking.  A common scenario involves business staff reviewing and interacting with higher-level dashboards to guide focus and spawn additional questions.  Traditionally, the analyst would then need to review individual reports to address an initial set of questions.  Dashboard and worksheet actions enable context-specific navigation and filtering which matches the data discovery process.  For example, a chart might trigger curiosity about a specific product category, such as “How are events including in this total distributed?  Did marketing campaigns contribute substantially to this total?  What are geographic patterns?”
A menu of potential dimensions can be available that guides exploration towards visualizations automatically filtered by the context of the bar chart value.  The result is ongoing questions and answers about the data.  The opportunities are limitless and help foster individual curiously and an analytic culture in your association.  The right implementation of these data exploration capabilities with the a data layer created specifically for the association will liberate the data and enable data-guided decisions for your association.

Association Analytics – Begin with Why

Simon Sinek in his famous TED Talk, “How Great Leaders Inspire Action“, explains that inspired leaders and organizations all think, act and communicate the same way.  They start with “why“. Only once they are clear about “why” do what they do, do they tackle the “how” and “what”.  In other words, they think, act and communicatewhyhowwhat the opposite of everyone else – they start on the inside of the circle and move outward.   For me, this philosophy applies to association analytics as well.  When I am asked, “Where should we start?” my approach is to first get very clear on “why” you do what you do – what is your organization’s purpose from a strategic point of view?  If your primary mission is to educate your community of interest, then everything else that you do should be evaluated in that light.  How you do it, and what you do, come after the why.  Sounds simple, but in practice it’s not as easy as it sounds because we are not used to thinking this way.  And as important as it is to identify the “why”, it is every bit as important to measure the “what” and the “how” so that you know whether you are achieving the “why”!  That’s where analytics are essential.  For example:

  1. The primary mission of one of our association clients is to educate people (this is their “why:).  They have a strategic objective that is expressed this way (action + measurement + date):
    “Triple the number of professionals we educate within five years.”
  2. In order to achieve this aggressive objective, they identify what seems like a logical initiative (this is the “how” and the “what” and is the tactical approach to meeting the strategic objective), in this case:
    Triple the number of courses we teach.”
  3. They then use analytics and data visualization to measure the results of their initiative and find that in one year alone they almost tripled the number of courses from 34 to 90 – however they only educated 888 more people!  In fact, even though they did educate more professionals, they lost money because of the extra logistics required to teach so many courses and it was not economically sustainable.  At this rate they would fail to achieve their strategic objective and their “why”.  In the visualization below, you can see the number of courses indicated by the height of the bar, and the number of participants is indicated by the depth of the green.Courses1
  4. When they stepped back and used data discovery to analyze the data, they were able to clearly see the patterns of participation. Registration was primarily related to location.  So they formed a new initiative: “Locate and market our courses in the areas most densely populated by our target population.”
  5. Within five years, they were able to consolidate the number of courses and more than triple the number of registrants:

Analytics helped to clearly illustrate that the initial “what” (triple the number of courses) was the wrong approach.  With continued focus on the “why” – to educate more people – they were able to shift their initiative (concentrating courses based on location and target population) to exceed their strategic objective within five years.  There are many ways data anlytics can help identify and measure initiatives you can undertake that will address the “what” and the “how” – but first start with “why”.

Association Analytics at Tableau Conference 2015

We had an amazing opportunity to attend the 8th annual Tableau Conference last week in Las Vegas.  For us, the highlight of the event was a great dinner where we celebrated our partnership with many of our association data rock stars.  We’re honored to have such amazing clients.   As we get back to the business of Association Analytics, we’d like to share some of the experience and knowledge from the conference.

Christian Chabot, the CEO and Co-founder of Tableau Software, opened the show and proudly described the growth of the conference which this year numbered over 10,000 participants.  He previewed the event and demonstrated Tableau’s commitment to the future and the mission of helping people see and understand data by noting that Tableau plans to invest more in R&D over the next two years than all the previous years combined.  Keeping with Tableau’s goal to empower people, the first event was “Developers on Stage” – the opportunity to learn about upcoming Tableau features directly from the development teams.

Developers on Stage TC15

The developers took the stage with enthusiasm rivaling the audience.
Individual development team lead demonstrated and described features that include:

  • Inclusion of multiple databases in a data source
  • Ability to “Union All” objects in a data source
  • Automatic clustering
  • Easy application of worksheet filters to dashboards
  • Outlier detection
  • Option to exclude table totals from color logic
  • Several additional international zip codes
  • Availability of additional external maps and GIS formats
  • Feature to highlight data based on text search
  • Version control in Tableau Server
  • Personalized Tableau Server home pages
  • Creation of “visualizations within visualizations”
  • Application of global formatting
  • Rapid optimization of visualizations for mobile devices

Factions of the crowd roared with approval as the developers announced their favorite features.  We eagerly signed up for the opportunity to beta test new product releases and continue to keep an eye on the future while understanding the product roadmap as a Tableau Partner.

Breakout Sessions & Hands-on Training

Most of the conference time was dedicated to extremely valuable breakout sessions and hands-on training.  Sessions tracks included Analytics, Big Ideas, Customer Session, Data Storytelling, Developer, IT, Workshops, and Zen Master Session.  I focused on targeted technical areas (some “Jedi level”) that are important to our clients which involve advanced analytics, optimal performance, R integration, and cloud-based architectures, while my colleague Bill Conforti focused on the key areas of analytics culture, adoption, and training which are so vital to association analytics success.
Guided Analytics: A Guiding Light in a Data Desert
This session demonstrated various advanced techniques to align Tableau with diverse analytic workflows such as dashboard interactions, actions between worksheets, and dynamically displaying visualizations based on parameters.
Jedi Strategies Using R-Integration
Tableau provides flexible R integration using script tasks that interact with a server (RServe) and behave similar to calculated fields.  The speaker led a demonstration based on a scenario from a great Tableau-focused blog and included accurately displaying flight paths incorporating the curvature of the earth.  Other great information included basic design patterns and ways to optimize performance.
Revenge of the Nerds: Advanced Analytics and Tableau
Tableau provides a range of advanced analytics capabilities leveraging diverse features.  Examples demonstrated included visualizations based on multiple plots, what-if analysis using stories, forecasting, trend lines, and level-of-detail aggregations.
Programming Tableau: Introduction to APIs
Tableau application programming interfaces (APIs) include a software development kit (SDK) for creating extracts, a JavaScript API for working with views in a browser, a REST API for managing Tableau Server, and JavaScript to develop web data connectors.  The session included demonstrations of examples of each type of API that associations can leverage to integrate with external systems such as conference registration providers and more efficiently create incremental data source extracts.
Turn Your Data Pile into a Data Stack with Tableau Online and Tableau Data Server
Tableau accommodates various architectures and data sources.  The speaker described cloud-based data sources such as Amazon Redshift and built an integration to the Twitter API using Google BigQuery as the intermediate data source.
Getting Your Performance Up
The speakers presented various opportunities to enhance Tableau performance and declared that replicating reports is the #1 reason for performance degradation.  We reviewed the Performance Recorder available within the Help menu along with resulting log files to identify performance bottlenecks.  The speakers listed several scenarios that negatively impact performance including custom queries, filters on table calculations, leverage show relevant values, blending data that can be available in a data source, and wide tables as opposed to tall tables.  They also noted that a good data warehouse can be better than an extract and Tableau visualizations are only as fast as the live underlying database.
Analyzing Data: The Balance of Art and Analysis
This session focused on creatively using features of Tableau to identify opportunities to develop analysis and presentations that business staff may not have considered.

Keynote Speakers

The keynote speakers were incredible and a great fit for the conference as data analytics using Tableau drives data-guided cultures, spawns creativity, provides deep analysis, and transforms work structures.
Daniel Pink, Best-selling author of Drive & host of the TV show “Crowd Control”
Similar to his fascinating books, Daniel Pink described characteristics that drive successful work cultures.  He discussed the key factors of autonomy, mastery, and purpose.  He concluded that people do great work when they are engaged, and self-direction is the key to engagement.  He also recommended carving out a few islands of autonomy and introduced the idea of the Autonomy Audit.
Neil DeGrasse Tyson, Astrophysicist & host of the TV show “Cosmos”
Dr. Tyson was a highly anticipated speaker and was predictably extremely entertaining.  He appropriately spoke on Back to the Future Day and evaluated the accuracy of science-themed fiction movies.  Although it is expected that such films take creative liberty, he stressed examples where filmmakers paid attention to other extreme details while neglecting science.  His main theme was the importance of data to identify objective truths.  Dr. Tyson even spent a considerable amount of time taking questions from the audience.
Hannah Fry, Mathematician, University College London Centre for Advanced Spatial Analysis
Dr. Fry provided very insightful observations about the analytics process including “numbers can’t speak for themselves, we need to speak for them” and “deeper analysis is needed to ensure what your conclusions are telling you”.  She presented scenarios that exemplify the value of data exploration and visualization where initial conclusions are made based on aggregated data.  For example, outliers influenced a popular study involving public debt driving significant policy decisions by skewing averages.  Dr. Fry also demonstrated revealing map visualizations involving London bike share and other data.


Sir Ken Robinson, Best-selling author, internationally acclaimed expert on creativity & innovation
The dry ironic sense of humor of Sir Ken Robinson was a great fit with the event and kept the audience very engaged as he provided fascinating observations about society, education, and innovation.  He described how traditional education conflicts with high life operates and is based on conformity and not diversity.  He noted that advancements in technology are driving an educational revolution as life is not linear, but organic.  His observations of the power of imagination, how imagination leads to creativity, and the importance of an environment to foster creative potential are very inspirational.  Sir Ken Robinson closed by referring to a Tableau customer testimonial quoted by Christian Chabot earlier in the event to the effect of “Tableau allows me to be creative, and I am not a creative person.”  He confidently noted that this is not correct – everyone has profound creativity in some way.
The Conference provided deep value event and we look forward to next year in Austin at TC16!

How Association Leaders Make Good Decisions

Hal Varian, Google’s Chief Economist, is quoted as saying, “The ability to understand, visualize, and communicate data will be the most important skill in the next decade.”
Notice he didn’t say anything about technical skills like SQL or building a data warehouse – he’s talking about the ability to understand and communicate about data.  The reason this hard to find skill is so essential in today’s world is that because of the accelerating pace of change, we all have more decisions to make and less time to make them.  For association staff this leads to greater risk of making a bad decision, or even worse making no decision – which is in itself a decision.
In our personal lives we use data to make decisions every day and we know that it both faster and less risky.  For example, to pick a book or movie we look to see how many people have reviewed it and the average score, to decide which route to drive to work, we use our mobile GPS or  Waze, how much exercise we need – check the data on our FitBit.  But in association business the decisions have a higher profile and both the questions we ask and the data we collect is more complex.  That’s why the ability to understand the story in the data and communicate it clearly to others is so important.
In fact, what we all really want is the ability to make decisions with confidence.  The way to do that is to have a single version of the truth so that we can understand the story and have a conversation with the data – asking new questions interactively, at the speed of thought.  From a technical perspective the best way to create a single version of the truth is to reduce complexity by transforming multiple sources of transactional data into a logical business layer stored in a data mart.  That way you don’t end up with different answers to the same questions because different people used different criteria for their queries.  But the technical part of this is not the most important aspect.  People think the field of analytics is about DATA – but I believe it’s really about PEOPLE.  People decide what to measure.  People explain the meaning to others and people take action get results.
In fact, the ability to make decisions with confidence is a leadership trait.  To advance in our careers we must be able to do this.  In fact, this is what leaders do – they ask questions, make decisions and influence others.  Those of us who understand the story in our data can make better decisions faster and we can make those decisions with confidence because we have data as evidence

More confidence + good decisions = advance your career!

When I started my career in associations 22 years ago, I was LeadershipDirector of IT for the National Institute of Governmental Purchasing.  During the 7 years I was there I experienced first-hand how important data was.  We didn’t refer to it as “data” – we called it reports and queries but I saw that leadership needed information to make decisions.  I was able to add value and get noticed and promoted by being able to deliver and explain the meaning of the data in ways the Board of Directors could understand.  I attribute most of my success to that single skill.  Years later we’ve taken this same skill of understanding and communicating clearly about the stories we see in data and create an entire business model that focuses exclusively on serving the association market.
In the past there has been a histsory of miscommunication between IT and association leaders.  The leaders would ask for what they wanted and IT would deliver what they thought they heard.  Often the leaders would have another question or want different information or have it grouped another way.  This took time and frustrated both IT and business.  But it shouldn’t be a BAD thing when you have a better quality question!
The reason those of us in IT became the arbiter of all things data is in large part because the transactional databases that contained data were designed to optimize data storage not data retrieval. This meant that most association leaders or staff could not confidently get the data they wanted quickly and accurately without help from IT.  It became a technical exercise to correctly construct all the table joins and criteria.  And even once these database systems built good query tools, it was still difficult to translate business questions into queries and it only took one or two episodes of retrieving the wrong information before confidence was lost in the “database”.  Enter the era of centralized reporting.  But this really wasn’t the solution the association leaders and staff wanted – what they really wanted was to be able to have a conversation with the data themselves.
In today’s world, we want to encourage association leaders and staff to ask questions of the data themselves – at the speed of thought.  This is what is meant by a “conversation with the data”.  Ideas become fluid when we interact visually with data.  We understand images 3 times faster than text and using  visual data discovery, science and art mix, enabling new insights and helping us to see what we didn’t know that we didn’t know.  This is how we make decisions with confidence and enhance our leadership abilities.

Leave Manual Data Collection and Static Reports in the Past

Do you ever wonder why we’re still using the Nielsen ratings?  These are the ratings used for measuring TV show popularity and demographics.  They’re heavily criticized for being inaccurate but still relied on heavily by networks and advertisers.  I was surprised to learn the company still collects some of its ratings data by sending paper surveys to American households.  The recipient must document every minute of every show watched on any TV in the house for an entire week in a paper journal.  At the conclusion, the survey is mailed back to Nielsen where someone goes through the journal and enters the data manually into a computer.  Once all the surveys are received and entered, the data is crunched, combined, and out comes a rating number.  These methods have been around for decades.  While they may have worked before computers and before households with multiple TVs, today they have some big disadvantages.  Compared to electronic collection, it is more subject to response bias (only collecting data from those who choose to respond, and they may over/under report certain types of programming).  Since it reports mainly on household habits rather than individuals, it misses key demographic information because it mainly reports on household habits, not individuals.  Worst of all, it’s missing the automation and efficiency of more modern collection and measurement methods.
Let me bring this back around to associations and dig into a specific example of where you might see this concept of outdated data collection and management in your organization.  A common manifestation of this is preparation for a board meeting.  It often takes a team of people working 2-4 weeks to collect and summarize all the data points that are desired for analysis in the meeting.  After all the effort, the data is already outdated and doesn’t provide any detail beyond the summary level.  Wouldn’t it be nice to collect data the same morning or, even better,  view and interact directly with near real-time data on a visualization/dashboard  during the meeting?
For many associations, replacing manually generated reports is the first tangible benefit of a data analytics initiative, saving hours of staff time for collecting and calculating data.  Other benefits include:

  • Updated information available on demand
  • Establishing a common language across the organization for complex concepts and formulas such as member retention and engagement
  • Consistent formulas for calculations and used repeatedly
  • Instant ability to ask/analyze new questions based on insights gained or patterns recognized
  • Visually compelling and interactive presentation

If you’re still using manual methods like the old Nielsen surveys, it is costing your team and your organization dearly.  Act now and start reaping the benefits of modernization in your association’s data collection and presentation.