Archive for predictive analytics

Partnership Provides Powerful Member Engagement Analytics

(Washington, DC—August 13, 2018) Association Analytics, the leading data analytics software and services company for associations, and Higher Logic, the industry leader in cloud-based engagement platforms, have announced their formal partnership.

Sharing the goal of innovating associations through technology, this dynamic union originated on May 23, 2018. An unprecedented case study detailing the impact of this new integration was released at the ASAE Annual Conference, featuring the predictive membership engagement and revenue indications ASAE has captured.

Higher Logic’s Director of Partnerships, Bobby Kaighn, explained, “What may surprise association leaders is the potency of analytical trends when the Higher Logic community platform is integrated with Association Analytics’ Acumen data analytics platform. You can see the exact retention rate of members who engage in specific community activities, the relationship between community interactions and attendance at annual meetings, and the ratio of membership revenue to the frequency of online community participation.”

Associations dedicate their sources to storing information on everything from finances to event registrations, and most of these sources are continually upgrading their advanced reporting functions to include visual analytics. Now imagine if an association could lay these advanced data sets, like marketing campaign data (from one source), on top of member engagement statistics (from another source). This is where the synchronized connectivity of this partnership innovates the association industry.

Higher Logic’s data-driven approach helps organizations track and manage meaningful interactions along each stage of the member journey. Its expanded suite of engagement capabilities includes online communities and marketing automation, covering everything from the initial web visit to renewal and ongoing engagement.

Julie Sciullo, CEO of Association Analytics, said, “With Higher Logic as a standard Acumen platform integration, we’ve been able to quickly and easily import community data in real-time to interpret past and current data, bringing trends to light for strategic action plans.  The integration with Higher Logic empowers associations to overlay hundreds of data combinations to gauge performance and predict future outcomes.”

With the partnership between Association Analytics and Higher Logic already innovating associations, today’s announcement is shared in celebration of this ongoing and increasing benefit to the industry. For more, don’t miss the latest case study featuring ASAE’s actionable analytics on retention rates, the relationship between specific interactions and attendance at annual meetings, and the ratio of membership revenue to the frequency of online engagement.

Association Analytics (A2) is an industry leader in data analysis and management products, services, and training. We innovate associations by funneling your databases continuously into one powerful visual analytics dashboard for real-time, instant interpretation and decision-making power. The platform is called Acumen and it’s designed specifically for associations. This hassle-free, hosted analytics platform means you no longer need IT staff to run reports. Acumen is intuitive and easy-to-use with out-of-box visualizations & reports that encourage cross-staff adoption. The flexible design allows you to choose only the modules you need and includes seamless built-in integrations with your AMS, LMS, email, and finance platforms. Modules include Membership Engagement, Events, Community, Finance, Membership, Sales/Orders, Email Marketing/Automation, and the Executive Dashboard feature. It’s time you had a single source of truth with a 360 degree view for better, faster decisions, enhanced member experiences, improved staff efficiency, and increased revenue. Learn more at www.associationanalytics.com

Higher Logic is an industry leader in cloud-based engagement platforms. Our data-driven approach gives organizations an expanded suite of engagement capabilities, including online communities and marketing automation. From the initial web visit to renewal and ongoing engagement, we help you track and manage interactions along each stage of the digital customer experience. Organizations worldwide use Higher Logic to bring people all together, by giving their community a home where they can interact, share ideas, answer questions, and stay connected. Everything we do – the tools and features in our software, our services, partnerships, best practices – drives our ultimate goal of making your organization successful. Visit www.higherlogic.com.

An Approach to Analytics both Hamilton and Jefferson Could Embrace

Happy 4th of July!  What a great time to think about data independence, democratization, and governance for your association.  In this post we’ll talk about the balance between the central management of data by IT and data directly managed by association staff.
Leading analytics tools provide great capabilities to empower people to make data-guided decisions. The ability to analyze diverse data from a breadth of sources in a usable way for association staff is a key feature of these tools. Examples include Power BI Content Packs and Tableau Data Connectors. These range from pre-built data sources based on specific applications such as Dynamics, Google Analytics, and Salesforce; to relatively rarer “NoSQL” sources such as JSON, MarkLogic, and Hadoop data. These tools rapidly make data from specific applications available in formats for easy reporting, but can still lead to data silos. Tools such as Power BI and Tableau provide dashboard and drill-through capabilities to help bring these difference sources together.

Downstream Data Integration

This method of downstream integration is commonly described as “data blending” and “late binding”. An application of this approach is a data lake that brings all data into the environment but only integrates specific parts of data for analysis when needed. This approach does present some risks, as the external data sources are not pre-processed to enhance data quality and ensure conformance. In addition, business staff can misinterpret the data relationships that can lead to incorrect decisions. This makes formal training, adoption, and governance processes even more vital to analytics success.

What about the Data Warehouse?

When should you consider using content packs and connectors and how does this relate to a data warehouse and your association? The key is understanding that they do not replace a data warehouse, but is actually an extension of it. Let’s look at a few scenarios and approaches.

  • Key factors to consider when combine data is how closely the data is linked to common data points from other sources, the complexity of the data, and the uniqueness of the audience. For example, people throughout the association want profitability measures based on detailed cost data from Dynamics, while the finance group has reporting needs unique to their group. An optimal approach is to bring cost data into the data warehouse while linking content pack data by GL codes and dates. This enables finance staff to visualize data from multiple sources while drilling into certain detail as part of their analysis.
  • Another consideration is the timeliness of data needed to guide decisions. While the data warehouse may be refreshed daily or every few hours, staff may need the immediate and real-time ability review data such as meeting registrations, this morning’s email campaign, or why web content has just gone viral. This is like the traditional “HOLAP”, or Hybrid Online Analytical Processing, approach where data is pre-aggregated while providing direct links to detailed source data. It is important to note that analytical reporting should not directly access source systems on a regular basis, but can be used for scenarios such as reviewing exceptions and individual transaction data.
  • In some cases, you might not be sure how business staff will use data and it is worthwhile for them to explore data prior to integration into the data warehouse. For example, marketing staff might want to compare basic web analytics measures from Google Analytics against other data sources over time. In the meantime, plans can be made to expand web analytics to capture individual engagement, align the information architecture with a taxonomy, and track external clicks through a sales funnel. As these features are completed, you can use a phased approach to better align web analytics and promote Google Analytics data into the data warehouse. This also helps with adoption as it rapidly provides business staff with priority data while introducing data discovery and visualizations based on actual association data.
  • Another important factor is preparing for advanced analytics. Most of what we’ve described involves interactive data discovery using visualizations. In the case of advanced analytics, the data must be in a tightly integrated environment such as a data warehouse to build predictive models and generate results to drive action.

It’s not about the Tools

The common element is that using data from sources internal and external to your association requires accurate relationships between these sources, a common understanding of data, and confidence in data to drive data-guided decisions. This makes developing an analytics strategy with processes and governance even more important. As we’ve said on many occasions: it’s not about the tools, it’s the people.
Your association’s approach to data democratization doesn’t need to rise to the level of a constitutional convention or lead to overly passionate disputes.

The Analytics Convergence

Data-guided decisions permeate our everyday lives as individuals, but how can you harness that power for your association and your members? The field of data analytics and big data is exploding with opportunity. Businesses are encroaching on the areas that used to be the private domain of associations – content, networking, events, etc. – because they are employing the power of analytics. But now, the process and tools needed to analyze and interpret data are much less expensive and easier to use than before. Are you ready to have a conversation with your data? Each level of business intelligence has a unique language to make your data speak!
The following presentation was created by Debbie King, CEO of DSK Solutions, Inc. and David DeLorenzo, CIO National League of Cities. This information was first featured at the 2013 ASAE Finance, HR, and Business Operations Conference.

Business Intelligence Trends 2013

Although the term “Business Intelligence” is so overused that it is almost meaningless, what is important to know is that advances in data science and analytics are affecting everyone – every day.  Tableau outlines important trends in this field for 2013. What do these trends mean to your association?

Data Visualization – More Than a Pretty Picture

It’s been said that a picture is worth a thousand numbers and there is hard science to back this up.   The human brain processes images 3X faster than text, and our brains excel at the task of subtle pattern detection, and interpretation of meaning.  The visual system is extremely powerful – we process the visual field at once, and we can spot trends and outliers very quickly.
So what is it exactly about visual images that make data so much easier to understand?  It’s the mind’s ability to quickly see the most important information and to detect relationships. Pictures and colors allow us to take mental shortcuts.  We understand the red stop sign and the yellow warning symbols even before reading the words.  Remember how the light bulb went off in elementary school when we learned how relationships between sets could be represented visually with a Venn diagram? 
Today we have more data available than ever before.  Yet we have less information.  We all yearn for an easy to way to understand data quickly.  Enter the era of data visualization. 
Business Benefits of Data Visualization

  1. Visual patterns surface quickly
  2. Visualization is dense – many data points presented in a small space
  3. Accelerates time to insight
  4. Key to adoption and success of Business Intelligence (BI) projects

When we look at a table with numbers, our minds generally first focus on the largest numbers, then we mentally perform the task to convert the numbers into images that show how they relate to each other – for most people this is a vertical bar graph.  In this way we “see” in our imagination how the largest number relates to the other numbers. But using a simple image such as a bar graph from the beginning, frees up the mind from the task of constructing the comparison and allows the mind to focus on the most important questions of all – what does it mean and what should we do about it?
Here is a simple example.  Which of these representations allows you to quickly determine the month with the highest total sales?  Obviously the bar chart.  But what you can also do is immediately see the relationships between the sub-categories, months and totals, which are lost in a table.  With a quick glance it is possible to see there is a problem with March, which is not evident by looking at the table.   Days later, chances are we’ll remember the image, not the numbers.  The best data visualization tools also allow you to click on the colored bars and drill all the way down to the line item level. 
Transactions Heatmap 2012
Human beings have an incredibly refined ability to perceive patterns based on position, size, color and shape.  Data visualization allows us to see important patterns and correlations quickly without getting lost in the details.
Data Discovery
In the field of business intelligence, data discovery is the process by which we find the answers to the questions we have – namely the things we know that we don’t know. An important part of data discovery is the ability to present the data visually one way in order to simulate the mental conversation and allow us to interact and “ask” meaningful follow-on questions of the data, and then “re-present” the data another way, once more is known.  
We are inspired when we see data visually and notice how it changes based upon the questions we ask.  We can see the power as we interact with the data and the analysis process become a natural extension of the activity of thought, enabling us to drill down, drill up, filter, bring in more data sources, or create multiple visual interpretations.  Interactivity supports visual thinking.  We can work with the data visually at the speed of thought, rather than writing queries to the databases.  We can learn and reach insights faster.  When we interact with the data visually, we are participating in the data discovery process and data becomes our ally, our partner in gleaning the information that becomes business intelligence. 
Once we have tackled data discovery we can start using predictive analytics and time series analysis to move beyond what we know we don’t know – to uncover the patterns, correlations and opportunities hidden in the domain of “what we don’t know that we don’t know”.  Data visualization is the first step down the BI path because it is interesting, understandable and can deliver compelling results quickly, making us yearn for more.  This is an important point that bears repeating – because data visualization is interesting and engaging it can effectively speed the adoption and success of a business intelligence project.
Big Data = Big Opportunity for Data Visualization
The term “big data” is used when referring to extremely large data sets that can change rapidly and include both structured and unstructured data.  Big data is often characterized by the three dimensions of volume, velocity and variety.  Data visualization shines with big data, allowing us to quickly comprehend relationships we would never be able to grasp if presented as rows and tables, even in summary form.  It fact it’s a whole new field – we used to have to be content with sampling large data sets, but with today’s tools we can analyze entire populations.
The example below is a simulation of a big data map where we have combined social media data with CRM data to show activity in the eastern region of the United States.  The green dots represent members and red are nonmembers.  The larger the circle, the more activity.  Of particular note are the larger red dots which represent an immediate opportunity for membership.
Big Data Map
Because of our ability to understand relationships quickly based on size, position and spatial attributes, the eye can understand visual data much faster than what might otherwise require thousands of numbers to convey and a long time to comprehend.  Data visualization enables us to see and understand the stories our data is telling us.  With the advent of big data and the pace of change, we need to able to quickly interpret data so that we can take action and change our future for the better.  That’s the power of data visualization.

Agile Business Intelligence

Business Intelligence (BI) projects that incorporate key aspects of Agile processes dramatically increase the probability of a successful outcome. 
I wonder why business intelligence (BI) projects have a reputation for being slow, painful and ineffective – and why do they often fail to deliver on the promise to improve data-driven decision-making?  I believe part of the answer is in the approach: the waterfall, linear, command and control model of the traditional System Development Life Cycle (SDLC) that is still pervasive in most technology projects today.  There is a better way!
One of the core principles of Agile is embracing a realistic attitude about the unknown.  It is interesting that at the beginning of a traditional technology project, when the least amount is actually known about an organization and its business rules, environment, variables, players, questions and requirements, that the greatest amount of effort is made to lock in the scope, the cost and the schedule.  It’s understandable that we want to limit risk, but in reality the pressure to protect ourselves can lead to excessive time spent on analysis, which often still results in unclear requirements, leading to mismatched expectations, change orders and cost overruns.  This is a well known phenomenon – at the very point where we have the least amount of information, we are trying to create the most rigid terms.  See the “Cone of Uncertainty” concept. 
I think part of the reason for this paradox stems from an intrinsic lack of trust.  Steven M.R. Covey explains in his book, “The Speed of Trust”, that trust has two components:  character and competence.  In each situation in which you are asked to trust, you must have both.   For example, if your best friend is a CPA, you might trust them as a friend, have complete confidence in their character and trust them to handle your taxes, but you will not trust their competency to perform surgery on a family member.   It’s the same in business.  We might have confidence in a vendor’s base software product, but not trust their ability to understand our needs or implement the solution well.  And trust has to be earned.  Once an organization has trust, the speed at which change can be communicated and accommodated dramatically increases.  And this increase in speed translates into an improved outcome and a reduction in the cost, both of which are a by-product of the clear communication that is possible when trust is present.
What does all this have to do with business intelligence?  I believe BI projects lend themselves to an agile, iterative approach, and this approach requires trust in order to work.  I’m not a big fan of some of the Agile terminology – terms like “product backlog” (doesn’t “backlog” sound negative?) and “sprint” (is it a race?)  But I do fully embrace the concept of working solutions vs. endless analysis, communication and collaboration instead of rigid process enforcement, responding to change vs. “hold your feet to the fire” denials of needed change requests.  In general, it’s the concept of “outcome” vs. “output” that is so inspiring to me about Agile.  I’ve seen examples where a technology project met all of the formal “outputs” specified in the contract, yet sadly failed to deliver the most important thing – the “outcome” that the organization was really trying to achieve.  For example, the CRM implementation that was delivered on time and on budget but that none of the staff would use, or the BI project that resulted in dashboards that measured the wrong things.  These are not examples of successful projects because the true desired outcome was not achieved.
How can Agile concepts be used in BI? 

  1. Identify an initial high profile “win” and complete a small but important aspect of the project to inspire the team, generate enthusiasm, engagement and feedback
  2. Facilitate data discovery : create a hypothesis -> investigate and experiment -> learn -> ask new questions and repeat the process
  3. Value the learning and the teamwork that is intrinsic to the process and which builds trust and speeds the ability to adapt to change

In a future post I’ll debunk some of the common myths that surround the topic of agile processes.