Lighthouse Blog

  • Lighthouse Blog
  • Organizations Have Accelerated Analytics, but the Paradigm Remains the Same

Organizations Have Accelerated Analytics, but the Paradigm Remains the Same

Organizations Have Accelerated Analytics, but the Paradigm Remains the Same

Analytics has become an important tool in helping businesses make informed decisions, be more efficient and uncover new opportunities. While most organizations today are benefiting from analytics, the road to get there has been a long and winding one. Over the years, analytics teams have responded to growing and changing demands of their business users—from delivering transactional BI to standing up data warehouses to exploring big data and emerging analytics technologies.

As the complexity, speed and volume of data continues to increase, new methods and skillsets for analyzing data are pushing the competitive boundaries of what companies can and should do with it. They will have to address this new reality, while continuing to support the diverse needs of business users—whether it’s transactional reporting, curated data warehouses or explorations in data science.

While the frontier may look new and daunting, as an industry we’ve been through a lot. As I look back at this history, I’m reminded of the old adage, “The more things change, the more they stay the same.” Let’s take a look back at how our shared past can help us be more successful today,and help us be ready for the next big thing in analytics.  And for a brief overview, download our infographic to see where we have been, where we are, and where are headed. 

Where we have been: The dawn of analytics

The analytics industry got its start by using transactional data sources to provide reports to business users. For most in the industry, analytics started on the application side. Data was extracted, connected, modeled and reported on from a standard source, whether it was an ERP system, a mainframe, an EHS or a core banking application.

These sources of information came with their own challenges. They were difficult to understand, difficult to extract and difficult to interpret. Business users needed insights that they could act on, but they didn’t have the skills to get what they needed from these data sources. This led to the role and support provided by the programmer analyst. These programmer analysts developed skills in extracting and modeling data and presenting visualizations to business users. And for a time, all was good. A single solution had been found for organizations’ data troubles.

But, after a while, this approach created siloes—of both skills and information—across the enterprise. That’s because each data source (i.e., transactional system) had a designated programmer analyst assigned to it, and there was no integrated source across all of the systems. Through various means, there was a need identified to create a place to host data that was separate from the transactional system. And thus, the data warehouse was born.

Where we are: The era of data warehouses and curated analytics

The data warehouse came in many flavors: data marts, data warehouses, operational data stores, information factories and others. It was created out of a need to integrate data from multiple sources and create an approachable source for user reporting and informational needs. Many data warehouses were attempted, many were built and many failed—or failed to be fully realized.

Those that succeeded typically solved a lot of the tougher integration questions of the past by cleansing, correcting and correlating data and delivering it to business users so they could use reports to help make decisions. To make the reporting and access easier, significant modelling was performed to meet what different users wanted and needed.

This created the need for new roles in the analytics department, such as the business analyst, the information architect and the data warehousing architect. They set out to build and support systems to deliver curated and easy-to-use analytics for business users, who got the reporting capabilities that they needed as well as access to trusted data. However, this came at a cost to organizations because building and maintaining these systems required a whole lot of time, money and resources.

It came at a cost for business users as well. Analytics reporting cycles grew long—too long for some. Those who had not lived through the programming days grew impatient with these long cycle times. They began to “self-serve” and replicate what they have always done: each user began to source their own information sets, model them (now known as “wrangling data”) and create visualizations to consume their findings from analytics. And so, all was good again. Except there were now multiple team members providing the same numbers based on different logic, processes and methods.

Where we are headed: The open and self-service era

In recent years, as exponentially growing volumes of data have become available to enterprises, a new role has risen to the forefront: big data and the data scientists who can wrangle it. It represents a promising, and perhaps daunting, new frontier for many organizations. As we’ve seen before, too often the approach in data and analytics has been one solution will solve all—that the next new thing will be “the answer.” First it was the programmer analyst dealing with the mainframe, then it was the data warehousing era, and now, data science is where it’s at.

But in reality, each are highly relevant today, bringing with them their own opportunities and limitations. Each will be a big part of the modern enterprise going forward. With the collective knowledge we have gained as an industry, enterprises now have the capability to do all three approaches: programmatic, analytic and open. As it always has, our analytics journey keeps evolving and will continue to evolve. It is now the time to deliver through multiple means and to streamline the redundancy and overlapping use of each approach. Going forward, there will be important and evolving needs for:

  • Transactional reporting and system-to-system integration
  • Curated content that is governed, secure and trusted
  • A source for ad-hoc and self-service sources

No solution can meet all of the above needs, and it is now time to take all that we know and put them together to solve the information needs of organizations.

Putting it all together: Enabling enterprise analytics on all three fronts

The best path forward will be a balanced approach that smartly addresses the analytics needs of the enterprise for all three fronts. Here’s a few things to consider to be successful in each of these areas:

Transactional sources: Organizations should model and map to transactional sources and limit the technical exposer to the users. They should make query (limited sets or only one record) the default access for transactional sources. They should remove direct access for citizen analysts. This will take the load off of the source and guide the CA to more open means. Organizations should also integrate between systems with hub, stream or data backbone methods.  This access and integration will support users of transactional systems that need fast results in application usage (number of, current status, scheduling and sequencing results).

Curated analytics: For most organizations, there are business users who need to know the score, see the dashboard and feel comfortable with a curated set of analytics. This will require extraction from sources, integration into a centralized space, modeling and visualization (the traditional data warehouse and reporting solution). These users are leading, running or controlling the business outcome for the organization. They need trusted data that is highly curated and will guide them to make better decisions. Most organizations will have this in place as some sort of data warehouse ODS or mart with a traditional visualization solution. The data warehouse has to be offloaded from transactional purposes and from some of the self-service.

Open and self-service analytics: Going forward, the new need in the industry is for a set of data scientists who will have a large need for data to provide business users new insights. In one sense, these data scientists are like the new programmer analysts with different skills, tools and methods, but the same results. They will learn the sources of information, extract and manage (wrangle), model (prepare) and report on the data. We are back at the beginning with a whole lot of new knowledge, skills and methods. It will take time for organizations to best manage the impact of this role and approach and make it work best across the enterprise. While it seems new, in a sense we’ve been here before.

Download our infographic

Let’s look at your analytics history

For many years, Lighthouse has helped organizations get results by making the most of their analytics capabilities and exploring new opportunities. We suggest that organizations look into their own analytics operations to take stock of where they are, how they got there and where they are headed. As we’ve seen, traditional sources, curated analytics and open analytics will each have a role going forward. The modern analytics organization will be finely tuned on all these fronts. Together, we can look at your current capabilities, your goals and your vision to help you succeed with analytics. Contact us today

Related Posts
Analytics Netezza – How Easy Do We Have It?
Analytics Examining the Past, Present, and Future of Data Warehousing and Analytics