About Analytics

Analytics is a commitment to making fact-based decisions. It is about understanding what has happened, predicting what is likely to happen, and figuring out how to ensure that the best things happen. It is all based on data, and businesses are collecting more and more data, sometimes more than they know how to handle.

People often think of analytics as "reports", and these are important, but it goes so much further.

It all starts with data

Analysis is only as good as the data it is based on, so having quality data is important. However, it is equally important not to get so concerned with the data that you forget to do something with it.

Just about every business tells us that their data is probably the worst we've seen, and are surprised to hear that most businesses have similar problems with their data. Some industries are more advanced than others, but data management and analytics is still at the early stages of what is possible - where just getting started with it can still give a competitive advantage. In a few years, those who haven't started on the path will be at a severe competitive disadvantage.

Check out our strategic services offering to help decide whether your data is suitable for analytics now, or if you should wait. One interim solution could be an analytical datamart. We can assist with gathering data from various parts of the business, and integrating it together in a form suitable for analysis. This allows you to start getting results immediately, without waiting for the enterprise data warehouse project to complete.

Reporting and business intelligence: looking back

Business Intelligence (BI) is about creating reports to understand what has happened and is currently happening. It is very important, and forms the basis of making fact-based decisions. A good BI capability should be a priority for any business. However, BI is about looking back, and it's not really what we do.

Advanced analytics: looking forward

The type of analytics we do is forward-looking: trying to figure out what will happen, and what is the best that can happen. This involves using techniques beyond querying and reporting, and we call this advanced analytics.

Most analytical projects involve blending many techniques with a careful focus on goals and what we are trying to achieve. However, the fanciest model in the world isn’t any good if it can’t be validated or trusted. Advanced analytics can be broadly grouped into the following categories.


Forecasting is about predicting what will happen in the future, based on what has happened in the past. It involves using a combination of statistical modelling algorithms (from us) and expert domain knowledge (from you).

Typically the starting point for forecasting is a time series of historical data for the quantity being forecast, at a regular interval. For example, units sold per week or peak energy demand per month. In addition, there might be other variables that we suspect influence the quantity being forecast, such as price, GDP or temperature. The statistical forecasting algorithms, guided by expert knowledge, are able to find the best model for that particular time series.

With the right software and approach, we are able to forecast hundreds of thousands (or more) different time series, reconciled up and down a hierarchy if necessary (viewing forecasts at a national, state, per-store, or per-product level, as required). Robust forecasting projects can also consider the effect of events, scenario modelling, and Monte Carlo simulation to get a distribution of possible outcomes.

Predictive modelling

Predictive modelling is the name given to a collection of techniques that attempt to use relationships between data we have, to estimate things about data we don't have. It is about predicting what will happen under various circumstances. A few examples are described below.

Segmentation and clustering are techniques that group entities based on their similarity in known variables. For example, segmenting customers based on their mobile calling patterns and data usage. Or, clustering stores based on the types of products they sell most often. By grouping things together, we can develop separate strategies to deal with each group, and can more easily conceptualise the value and trajectory of these groups over time.

Propensity modelling assigns a probability that a customer will respond a particular way, using analysis of customers who have responded that way in the past, and how similar this customer is to them. This technique is an ideal way to select the best customers for a campaign, based on their likelihood of responding.

Classification is about determining the likelihood that an entity belongs to a particular group, based on the known characteristics of the entity and the known characteristics of others in the group. It uses many of the same techniques as propensity modelling: regression, decision trees, neural networks, etc.


When you have thousands of possible decision states, analytical optimization finds the best possible decision, to maximize or minimize some desired outcome.

For example, choosing the best combination of prices for all your products to maximize profit, given price elasticities and constraints on how many products can be on promotion at a given time. Or, choosing the best roster allocation of staff to shifts, given leave requests, roster stability, fair allocation of hours, etc. Or, the best delivery routes through a city in order to make all the deliveries within certain time windows, and minimise distance travelled.

The possibilities are endless. Any time you have lots of possible solutions to a problem, a formula that calculates which solution is better than another, and rules for deciding whether a solution is allowed, optimization can be used to find the best possible decision. It uses advanced mathematical algorithms to sort through the millions of combinations on a computer, and typicaly finds solutions significantly better than humans can find.

Optimization techniques have been around since World War II, when the allies used them to maximize the output of their factories, but they are still underutilized by most businesses. They are an area where it is possible to get a huge increase in profits, or decrease in costs. We would be happy to talk to you about whether there are any opportunities for optimization to be used in your business.

Validation and verification

Validation and verification are the most important aspects of any analytical modelling exercise, but they are so often overlooked. We constantly see complicated models in businesses that might as well be complicated random guesses.

Many modelling techniques, such as regression, have a tendency to be "overfit". This means that the parameters of the model match very well the data they were built on, but don't generalise to other data, which is the whole point of having the model! Carefully calibrating a model so that it is not overfit is more important, and harder (since some software doesn't allow for it), than building the model in the first place.

All models are wrong, but some are useful.
--George Box

The other side of things is that models go "stale". They constantly need to be refreshed with more recent data, and often rebuilt from scratch as the environment changes, and their assumptions vary. A good analytical programme tracks the performance of a model over time, to identify when it goes outside its expected tolerances.

The lack of verification and validation is the single biggest reason why organisations don't get the level of results they want from their analytical activities.

We are always happy to chat

If you have an idea that analytics could help you, or you have some data that you don’t think you’re getting maximum value from, we can help. We’re happy to come and have a chat about the challenges you are facing, and give you some ideas about how we, or analytics, might be able to help.