One brewery in Australia has tapped into the power of big data analytics to improve its flagship beverage further. Source: Shutterstock

One brewery in Australia has tapped into the power of big data analytics to improve its flagship beverage further. Source: Shutterstock

How one brewery tapped into big data to improve its ale

DIGITAL transformation is bringing massive changes to businesses across all industries, providing technology-driven solutions to traditional problems faced by enterprises.

And thus, it should come as no surprise that more non-technology focused industries are also hopping on the digital transformation bandwagon, to optimize their operations and business model.

One such company did just that when it decided to tap into the power of big data analytics to further improve and maintain consistencies of its proprietary ale recipe.

Faced with seasonal variation in the ingredients, Cooper’ Brewery often runs into challenges to produce its flagship ale with the required uniformity.

So, the brewery, with the support of Australian government, partnered with Data to Decisions Cooperative Research Centre (D2D CRC) to leverage big data analytics to measure the impact of the inconsistencies in the ingredients on the final quality of its ale, according to one report.

“The bottom line is, we need to understand, manage and control the inputs and processes so as to ensure not just the quality of the product, but that it is always the same, day-in-day-out, year-in-year-out.”

“The challenge for all manufacturers is how to do this when there will always be ingredient seasonality and variations to the brewing process,” Cooper’s supply chain manager Dr. Jon Meneses was quoted as saying.

To overcome the problem, Meneses’ men came up with ten metric that affects – both positively and negatively – the quality of Cooper’s brew to gain valuable insight.

Getting the data is easy, turning it into knowledge and deliverables is the big challenge,” Meneses added.

According to D2D’s lead data scientist Dennis Horton, the complexity of the brewing process makes it hard for defining the mild difference in the end product to the variations in the ingredients used to make it.

It took the Horton’s team eight weeks to develop an algorithm to fine-tune the brewing process that guarantees a quality work, which includes a long lasting firm head foam.

The development of the algorithm required combining and analyzing data collected from the various stages of the brewing the ale for a deeper understanding of the production process.

Consequently, the algorithm managed to identify the ten most significant constituent out of hundreds of different brewing conditions and component.

“These results provide valuable insights into the brewing process and can help Coopers manage the quality of the final product despite the complex nature of brew settings and ingredients.

“The results both confirmed their previous ideas of the important components as well as revealed other factors that impact on final quality,” Horton said.

Though Cooper’s have not permanently adopted the solution to its broader operations, the brewery is keen on tapping to the promise of big data analytics to further transform its offerings.

“While we have a great product, we are always looking for ways to improve and do things better,” Meneses said.

In recent times, the food and beverage industry has embraced future technology in significant ways, especially big data analytics to optimize productivity, as well as to improve product quality.

As recipe repeatability is key in maintaining brand loyalty and repeat business, using data analytics, complemented with technology such as AI will provide valuable intelligence and insights for enterprises to develop a more robust recipe and method.