Reconsidering Analytic Investment in a Recessionary Economic Cycle

Yellowbrick | Hyoun Park
Hyoun Park
5 Min Read

I recently had the pleasure of kicking off Yellowbrick’s Tiki Talks where I got to speak with Yellowbrick Chief Operating Officer Jason Snodgress and veteran Silicon Valley CFO Adriel Lares about the key financial and economic challenges companies are facing in 2023 as well as how technology fits into the CFO and executive business perspective. We were able to cover a wide variety of strategic topics in the Bay area including the challenges of analytics, the importance of predictable growth and rapid payback periods for new investment, the end of “easy mode” for financial growth, why IT cost management is a team sport, the importance of seeing headcount as talent, and the need for the CFO office to reduce costs. It is a wide-ranging discussion that will help you plan for the year.

In that discussion, one of the biggest topics that came up was how to help the CFO save money in their tech portfolio. This topic was especially timely, given recent tech layoffs and hints of recessionary economic activity. At first glance, this concept seems fairly straightforward, as one would expect that the CFO would like to cut costs wherever possible. One of Lares’ important points was that, as an executive, he is always looking across all aspects of the company and it is important for any cost-cutting opportunities to be significant to the business’ bottom line and reinforced with a credible list of actions based on existing budget, resources, and subject matter expertise.

As a result, we spoke about three tiers of savings opportunities. First, when companies talk about tweaks to cut 3-5% off an existing bill, those reductions are appreciated, but typically seen as simply doing your job and not something that gets the C-Suite excited. To make significant cuts that require either operational change or resources will typically take additional savings.

I brought up Amalgam Insights’ IT Rule of 30, which states that any unmanaged IT spend category typically has 30% waste that can be optimized through a combination of invoice management, payment management, service optimization, and contract negotiations. Although this was seen as an interesting opportunity in our conversations, the multi-departmental coordination, and timeframes needed to fully capture that 30% in savings were seen as potential challenges. Applications running in the cloud typically are limited to 30% savings as public cloud services are often built with a 30% gross profit margin in mind. To achieve greater savings, businesses need to make underlying changes to the technical architecture such as software replacements, cloud migration, or data policy changes.

Moving Legacy Business Technologies to the Cloud

For those organizations running legacy technologies, especially those that may be fully depreciated or no longer under formal vendor support, the cloud offers an option for future-proofing maintenance and upgrades. In this Tiki Talk, we spoke about the Big Hairy Audacious Goal of IT cost reduction, the 70-80% savings that can be seen by replacing obsolete legacy technology solutions. From a high-level perspective, it can be easy to look at IT areas such as analytics, compute, and storage and just assume that a gigabyte is a gigabyte and leave it at that. This is especially true if the legacy solution is a fully depreciated software solution that is on-premises.

However, the cost structure of platforms that are poorly suited to an organization’s usage patterns, whether because the technology is outdated or the business has transformed its data usage and analytic access patterns, can be costly. When companies are thinking about analytics and data technologies that can potentially be replaced to reduce a majority of the costs, they should look at the following aspects as a starting point.

First, is there an internal technical resource who understands the technology well enough to describe the workloads and data being supported? And is this person able to translate technical requirements into the percentage change and dollar change that CFOs and other executives can recognize? To find big savings in IT, finance and IT executives need to find each other and work together.

Second, what are the hardware requirements both for the legacy solution and for the new cloud solution? Legacy solutions may be requiring hardware in discrete units that are far larger than the workload requires, rather than paying to scale up at the smallest measurable unit of server or node. Paying on a per-server, per-node, or per-transaction basis depending on the use case may allow the organization to save far more on hardware and infrastructure upkeep than is necessary with the current analytics environment.

As companies look at changing their pricing options for analytic technologies, they need to be aware that the cost governance and cost tiers associated with the technology expense may differ from their existing spend patterns. For instance, something like “1.2 terabytes” may look very different if it is provided on a linear per-gigabyte basis compared to a node-based basis where going over 1 terabyte may suddenly double your cost. The variability of cost tiers and the differences between best-case and worst-case scenarios require due diligence both in evaluating the current TCO of existing solutions and during direct vendor comparisons for net-new or replacement solutions.

Third, what is the cost per query for the analytics solution? When it comes down to supporting end user requests, is the query being run on efficient code and how does it request or cache data to optimize user requests? This is often a “hidden” cost for analytic solutions as analytics can be supported at multiple compute layers with different coding languages or libraries that seem obscure to the non-technical manager but can result in expensive bills for data-savvy organizations. To avoid these hidden costs, check if your current analytic investments require a middle layer of caching or transformation that results in a total cost of ownership above and beyond the direct cost of a query.

Fourth, what level of operational flexibility does the current analytic solution provide? Is it wedded to a specific type of hardware or to a specific proprietary language that is rarely taught in current degree and certificate programs? Can it be moved to the public cloud in emergency or peak situations? What kinds of technical debt does your current solution have and are they taking up time and resources at a time when the business is being asked to be leaner and more productive?

My firm, Amalgam Insights, finds that this due diligence and teamwork can frequently lead to finding technology solutions in-house that have outworn their functional value and led to a current total cost of ownership that can be cut by 50-80%. In addition to TCO reductions, analytic modernization can also potentially lead to data monetization and data productization opportunities as data and analytic outputs can be provided to internal departments, external partners and channels, and direct customers across end-user devices and applications more easily. These changes can be made in time to affect the bottom line in 2023 but require a commitment to both modernize and not simply accept the status quo. In light of current economic and financial trends, it is time to review data and analytics environments and use this crib sheet to identify potential opportunities to improve.

Sign up for our newsletter and stay up to date