Building AI Strategies for Businesses

Author:Murphy  |  View: 22717  |  Time: 2025-03-23 18:25:55

Background – Spearheading AI Strategy at an Established Bank

Prior to founding Data-Centric Solutions, I spearheaded the development, execution, and implementation of the AI strategy at an established bank. Although relatively green in delivering at such scale, I accepted the job with enthusiasm. Dispelling the initial feelings of imposter syndrome, I reassured myself that the worst-case scenario would involve making a few mistakes, learning lessons, and professional growth.

It became abundantly clear that AI strategy was significantly more intricate than I had initially perceived. Throughout my career, I have had the good fortune of being involved with both successful and unsuccessful strategies. I want to share some of the valuable insights I gained, and hope this can serve as a foundation for others to build on.


Inspired by Wardley Mapping

I initially discovered Wardley maps via recommendation from a trusted mentor. For those unfamiliar, Wardley maps are an approach to system mapping. As the name suggests, the method was developed by Simon Wardley.

Initially, I was dubious, but the more I delved into it, the more I recognised the value. I saw that they were not only a fantastic way to articulate strategy, but an indispensable tool.

What convinced me was Wardley's chess analogy. As an avid chess enthusiast, playing whenever I can steal a moment, I found the analogy striking. Chess, while much simpler than business, is a strategic game. Wardley highlighted that the ability to formulate and implement strategies in chess is due to the presence of a map , which is the board itself!

If you wish you can read of Wardley's chess analogy, like me you may find yourself persuaded to pursue the approach!


What Constitutes a Map?

According to Wardley, these six components are essential for any map to be useful:

  1. Maps are visual, which is undeniably the case with a chessboard.
  2. Maps are contextual. We acknowledge that the chessboard pertains to the game of chess.
  3. Maps enable comprehension of position. We arrange the board in a way that allows us to understand the placement of the pieces.
  4. Maps reference an anchor. The anchor is intrinsic to the board, providing us with a sense of direction based on the board's position relative to the players and the pieces.
  5. Maps allow comprehension of movement. The design pattern of the chessboard gives us an understanding of movement.
  6. Maps contain components. In the chess analogy the pieces are components.

Understanding Wardley Maps

Before we delve into the strategy maps I have created, I want to foster a basic intuition of Wardley Maps for the reader.

A brief definition

A Wardley Map should be interpreted as a strategic visualisation of a business's value chain or service, highlighting the maturity and evolution of its components. It aids in understanding dependencies, identifying opportunities, and making informed decisions about where to focus innovation, resources, and development efforts.

Let's detail the attributes of a Wardley map:

Anchor: In each of our maps, the anchor is the customer. The position of all the components on a map is relative to the customer.

Components: Components represent the technology, practices, or activities carried out to serve customer needs. Linked components depend on each other in some way, primarily through enablement. Capital, data, and risk can flow between linked components.

For the purposes of AI strategies, I have defined the following components:

  • Intelligent App: The AI-powered application served to the customer. I have kept this general but the reader can imagine any type of AI powered application. This is the most visible part of the value chain, and is therefore closest to the customer as represented on the map.
  • Machine Learning Operations (MLOps): The practices, staff, and technologies that enable the management of machine learning/AI models in production. For a more detailed explanation read this.
  • Research & Development (R&D): The practice of discovering and prototyping novel AI/ML models. These activities are often conducted in a lab or ‘sandpit' environment.
  • Data Management: Data management involves the organisation, storage, retrieval, and maintenance of data in an efficient, secure manner to ensure accessibility, reliability, and timeliness for its users.
  • Infrastructure: Refers to the technical IT infrastructure that all other components are built on.

Visibility: The position of a component is determined by the y-axis, which represents visibility along the value chain. The closer a component is to the customer, the more visible it is.

Evolution: Movement is bounded by the x-axis, which represents the stage of evolution (maturity) of a given component on the map. Components can and will evolve over time due to external forces.

There are four stages of evolution for any component:

  • Genesis: Novel, unique components or technologies are created.
  • Custom-Built: Components are better understood but still require customisation for each use.
  • Product: Components become standardised, turning into widely-distributed products or services.
  • Utility: Components are highly commoditised and delivered as utility services focused on efficiency and cost-reduction.

Now that we have a basic understanding of the maps, let's explore the AI landscape together!

Note: This is a simplified take on Wardley mapping, but it's sufficient to understand the maps I have created for this article. If you're curious to learn more, I urge you to read the full blog.


AI at Phase 0: The Land of the Dinosaurs

I was recruited into a phase 0 business to construct an AI strategy. We had a plethora of legacy infrastructure, limited access to a modern tech stack, and scant in-house expertise for building intelligent applications. I suspect many readers can relate to phase 0.

Please take some time to comprehend the map for this phase 0 company.

Infrastructure: There was an abundance of legacy, on-premises, tech infrastructure. These were products that had been constructed by third parties decades ago, most of which were no longer supported except by a select few, very expensive, experts. Nobody really understood them, and people were hesitant to interfere with them for fear of causing significant problems. They were practically safeguarded by IT to prevent a "meltdown". The legacy infrastructure was reliable, static, and decidedly not conducive to innovation.

Data Management: The data management systems were built on top of the legacy infrastructure. Generations of analysts and engineers had constructed custom data pipelines, table views, dashboards, and reporting layers that were mostly undocumented. Our on-premises data warehouse was satirically referred to as "the data swamp". It was pretty much the wild west of data, a dashboard you had built could randomly fail because someone had decided to alter a pipeline without notifying anyone. The swamp data could not be trusted.

Every so often, I'll hear people describe free form text fields or CSV files as messy data. In comparison, what I'm describing here on the scale of messiness, is akin to an oil spill; free form text entry is spilt milk.

The Phase 0 AI strategy

We recognised that R&D, MLOps, and intelligent apps could not outpace the evolution of our data management. To get an AI strategy off the ground, we first needed to address this.

As a Data Science function we needed more control of the data we required for prototyping. We leveraged an off-the-shelf analytics tool allowing us to connect to the disparate data sources feeding the data swamp. Limited storage and compute meant we needed to leverage statistics to sample data sources appropriately. Data quality checks were implemented with open source profiling tools. Effectively we created a custom data management solution for the purposes of R&D.

R&D began as discovery sessions where we spent time speaking to our customers, trying to understand where intelligent apps could benefit them. We developed frameworks to assess feasibility vs. value based on our giving us a way to prioritise development initiatives.

With our makeshift data management approach and data labs, we managed to develop a few prototype applications. However, from an AI Strategy perspective, we were at a bottleneck. We had exhausted all our avenues, the next logical step was to evolve from phase 0.


Evolution of Infrastructure & Data Management

I would like you to imagine for a moment that you're the Chief Technology Officer (CTO) of a phase 0 business. How would external development in technology impact your business?

Point 1 – The first obvious one revolves around technical infrastructure, which has been commoditised by cloud providers. Compute and storage are available as a utility.

Point 2 – Consequently, tools and approaches for data management have also evolved. At the time of writing there are popular cloud native data lakes that can accommodate a wide range of data management requirements. Although this isn't quite commoditised yet.

Taking this into consideration, a savvy CTO could represent the future state of their Phase 0 business as displayed in Figure 3.

As the CTO keeping an eye on competitors, you would be able to anticipate that smaller, more agile companies may already be cloud native. This would give them the ability to invest in R&D activities and MLOps, as the evolution of the state of their data management and infrastructure makes this possible and potentially lucrative. The map would allow you anticipate how your competition could steal your market share by developing intelligent apps at a faster pace than you can. You could make a case for why investment in infrastructure and data management is critical.

You might also anticipate that there could be some resistance to evolution, be it due to internal friction or previous success with your own custom-built solutions. In my phase 0 experience, the data swamp was indeed beloved by some. There were political forces that would slow down migration to a more efficient data management approach.

The map displayed in Figure 3 helps you see that in order to advance your AI strategy you should be prioritising the evolution of your technical infrastructure and data management. This is exactly what happened at my phase 0 bank, the strategy moved from experimental AI to maturing the technical infrastructure and data management approaches.


AI at a Phase 1 Business: Tech Savvy, Chaotic, & Enthusiastic

I had the good fortune to be part of a strategy for a phase 1 bank. I feel this is where most data scientists fresh out of their PhDs or master's courses envision themselves landing. This bank had already outsourced their technical infrastructure to cloud providers. They had also leveraged "productised", cloud-based data management solutions.

No company is perfect, but this honestly was worlds apart from the phase 0 bank, and a lot of fun for me personally. There were hundreds of data scientists employed exploring novel areas of ML and AI. I lost count of how many times I was in awe at what I saw in the innovation sessions. However, there were some big problems on the horizon.

AI, just an expensive science experiment?

I remember sitting in a meeting with senior members of the innovation team. Each of us was scratching our heads, trying to figure out how we could get all these fantastic models to production. All these prototypes were brilliant, but honestly they were delivering no value to customers. We might as well have been creating expensive science experiments. At this point, I want you to imagine you're the CTO or head of innovation. How would you tackle the production problem?


Wardley Map: Evolution of AI – Phase 2

At the time of writing, cloud providers, scale-ups, and start-ups have been working on the problem of productionising AI. Some providers promise to handle all of the complexity involved with deploying machine learning models, while others promise to handle the full model lifecycle from deployment, to monitoring, to retraining.

Taking this into consideration, the CTO of the phase 1 bank understood the need for evolution. The CTO opted to partner with a cloud provider to build a productised MLOps service. This would really help differentiate them from the competition as not many established banks were delivering intelligent applications at scale. But it wasn't as simple as building new tech.

From a strategic perspective, running a phase 2 business requires organising teams in a way that is completely different from the phase 1 business. The types of people required across the business are quite different. Let's take a slight detour to understand this.

A Brief Detour to Team Attitudes

Wardley mapping gives us the concept of attitudes. Attitudes help us conceptualise how teams should work across different stages of evolution. Here's a quick rundown of the three attitudes:

  • Pioneers: Pioneers are inventive individuals who explore uncharted territory and make future success possible through core research, despite frequent failures.
  • Settlers: Settlers are practical visionaries who transform prototypes into profitable products, bridging the gap between the possible and the actual through applied research and differentiation.
  • Town Planners: Town Planners are the efficiency experts who industrialise products, maximising economies of scale and making innovations accessible, reliable, and affordable for the masses through industrial research.

Leading a Phase 2 Business

At phase 1, the bank had two types of employees: the pioneers and the town planners. Pioneers were the data scientist conducting R&D and building cool prototypes. Town Planners managed the IT Infrastructure. During my time with this bank, the discussions were very much on how to integrate the two camps. How might our Wardley map suggest we resolve this issue?

Bring in the Settlers

In the context of building intelligent apps, settlers are effectively your machine learning and data engineers. They are incentivised to turn research outputs into value. Being diligent CTOs that pay attention to external factors, we can see the need for moving from our custom-built MLOps platforms to a more standardised approach.

More standardised approaches require a shift from the pioneer to the settler. When we understood this, are hiring approach adapted to bring in the settlers. Where a lot of companies go wrong is asking their R&D data scientists to become settlers. A successful transition needs both, and these attitudes tend not to exist in the same person.


AI at Phase 3: Pioneering the Unknown in AI Applications

Figure 7 maps out the evolution from a phase 2 to a phase 3 business. As CTOs, we could leverage the mapping approach to anticipate changes and refocus our strategies.

Point 1 – As the entire landscape of MLOps becomes utility-like. You could imagine a situation where models are consumed by third party APIs.

Deployment: Instead of businesses having to develop their own infrastructure for deploying ML models, MLOps utility providers would offer seamless, scalable deployment solutions. This could include serving real-time predictions, batch scoring, or embedding models into applications.

Monitoring: ML models need to be monitored for drift in their performance over time or changes in input data. As part of the utility service, providers would track these metrics, identify issues, and alert businesses when models need to be adjusted or retrained.

Retraining: Models may require retraining as new data becomes available or if their performance degrades over time. MLOps utility providers could handle this process, retraining models on new data and updating deployed models with minimal disruption to the business operations.

Point 2 – Data management as a utility treats data handling like a service, akin to water or electricity. Companies subscribe to this service, which may include data storage, security, processing, and analysis. It allows businesses to focus on their core competencies, while the utility provider ensures quality data management, offering scalability, cost-effectiveness, and reliability.

The Role of Pioneers Revisited

Point 3 – At phase 3, the role of pioneers (data scientists) would once again come to the forefront. However, this time, they would be better equipped. Their explorations into unknown territories of AI would no longer be constrained by the need for the simultaneous development of robust operational processes. Rather, they would be free to create and innovate. Capital would flow to this new area of potential value due to the high potential upside.

New markets and expertise

Point 4 – As AI applications become more pervasive and start affecting various aspects of society more deeply, the importance of AI ethics will also grow. It would become crucial to ensure that the AI applications being developed are fair, transparent, and respect privacy. Organisations would need to embed AI ethics into their innovation processes, from the design and development stage right through to deployment and monitoring. We might even see the creation of new customer needs and markets around AI ethics and safety.


Final Thoughts

As MLOps becomes a utility, the AI landscape is likely to experience significant shifts. The democratisation of AI, coupled with a focus on the development of innovative AI applications and the growing importance of AI ethics, points towards a future where AI is increasingly integrated into our daily lives. As we look forward to this future, the task for businesses is clear: prepare for the era of utility MLOps by fostering an innovation-oriented mindset, building robust teams of pioneers and settlers, and keeping an eye on the evolving landscape of AI ethics.

Finally I would like to give credit to Simon Wardley for sharing this approach with the world.

Thanks for reading.


Follow me on LinkedIn

Subscribe to medium to get more insights from me:

Join Medium with my referral link – John Adeojo

Should you be interested in integrating AI or data science into your business operations, we invite you to schedule a complimentary initial consultation with us:

Book Online | Data-Centric Solutio

Tags: Artificial Intelligence Data Science Leadership Strategy Technology

Comment