Solving an AI enigma: How defence and national security can unlock the AI ecosystem

James Matthews Clare Warren Chris Miles

By James Matthews, Clare Warren, Chris Miles

Wherever AI is finding uses, industry, academia, and governments are racing to keep up with the unprecedented technical and ethical challenges it brings. Nowhere is this more evident than in defence and national security departments – guardians of the state’s most extensive powers.

These organisations need to change rapidly, both to take advantage of AI’s potential and to combat adversaries who are unhampered by bureaucracy or, supply chains, and may operate within different legal and ethical frameworks. The emerging AI ecosystem is at a pivotal moment.

Initiatives like those led by Team Protect are already showing this in action. The PA-led consortium brings together the defence sector and industry to deliver the UK Ministry of Defence’s (MOD) CRENIC programme. The programme, which will provide the next generation of capabilities to counter threats posed by radio controlled improvised explosive devices, illustrates how forward planning using ecosystems could enable greater integration of emerging tech like AI.

Lessons are already being learned, but what are the key issues that need to be prioritised? To seek answers, we recently hosted a networking event for stakeholders from the MOD, national security, academia and subject matter experts, organised in association with the Alan Turing Institute, as part of the 2024 AI UK summit in London. While there are many initiatives, programmes and organisations supporting the uptake of AI within the sector, here are some of the key themes and solutions, not often highlighted, that emerged from our discussions.

Reframe relationships to level the playing field

Switching from a competitive to a collaborative way of working requires looking at how organisations are perceived, and reframing the value they bring. Academia isn’t just about providing research, they can provide new perspectives on old problems, bring different creative ways of working, and a sense of community between different organisations.

Similarly, there are organisations that don’t fit into the traditional defence and security supply chains and partnerships, from tech giants like Google and Microsoft to AI start-ups and research institutes. Historically, many non-traditional organisations have seen defence and security as an inaccessible sector – too difficult to sell their services, connect with the client’s stakeholders, and gain consistent funding.

Defence, industry, and academia must all reframe the value each side brings to the table to enable more innovative partnerships and ways of working. To encourage this kind of reframing, as well as make the ecosystem more attractive to newcomers, all players need to:

  • Demonstrate the unique value of their involvement aligned to outcomes
  • Provide consistent direction to build trust
  • Maintain clear and attractive routes to market
  • Develop a shared sense of purpose.

Listening to the needs of individual organisations and being able to dynamically respond to the opportunities that they can provide is key to success. Traditional defence and security companies have a role to play in providing a safe and collaborative environment to ensure the best ideas are leveraged to mutual benefit. In addition, the people influencing policy at the highest level should be pushing for trusted partners to be in the room as policy evolves, even if they aren’t well known names within defence and security already.

Accept how important it is to share data

Any system based on AI or machine learning can only ever be as good as the data it’s been trained on. So, an important part of reframing relationships will involve achieving a consensus on how closely data needs to be protected. A significant gap currently exists between the government and the MOD’s policy to protect security-related information and industry’s desire to put as much of it as possible to use, as quickly as possible.

The keystone here is to break down barriers by changing from a ‘need to know’ approach to a ‘need to share’ one. This means starting a conversation between suppliers and users that goes in two directions – addressing how secure data really needs to be based on an informed risk assessment, while at the same time providing reassurance about how big the threat is to shared data.

The same conversation should address the fact that developers want to be able to experiment using data that is as close as possible to the real thing. Defence and security applications of AI can be very different to more mainstream machine learning models, so a secure, shared space where different stakeholders can work with real or representative data is essential.

Sourcing financially viable hardware capable of running platforms that demand massive computing power is also difficult. Parts like graphics processing units, which are essential to implementing the latest AI tools, are hard to acquire from UK suppliers. The dilemma at present is where to obtain them without relinquishing sovereignty over the system in which they are incorporated. Government needs to look at ways it can boost not just applications of AI in defence and other sectors, but domestic production of the equipment on which it operates.

Look beyond Defence and National Security

The MOD’s Defence AI Playbook has been designed to help industry collaborate with defence on AI development by illustrating the breadth of opportunities to realise AI’s potential benefits. It includes examples of AI currently in use or under development, and highlights some of the common challenges the defence industry faces in delivering these new capabilities.

The next step here is to recognise that defence is one component of a complicated national security matrix, which encompasses the security services and domestic law enforcement. It is important that where a challenge arises in one area – this is shared amongst this broad group, so a collaborative solution can be found that benefits multiple services. Establishing regular cross-service communications will also mean that work is not duplicated elsewhere. An example of this kind of collaborative problem-solving could be around approaches to talent management and creating career pathways between organisations, recognising the value this diversity brings.

Build and maintain an AI skills pipeline

UK industry has faced a perennial problem with finding people who possess the technical skills it needs. The broader skills and training issues are particularly significant within defence and security, where options to recruit from outside the country are constrained for good reasons.

Developing AI will require a new approach to nurturing skills within education and industry to widen the limited talent pool. While government and business can work together on this, organisations have to look at opportunities for reskilling within their existing workforce.

It’s not as easy as just upskilling the workforce with a three-month AI course. World-leading experts are needed to address complex challenges, and the defence and security industry must find a way to attract the best talent with deep technical expertise. While part of the solution does lie in learning and development offers, attractive career pathways, and competitive salaries, purpose and brand needs to take on a bigger role. Looking outside the industry at how private sectors use brand purpose and positioning to attract talent can help defence and security services become a more sought-after AI career destination.

The MOD could also benefit from being more flexible about how it engages with industry to rotate the workforce around different roles. The use of secondments between public and private sector so stakeholders can get a better understanding of the challenges and potential solutions is already established in many instances, but this is not always as structured as it could be, which results in missed opportunities and value.

Careers have to be attractive, not just in terms of big projects but in how AI is used in day-to-day business. The defence sector has to show how it is maturing its approach to responsible AI to assure the existing workforce, and those it hopes to attract, how its AI capabilities align with personal values. Policies on safe and responsible AI must be echoed by stakeholders across the industry, demonstrating that they will be adhered to, building trust that the outcomes of people’s work will be used in the way they were intended.

Learn from experience in other sectors

The speed with which the UK healthcare sector responded to the Covid-19 pandemic, particularly in developing and deploying effective vaccines at pace, shows how agile and adaptable industry and government can be when faced with a rapidly emerging challenge.

In the same way, the AI ecosystem can learn from how other industries in countries across the world are addressing urgent challenges. Whether it is tackling climate change and achieving net zero carbon emissions, or developing sustainable agricultural systems, issues that on the face of it may seem very different can provide valuable lessons on how to create effective partnerships.

Getting this right is about more than national security

AI is just one element of the digital transformation needed within the sector. As an ecosystem we all have a role to ensure the effective and appropriate use of AI, and we need to ensure together we develop the infrastructure to be able to maximise the power of AI, when AI is the most suitable tool to use.

Participants in this discussion will have been aware of how appropriate the association with the Turing Institute was. When it comes to using pioneering technology to tackle an unprecedented threat to national security, Alan Turing and his fellow Bletchley Park codebreakers have become some of the figures best known in the public mind.

As well as solving an urgent problem, that wartime work provided one of the cornerstones for an AI industry which promises benefits far beyond just defence. With the right approach to collaboration, the partnership that we see emerging now can do the same for future generations.

About the authors

James Matthews
James Matthews PA defence and security expert
Clare Warren
Clare Warren PA defence and security expert
Chris Miles
Chris Miles PA defence and security expert

Explore more

Contact the team

We look forward to hearing from you.

Get actionable insight straight to your inbox via our monthly newsletter.