Our world is experiencing major-power conflict at a level not seen in the post-Cold War era. In response, European states have significantly increased their defence budgets. While the international security environment may have changed, at Epworth our ethical commitments remain the same. We will not invest in arms companies or in companies with significant exposure to the defence sector.[1] Where investee companies have only limited military exposure, we will engage in dialogue where necessary. Such engagement now appears increasingly important in the technology and AI sector.
The ‘Conflict’ pillar of Epworth’s ethical policies is currently under review. This comes at a time when high technology and AI have assumed a major role in the conduct of military campaigns. The capabilities of large technology companies are now in demand by governments and militaries to an unprecedented degree. Household names such as Microsoft, Amazon and Google are being courted by militaries for their data-management and AI platforms. AI provider Anthropic recently hit the headlines following its disagreement with the Pentagon, after which the US administration reportedly attempted to designate the company as a ‘supply chain risk’. Meanwhile, several Epworth clients are engaged in advocacy on autonomous weapons and AI in conflict. What role can investors play in this developing area?
Before turning to the investor response, it is worth briefly outlining the current state of international norm development in this field. AI is increasingly being applied to:
- military surveillance
- target generation
- overall command and control
- lethal autonomous weapons systems (LAWS)
In recent years there have been useful intergovernmental dialogues on AI in the military domain, alongside multistakeholder discussions involving AI companies. One phrase that has emerged from these discussions is the need for “context-appropriate human judgement and control.” However, unless such norms are backed by agreed regulation, they are unlikely to translate into meaningful constraints on action. To this end, in October 2025, the Interfaith Center on Corporate Responsibility (ICCR) in New York hosted the launch of an investor statement on autonomous weapons.[2] The statement endorses the UN Secretary-General’s call for a new international instrument to be negotiated through the United Nations. Unfortunately, although perhaps unsurprisingly, governments have been slow to move towards international agreement. Accountability is not always politically convenient. While human judgement and control make public accountability in the use of weapons systems possible, they do not in themselves guarantee it. This tension is illustrated by the stance of the US Department of Defense in its dealings with Anthropic. Some states would prefer companies simply to hand over powerful AI capabilities and leave the interpretation of guidelines to the military.
In the longer term, solutions must lie in embedding government accountability both in international agreements and in the doctrines of individual armed forces. In the meantime, corporations providing AI services to the defence sector must not opt out of responsibility for how these powerful capabilities are used.
Investors, as the ultimate owners and beneficiaries of leading AI providers, should voice their concerns, even where the issues are complex. At Epworth, we will continue to be active in this area through:
- direct engagement with companies on the moral, legal and reputational risks
- support for investor-led statements calling for public accountability in the military use of AI
- dialogue with our client base on these issues
We are committed to working with like-minded investors to ensure that, even when the international political environment appears difficult, our investee companies maintain their commitment to human rights and the avoidance of harm.
[1] The tolerance threshold for exposure to combat weapons is 5% of revenue. There is a zero-tolerance policy in relation to controversial weapons including nuclear weapons.
[2] ICCR Fall Conference 2025: Grave New World
Etica: Progress cannot be a weapon