The Devil’s Advocate (DA) in Dutch military intelligence serves as an institutionalized form of critical reflection and quality control. The concept was introduced within the Defence Intelligence and Security Service (NLD DISS) in 2008 to enhance analytical rigor and counter groupthink by critically evaluating analytic products and providing contrarian perspectives. It has operated independently, reporting directly to the Director, while being closely connected to operational and analytical departments. Over time, its role has expanded from reviewing intelligence products to assessing organizational processes and analytical methodologies across the intelligence cycle.
The concept of a “devil’s advocate” has its origins in the Catholic Church with the Advocatus Diaboli critically examining the presented evidence in canonization cases. This tradition of structured dissent served to prevent bias and ensure balanced judgment. Within Dutch military intelligence, the DA and his team fulfill a similar purpose by challenging assumptions, testing reasoning, and exposing weaknesses in analysis. When applied with care to avoid “contrarian fatigue”, or outright resistance its strength lies in encouraging alternative perspectives and reducing cognitive errors.
Academic research has convincingly shown that so-called authentic dissent – i.e. genuine critique, based upon thorough investigation rather than staged – stimulates creativity and better decision-making. The DA’s goal is not to prove an assessment wrong, but to test its logic and consistency. The DA helps balance the risks of false positives (seeing links that do not exist), and false negatives (missing weak, but real signals) by ensuring analytical conclusions are robust.
From its inception in 2008, the DA and his team – small, autonomous, with full information access – reviewed finished intelligence products emphasizing transparency and learning over punishment: DA-reports are discussed with analysts to strengthen analytical reasoning. Initially, the DA conducted dozens of reviews annually, often applying other methods like scenario exercises, contrarian analyses and the introduction of competing hypotheses. These efforts aimed to instill a culture of reflective professionalism and thereby reduce groupthink and enhance the quality of intelligence analysis.
After this initial phase the DA’s scope widened. It began assessing the organisation’s overall self-reliance – to what extent its intelligence products relied on information provided by foreign allies – the effectiveness of analytical methods that were used, and organizational processes. The office also contributed to internal training programs and helped establish an academic intelligence curriculum at the Netherlands Defence Academy. From 2012, as NLD DISS faced budget constraints, the DA was tasked to design a system that linked (budgetary) resources to intelligence requirements. A “quantification matrix” and customer feedback cycle allowed its leadership to align input (and its quality and usefulness), throughput and output – closing the loop between what was needed, produced, and delivered. This also helped decision-making on prioritization issues.
As the DA expanded its scope into organizational assessment, tensions arose. Some departments viewed its findings as management oversight rather than a peer review mechanism. Despite this, consistent support from senior leadership guarded its existence and effectiveness. By the mid-2010s, the DA had evolved into a recognized means of quality assurance within NLD DISS. Its main challenge now became keeping access to data (systems) and maintaining its relevance in an era of increasing data complexity.
Intelligence processes today depend heavily on the automated processing of huge data streams. Traditional DA reviews – focused on written assessments – are insufficient for evaluating ICT-systems and algorithmic tooling that analyze vast datasets. The “black box” nature of AI introduces new risks of bias, false correlations, and misplaced confidence in machine outputs. Therefore, besides reviewing intelligence products and processes, the DA started to scrutinize data inputs, data models and algorithms.
Team composition and leadership play a central role in maintaining DA-quality. Cognitive and disciplinary diversity is valued for strengthening critical review and avoiding analytical tunnel vision. Leadership is facilitative rather than directive. Team members are expected to work autonomously while maintaining collective accountability – a balance that allows for creativity.
Communication is a crucial part of the DA’s effectiveness. The team’s work continues after the completion of an investigation: presenting findings, engaging with analysts, and ensuring that conclusions are understood and used are essential steps. Dialogue with analysts increases transparency and helps prevent resistance to critique. Formal briefings, ‘roadshows’, and personal discussions complement written reports. Keeping a “paper trail” supports institutional learning and accountability while also demonstrating that challenges are evidence-based and professional. Successful engagement depends on credibility, openness, and the ability to balance independence with collaboration. Transparency about methodology and criteria strengthens legitimacy and reduces defensiveness among colleagues.
Since 2008 DA concept has evolved from reviewing human judgment to overseeing hybrid analytical ecosystems where human reasoning and machine algorithms interact. The current and future DA will question what intelligence says as well as how it was produced. In a world dominated by automation and information overload its critical role – as a guardian of analytical integrity – remains vital.
The Dutch DA’s development illustrates how institutionalized dissent enhances the credibility and resilience of intelligence work. By systematically questioning assumptions, it helps prevent analytical complacency and strengthens decision-makers’ confidence in intelligence outputs. However, its long-term value depends on adaptability, e.g. by acquiring technical literacy to review complex, data-driven systems. This poses significant new challenges for the DA.
The Dutch experience demonstrates that dissent, when institutionalized constructively, is a sign of strength rather than disunity. By combining professionalism, transparency and independence, the Devil’s Advocate system has become an enduring mechanism for learning, adaptation, and trust within Dutch military intelligence.
Alexander Claver
Dr., Devil’s Advocate
Defence Intelligence and Security Service (NLD DISS)
The Netherlands
