Abstract

Background

This article outlines the methods being used to evaluate a community-based public health intervention. This evaluation approach recognizes that not only is the intervention, Healthy Families NZ, complex, but the social systems within which it is being implemented are complex.

Methods

To address challenges related to complexity, we discuss three developing areas within evaluation theory and apply them to an evaluation case example. The example, Healthy Families NZ, aims to strengthen the prevention system in Aotearoa/New Zealand to prevent chronic disease in 10 different geographic areas. Central to the evaluation design is the comparative case method which recognizes that emergent outcomes are the result of 'configurations of causes'. 'Thick', mixed-data, case studies are developed, with each case considered a view of a complex system. Qualitative Comparative Analysis is the analytical approach used to systematically compare the cases over time.

Conclusions

This article describes an approach to evaluating a community-based public health intervention that considers the social systems in which the initiative is being implemented to be complex. The evaluation case example provides a unique opportunity to operationalize and test these methods, while extending their more frequent use within other fields to the field of public health.

Background

This article describes the methods being used to evaluate a community-based public health intervention. The approach described recognizes that not only is the intervention, Healthy Families NZ, complex, but the social systems within which it is being implemented are complex. This article contributes to the argument that this distinction, about where the complexity lies, is significant for selecting the most appropriate evaluation methods to use. 1–3

'Complexity theory' underpins the understanding of social complexity conveyed in this article. Complexity theory was developed during the 1980s. Although new as coherent theories, complexity, and its predecessor chaos theory, were the culmination of more than a century of scientific ruminations, experimentation and theorizing on the behaviour and organization of matter. 4–6 Since the 1990s complexity theory began a transition of influencing thinking in a range of social sciences. 7,8 This influence has been separate to, but mutually reinforcing with various systems theories which have a longer history of influence through operational research and systems design. 9

Within public health a trickle of interest in the 1990s has turned into a river—with the theory and its utility becoming more widely accepted. 10,11 Some have described these recent developments as the 'complexity turn', and argue that the theory offers the social sciences a valid framing for empirical studies into diversity and social change. 12–15 The basics of complexity theory as applied to social systems are now well described in multiple publications. 8,16–19 Fundamentally, complexity theory sees social phenomena, such as increasing rates of childhood obesity, 20 as the 'emergent' result of interacting elements within a social system. 8 The emergent social phenomena are real in that they have an impact on people, the systems that generated them, and other interacting systems. 21 For public health evaluation some implications of the theory are that it: offers a useful set of conceptual tools; 13,21 provides a valid way to theorize systems; 12 allows for contextual or specific understanding; 12,22 recognizes processes of emergence and different social levels; 12 and facilitates an understanding of both individual agency and social structure. 12–15,21

For public health interventions, arguments are made about whether the focus should be on the complexity of the intervention, or the complexity of the social system within which an intervention is implemented. 1 Such arguments are not purely academic. Recent research suggests that when evaluators perceive complexity as residing within the intervention (rather than in the surrounding systems), they consider a more limited role for complexity considerations within evaluation design. 23 The existing UK Medical Research Council guidance on evaluating complex interventions, while providing valuable guidance on evaluation, does not draw explicitly on an understanding of complex social systems. 24 Moreover, the authors have expressed scepticism that the theory of complex systems will prove to be useful, and that there are few examples of successfully completed evaluations drawing explicitly upon complex systems theory to provide guidance for evaluation design. 25

Experience from other evaluations of community-based public health interventions illustrate the limitations of their design or highlight their discontinuation before they are complete. Nonetheless lessons have been drawn from substantial evaluation attempts. Reflections on the Health Action Zones (HAZ) in the UK, e.g. suggest that policy failure led to their demise, and lack of appropriate planning to ensure comparable data across the HAZ sites led to its evaluation being less useful than it could have been. 26 Another example is New Zealand's obesity prevention strategy Healthy Eating Healthy Action (HEHA) which was abruptly ended by an incoming government, and with it the potential for learning from its evaluation. 27 In Australia, Healthy Together Victoria (HTV) and its evaluation suffered a similar fate. 28

Taking the perspective that public health interventions are located within complex social systems makes explicit some important challenges for the design of their evaluation. First is recognizing that evidence is contextual, 29 meaning factors such as history, resources and other features of the organization of the system need to be considered. A second challenge is the recognition that outcomes or impacts are emergent—resulting from non-linear interactions between systems and the components that make up the system. A third challenge, often articulated as a feature of 'wicked' problems 30,31 is that there will be differing perspectives on evaluative judgements that are made, including the criteria that should be applied. 32 To address these challenges, which are exposed by viewing social systems as complex systems, we discuss three important areas which are developing within evaluation theory and we apply them to an evaluation (of Healthy Families NZ) case example described below.

Drawing upon both complexity theory and realist evaluation, Westhorp 33 considers the theory of complex social systems can be used as a base upon which particular intervention theories are layered. This allows for interventions that themselves are not complex, to legitimately call upon theories of complex systems to inform their evaluation, without losing a focus on the theory of the intervention itself. The second development is the evolution of case study methods to research complex social systems, where causation is viewed as complex and contingent, and configurational approaches to causation emphasized. 34–36 Case study methods also allow for context to be integral within the evaluation design, while configuration approaches to causation allow for the interaction of context, process and outcomes—rather than treating these as separate evaluation activities. Finally, the continued development of participatory approaches to evaluation brings to the fore the need to recognize and account for different perspectives within evaluative judgements. 37,38

Next, the evaluation design of the Healthy Families NZ initiative is outlined as a practical example of a complexity-informed approach to public health evaluation.

Case example—Healthy Families NZ

Healthy Families NZ is a government-funded initiative which aims to strengthen the prevention system in Aotearoa/New Zealand (Aotearoa/NZ) in order to prevent chronic disease. It was adapted from Healthy Together Victoria (HTV) in Australia and is focussed on bringing community leadership together to improve people's health where they 'live, learn, work and play'. Informed by systems thinking, the initiative aims to change systems and settings—particularly as they relate to the risk factors of harmful alcohol consumption, tobacco use and inadequate physical activity and nutrition—that influence health. Healthy Families NZ is being carried out in 10 different communities around Aotearoa/NZ. The significant components of the initiative in each community are: investment in a dedicated systems-thinking and acting health promoting workforce; activating local leadership through bringing together partnerships of key organizations and individuals who can influence transformational change; and building on existing action underway in the community to create an integrated, community-wide 'prevention system' for good health. Each Healthy Families NZ location team is housed within an existing local organization or collective. These 'lead providers' comprise local councils, sports trusts and iwi organizations (Iwi, or tribal groups, are traditionally the largest social units in Māori society. Each iwi generally has a recognized territory. Statistics New Zealand describes iwi as 'the focal economic and political unit of the traditional Māori descent and kinship based hierarchy'), selected through a competitive tender process.

The 10 communities involved in the Healthy Families NZ initiative are spread over the length of the country (Fig. 1). Together, the population potentially exposed to the Healthy Families NZ initiative is close to 1 million people. The 10 locations are, in general, in areas with higher than average rates of preventable chronic diseases, higher than average rates of risk factors for these diseases and/or high levels of deprivation. Figure 1 shows the locations and the name of the contracted lead providers (contracts were signed between the Ministry of Health and the lead providers in 2014).

Fig. 1

Healthy Families NZ locations and lead providers.

Healthy Families NZ locations and lead providers.

Source: Massey University Evaluation Team. (2017) Interim Evaluation Report: Healthy Families NZ. Massey University. Wellington.

The practical challenges of evaluating Healthy Families NZ related to complexity include that each of the 10 locations are different in many ways (e.g. urban/rural, geographical spread, size of population and history of health promotion initiatives); that there are influences from the wider social/political environment on local activities, practices and policies; and that the main goal of the Healthy Families NZ initiative—to prevent chronic diseases—is a long-term goal, meaning the timeframe expected for change is largely outside the contracted evaluation period (3.5 years).

The evaluation design

The design of the evaluation is summarized in Fig. 2. Central to the design is the comparative case method which recognizes that emergent outcomes are the result of 'configurations of causes' 14,34,39 that interact in any direction, and incorporate many elements. The strength of the comparative method case study is in being able to tease out configurations of causes in relation to identified outcomes. Comparison allows these elements and, most crucially, their interactions, to be explored across situations and contexts. 40 Furthermore, building case studies provides a way to group data and information in recognition that there will be relationships between the sources of data, as well as being explicit about setting the boundaries of what is being observed.

Fig. 2

Evaluation of Healthy Families NZ—design.

Evaluation of Healthy Families NZ—design.

Source: Massey University Evaluation Team. (2017) Interim Evaluation Report: Healthy Families NZ. Massey University. Wellington.

The first step in the evaluation, as shown in Fig. 2, is to determine baseline information (first view) through a process of case building. Overall, 10 descriptive case studies representing the 10 Healthy Families NZ locations are being constructed, along with a National perspective case study.

Two years later, descriptive case studies are again developed (second view). Comparison is initially made within locations between the two points in time (first view and second view), followed by comparison across the case studies using Qualitative Comparative Analysis (QCA). 38,39,41,42

The case study development

The descriptive case building of first view and second view utilize a number of data sources and collection methods, outlined in Table 1. Appropriate ethics processes have been followed and approvals obtained for the use of these named data sources.

Data sources Description summary
Demographic data collection Data, for selected demographic themes (including population size and structure, ethnicity, household composition, employment, education, deprivation and household economic status), is extracted from the 2013 New Zealand Census for each location to provide a socio-demographic profile.
A composite socio-economic indicator—The New Zealand Index of Deprivation (NZDep2013)—extracted at small area (meshblock) level within each Healthy Families NZ location.
Quantitative data A range of existing quantitative data from survey and routine administrative datasets are used to develop the descriptive case studies and indicators.
Key data sources include the New Zealand Health Survey (NZHS) and B4 School Check administrative data.
Community stocktake [provided by location workforce] Standardized information relating to networks, organizations involved in each local 'prevention system', key community stakeholders/'movers and shakers', number of settings (schools, workplaces, marae, sports clubs, places of worship) and health related programmes within settings, local information and evidence available, health related programmes/projects and initiatives and policies.
Qualitative data Relevant documents reviewed and analysed to inform the case studies include implementation roadmaps, activities, performance monitoring reports, contracts, policy documents and other support materials.
Semi-structured interviews are undertaken with key informants who are pivotal to the implementation of the initiative in each location.
Survey of stakeholders across prevention system Network structures and change in network can be a contextual variable to aid understanding of factors that contribute to increased capacity, greater collective impact and changes in health behaviour.
A web-based survey asks respondents within identified organizations about working relationships between organizations.
Data sources Description summary
Demographic data collection Data, for selected demographic themes (including population size and structure, ethnicity, household composition, employment, education, deprivation and household economic status), is extracted from the 2013 New Zealand Census for each location to provide a socio-demographic profile.
A composite socio-economic indicator—The New Zealand Index of Deprivation (NZDep2013)—extracted at small area (meshblock) level within each Healthy Families NZ location.
Quantitative data A range of existing quantitative data from survey and routine administrative datasets are used to develop the descriptive case studies and indicators.
Key data sources include the New Zealand Health Survey (NZHS) and B4 School Check administrative data.
Community stocktake [provided by location workforce] Standardized information relating to networks, organizations involved in each local 'prevention system', key community stakeholders/'movers and shakers', number of settings (schools, workplaces, marae, sports clubs, places of worship) and health related programmes within settings, local information and evidence available, health related programmes/projects and initiatives and policies.
Qualitative data Relevant documents reviewed and analysed to inform the case studies include implementation roadmaps, activities, performance monitoring reports, contracts, policy documents and other support materials.
Semi-structured interviews are undertaken with key informants who are pivotal to the implementation of the initiative in each location.
Survey of stakeholders across prevention system Network structures and change in network can be a contextual variable to aid understanding of factors that contribute to increased capacity, greater collective impact and changes in health behaviour.
A web-based survey asks respondents within identified organizations about working relationships between organizations.
Data sources Description summary
Demographic data collection Data, for selected demographic themes (including population size and structure, ethnicity, household composition, employment, education, deprivation and household economic status), is extracted from the 2013 New Zealand Census for each location to provide a socio-demographic profile.
A composite socio-economic indicator—The New Zealand Index of Deprivation (NZDep2013)—extracted at small area (meshblock) level within each Healthy Families NZ location.
Quantitative data A range of existing quantitative data from survey and routine administrative datasets are used to develop the descriptive case studies and indicators.
Key data sources include the New Zealand Health Survey (NZHS) and B4 School Check administrative data.
Community stocktake [provided by location workforce] Standardized information relating to networks, organizations involved in each local 'prevention system', key community stakeholders/'movers and shakers', number of settings (schools, workplaces, marae, sports clubs, places of worship) and health related programmes within settings, local information and evidence available, health related programmes/projects and initiatives and policies.
Qualitative data Relevant documents reviewed and analysed to inform the case studies include implementation roadmaps, activities, performance monitoring reports, contracts, policy documents and other support materials.
Semi-structured interviews are undertaken with key informants who are pivotal to the implementation of the initiative in each location.
Survey of stakeholders across prevention system Network structures and change in network can be a contextual variable to aid understanding of factors that contribute to increased capacity, greater collective impact and changes in health behaviour.
A web-based survey asks respondents within identified organizations about working relationships between organizations.
Data sources Description summary
Demographic data collection Data, for selected demographic themes (including population size and structure, ethnicity, household composition, employment, education, deprivation and household economic status), is extracted from the 2013 New Zealand Census for each location to provide a socio-demographic profile.
A composite socio-economic indicator—The New Zealand Index of Deprivation (NZDep2013)—extracted at small area (meshblock) level within each Healthy Families NZ location.
Quantitative data A range of existing quantitative data from survey and routine administrative datasets are used to develop the descriptive case studies and indicators.
Key data sources include the New Zealand Health Survey (NZHS) and B4 School Check administrative data.
Community stocktake [provided by location workforce] Standardized information relating to networks, organizations involved in each local 'prevention system', key community stakeholders/'movers and shakers', number of settings (schools, workplaces, marae, sports clubs, places of worship) and health related programmes within settings, local information and evidence available, health related programmes/projects and initiatives and policies.
Qualitative data Relevant documents reviewed and analysed to inform the case studies include implementation roadmaps, activities, performance monitoring reports, contracts, policy documents and other support materials.
Semi-structured interviews are undertaken with key informants who are pivotal to the implementation of the initiative in each location.
Survey of stakeholders across prevention system Network structures and change in network can be a contextual variable to aid understanding of factors that contribute to increased capacity, greater collective impact and changes in health behaviour.
A web-based survey asks respondents within identified organizations about working relationships between organizations.

Two important features of the evaluation design are the inclusion of a National perspective case study and also the inclusion of sensemaking processes as a way to incorporate elements of context as well as wider systems influences. The National perspective provides important information on interactions between national-level influences and those at the local level. Several authors, utilizing a complexity frame, have considered the importance of understanding the national/local interaction when evaluating how initiatives have adapted within local settings. 43–45 'Sensemaking' is where data and findings are taken back to participants and discussed in a structured way to improve interpretation and applicability to local context. 46 A process of sensemaking is being undertaken with each location and with the Ministry of Health (funder) to refine the descriptive case studies; to collaboratively develop indicators (for the QCA described below); and interpret findings.

Qualitative comparative analysis

QCA enables identification of combinations of factors associated with prioritized outcomes, using formal tools and with a specific conception of cases as complex systems. 40 Each case is considered as a complex combination of properties: a 'whole' that should not be lost sight of in the course of analysis. The factors included within the analysis are referred to as 'conditions' and the indicators to be developed (discussed below) represent these conditions.

Conditions can include features of case context (e.g. stability of workforce and strength of networks); process (e.g. reach into settings); and outcomes (e.g. changes in health behaviours). QCA identifies combinations of conditions associated with types of outcomes. The assumption is that there are multiple configurations that may lead to similar outcomes; or that divergent outcomes may have similar configurations. The strength of the method is to identify configurations across 'near neighbour' cases to inform both initiative design and evaluative judgements of effectiveness. 39 While QCA provides a guide on what configurations are associated with particular outcomes, understanding why relies on going back into the detailed case study and the 'thick' qualitative description of cases. 38

Development of indicators for QCA

Crisp-set QCA, used here, requires development of dichotomous (either/or) condition indicators. For each condition, criteria against which to make judgements about the dichotomous state of conditions are defined—i.e. whether there is change or not. The timeframe of 2 years between first view and second view case studies, and relatively small population of each Healthy Families NZ location, means that statistical significance is unlikely to be achieved from data measuring any one risk factor or chronic disease indicator. Instead, condition variables are developed that draw upon multiple data sources. Informed by the practice of evaluative rubrics, 47 criteria used to allocate conditions to an either/or condition are a form of evaluative criteria and are developed in consultation with the initiative funder and representatives of the Healthy Families NZ locations.

Analytic process

QCA has three distinct phases: the production of a data table which shows each case has an outcome and specific combination of conditions; Boolean minimization; and an explanatory analysis that draws upon the full case study, theory and previous research to explain why those particular combinations of conditions (or their absence) contributes to the outcome. 41 As illustrated in Fig. 3 below, the research process with QCA is iterative, usually involving several rounds of within-case analysis and cross-case comparisons where the results inform further within-case analyses. By combining a deep qualitative understanding of the cases, gathered through the case building, identification of causal configurations is possible.

Fig. 3

QCA case process.

QCA case process.

Discussion

What is already known

Theories of social complexity challenge the use of evaluation methods that exclude context and process from analysis and interpretation. There are however limited examples of such theories being successfully applied within public health evaluation. This article describes an evaluation of a community-based public health initiative which takes an approach that considers the 'complexity' of the social systems in which the intervention is being implemented. Whereas many methods attempt to reduce complexity (e.g. by focussing on individual variables) in order to isolate the 'parts' or 'control' for context, a complexity frame instead 'accounts' for context and acknowledges interdependence. 48 Indeed, there are enduring critiques of the hierarchy of methods for determining whether there are causal relationships between social variables. 49,50 In a review of impact evaluations, Stern and colleagues conclude that in complex development interventions there is an over-reliance on experimental designs that ask the question 'did the intervention work?'. They argue that a more appropriate question to ask is 'did the intervention make a difference?'. This question allows greater room for understanding combinations of causal factors within, and outside, the intervention. 35

Strengths and limitations

The case example given in this article provides a practical example of an evaluation design being used for a community-based public health initiative. The methods outlined combine developments in the application of complex systems theories to evaluation. The strengths of the approach are: the ability to explicitly layer complex systems theories with intervention theory; the emphasis on the participation and perspectives of stakeholders; and the gathering of in-depth, context-rich information about each case. The systematic comparison (using QCA) across the cases also allows for some level of generalizability. One limitation is that treating each community as a case study makes it difficult to get quantitative data with sufficient numbers for statistical significance. Instead we are constructing indicators using multiple sources of data to provide indications of direction of change (if any). The time and resource involved in gaining an understanding 'context' also has challenges in that it requires the collection of substantial, rich, data while also gathering systematic, comparable data across the case studies. Also required is a significant degree of consistency in research procedures across time as well as an in-depth, iterative relationship with the data.

What this article adds

This article contributes to the argument that framing social systems as 'complex' is useful for shaping the selection of the most appropriate methods for evaluating public health interventions. The practical evaluation case example provided in this article takes a theoretical position that the social systems, within which an intervention is being implemented, are complex. While using existing methods, the combination of approaches within a frame of social complexity offers an innovative approach to public health evaluation. At the time of writing this article the evaluation of Healthy Families NZ is on-going. There will be further publications detailing both methods and findings to come. This evaluation provides a unique opportunity to operationalize and test the methods described, while extending their more frequent use within other fields to the field of public health.

Acknowledgements

Many thanks to the national Healthy Families NZ team at the Ministry of Health for a positive, collaborative relationship, as well as significant input into the evaluation design and data collection methods. We also greatly appreciate the time and effort given to the evaluation from all members of the Healthy Families NZ workforce and other participants across the locations.

Conflicts of interest statement

The authors declare that they have no competing interests.

Authors' contributions

AM and MW have contributed equally to this article, they have led the design and implementation of the evaluation and have jointly led the development and writing of this article. RG has contributed to the process and protocols for the case study development. NW has advised on the evaluation implementation and responsiveness to Maori and the Treaty of Waitangi. KL, MS, CF, BB have contributed to the collection and management of the quantitative data sources and the development of the process and protocols for analysis. All authors read and approved the final manuscript.

Ethics approval and consent to participate

In line with Massey University processes for ethical conduct of research and evaluation this project was assessed by peer review to be low risk. Consequently, it was not reviewed by one of the University's Human Ethics Committees. The evaluators are responsible for the ethical conduct of the research. Standard ethical processes have been followed.

Approval for the use of named secondary data sources has been obtained from the Ministry of Health.

Funding

The Ministry of Health, New Zealand is funding the evaluation of Healthy Families NZ.

References

1

Shiell

A

,

Hawe

P

,

Gold

L

.

Complex interventions or complex systems? Implications for health economic evaluation

.

Br Med J

2008

;

336

(

7656

):

1281

83

.

.

2

Hawe

P

.

Lessons from complex interventions to improve health

.

Annu Rev Public Health

2015

;

36

(

1

):

307

23

.

.

3

Matheson

A

.

Reducing social inequalities in obesity: complexity and power relationships

.

J Public Health (Bangkok)

2016

;

38

(

4

):

826

9

.

4

Waldrop

M

.

Complexity: The Emerging Science at the Edge of Order and Chaos

.

New York

:

Simon & Schuster

,

1992

.

5

Capra

F

.

Complexity and life

.

Theory Cult Soc

2005

;

22

(

5

):

33

44

.

.

6

Gare

A

.

Systems theory and complexity introduction

.

Democracy Nat

2000

;

6

(

3

):

327

39

.

7

Medd

W

.

What is complexity science? Toward an 'Ecology of Ignorance'

.

Emergence

2001

;

3

(

1

):

43

60

.

8

Rickles

D

,

Hawe

P

,

Shiell

A

.

A simple guide to chaos and complexity

.

J Epidemiol Community Health

2007

;

61

:

933

37

.

9

Phelan

SE

.

A note on the correspondance between complexity and systems theory

.

Syst Pract Action Res

1999

;

12

(

3

):

237

46

.

10

Chughtai

S

,

Blanchet

K

.

Systems thinking in public health: a bibliographic contribution to a meta-narrative review

.

Health Policy Plan

2017

;

32

(

4

):

585

94

.

.

11

Rutter

H

,

Savona

N

,

Glonti

K

et al. .

The need for a complex systems model of evidence for public health

.

Lancet

2017

. doi:10.1016/S0140-6736(17)31267-9.

12

Complexity theory, globalisation and diversity. British Sociological Conference,

2003

April; University of York.

13

Gatrell

A

.

Complexity theory and geographies of health: a critical assessment

.

Soc Sci Med

2005

;

60

:

2661

71

.

14

Urry

J

.

The complexity turn

.

Theory Cult Soc

2005

;

22

(

5

):

1

14

.

15

Smith

J

,

Jenks

C

.

Qualitative Complexity: Ecology, Cognitive Processes and Re-emergence of Structures in Post-Humanist Social Theory

.

New York

:

Routledge

,

2006

.

16

Byrne

D

,

Callaghan

G

.

Complexity Theory and the Social Sciences: The State of the Art

.

Oxon

:

Routledge

,

2014

.

17

Eppel

E

,

Matheson

A

,

Walton

M

.

Applying complexity theory to New Zealand public policy: principles for practice

.

Policy Q

2011

;

7

(

1

):

48

55

.

18

Patton

MQ

.

Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use

.

New York

:

The Guilford Press

,

2011

.

19

Room

G

.

Complexity, Institutions and Public Policy

.

Cheltenham

:

Edward Elgar Publishing

,

2011

.

20

Anderson

PM

,

Butcher

KE

.

Childhood obesity: trends and potential causes

.

Future Child

2006

;

16

(

1

):

19

45

.

21

Byrne

D

.

Complexity Theory and the Social Sciences: An Introduction

.

New York

:

Routledge

,

1998

.

22

Nowotny

H

.

The increase of complexity and its reduction: emergent interfaces between the natural sciences, humanities and social sciences

.

Theory Cult Soc

2005

;

22

(

5

):

15

31

.

23

Walton

M

.

Expert views on applying complexity theory in evaluation: opportunities and barriers

.

Evaluation

2016

;

22

(

4

):

410

23

.

.

24

Craig

P

,

Dieppe

P

,

Macintyre

S

et al. . Developing and evaluating complex interventions: new guidance. Medical Research Council,

2008

.

25

Craig

P

,

Dieppe

P

,

Macintyre

S

et al. . Developing and evaluating complex interventions: an introduction to the new Medical Research Council guidance. In:

Killoran

A

,

Kelly

MP

(eds)
.

Evidence-Based Public Health

.

Oxford

:

Oxford University Press

,

2010

:

185

202

.

26

Judge

K

,

Bauld

L

.

Learning from policy failure? Health Action Zones in England

.

Eur J Public Health

2006

;

16

(

4

):

341

44

.

27

McLean

R

,

Hoek

J

,

Buckley

S

et al. .

'Healthy Eating Healthy Action': evaluating New Zealand's obesity prevention strategy

.

BMC Public Health

2009

;

9

:

452

.

28

Healthy Together Mildura

. Healthy Together Mildura Evaluation Paper 2012–2016. Mildura: Mildura Rural City Council and Sunraysia Community Health Services,

2016

.

29

Walton

M

.

Setting the context for using complexity theory in evaluation: boundaries, governance and utilisation

.

Evid Policy

2015

.

.

30

Rittel

HWJ

,

Webber

MM

.

Dilemmas in a general theory of planning

.

Policy Sci

1973

;

4

(

2

):

155

69

.

31

Blackman

T

,

Hunter

D

,

Marks

L

et al. .

Wicked comparisons: reflections on cross-national research about health inequalities in the UK

.

Evaluation

2010

;

16

(

1

):

43

57

.

32

Walton

M

.

Applying complexity theory: a review to inform evaluation design

.

Eval Program Plann

2014

;

45

(

0

):

119

26

.

.

33

Westhorp

G

.

Using complexity-consistent theory for evaluating complex systems

.

Evaluation

2012

;

18

(

4

):

405

20

.

.

34

Byrne

D

,

Uprichard

E

. Useful complex causality. In:

Kincaid

H

(ed)
.

The Oxford Handbook of Philosophy of Social Science

.

Oxford

:

Oxford University Press

,

2012

:

109

29

.

35

Stern

E

,

Stame

N

,

Mayne

J

et al. .

Broadening the Range of Designs and Methods for Impact Evaluations

.

London

:

Department for International Development

,

2012

.

36

Byrne

D

,

Ragin

CC

(eds).

The SAGE Handbook of Case-Based Methods

.

London

:

Sage

,

2009

.

37

Boyko

JA

,

Kothari

A

,

Wathen

CN

.

Moving knowledge about family violence into public health policy and practice: a mixed method study of a deliberative dialogue

.

Health Res Policy Syst

2016

;

14

(

1

).

.

38

Blackman

T

,

Wistow

J

,

Byrne

D

.

A Qualitative Comparative Analysis of factors associated with trends in narrowing health inequalities in England

.

Soc Sci Med

2011

;

72

(

12

):

1965

74

.

39

Byrne

D

.

Evaluating complex social interventions in a complex world

.

Evaluation

2013

;

19

(

3

):

217

28

.

.

40

Bennett

A

,

Elman

C

.

Complex causal relations and case study methods: the example of path dependence

.

Polit Anal

2006

;

14

:

250

67

.

41

Rihoux

B

,

Marx

A

,

Ragin

CC

et al. .

QCA, 25 years after 'The Comparative Method': mapping, challenges, and innovations-mini-symposium

.

Polit Res Q

2013

;

66

(

1

):

167

235

.

42

Befani

B

.

Between complexity and generalization: addressing evaluation challenges with QCA

.

Evaluation

2013

;

19

(

3

):

269

83

.

.

43

Matheson

A

,

Dew

K

,

Cumming

J

.

Complexity, evaluation and the effectiveness of community-based interventions to reduce health inequalities

.

Health Promot J Austr

2009

;

20

(

3

):

221

26

.

44

Trenholm

S

,

Ferlie

E

.

Using complexity theory to analyse the organisational response to resurgent tuberculosis across London

.

Soc Sci Med

2013

;

93

:

229

37

.

45

Barnes

M

,

Matka

E

,

Sullivan

H

.

Evidence, understanding and complexity: evaluation in non-linear systems

.

Evaluation

2003

;

9

(

3

):

265

84

.

46

Maitlis

S

,

Christianson

M

.

Sensemaking in organizations: taking stock and moving forward

.

Acad Manag Ann

2013

;

8

(

1

):

57

125

.

.

47

King

J

,

McKegg

K

,

Oakden

J

et al. .

Evaluative Rubrics: a method for surfacing values and improving the credibiilty of evaluation

.

J MultiDisciplin Eval

2013

;

9

(

21

):

11

20

.

48

Shoveller

J

,

Viehbeck

S

,

Ruggiero

ED

et al. .

A critical examination of context within research on population health interventions

.

Crit Public Health

2016

;

26

(

5

):

487

500

.

49

Kaptchuk

T

.

The double-blind, randomized, placebo-controlled trial: gold standard or golden calf?

J Clin Epidemiol

2001

;

54

(

6

):

541

9

.

50

Byrne

D

.

Interpreting Quantitative Data

.

London

:

Sage Publications

,

2002

.

Author notes

Dr Anna Matheson and Dr Mat Walton have contributed equally to this article.