Final Exam Flashcards

(72 cards)

1
Q

what is a behaviour change technique? (BCT)

A

a replicable component of an intervention designed to alter or redirect causal processes that regulate behaviour
- “active ingredient”
- usually selected on basis of theoretical constructs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

mechanism of action definition (MoA)

A

constructs specified in behaviour change theories that can be seen to “mediate” intervention effects e.g., beliefs about capabilities, knowledge, behavioural regulation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

benefits of mapping BCTs to MoA

A
  • testable e.g., RCT (if no mediation effects, what does this mean)
  • shed’s light on the “black box” that can be behavioural interventions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

intervention optimisation

A
  • iterative and experimental process to answer mechanistic questions
  • precise
  • slow process - systematic program of work
  • mapping BCTs to MoA and conducting mediation analyses
  • qualitative evaluations
  • lack granular info on which parts were effective, which were unnecessary, or whether some parts worked against each other
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

real-life intervention analysis

A
  • based on best available and is evidence-inspired
  • fast process - what we can do now based on what we know
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Multi-phase Optimisation Framework

A
  1. preparation
  2. optimisation
  3. evaluation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

preparation stage (or screening phase) in multi-phase optimisation framework

A
  • lay the groundwork for optimisation
    activities
  • derive/revise conceptual model
  • components
  • conduct pilot tests
  • co-design (engage w different stakeholders
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

optimisation stage (or refining phase) in multi-phase optimisation framework

A
  • build optimised interventions
    activities
  • conduct optimisation trial(s) e.g., factorial experiment, SMART, micro-randomised trial, system identification
  • identify intervention that meets optimisation criterion (if does not, go back to preparation phase; if it does, go to evaluation)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

evaluation stage (or confirming phase) of multi-phase optimisation framework

A
  • confirm effectiveness of optimised intervention
    activity
  • RCT
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

example of multi-phase optimisation using Drink Less App

A
  • 5 modules containing multiple BCTs (self-monitoring + feedback; action planning; normative feedback; cognitive bias retraining; identity change)
  • modules based on review of BCTs in alcohol interventions, expert consensus study, content analysis
  • between-subjects full factorial RCT run: enhanced experimental version of each module to contain the ‘active ingredients’ hypothesised to be effective. control versions of each module designed to provide support to participants for ethical reasons (exclude active ingredients)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

findings from Drink Less App full factorial

A
  • no main effects of each intervention module on change in weekly alcohol consumption
  • significant two-way interactions bw normative feedback and cognitive bias re-training on weekly alcohol consumption; and self-monitoring and feedback and action planning on change in alcohol use disorders identification test
  • no definitive evidence for single components, but some modules show promise.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

next steps from findings of full factorial in Drink Less App

A
  • optimise app using user feedback (improve acceptability and feasibility)
  • conduct definitive RCT w long-term outcomes
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

goal setting as a BCT

A
  • one of the most widely used BCTs
  • helps link intention to action
  • SMART goals
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

open goals

A

nonspecific and phrased as exploratory, with measurable parameters, producing graded outcomes.
e.g., how quickly can…
- exploratory and something fixed
- goals are task-focused and should be context-sensitive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

benefits of open-goal setting

A
  • can’t fail in the same way you can fail to meet a SMART goal
  • encourages exploration and autonomy
  • good for meeting people “where they are at”
  • effective for increasing self-efficacy
  • advantageous in early stages of learning/behaviour change
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

SMART goals

A
  • may be better for people later on in their behaviour change process
  • structured
  • can induce pressure/failure
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

why do we evaluate?

A
  • need to know what works and what doesn’t to improve interventions
  • helps policy makers determine where they should allocate scarce resources
  • accountability - of money - is it being spent appropriately and effectively
  • developing/building an evidence-base
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is evaluation?

A
  • systematic collection of information about the activities, characteristics, and outcomes of an intervention to make judgements about the intervention, improve intervention effectiveness, and/or inform decisions about future intervention development
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

types of evaluation (2)

A
  • process evaluation (i.e., implementation)
  • outcome evaluation (i.e., effectiveness)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

process evaluation

A
  • focuses on whether an intervention and its activities are operating and being implemented as planned
  • identifies intervention strengths, weaknesses, and areas for improvement
  • done in collaboration w stakeholders
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

things you can assess in process evaluation

A
  • fidelity (administered as intended)
  • dose delivered
  • dose received
  • reach
  • recruitment
  • context
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

methods of evaluation in process evaluation

A
  • activity records
  • participation rates
  • budgets
  • reach
  • impressions
  • clicks (intentional)
  • engagement (comments, shares)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

outcome evaluation

A
  • investigates whether, and to what extent, changes occur for participants in a program and if these changes are associated w the program
  • done in collaboration w stakeholders
  • outcomes assessed in an evaluation depends on objectives of programs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

outcomes assessed in outcome evaluation

A
  • attitudes
  • knowledge
  • skills
  • behavioural intentions
  • behaviour/behaviour change
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
outcome evaluation must abide by SMART
- Specific - Measurable - Achievable - Relevant - Timely
25
experimental outcome evaluations
participants are randomised to a treatment group, which receives the program, or to a control group, which does not - allows conclusions to be drawn about attribution e.g., RCT
26
quasi-experimental outcome evaluations
involve assessing outcomes over time (pre/post program exposure) for a single group an comparing to a similar population, comparison group, or national data
27
before and after study in outcome evaluations
involve assessing outcomes over time (pre/post program) for a single group but does not compare to a control group - changes could be due to other variables but no way to measure
28
methods adopted by outcome evaluations
qualitative - focus groups - interviews - diaries quantitative - surveys/questionnaires (include validated measures) observation mixed-methods approach is best practice
29
when to evaluate/collect data in outcome evaluations
- before implementation (baseline) - pilot-testing stage (ensure outcomes that are chosen for evaluation are relevant, early evidence for success) - after roll-out (effectiveness of program demonstrated to funders + makes case for its continued functioning)
30
logic models
- systematic way of demonstrating relationships bw resources used, the activities those resources facilitate, and the intended outcomes - inputs--> activities--> outputs--> short-term outcomes--> medium-term outcomes--> long-term outcomes
31
logic model inputs
- resources used to plan, implement, and provide a program/intervention and the services it delivers e.g., staff, volunteers, funders, equipment, resources
32
logic model activities
- the activities/services the inputs support e.g., seminars, and content within seminars
33
logical model outputs
- productivity of activities e.g., no. of seminars delivered, people expressing interest in participation, people who have participated, people sent a particular resource
34
short-term outcomes - logic models
- stepping-stones to medium- and longer-term goals e.g., increases in knowledge, self-efficacy, and motivation
35
medium-term & long-term outcomes - logic models
- outcomes that take longer to achieve e.g., behaviour change, maintenance of behaviour change, reduction in morbidity and mortality
36
cost-effectiveness evaluation
- economic evaluation allows us to determine whether one intervention represents a better use of resources than another (or doing nothing) - "value for money" and possible cost savings - cost-benefit analysis (cost of program are weighed against the costs of not running the progam) - quality adjusted life year (QALY)
37
checklist for program evaluation
- start evaluation process at beginning of program implementation - determine aim of evaluation and develop evaluation framework - clearly define target pop, place, and time - develop and test instruments for data collection, ensuring consistency in training and measurement - collect and analyse data - write and disseminate evaluation report, feeding back into various aspects of program
38
why may a program fail to succeed
- poor program design - poor or incomplete program implementation - failure to reach sufficient numbers of target audience
39
formative use of process-evaluation
- involve using process-evaluation data to fine-tune the program
40
summative uses of process-evaluation
- involve making a judgement about the extent to which an intervention was implemented as planned and reached intended participants.
41
steps for developing a process-evaluation plan (6)
1. describe the program (logic model) 2. describe complete and acceptable delivery of program (specific strategies, goal, fidelity, dose, reach) 3. develop list of potential process-evaluation questions (can be organised by intervention component) 4. determine methods for process evaluation (methods to answer each question, issues, quan + qual) 5. consider program resources and program characteristics and context (resources to answer each question, availability of staff, time) 6. finalise the process-evaluation plan
42
steps to design and evaluate a behaviour change intervention (4)
1. identify levers of change (for specific behaviour) 2. intervention design 3. translation to real world applications (co-design) 4. develop evaluation plan
43
what is public health?
WHO - all organised measures (public or private) to prevent disease, promote health, and prolong life among the population as a whole. - activities aim to provide condition in which people can be healthy - entire population docus
44
what are we influencing in public health
- individual (knowledge, attitudes, skills, behaviours) - interpersonal (friends, family, social) - institutional (organisations, schools, workplaces) - community (cities, neighbourhoods, resources, norms) - policy (federal, state, local legislation)
45
centre for behavioural research in cancer
- production of high quality evidence to optimise cancer control initiatives
46
goals, processes and tools - centre for behavioural research in cancer (4)
1. identify population progress & needs (monitor and synthesise data and evidence) 2. identify, develop and test potential initiatives (conduct strategic and formative research) 3. determine value of initiatives (conduct evaluation studies) 4. inform and influence policy and practice (feedback & disseminating) COMMUNICATE & COLLABORATE
47
Monitor & synthesise data and evidence (step 1)
- population monitoring e.g., smoking rates - media monitoring (e.g., trends in news coverage, portrayal in magazines) - industry/market monitoring
48
conduct strategic and formative research (step 2)
- hierarchy of effects model - content analysis of existing prevention ads/interventions - studies - experimental studies
49
hierarchy of effects model
posits proximal variables are causally linked to distal outcomes through a series of intermediate measures - proximal variables - exposure - intermediate variables - understanding + knowledge (e.g., of campaign message); attitude and social norms; self-efficacy and intention) - distal variables - behaviour
50
tobacco - new tobacco control measures
- 10 new Health Promotion Inserts (HPIs) - 8 new On Product Health Messages - 10 new graphic health warnings
51
screening and early detection process (5 stages)
1. recruitment - targeted pop encouraged to participate in screening 2. screening - targeted pop who participate 3. assessment - screened pop who require further assessment 4. diagnosis - assessed participants diagnosed w disease/condition 5. outcomes - reduced morbidity and mortality from the disease
52
I-SAM: Integrated Screening Action Model
- participant influences (motivation - automatic, reflective, capability - psychological, physical) influence - screening behaviour process - environmental influences (opportunity - social, physical) - influence screenign behaviour process in week 9 slides - screening behaviour shaped by interaction bw participant and environmental influences (COM-B) - top-down process
53
screening behaviour process
unaware --> unengaged --> undecided --> decided to act (can also decide not to act) --> acting
54
what will work to encourage people to screen
- different intervention targets are different points in the screening behaviour process
55
theory of change
- mapping/tool to describe the need you are trying to address, changes you want to make (outcomes), and what you plan to do (activities) if X(activity), then Y(link); if Y(link), then Z(outcome)
56
theory of change - example
short-term outcomes (engagement, training, etc) medium-term outcomes (modelling, supports) long-term outcomes (culture changes, understandings are broader) - assumptions
57
data triangulation
- multiple sources of data (e.g., managers, instructors, learners) - multiple types of data analysis (e.g., grounded theory, narrative, discourse) - multiple methods of data collection (e.g., focus groups, surveys, interviews) form robust findings
58
knowledge translation
- synthesis, exchange, and application of knowledge - within this is implementation science
59
implementation science
- scientific study of methods to promote uptake of evidence into practice
60
Orygen
Australia's leading youth mental health not-for-profit and home to world's largest mental health research institute - dedicated to positive change for 12-25 year olds w mental ill-healtg
61
self-direct learning cycle
trigger --> - uncover knowledge gaps - formulate learning goals - use resources - apply knowledge - evaluate learning cycle
62
Orygen findings - barriers to self-directed learning
- time (in workday) - access (to learning resources)
63
Orygen findings - enablers to self-directed learning
- personal driver and efficacy - coworkers/management - encouragement from leadership to engage in professional development as part of role
64
knowledge translation - recommendations for mental health professionals
- utilise supervision and other forms of feedback to enhance self-assessment of skill gaps + identify learning goals - refer to professional competencies + practice guidelines when assessing skills and formulating learning goals - select appropriate interventions for knowledge and skill gos - prioritise developing and expanding skills throughout career
65
knowledge translation - recommendations for services
- allocate protected time and funding for learning - provide regular structured supervision to discuss learning needs and goals against competency frameworks - use diverse feedback sources - offer discretionary PD fundin - foster a learning culture
66
Youth Risk of Cognitive Impairment Toolkit (Youth ROCIT)
- designed to help practitioners identify cognitive difficulties in young people
67
defining the problem in behaviour terms
be specific about - the behaviour itself - the target individual, group or population involved in behaviour - the location
68
workshops in creating the Youth ROCIT
- workshop 1: discussion-based (understand goals) - workshop 2 & 3: education, demonstration, and discussion (facilitating training of ROCIT, build capability and confidence) - workshop 4: implementation action planning (how and when youth ROCIT to be used, methods of successful implementation discussed)
69
community engagement spectrum
inform --> engage --> consult --> involve --> co-design --> co-produce --> citizen led
70
different guidelines - what did they involve in the community engagement spectrum
- Expert Guidelines: Inform + Educate - New Federal Guidelines: Consult + Involve - Our Guidelines: Co-design
71
Co-design workshop example structure - Co-designing behavioural guidelines for politicians
1. introduction to the team (leaders, topic background info) 2. code of care for session (be present, respectful, agree to disagree...) - make people feel safe 3. warm-up activity (breakout rooms + report back) 4. input into draft guidelines (feedback - look over parts of the guidelines and what to clarify, change, improve on) 5. final group discussion