Week 9 Discussion SOCW 6311-Reporting a Process Evaluation

Week 9 Discussion: Reporting a Process Evaluation
Just as in needs assessments, interviews and focus groups are common tools for
obtaining information about the processes involved in the implementation of programs.
Process evaluation should include specifics about purpose, questions which the
evaluation will address, and methods that social workers will use to conduct
evaluations.

Week 9
Discussion:
SOCW 6311,
Walden
University
There is a proverb that reminds us that "the road to hell is paved with good intentions".
Sometimes, the best laid plans do not deliver; the most carefully prepared
activities turn out differently than we expected, and our good intentions do not
lead to our desired outcome, even with the best efforts of everyone involved.
This week, we continue our examination of evaluation procedures by studying methods
used in process evaluations (also called implementation evaluations, performance
monitoring, and program monitoring) and in reporting results from process evaluations.
The focus of process evaluation includes "determining what practitioners really do
when intervening and what clients…experience during an interventions" (Begun, n.d.).
The process evaluation examines how the program plans are implemented and
obstacles that emerge when providing the services.
Process evaluations may also determine whether funds are spent appropriately
according to agreements made between the program managers and funding sources,
but more often these questions are answered through separate audits and financial
evaluations.
If a process evaluation is conducted while the program is being conducted, the
evaluation may be termed program monitoring. Program monitoring involves evaluating
the processes involved in the program as they are being conducted – often as soon as
the program launches – for these purposes:

a. To ensure that the services or interventions are being performed as
planned
b. To deal with any barriers have that arisen to or could be expected
that would interfere with delivering services as planned

______________________________________________________________________
_________________________________________________________________
I. What to READ this week
1. Becker, L.A. (1999). Statistical and clinical significance. Read sections 1 and 2 –
Statistical Significance and Clinical Significance. There is a link to the paper in the
learning resources.
2. Riemann, B. L. & Lininger, M. (2015). Statistical primer for athletic trainers:
The difference between statistical and clinical meaninfulness. Journal of Athletic
Training, 50(12), 1223-1225. Replace the term "athletic trainer" with "social
worker" for an easy-to-understand overview of the most important differences
between these two terms with a minimum of statistical jargon.

3. Bliss, M. J. & Emshoff, J. G. (2002). Workbook for designing a process
evaluation. Atlanta, GA: Georgia State
University. https://brainmass.com/file/1580335/WorkbookProcessEvaluation.pdf
4. Dudley, J. R. & Herman-Smith, R. (2020). Chapter 8: Improving how
programs and practice work. In J. R. Dudley (Ed.), Social work evaluation:
Enhancing what we do (3rd ed., pp. 167-210).
5. The Social Work Research Qualitative Groups case study located in this
week's resources and in Plummer, S.-B., Makris, S., & Brocksen S. (Eds.).
(2014b). Social work case studies: Concentration year (pp. 68-69). Baltimore,
MD: Laureate International Universities Publishing. [Vital Source e-reader]. This
case study will be used in the assignment this week.
***6. Optional, but recommended: Boyce, C., and Neale, P. (2006). Conducting
in-depth interviews: A guide for designing and conducting in-depth interviews
for evaluation output. [PDF]. Pathfinder International Tool Series: Monitoring
and Evaluation. https://www.measureevaluation.org/resources/training/capacity-building-
resources/data-quality-portuguese/m_e_tool_series_indepth_interviews.pdf
____________________________________________________________________________________
___________________________________________________
II. Our learning GOALS for this week: We will do a deep dive into process evaluations
and learn about the procedures used to evaluate services that are provided in
programs.
1. We’ll explore how process evaluations improve programs
2. We’ll review the questions that are considered in a process evaluation
3. We’ll make the connection between the logic model and the process
evaluation
4. We’ll see how different types of purposes and questions may require
different data collection techniques
5. We’ll examine some examples of results from process evaluations.
6. We’ll practice writing some elements of a process evaluation.
______________________________________________________________________
_________________________________________________________________
III. What to DO this week: We will practice writing elements of a process evaluation from
the descriptions of program scenarios in Dudley (2020; pp. 181-206). They are in the
darkened boxes throughout this section. The first step will be to complete a data
gathering table (Table 1, below) as if you are gathering the data for the process
evaluation yourself. Your second step is to summarize the information you have
obtained or can reasonably infer from your scenario. Third, you will ask questions of
your fellow scholars regarding the questions that they have chosen for the evaluation.
By Day 3: Your task for this discussion – complete a table with information for
a process evaluation and create a summary statement. Please label your post as
“Original post”. If you choose to revise your post, indicate the revision by adding
"revised post" or "corrected post" in the subject line.
A. Select one of the scenarios describing a program in Chapter 8 of Dudley (2020; pp. 172-206). The
scenarios are the examples in the grey boxes in this section.

B. Locate the challenge linked to the selected program. Challenges are identified as
subcategories dividing the pages and are demonstrated by the scenarios that follow them within their
section. The challenges include these situations:
 link the intervention to the clients' problems (p. 172)
 implement the intervention as proposed (p. 175)
 assist and promote evidence-based interventions (p. 179)
 focus on staff members (p. 184)
 accessibility of the intervention (p. 189)
 program quality (p. 194)
 client satisfaction (p. 196).
 evaluating practice processes (p. 204).
C. To ensure a wide range of examples, only 1-2 students may use the same scenario. Skim through
the list of already posted discussions, and ensure that you are the first or second student to use
the scenario for your post. Post your choice as soon as you decide upon it. You can return to the
post to complete it later.

1. After you have chosen your scenario, enter the name of the scenario, the
page(s) where it is found in Dudley (2020), and the reason that
the PROCESS evaluation was conducted at the top of Table 1. If a purpose is
not provided for the process evaluation, infer it from the description. (Be careful
not to do a product (= outcome) evaluation!)
2. Locate four questions in Table 1 (below) that you can answer about your
selected program, either from information provided or from information you can
reasonably infer about it. Delete the rest of the questions so your table does not
get too large.
3. Identify the types of individuals who provided the information or data for the
evaluation in the scenario. If no respondents are described, suggest several
who could reasonably provide the best answer to your selected question (one
informant per question). You may select feasible candidates for informants from
the table of stakeholders presented in the Discussion for Week 6.
List the actual or suggested informants for the process evaluation by their roles
(staff, leaders, participants, etc.) and insert them in Column 2 in Table 1 below.
If the evaluation has not been conducted, list at least two types of informants
who might provide useful information for the evaluation and their roles, and
indicate that these roles are suggestions for the evaluation.
4. Post the data collection method used to obtain responses to the selected
questions and the responses obtained in the table below. If data collections
methods are not provided, propose data collection methods you could use and
the types of responses you might expect from the data collection table, based on
the description of evaluation activities for your scenario.
Table 1: Process Evaluation
Selected scenario from Dudley (2020) and page number:
Purpose of the evaluation (what problem or outcome is being examined with the evaluation?)
Process Evaluation Questions (Select 4; delete the
rest)

Name the role of the informants who
provided an answer to the selected

List the
information

question OR suggest an informant or
stakeholder who could provide valid
answers for the evaluation. (Brief word or
phrase only)

obtained in answer
to the questions
you selected. Infer
information
from the
scenario
description if
information was
not provided, but
indicate that you
are estimating
the response.) 1-3
sentences

What group did the program developers intend to reach
through this program? Was the program successful in
reaching this group?

 

What are/were the characteristics of actual program
participants? Are/were these people in the group that
the program was designed to assist?

 

How much did the program participants use the
program compared to the amount that the planners
intended?

 

What are the interventions that were planned for the
program and the intended goals of the activities?

 

"[Do] the intervention[s] address the causes of the
clients' problems that are the addressed concern?"
(Dudley, 2014, p. 168). If not, what changes are
needed?

 

"Are the proposed interventions supported by evidence
(treatment manuals, evaluations, and/or clinical
practice)" that [indicate] that the intervention[s] work?"
(Dudley, 2014, p. 168).

 

Where and when were the program interventions
conducted? Were the times and locations adequate?

 

How closely did the program staff follow the program
activities and interventions as outlined in the logic
model and promised to funders, participants, and other
stakeholders?

 

What problems or barriers arose to prevent delivery of
program components as planned, if any? Was it
possible to reduce or work around the obstacles?

 

Which planned activities worked well? Which were
more workable or satisfying than expected?

 

Are/were the services producing positive results that
justify the time, effort, and costs expended in providing
them?

 

Did the program deliver on its
promises? Are/were the funds spent appropriately as
agreed with funding sources or in compliance with legal
requirements? (Questions about finances are often
conducted in a separate audit or accounting
evaluation). Did the program policies and staff comply
with safety and other legal requirements?

 

What is the participants' level of satisfaction with the
program?

 

Were goals for cultural inclusiveness among staff, in
program activities, and in recruitment of participants
met?

 

How are staff recruited or assigned to the program? Do
they have the right mix of knowledge, organizational
skills, experience, competence, and interpersonal skills
to interact with the intended participants and
accomplish the program activities?

 

Are staff provided with the training, resources, time,
and support to provide the services competently
without due stress? What was the perception of the
staff (including volunteers) regarding the quality of the
program services, adequacy of resources, and their
treatment by program administrators, participants, and
stakeholders?

 

How well did staff meet expectations for performance?
Was the staffing appropriate (the right number of
competent personnel with right mix of skills; neither too
many staff, staff with the wrong skills set, or too few
staff to ensure that the services provided were high
quality.

 

Were the resources appropriate for the program? Were
more needed? Were any resources less useful than
expected? Did the program adhere to the budget?

 

Was a recommendation given in the description of the
program about continuing, repeating, or discontinuing
the program? Were any specific changes
recommended for the tasks and methods used in the
program?

 

5. Write a summary of information obtained in the evaluation. Include the following
information (4-10 sentences):
5a. Briefly describe the scenario and the purpose for the evaluation.If there was a
problem with the program, describe it. (1-2 sentences).
5b. Name 1-2 strengths and 1-2 possible weaknesses of your selected program as
described in the scenario (1-3 sentences).
5c. Explain whether the selected program appeared to be implemented as planned,
given the description of the program by Dudley (2020). If the program was not
implemented as planned, provide an explanation for diverting from the program
plan, if one is given (1-3 sentences).
If the scenario lacks a description about the program being implemented as
planned, (which would be a red flag for most auditors, since they would want to
know why this information is lacking), note this lack of information and explain
why a description of program fidelity is important in a process evaluation. (2-4
sentences)
5d. Summarize any information regarding client perceptions of the program. (1-3
sentences). If there are none, state that none were provided.
6. Explain how the method of obtaining the answers to the questions (review of records,
interview and questionnaires with open and/or closed questions, focus groups, phone
calls, etc.) might affect the results obtained for the evaluation. (See the data collection
methods handout for ideas, if necessary.) What methods could you use to ensure that
the findings and responses of informants were valid? (Maximum: 2 sentences)

SAMPLE COMPLETED ASSIGNMENT

A. Implementing an Evidence-Based Manual in a Domestic Violence
Intervention Program (Dudley 2020; pp. 181).
B. Adopt and Promote Evidence-Based Interventions.
C. 1.

Implementing an Evidence-Based Manual in a Domestic Violence
Intervention Program Dudley (2020; pp. 181).
To assess agency staff’s concerns about evidence based interventions and how they have addressed these concerns
in the past.
Process Evaluation Questions (Select 4; delete
the rest)

Name the role of the informants who
provided an answer to the selected
question OR suggest an informant or
stakeholder who could provide valid
answers for the evaluation. (Brief word
or phrase only)

List the
information
obtained in
answer to the
questions you
selected. Infer
information
from the
scenario
description if
information was
not provided, but
indicate that you
are
estimating the
response.) 1-3
sentences

What group did the program developers intend to
reach through this program? Was the program
successful in reaching this group?

Psychotherapist – Staff Face-to-face
Interview

The program’s
intervention
targeted children
younger than six
years old who
have witnessed
domestic violence.
The Child-Parent

Original Post 2
Psychotherapy
intervention was
implemented
provoking several
issues of concern,
implying the
program’s
intervention
successfully
reached the target
group (Dudley
2020; pp. 181).

What are the interventions that were planned for
the program and the intended goals of the
activities?

Program leader -Staff Face-to-face
Interview

The program
employed a Child-
Parent
Psychotherapy
(CPP) intervention
to address
emotional and
behavioral
problems
experienced by the
children (Dudley
2020; pp. 181).

"[Do] the intervention[s] address the causes of the
clients' problems that are the addressed concern?"
(Dudley, 2014, p. 168). If not, what changes are
needed?

Social worker – Staff Telephone
Interview

The intervention
does not address
the causes of the
clients’ problems.
There is a need to
assess mothers’
emotional
readiness to
process past

Original Post 3
violence and to
accept that their
children were
affected by the
violence. Also,
there is a need to
assess whether the
victims [children]
are still
experiencing
domestic violence
(Dudley 2020; pp.
181).

"Are the proposed interventions supported by
evidence (treatment manuals, evaluations, and/or
clinical practice)" that [indicate] that the
intervention[s] work?" (Dudley, 2014, p. 168).

Social worker – Participant Face-to-face
Interview

The proposed CPP
intervention is
evidence-based
and its efficacy is
supported by two
random controlled
clinical trials
(Lieberman &
Van Horn, 2008;
Lieberman, Van
Horn, & Ippen,
2005).

5. Summary
a. Scenario: Implementing an Evidence-Based Manual in a Domestic Violence Intervention
Program (Dudley, 2020; pp. 181).
Purpose: To assess agency staff’s concerns about evidence based interventions and how
they have addressed these concerns in the past.
Problem: The program made assumptions that failed to address two vital promoters of
success for the intervention. It failed to assess mothers’ readiness to process past violence
and to accept that their children were affected by the violence. Also, to assess whether the
victims [children] were still experiencing domestic violence (Dudley 2020; pp. 181).

Original Post 4
b. Strengths
– The model’s efficacy is supported by two random controlled clinical trials (Lieberman
& Van Horn, 2008; Lieberman, Van Horn, & Ippen, 2005).
-Competent staff who were able to identify and raise critical concerns.
Weaknesses
-It assumed that participants were no longer experiencing violence at home.
– it assumed that mothers were emotionally ready to process past
violence and to accept that their children were affected by the violence.
c. The program was implemented as planned. Its implementation lead to rise of several
concerns from staff.
d. No client perceptions provided
e. Interview, review of records, and focus groups methods provide credible information.
They can be used to ensure that the findings and responses of informants are valid
("Chapter 3. Assessing community needs and resources | Section 2. Understanding and
describing the community | Main section | Community tool box," n.d.).