PERL is going to AEA!

Yes, PERL is going to Atlanta this October to present a panel of papers at the annual meeting of the American Evaluation Association. Our proposal for this panel was accepted after peer review.

Here’s a summary of the presentations, to be made by a mix of current and former graduate students at American Univerity, all of whom were (and are) members of PERL. The panel will be chaired by me, Brian Yates, your PERL Director; I’ll also serve as Discussant.


Resource → activity → process → outcome analysis (RAPOA) model offers a flexible yet comprehensive design for programs, evaluation of programs, and communication of findings of program evaluation. The papers in this proposed grouping explore the degree to which the RAPOA integration of program, evaluation, and information design actually work in the real worlds of health and human services. Together the papers illustrate challenges of using one design for cost-inclusive evaluation of diverse programs, including:

1. a multi-state suicide prevention hotline,

2. 13 programs for substance abuse,

3.  a randomized clinical trial of a psychological versus a medical program for seasonal affective disorder, and

4.  a quasi-experimental comparison of providing a parent-child interaction training program at homes versus a centralized clinic.


Dr. Katheryn Ryan

Evaluation of the suicide prevention hotline was challenged by the ethical difficulty of offering a comparison service. The hotline evaluator constructed a quantitative comparison group that represented the demographic mix expected for a randomly selected comparison group, enabling a cost-benefit analysis of hotline activities as well as calculation of cost per quality-adjusted life year (QALY) added by the hotline. Both analyses considered effects of different assumptions regarding control conditions.

Dr. Sarah Hornack

A large-scale cost-effectiveness and cost-benefit analysis of intensive inpatient substance abuse treatment programs had neither a randomly selected control group nor systematic variation in the one program component in which the funder was particularly interested: gender-sensitivity of program design. Data from visits to each of 13 different sites found four naturally-occurring groupings of gender-sensitivity. The evaluation quantified contributions of gender sensitivity, gender, and parenthood to cost-effectiveness and cost-benefit of treatment. The more intensive and objective design of this evaluation, using individual data on heath care, income support, criminal justice involvement, and income earned from over 14,000 consumers for each of 24 months preceding and 24 months following treatment, provided entirely different cost-benefit findings from what previous, smaller-sample evaluations reported when relying on extrapolations of retrospective consumer self-reports of health care and criminal justice involvement.

near-Dr. Lana Wald (Ross)

The luxury of accessing data on program costs, activities, processes, and health-care outcomes from a longitudinally designed randomized clinical trial for alternative treatments for seasonal affective disorder was counterbalanced by problems in the distributions of health care utilization costs and measure design. Different approaches to assessing monetary outcomes following completion of the assigned treatment took into account subsequent use of similar or the alternative treatments, resulting in different cost-benefit and cost-utility findings.

Alexis French

A demonstration program for preventing child abuse and neglect and promoting child development was designed to reach consumers in the home — consumers less likely to begin and continue participation in programs at central clinics. This multi-site program was evaluated for costs and cost-effectiveness from consumer versus provider versus administrative versus community perspectives. Adding elements to cost evaluation that represented these often un- or underrepresented stakeholders was challenging but resulted in different findings than when only effectiveness was included in the evaluation design.


© Brian T. Yates 2017