Abstracts - Society for Benefit

Transcription

Abstracts - Society for Benefit
Presentation Abstracts
2015 Conference and Annual Meeting
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Contents
Session 1 - Thursday, March 19, 9:00 - 10:30 ................................................................................... 3

A.1: What Should Policy Makers (and the Public) Know about Interpreting
Regulatory BCA? (Marvin 309) ......................................................................................................... 3

B.1: Water Resources Management (Marvin 307) ............................................................... 3

C.1: Estimation of Cumulative Benefits and Costs of Regulation Using the
"RegData" Database (Marvin 308) ................................................................................................... 5

D.1: Use of BCA in Setting Homeland Security Policy (Marvin 413-414) ...................... 6

E.1: Social Policy BCA: Assessing Child Welfare and Justice Programs (Marvin
310) .......................................................................................................................................................... 8
Session 2 - Thursday, March 19, 10:45 - 12:15............................................................................... 10

A.2: Benefit-Cost Analysis and Health Care: A Conversation with David Cutler and
Sherry Glied (Marvin 309) ................................................................................................................ 10

B.2: Decision Tools for Analyzing Uncertain Futures (Marvin 307).............................. 10

C.2: Transportation: Program and Project Assessments (Marvin 308) ....................... 11

D.2: The Nexus between Health Effects Studies and Benefits (Marvin 413-414) ...... 12

E.2: Perspectives on Implementing Benefit Cost Analysis in Climate Assessment
(Marvin 310) ......................................................................................................................................... 14
Session 3 - Thursday, March 19, 2:00 - 3:30 ................................................................................... 16

A.3: Skills for the Next Generation: A Conversation between Senior Government
Economists and Public Policy School Leaders (Marvin 309)................................................. 16

B.3: Assessing Benefits for Policies that Reduce Health Risks (Marvin 307) ........... 16

C.3: Development and Miscellaneous Regulatory Issues (Marvin 308) ...................... 18

D.3: The Valuation of Ecological Goods and Services in Support of Benefit-Cost
Analysis (Marvin 413-414) ................................................................................................................ 19

E.3: Climate Policy Benefits Issues (Marvin 310) .............................................................. 21
Session 4 - Thursday, March 19, 3:45 - 5:15 ................................................................................... 23

A.4: Estimating the Benefits of Policies that Address Addictive Goods (Marvin 309)
23

B.4: Benefits, Costs and Labor Markets (Marvin 307) ...................................................... 23

C.4: Benefit-Cost Practices and Discounting Issues (Marvin 308) ............................... 25

D.4: Electricity Sector Optimization (Marvin 413-414) ...................................................... 27

E.4: Non-Market Recreational Welfare Effects of Changes in the Diversity and
Abundance of Species (Marvin 310) ............................................................................................. 28
Session 5 - Friday, March 20, 9:00 - 10:30 ....................................................................................... 30
1
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier

A.5: Financial Benefits of Remediation (Marvin 309) ....................................................... 31

B.5: Economics Frontiers and Benefit-Cost Analysis (Marvin 307) ............................ 32

C.5: Finance Issues (Marvin 308) ........................................................................................... 34

D.5: Economic Evaluation of Medical Interventions (Marvin 413-414) ........................ 36

E.5: Methodological Approaches to Benefit-Cost Analysis (Marvin 310) ................... 38
Session 6 - Friday, March 20, 10:45 - 12:15..................................................................................... 40

A.6: Valuing Reductions in Morbidity and Mortality (Marvin 309) ................................. 40

B.6: Food and Water Issues (Marvin 307) ............................................................................ 42

C.6: Local Policy Issues (Marvin 308) ................................................................................... 43

D.6: Real Option Value and Federal Offshore Leasing (Marvin 413-414) .................... 45

E.6: International Benefit-Cost Analysis Issues (Marvin 310) ........................................ 46
Session 7 - Friday, March 20, 2:00 - 3:30 ......................................................................................... 48

A.7: Retrospective BCA of Federal Rules (Marvin 309) ................................................... 48

B.7: Law and Economics Perspectives on Benefit-Cost Analysis (Marvin 307) ....... 48

C.7: Preliminary Recommendations from the 2nd Panel on Cost-Effectiveness in
Health and Medicine (Marvin 308).................................................................................................. 50

D.7:Cost-Effective Air Quality Strategies (Marvin 413-414) ............................................ 50

E.7: Valuing Outcomes and Performing BCA for Social Policy Intervention (Marvin
310) ........................................................................................................................................................ 52
Session 8 - Friday, March 20, 3:45-5:15 p.m. .................................................................................. 55

A.8: Challenges and Opportunities for Economic Analysis of Risk Regulations
(Marvin 309) ......................................................................................................................................... 55

B.8:Retrospective Review of Federal Regulations (Marvin 307) ................................... 55

C.8: State and Local Benefit-Cost Issues (Marvin 308) .................................................... 57

D.8: Assessing Benefits in Consumer Protection Regulation (Marvin 302) ............... 58

E.8: The Effectiveness of Policies Involving Health Warning Labels and Signage:
Cigarettes, e-Cigarettes, and Alcohol (Marvin 310) .................................................................. 59
2
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Session 1 - Thursday, March 19, 9:00 - 10:30
 A.1: What Should Policy Makers (and the Public) Know about
Interpreting Regulatory BCA? (Marvin 309)
Chair: Susan Dudley (sdudley@gwu.edu), The George Washington University
Panelists to Include:
1. Richard Belzer (regcheck@mac.com), Regulatory Checkbook
2. Glenn Blomquist (gcblom@email.uky.edu), University of Kentucky
3. Chris Carrigan (ccarrigan@gwu.edu), The George Washington University
4. Tony Cox (tcoxdenver@aol.com), Cox Associates
5. Peter Linquiti (linquiti@gwu.edu), The George Washington University
6. Brian Mannix (BMannix@aol.com), The George Washington University
Participants from a January 2015 discussion will discuss what policy makers need to know to
understand and interpret RIAs.
 B.1: Water Resources Management (Marvin 307)
Chair: William Wheeler (wheeler.william@epa.gov), U.S. Environmental Protection Agency
Presentations:
1. Economic Assessment of Climate Change Adaptation Pilot Studies in the Great Lakes
Region, Tess Forsell* (tess.forsell@erg.com), Eastern Research Group; National Oceanic and
Atmospheric Administration's Coastal Service Center; Horsley Witten Group, Inc.
The economic effects of flooding from extreme precipitation events are being experienced
throughout the Great Lakes region. The purpose of this study was to assess the economic costs and
benefits of green infrastructure (GI) as a method of reducing the negative effects of flooding in
Duluth, Minnesota, and Toledo, Ohio. A secondary purpose of the study was to develop an analytical
framework that can be applied in other communities to 1) assess how their community may be
impacted by flooding with increased precipitation, 2) consider the range of available green
infrastructure and land use policy options to reduce flooding, and 3) identify the benefits that can be
realized by implementing GI. Flooding modeled under current and future precipitation scenarios was
coupled with current and future land use conditions to account for increased impervious surfaces
that can further increase stormwater runoff volumes and peak flows. Next, flooding under current
and future scenarios was modeled and associated damages were estimated using assumptions
about additional flood storage that could be provided through the implementation of GI. The amount
of reduced damages associated with flood mitigation strategies is represented as “benefits” (i.e., the
difference between the economic impact of flooding without flood mitigation and the economic
impact with the implementation of flood mitigation infrastructure). Monetized benefits include:
3
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
reduced building damages; increased recreational use; reduced flood damaged land restoration
costs; and reduced storm sewer infrastructure costs. The total present value and annualized benefits
are monetized over 20 years and 50 years. In Duluth, the community where more benefits could be
monetized, the 50-year effects are estimated to be $4.17 million in costs and $4.68 million in
benefits. In this comparison benefits exceed costs, providing evidence in favor of implementing the
GI project.
2. Joint Effects of Storm Surge and Sea-Level Risk on U.S. Coasts, Lindsay Ludwig*
(lludwig@indecon.com) and James Neumann, Industrial Economics; Kerry Emanuel and Sai Ravela,
WindRisk Tech and MIT; Paul Kirshen, University of New Hampshire; Kirk Bosma, Woods Hole
Group; and Jeremy Martinich, U.S. Environmental Protection Agency
Recent literature, the US Global Change Research Program’s National Climate Assessment, and
recent events, such as Hurricane Sandy, highlight the need to take better account of both storm
surge and sea-level rise (SLR) in assessing coastal risks of climate change. This study combines
three models – a tropical cyclone simulation model; a storm surge model; and a model for economic
impact and adaptation – to estimate the joint effects of storm surge and SLR for the US coast
through 2100. The model is tested using multiple SLR scenarios, including those incorporating
estimates of dynamic ice-sheet melting, two global greenhouse gas (GHG) mitigation policy
scenarios, and multiple general circulation model climate sensitivities. The results illustrate that a
large area of coastal land and property is at risk of damage from storm surge today; that land area
and economic value at risk expands over time as seas rise and as storms become more intense;
that adaptation is a cost-effective response to this risk, but residual impacts remain after adaptation
measures are in place; that incorporating site-specific episodic storm surge increases national
damage estimates by a factor of two relative to SLR-only estimates, with greater impact on the East
and Gulf coasts; and that mitigation of GHGs contributes to significant lessening of damages. For a
mid-range climate-sensitivity scenario that incorporates dynamic ice sheet melting, the approach
yields national estimates of the impacts of storm surge and SLR of $990 billion through 2100 (net of
adaptation, cumulative undiscounted 2005$); GHG mitigation policy reduces the impacts of the midrange climate-sensitivity estimates by $84 to $100 billion.
3. Assessing the Distributional Consequences of Premium and Claims Payments in the
National Flood Insurance Program, Okmyung Bin* (bino@ecu.edu) and John A. Bishop, East
Carolina University; Carolyn Kousky, Resources for the Future
This study examines the redistributional effects of the National Flood Insurance Program (NFIP), i.e.
who benefits and who bears the costs of the NFIP, using a national database of premium, coverage,
and claim payments at the zip code level between 2001 and 2009. Some argue the program
provides an important benefit to low income households living in low lying areas in communities like
those along the Mississippi River system, while others believe that it acts as a subsidy to the wealthy
owners of beach homes. A recent study, based on more than 25 years of the NFIP premium and
claims data to determine how the program’s price and payouts correlate to per-capita county
income, finds no evidence that the NFIP disproportionally advantages richer counties (Bin, Bishop,
and Kousky, Public Finance Review 2012). Although such finding is a useful first-order assessment,
more detailed analysis is warranted since claims payments tend to be concentrated on a few policies
such as repetitive loss properties. Our finding based on more disaggregated data should help
insurance practitioners and policy makers make informed policy decisions regarding the flood
insurance program.
4. Economic Evaluation of Community Water Fluoridation: A Community Guide Systematic
Review, Tao Ran* (xgy2@cdc.gov), Sajal Chattopadhyay and Randy Elder, U.S. Centers for
Disease Control and Prevention
4
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Previous systematic review of the effectiveness of community water fluoridation (CWF) showed that
it reduced dental caries across populations, and a 2002 economic review found from a societal
perspective that CWF saved money. However, the effectiveness of CWF has decreased from
around 50% in the 1970s to around 25% in the 1990s. Re-examining the benefits and costs of CWF
is therefore necessary. Using methods developed for Guide to Community Preventive Services
economic reviews, 564 papers were identified from January 1995 to November 2013. Ten studies
were included in the current review, with four covering intervention benefits only and another six
providing both costs and benefits information. Additionally, two of the six studies analyzed the costeffectiveness of CWF. For all four benefit-only studies, dental treatments in various forms decreased
with the presence of CWF. For the remaining six studies, per capita annual intervention cost ranged
from $0.11 to $4.89 in 2013 U.S. dollars (without an outlier). Variation in costs was mainly caused by
community population size, with decreasing cost associated with increasing community population.
Per capita annual benefits in the six studies ranged from $5.45 to $139.78. Variation in benefits was
mainly due to the numbers and types of benefit components. Benefit-cost ratio ranged from 1.12:1 to
135:1, and the ratio was positively associated with community population size. The economic benefit
of CWF exceeded the intervention cost. Further, benefit-cost ratio increased with the community
population size.
 C.1: Estimation of Cumulative Benefits and Costs of Regulation Using
the "RegData" Database (Marvin 308)
Chair: James Broughel (jbroughel@mercatus.gmu.edu), George Mason University
Presentations:
1. RegData: A Numerical Database on Industry-Specific Regulations for All U.S. Industries
and Federal Regulations, 1997-2012, Patrick McLaughlin* (pmclaughlin@mercatus.gmu.edu) and
Omar Al-Ubaydli, George Mason University
We introduce RegData, formerly known as the Industry-specific Regulatory Constraint Database.
RegData annually quantifies federal regulations by industry and by regulatory agency for all federal
regulations from 1997 to 2012. The quantification of regulations at the industry level for all industries
is without precedent. RegData measures regulation for industries at the two-, three-, and four-digit
North American Industry Classification System (NAICS) levels. We created this database using text
analysis to count binding constraints in the wording of regulations, as codified in the Code of Federal
Regulations, and to measure the applicability of regulatory text to different industries. We validate
our measures of regulation by examining known episodes of regulatory growth and deregulation as
well as comparing our measures to an existing, cross-sectional measure of regulation. We then
demonstrate several plausible relations between industry regulation and variables of economic
interest. Researchers can use this database to study the determinants of industry regulations and to
study regulations’ effects on a massive array of dependent variables, both across industries and
across time.
2. Estimating Industry- and Agency-Specific Cumulative Costs and Benefits with RegData,
Antony Davies* (antony@antolin-davies.com), Duquesne University; Patrick McLaughlin, George
Mason University
5
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Using RegData 2.0, we exploit variation in cumulated regulation across industries, agencies, and
time to estimate the cost, in terms of lost productivity, from regulatory accumulation to the benefits,
in terms of outcomes achieved. While agencies are sometimes required to estimate costs and
benefits of proposed regulations prior to those regulations being enacted, we are unaware of any expost analyses that look at the costs and benefits that actually accrued from regulatory accumulation.
We examine a set of major agencies for which the likely desired outcome of regulations is generally
known. For example, the paper will compare an estimate of the lost productivity due to OSHA
regulations to the likely desired outcome--improvements in workplace safety, as reflected in
workplace illness, injury, and fatality data. The results will inform both retrospective and prospective
review efforts by presenting a credible, empirical methodology for estimating the cumulative impact
of regulation (positive or negative).
3. The Aggregate Cost of Regulations: A Structural Estimation of a Tractable Multi-Sector
Endogenous Growth Model, Bentley Coffey* (bentleygcoffey@gmail.com), University of South
Carolina; Patrick McLaughlin; Pietro Peretto, Duke University
We estimate the effects of federal regulation on industry-specific value-added to GDP using
RegData 2.0 for a panel of 42 industries over 35 years (1977 – 2011). Our estimation is performed
within the structure of a Schumpeterian model of endogenous growth, which produces closed-form
solutions despite the complications inherent in its multi-sector dynamic general equilibrium structure.
To capture the effect of regulations on firms, we treat regulations as constraints in firms’ production
processes that raise fixed costs and decrease the firm’s productivity. We then estimate the
parameters of this model using national and sector-specific macroeconomic data joined with
RegData 2.0, which measures the incidence of regulations on industries based on text analysis of
federal regulatory code. With estimates of the model’s parameters fitted to real data, we can
confidently conduct counter-factual experiments on alternative regulatory environments and discuss
the policy implications of our findings.
4. Does Regulation Enhance or Inhibit Turnover of Firms by Industry? Thomas Stratmann*
(tstratma@gmu.edu), Matt Mitchell and Patrick McLaughlin, George Mason University
A large body of research suggests that churn—the turnover of top firms within an industry—is the
mark of a competitive, dynamic, and healthy economy. Among other things, churn has been linked to
technological innovation, competitive pricing, and economic growth. The economic theory of
regulation offers ambiguous predictions about the relationship between government regulation and
churn. Regulation may be a disruptive force, breaking up firms and discouraging integration (Posner
1971). Or, it may be a monopolizing force, erecting barriers to entry (Stigler 1971). We employ
RegData 2.0, a new dataset tracking regulatory trends by industry and agency over time, to test the
relationship between regulation and churn across 211 U.S. industries over the time period 1997 2011. We show that, on average, the accumulation of regulation specific to an industry reduces the
churn of that industry—implying a hidden but substantial economic cost of regulatory
accumulation. Our results are consistent with the ideas that regulations create barriers to entry,
protect incumbent firms, and are disproportionately costly on small entities such as new firms and
start-ups.
 D.1: Use of BCA in Setting Homeland Security Policy (Marvin 413-414)
Chair: Tony Homan (Anthony.homan@dot.gov), U.S. Department of Transportation
6
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Presentations:
1. The Social Value of Cybersecurity, Daniela Silitra* (dsilitra@mitre.org) and Haeme Nam,
MITRE Corporation
Cybersecurity has been a hot topic over the past few years. Its broad spectrum leads to array of
studies in various areas. Similarly, measuring social value is becoming more and more accepted
practice; especially in a period of unprecedented budget cuts. This paper will attempt to measure the
social value of cybersecurity in two areas by employing value added measuring methodology. The
two areas of interest is in national economy; and Corporate America. Value measuring criteria will be
identified for each of these areas; such as number of cyber jobs created or limited loss of revenue
due to cyber-attacks, and then assessed them accordingly. In addition, an extension of this study will
be conducted to assess social value of cybersecurity to the public-at-large. The initial study or
phase I of the study will be to assess the social value of cybersecurity in areas of national economy
and corporate America. Phase II of the study, which assess the social value of cybersecurity to the
public-at-large will be conducted for next year’s presentation.
2. A Literature Review and Proposed Method of Measuring a Reduction in Vulnerability,
Alex Moscoso* (Alex.Moscoso@tsa.dhs.gov), U.S. Transportation Security Administration
According to National Infrastructure Protection Plan 2013, risk is defined as a function of threat,
vulnerability, and consequence. This relationship is used by Federal government agencies within
the Department of Homeland Security to assess risks associated with certain terrorist attacks
scenarios, including benefit-cost analyses in rulemaking. In developing a standard methodology to
quantify risk, it is necessary to quantify the threat to a target, its vulnerability, and the consequence
of a successful attack. While estimating the economic consequences of a successful attack can be
done by using the standard value of statistical life, costs of injuries and property damage; accurate
quantitative measurements for threat and vulnerability are more elusive. With regards to rulemaking,
economists seek to measure the reduction in vulnerability from the introduction of certain mitigation
measures. Quantifying the effectiveness of a specific mitigation measure used to protect the
homeland is difficult task for many reasons: (1) an individual measure is usually part of a vast
security system where changes to one component affects other interconnecting components to
varying degrees; (2) measuring individual components’ cascading effects through a layered security
system proves challenging; and (3) while technology effectiveness can be tested, tracing the impacts
of a policy is difficult. This research will present findings from a literature review on the current
methods of measuring vulnerability. It will also present a working method of measuring vulnerability
using available data, fault tree analysis to map large security systems, and Monte Carlo simulations
to replicate terrorist attacks scenarios. This research will further the discussion on quantifying
vulnerability reduction by examining what has already been accomplished in the field and proposing
a method based on those accomplishments.
3. Estimating Benefits of Maritime Safety Training Programs, Ali Gungor*
(ali.gungor@uscg.mil), U.S. Coast Guard
U.S. merchant mariners and their employers spend significant time and money on safety training
programs each year due to international conventions or simply following best industry practices. In
2013, the United States Coast Guard (USCG) published a final rule that brings additional training
requirements with significant costs to the mariners and the industry overall following the international
standards set by the International Maritime Organization’s Standards of Training, Certification and
Watchkeeping (STCW) Convention and their amendments of 1995 and 2010. Against significant
annual costs that had already been incurred since 1997 and more to be incurred after the publication
of this final rule, however, USCG did not estimate any quantifiable or monetized benefits that could
7
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
be attributed to maritime safety training. Rather, the regulatory impact analyses since 2011 used and
provided detailed break-even analyses, transfer benefits and qualitative benefits among other benefit
estimation methods. This presentation discusses the challenges of estimating benefits of maritime
safety training programs over the last two decades. In particular, all subject matter experts attempted
to answer this question: “does safety training save lives?” and the follow-up question: “if yes or
maybe, how do you quantify or monetize them?”
4. Estimating the Cumulative Impact of Coast Guard Regulations Under Executive Order
13563, Rosemarie Odom* (rosemarie.a.odom@uscg.mil), Paul Large and Ali Gungor, U.S. Coast
Guard
In Executive Order 13563 (January 18, 2011), agencies are directed to tailor regulations to take into
account (to the extent practicable) cumulative costs of regulations. As a tool to inform the analysis
of cumulative costs of regulation, Coast Guard has developed a Cumulative Impacts Database,
which contains cost and benefit information on the final regulatory actions that Coast Guard has
promulgated since 1993. The CID allows Coast Guard to aggregate the estimated costs of its
regulations by diverse factors. The aggregated data indicates that 92% of the cost of Coast Guard
regulations over the past 20 years results from two statutory mandates: the Oil Pollution Act of 1990
and the Marine Transportation Security Act. This presentation describes and provides examples of
the contents of the Cumulative Impacts Database and summarizes some of the key findings. The
presentation also discusses the limitations of using the results from database, particularly the
challenge of applying the aggregate results to individual owners and operators.
 E.1: Social Policy BCA: Assessing Child Welfare and Justice
Programs (Marvin 310)
Chair: Stuart Shapiro (stuartsh@rutgers.edu), Rutgers University
Discussant: Brian Bumbarger (bkb10@psu.edu), Pennsylvania State University
Presentations:
1. A Cost-Benefit Analysis of the 2009 Reform of the Rockefeller Drug Laws in New York
City, Joshua Rinaldi* (jrinaldi@vera.org), Vera Institute of Justice
The 2009 drug law reforms (DLR) in New York State changed how drug crimes were processed in
the New York City criminal justice system by removing mandatory minimum sentences for
defendants facing a range of felony drug and property charges. It also created new options to divert
defendants to drug treatment as an alternative to incarceration. In addition to implementation and
impact evaluations, the Vera Institute of Justice conducted a cost-benefit analysis (CBA) to explore
the economic implications of DLR in New York City.
The CBA computes costs and benefits based on Vera’s impact evaluation, which used
administrative records from multiple city and state agencies to track outcomes for cases disposed
during two equivalent time periods, pre- and post-DLR. Propensity Score Matching (PSM) was used
to select comparable samples, controlling for baseline differences in case and individual level
characteristics.
Costs and benefits were measured from the perspectives of taxpayers and victims for a three-year
follow-up period post arrest. Taxpayer costs include law enforcement, courts, jail, prison, probation,
parole, and drug treatment. Victimization costs were measured by calculating the reform’s impact on
8
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
the tangible and intangible costs of crime. This presentation will describe the cost-benefit
methodology, results, and implications for policy-making.
2. Doing Well, While Doing Good: A Benefit Cost Analysis of Private Foundation
Investment in a Social Bond Impact Program to Reduce Recidivism, Joseph Cordes*
(cordes@gwu.edu), The George Washington University; William Winfrey, HCM Strategists/EPI
Nonprofit foundations are increasingly turning to benefit-cost analysis as a means of evaluating the
impact of their grants. A large nonprofit foundation located in the Northeastern United States has
participated in a large social impact bond program, investing $1.5 million out of a total of $27 million
in a project intended to reduce recidivism.
Our paper uses benefit cost analysis, undertaken from the perspective of the foundation, to evaluate
the impact of the foundation’s investment in the social impact bond experiment. The first step in the
evaluation is to undertake a benefit cost analysis of the intervention itself. Although the analysis
draws on standard practices for defining and measuring the social costs and benefits of the
intervention, the issue of which discount rate to use in the analysis is less well-defined. The options
include: the social discount rate used to evaluate the program if it were to undertaken in the public
sector; a discount rate based on the time preference of the foundation; or a discount rate reflecting
the opportunity cost to the foundation of investing its funds in the social impact bond program. We
explore this question by formulating a simple model of a foundation’s “social investment
problem”. We show that the appropriate discount rate will depend on: (a) the foundation’s objective
(social welfare) function, as defined by its mission; (b) the financial return to the foundation’s
endowment, and (c) tax rules governing foundation payout.
An additional issue that needs to be addressed is that of estimating the impact of the specific
foundation’s participation as one of several entities providing funds for the intervention. Our
approach builds on that suggested by Brest, et. al. (2004). The results of our base case analysis
indicate that participating in the social impact bond experiment is justified, even at discount rates that
are higher than would be used to evaluate an equivalent project financed with public funds. We also
undertake a Monte Carlo simulation to explore the robustness of these results to different values for
components of social benefits and costs.
3. Cost-Benefit Analysis of Supportive Housing for Child Welfare Involved Families, Josh
Leopold* (jleopold@urban.org), Mary Cunningham and Mike Pergamit, Urban Institute
This presentation will focus on practical challenges of designing and implementing a benefit-cost
analysis for a housing intervention targeted high-needs families with involvement in multiple
systems. The Supportive Housing for Child-Welfare Involved Families: a Research Partnership
(SHARP) evaluation is a randomized controlled trial in five demonstration sites: Memphis, TN, Cedar
Rapids, IA, Broward County, FL, San Francisco, CA, and Connecticut. The demonstration, funded
by the Children’s Bureau, a division of the U.S. Department of Health and Human Services, provides
supportive housing (permanent housing paired with case management and voluntary services) to
families with a history of homelessness and child welfare involvement. The costs of the intervention
are expected to be higher than the services provided through usual care by the child welfare system.
However, if the program works as intended it is expected to reduce utilization of homeless and child
welfare services and produce long-term benefits in child and adult well-being and productivity. The
Urban Institute, with support from Dr. Bob Plotnick at the Evans School of Public Affairs, is
conducting a benefit-cost analysis of the demonstration to determine whether, and under what
conditions, the benefits of the intervention outweigh the costs of producing them. The analysis will
distinguish between costs and benefits that accrue directly to families as well as publicly-funded
systems (local, state, and federal) and society at-large. For the primary cost domains—
9
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
homelessness, child welfare, and supportive housing—the evaluators will use the “ingredients
method” to estimate actual unit costs through direct data collection. For secondary domains, such as
public benefits, health care, and education, the evaluators will rely on the literature to estimate unit
costs. During this presentation, the evaluators will outline the methods being used for the benefitcost analysis to determine unit costs and utilization and solicit participant feedback on how to
address anticipated challenges.
Session 2 - Thursday, March 19, 10:45 - 12:15
 A.2: Benefit-Cost Analysis and Health Care: A Conversation with
David Cutler and Sherry Glied (Marvin 309)
Chair: Amber Jessup (Amber.Jessup@HHS.GOV), U.S. Department of Health and Human Services
Panelists to Include:
1.
David Cutler (dcutler@harvard.edu), Harvard University
2.
Sherry Glied (sherry.glied@nyu.edu), New York University
Health care expenditures account for almost 20 percent of the U.S. gross national product and are
among the fastest growing components of the Federal budget. Yet we rarely see benefit-cost
analysis used to support related policy decisions or decisions to subsidize particular treatments. This
session involves a moderated discussion with two leading health care experts on the role of benefitcost analysis in health care, including both its advantages and limitations in this context.
 B.2: Decision Tools for Analyzing Uncertain Futures (Marvin 307)
Chair: Susan Dudley (sdudley@gwu.edu), The George Washington University
Panelists to Include:
1.
Chris Carrigan (ccarrigan@gwu.edu), The George Washington University
2.
Tony Cox (tcoxdenver@aol.com), Cox Associates
3.
Heidi King (heidi.king@mac.com), GE Capital
4.
Peter Linquiti (linquiti@gwu.edu), The George Washington University
5.
Brian Mannix (BMannix@aol.com), The George Washington University
6.
Anne Smith (anne.smith@nera.com), NERA
This interdisciplinary panel will explore the key issues and best practices for understanding and
responding to uncertain, distant, global events. What types of events/risk should policy makers be
concerned about? Can benefit-cost analysis (BCA) be improved as a decision-making tool applied to
potentially significant, global future risks to wellbeing? What are best practices for addressing
10
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
uncertainty and understanding risk? How should policy makers think about risk management? What
are the challenges and possible techniques for discounting different outcomes?
 C.2: Transportation: Program and Project Assessments (Marvin 308)
Chair: Doug Schleffler (Douglas.W.Scheffler@uscg.mil), U.S. Coast Guard
Discussant: Ryan Endorf (ryan.endorf@dot.gov), U.S. Department of Transportation
Presentations:
1. Social Welfare Analysis of Investment Public-Private Partnership Approaches for
Transportation Projects, R. Richard Geddes* (rrg24@cornell.edu), Omid Rouhani, and H. Oliver
Gao, Cornell University; Germà Bel, University of Barcelona
This paper has two objectives: (1) to introduce a new approach to gaining widespread support for
comprehensive road pricing; and (2) to develop a detailed social welfare analysis for road pricing
schemes. We first describe a new approach to garnering support for system-wide road pricing, which
we refer to as an investment public-private partnership, or IP3. This approach returns a significant
portion of the economic value created by road pricing back to its citizen-owners. Next, we present a
social welfare framework that estimates the benefits and costs of using the IP3 approach on an
urban transportation network. Policy makers typically evaluate public-private partnership (P3)
projects using Value for Money (VfM) analysis. However, a P3 project’s impact on overall social
welfare provides a more comprehensive evaluation criterion. Apart from several theoretical studies,
a detailed social welfare analysis that includes all major P3 project stakeholders is lacking. Using
Fresno City’s transportation system as our case study, we show that system-optimal tolling
scenarios favor average users, but that government—and consequently taxpayers—would pay for
costly tolling systems. In contrast, unlimited profit–maximizing tolls raise substantial profits for
government, for the infrastructure’s citizen-owners, and for the private sector, but the average user is
worse off. From a social welfare perspective, one should search for a Pareto-improvement under
which all major stakeholders are better off. Our estimates indicate that a mixed private and public
tolling scheme offers such an improvement. A mixed scheme results in the highest social welfare
among all scenarios unless the weight placed on motorists’ (i.e., transportation users’) welfare is
very low or the weight placed on residents’ welfare is very high relative to the weight of other
stakeholders.
2. Marine Transportation Delays: Lock-Closure Case Study Meta-Analysis, Kathryn Connelly*
(Kathryn.A.Connelly@uscg.mil), Rolling Bay; Michael Trombley, U.S. Coast Guard
Starting in 2005, the United States Army Corps of Engineers’ Navigation Economic Technologies
Program compiled a series of reports detailing shipper and carrier responses to maintenance related
lock closures on shipping channels. Under a meta-analysis framework, we evaluated four of these
case studies with an emphasis on information that would be useful in a cost benefit analysis
environment. We used the information gathered from the case studies' industry surveys to put
together the estimated costs to industry of lock closures based on the environment of the lock and
the circumstance of closure. In an attempt to both understand and elucidate the underlying common
structure of delay and industry related costs, we then constructed a framework with which we sought
to apply across the case studies. Furthermore, we used our findings to compare the costs across
incidents to garner insight into the industry’s daily activities with respect to efficiency of freight
movement and the optimization of resource allocation. Through our analysis we discovered that one
of the most influential factors on cost of delay was preparation and anticipation of a closure.
11
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Focusing on preparation and anticipation of a lock closure can predict how industry will react to
vessel incidents that close waterways, the results of which can influence safety policies and how the
USACE and industry can mitigate impacts from lock closures.
3. Use of Economic Evaluation Methods for Transportation Project Appraisal: Application
of a 3-Dimensional Space-Time Framework, Glen Weisbrod* (gweisbrod@edrgroup.com),
Economic Development Research Group, Inc.
The systematic analysis of economic benefits, costs and impacts has an important role to play in
infrastructure planning. But all too frequently, decision-making relies upon less systematic judgment
calls, in part because of confusion regarding what seems to be competing approaches featuring
benefit-cost, economic impact and financial analysis methods. Transportation planners see
proponents advocating for specific techniques, even reaching to show how each analysis technique
can address issues usually in the domain of the other – such as the inclusion of non-pecuniary social
welfare impacts in macroeconomic models, or the inclusion of economic geography shifts in benefitcost studies. However, the single method approaches are ultimately seen as inadequate because
they cannot address all of the multi-faceted information requirements of various stages of the
planning process. This presentation (and an accompanying paper) will present a critical review of
the application of economic analysis techniques – benefit-cost analysis, economic impact analysis,
and financial cash flow analysis -- for transportation infrastructure decision-making. It will do so by
presenting a formal framework for viewing these various economic analysis techniques in terms of
how they differently cover a three dimensional universe of space, time and impact elements. It will
then show how the various analysis techniques can be matched to the stages of planning, prioritizing
and funding projects, along with their associated stakeholder issues and spatial and temporal
information requirements. There are also implications of this framework for the valuation of economic
benefits and costs, as they are affected by the breadth of study areas, time periods and impact
elements to be covered. Actual cases from statewide transportation planning studies will be used to
illustrate these points. The presentation will end by showing examples of how a unified analysis
framework can reinforce the complementary aspects of benefit-cost, economic impact and financial
analysis for effective transportation decision-making.
 D.2: The Nexus between Health Effects Studies and Benefits (Marvin
413-414)
Chair: Randall Lutter (randall.lutter@virginia.edu), Resources for the Future and University of
Virginia
Discussant: Arnold Harberger (harberger@econ.ucla.edu), UCLA
Presentations:
1. An Objective and BCA-Compliant Definition of 'Adverse Effect', Richard Belzer*
(regcheck@mac.com), Regulatory Checkbook
In virtually every case, benefit-cost practitioners in environmental health rely on the outputs of risk
assessment as inputs for benefit (and sometimes cost) estimation. This practice has numerous
deficiencies, most notably that human health risk assessments are incompatible with BCA because
they are neither intended nor performed for the purpose of generating unbiased risk estimates. As a
2004 USEPA staff paper put it, “EPA seeks to adequately protect public and environmental health by
ensuring that risk is not likely to be underestimated” (emphasis in original) and therefore, “EPA’s
policy is that risk assessments should not knowingly underestimate or grossly overestimate risks.”
12
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
This means every EPA risk assessment is intentionally biased, and furthermore, risks cannot be
compared because the amount of embedded bias is variable and unknown. This paper extends the
literature by identifying, and proposing a science-based correction for, an even more fundamental
deficiency in human health risk assessment: the definition of adverse effect. Virtually every human
health risk assessment includes at least one biological endpoint that its authors have defined as
adverse. However, neither the technical fields of toxicology and epidemiology nor the practical field
of risk assessment has an objective definition of adverse effect. While some endpoints are
unambiguously adverse (e.g., mortality, cancer), at the margin adversity is determined subjectively
by the scientists and practitioners who perform risk assessments. These determinations often are
highly controversial because they incorporate the personal policy judgments of scientists and risk
assessment practitioners, or of the institutions for which he work. A fairly simple remedy is available,
one that grounds the definition of adverse effect in science and conveniently is also compatible with
benefit-cost analysis. An adverse effect should be defined as any human health condition that a
consumer is willing to pay to avoid. Two corollaries are self-evident: (1) Any effect for which a
consumer would pay to experience is per se beneficial; (2) An effect for which a consumer would
pay nothing to experience or avoid is neither beneficial nor adverse. This presentation will explain
the logic behind the proposal and identify several technical and practical challenges that must be
addressed to implement it.
2. On the Importance of Discarded Inter-Maneuver Variance in the Estimation of Benefits
from Reduced Exposure to Ambient Air Pollutants, R. Jeffrey Lewis*
(r.jeffrey.lewis@exxonmobil.com), ExxonMobil Biomedical Sciences; Richard Belzer, Regulatory
Checkbook
Estimates of the health benefits of air pollution regulations depend substantially on quantitative risk
assessments. These risk assessments, in turn, often depend on forced vital capacity (FVC) and
forced expiratory volume (FEV1) data obtained in chamber studies or from observational
epidemiology. All studies in the field that regulatory authorities consider reliable use a research
protocol developed by the American Thoracic Society (e.g., Miller et al., 2005).1 That protocol calls
for obtaining three to eight clinically acceptable FVC and FEV1 measurements (called “maneuvers”).
However, published studies report and analyze only a single value (often the mean) representing
these multiple clinically acceptable maneuvers. In short, the inherent variability across maneuvers
within each FVC/FEV1 test is routinely discarded. This paper explores the consequences of
discarding this variability. It is hypothesized that statistical comparisons of individual differences in
pulmonary function, across chamber tests or ambient concentrations, would be substantially different
if within-test inter-maneuver variability were retained and included in the data analysis. Ceteris
paribus, discarding variability is expected to artificially increase calculated statistical significance,
resulting in non-significant differences in pulmonary function being incorrectly characterized as
statistically significant. Non-significant population-level differences in pulmonary function also will be
incorrectly described as statistically significant. If these hypotheses are confirmed, scientific
expressions of confidence that observed associations are causal would have to be downgraded.
That would have important consequences for the estimation of health benefits resulting from small
changes in ambient pollutant concentrations.
3. Valuing Mortality Risk Reductions from Traffic Accidents and Air Pollution
Simultaneously, Luis Cifuentes* (luisabdoncifuentes@gmail.com), L. Rizzi, C. Cabrera, M. Browne,
and P. Iglesias, Universidad Católica de Chile
Since 1974 all public infrastructure investment decisions in Chile are subject to a formal cost-benefit
analysis. Costs are quantified using social prices. Non-monetary benefits have historically
considered mainly travel time saved by the population. The responsibility of conducting these
analyses, and proposing the methods and key parameters (such as the social discount rate) is within
the Ministry of Social Development. In 1994, the General Environment Law was enacted,
13
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
establishing the basic environmental management instruments (environmental quality standards,
emission standards, pollution abatement plans, among others) available to the authority. The law
requires that the application of any of these instruments undergo an assessment of their economic
and social impacts. Though not explicitly required, many of these assessments have included a cost
benefit analysis. The Ministry of Environment performs these assessments. A key element for the
results of both types of analyses is the social willingness to pay for reductions of premature mortality
risks arising from reduction in fatal accidents, and from ambient air pollution reduction. This
presentation reports the design, application and results of a discrete choice survey to elicit WTP for
reductions in mortality risks from traffic accidents and from cardiovascular disease attributable to air
pollution. The survey was applied to a representative sample of Chilean population. In pilot tests it
was found that the capacity of low education segments of the population to understand the risk
themselves and much less their reduction was limited, so the instrument had to be simplified. Since
the study was developed under the auspice of the Ministry of the Environment in coordination with
the Ministry of Social Development, it is expected that the results of the survey will help public
regulators better evaluate decisions regarding infrastructure and environmental quality.
 E.2: Perspectives on Implementing Benefit Cost Analysis in Climate
Assessment (Marvin 310)
Chair: Fran Sussman (fsussman@rcn.com), ICF International
Presentations:
1. State of the Literature on Economic Impacts and Adaptation at the Sectoral Level in the
U.S., James Neumann* (jneumann@indecon.com), Industrial Economics; Kenneth Strzepek,
Massachusetts Institute of Technology
This paper discusses the current literature on impacts and adaptation costs at the sectoral
level. The focus is primarily the U.S., but includes examples on international approaches where they
highlight key differences or other relevant demonstrations of method and data use. The paper
provides an overall framework that addresses the components of economic impacts, including
definitions of impacts, adaptation costs and residual damages. The paper then focuses on
understanding the current breadth and depth of the literature that exists to characterize what we
know about each key sector (agriculture, coastal resources, water resources, infrastructure, forestry,
health, recreation, energy, urban resources, and ecosystems), what is the geographic coverage,
how do methodologies differ, what are the gaps and challenges, and a sense of the impacts at the
U.S. national level. A new generation of impact studies, including the USEPA’s ongoing Climate
Impacts Risk Assessment (CIRA) project; the new IPCC AR5 Working Group II report; the US
National Climate Assessment; and the “Risky Business” effort led by the Next Generation
Foundation, provide the motivation for this review. These effects, taken together, have advanced
the state of US economic impact assessment work along two critical frontiers, both of which support
benefit-cost analyses of climate change: assessment of the risk and economic consequences of
extreme climatic events; and assessment of ecosystem effects. Yet, the latest work also highlights
gaps in what needs to be comprehensive sectoral coverage; more complete incorporation of
adaptation opportunities in impact assessment; and critical cross- and multi-sectoral effects that
remain poorly understood.
2. Improving the Practice of Economic Analysis of Climate Change Adaptation, Jia Li*
(li.jia@epa.gov), U.S. Environmental Protection Agency; Michael Mullan, the Organisation for
Economic Co-operation and Development; Jennifer Helgeson, The Organisation for Economic Cooperation and Development and London School of Economics and Political Science
14
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
The development of national and sectoral climate change adaptation strategies is burgeoning in the
US and elsewhere in response to damages from extreme events and projected future risks from
climate change. Increasingly, decision makers are requesting information on the economic damages
of climate change as well as costs, benefits, and tradeoffs of alternative actions to inform climate
adaptation decisions. This paper provides a practical view of the applications of economic analysis
to aid climate change adaptation decision making, with a focus on benefit-cost analysis (BCA). We
review the recent developments and applications of BCA with implications for climate risk
management and adaptation decision making, both in the US and other Organisation for Economic
Cooperation and Development countries. We found that BCA is still in early stages of development
for evaluating adaptation decisions, and to date is mostly being applied to investment project-based
appraisals. Moreover, the best practices of economic analysis are not fully reflected in the BCAs of
climate adaptation-relevant decisions. The diversity of adaptation measures and decision-making
contexts suggest that evaluation of adaptation measures may require multiple analytical methods.
The economic tools and information would need to be transparent, accessible, and match with the
decision contexts to be effective in enhancing decision making. Based on the current evidence, a set
of analytical considerations is proposed for improving economic analysis of climate adaptation that
includes the need to better address uncertainty and to understand the cross-sector and general
equilibrium effects of sectoral and national adaptation policy.
3. Rationales for a “Dashboard” Approach to Inform Climate Change Assessment, Michael
Toman* (mtoman@worldbank.org), World Bank
Three interrelated challenges are identified for using cost-benefit analysis to evaluate climate
change risks and responses. Each challenge stems in part from the characteristics of climate
change risks as involving potentially large and irreversible as well as highly uncertain impacts. First,
many critics of economic analysis disagree with its underlying behavioral assumptions and its focus
on an economic summary statistic. The second challenge relates to how society evaluates moral
aspects of climate change risks and responses, in particular the implications for intergenerational
equity. Third, climate change is subject to a large degree of Knightian uncertainty, with implications
for how individuals perceive and evaluate climate change risks. Addressing these challenges
requires retaining a strong focus on assessing the potential costs of climate change and the potential
benefits of policy responses, but it also calls for providing several different types of information to
decision makers – a “dashboard” approach.
4. Incorporating Benefit-Cost Analysis into Other Decision-Making Frameworks, Robert
Lempert* (lempert@rand.org), RAND Corporation
Benefit cost analysis (BCA) aims to help people make better decisions. But BCA does not always
serve this role as well as intended. In particular, BCA's aim of aggregating all attributes of concern to
decision makers into a single, best-estimate metric can conflict with the differing world views and
values that may be an inherent characteristics of many climate-related decisions. This paper argues
that the new approaches exist that can help reduce the tension between the benefits of providing
useful, scientifically-based information to decision makers and the costs of aggregating uncertainty
and differing values into single best-estimates. Enabled by new information technology, these
approaches can summarize decision-relevant information in new ways. Viewed in this light, many
limitations on BCA lie not in the approach itself, but the way it is used. In particular we will argue
that the problem lies in a process that begins by first assigning agreed-upon values to all the
relevant inputs and then using BCA to rank the desirability of alternative decision options. In
contrast, BCA can be used as part of a process that begins by acknowledging a wide range of
ethical and epistemological views, examines which combinations of views are most important in
affecting the ranking among proposed decision options, and uses this information to identify and
seek consensus on actions that are robust over a wide range of such views.
15
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Session 3 - Thursday, March 19, 2:00 - 3:30
 A.3: Skills for the Next Generation: A Conversation between Senior
Government Economists and Public Policy School Leaders (Marvin
309)
Chair: David Weimer (weimer@lafollette.wisc.edu), University of Wisconsin-Madison
Panelists to Include:
1. Sherry Glied (sherry.glied@nyu.edu), Wagner Graduate School of Public Service, New York
University
2. Kathryn Newcomer (newcomer@gwu.edu), Trachtenberg School of Public Policy and Public
Administration, George Washington University
3. Sarah Stafford (slstaf@wm.edu), Jefferson Program in Public Policy, College of William and
Mary
4.
Amber Jessup (Amber.Jessup@HHS.GOV), U.S. Department of Health and Human Services
5.
Al McGartland (McGartland.al@epa.gov), U.S. Environmental Protection Agency
6.
Clark Nardinelli (clark.nardinelli@fda.hhs.gov), U.S. Food and Drug Administration
7.
Jack Wells (jackwells1@mac.com), U.S. Department of Transportation - retired
As the baby boomers head towards retirement, Federal agencies are faced with the difficult task of
replacing experienced economists with recent graduates of public policy schools and other
programs. This panel will address the skills related to benefit-cost analysis that are needed by
different agencies and how the leaders of several public policy schools are responding to these
challenges.
 B.3: Assessing Benefits for Policies that Reduce Health Risks (Marvin
307)
Chair: Thomas J. Kniesner (Thomas.Kniesner@cgu.edu), Claremont Graduate University
Presentations:
1. Believe Only Half of What You See: The Role of Preference Heterogeneity in Contingent
Valuation, Daniel Herrera* (danielherreraa@gmail.com), Toulouse School of Economics; James K.
Hammitt, Harvard University
The paper shows, under an expected-utility framework, the exact theoretical relationship between
willingness to pay (WTP) to reduce small mortality risks, risk reduction, baseline risk, and income.
We propose a scope-revealing value of statistical life (SR-VSL) that accounts for any lack of scope
sensitivity. Using a French stated-preference survey fielded to a large, nationally
16
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
representative internet panel, we explore by how much, and why, respondents depart from the
expected utility predictions. We find that only 40% of our respondents' behave as predicted by
expected-utility theory. High concern for environmental risks to health, low education, and less time
spent completing the survey are good predictors of deviant answers. Our preferred value of
statistical life estimates range from 2.2 to 3.4 million euros for adults, and over 6 million euros for
children. No differences are found for disease-specific WTP, particularly, we find no evidence of a
premium for cancer.
2. Willingness to Pay for Mortality Risk Reduction in Chinese Cities, Sandra Hoffmann*
(shoffmann@ers.usda.gov), U.S. Department of Agriculture; Alan Krupnick, Resources for the
Future; Ping Qin, Peking University
Willingness to pay for contemporaneous and future mortality risk reductions is estimated for
residents of Shanghai, Jiujiang and Nanning, China using a stated preference survey. An innovative
computerized payment card elicitation vehicle is used that allows analysis of the impact of anchoring
on willingness to pay (WTP). Results are compared to those from administration of a dichotomous
choice version of the survey in China and in several other countries. The overall VSL for a
contemporaneous reduction in annual mortality risk reduction of 5 in 10,000 is about 1.47 million
yuan ($430,000 U.S.). WTP as a percentage of household income is similar to that in other
countries. Tradeoffs between current and future risk reductions are far less than in other countries
(i.e., closer to one). We find some evidence of a “senior discount” and income elasticities in the 0.2
to 0. 25 range. Results for the “payment screen” version of the survey pass the external scope test,
while those of a previous dichotomous choice version do not. WTP estimates are higher in the
dichotomous choice than the “payment screen” version of the survey. We find evidence of
statistically significant, but numerically small, anchoring effects in the payment screen version. The
computerized payment (or payment screen) allows random assignment of the initial cursor point and
tracking of cursor movements by respondents. This allows us to create quantitative measures of
starting point bias, a concern raised with conventional payment card instruments. More non-movers
were observed in the study than would be expected with completely random behavior, and we find
some evidence of starting point bias influencing WTP. But the effect is small. We found sizable
percentages of respondents choosing round numbers (such as 100 and 1,000). To counter this
tendency payment cards should be constructed without round numbers.
3. Valuing Reductions in Risks of Fatal Illness: Implications of Recent Research, Lisa A.
Robinson* (Lisa.A.Robinson@comcast.net) and James K. Hammitt, Harvard Center for Risk
Analysis
The value of mortality risk reductions, conventionally expressed as the value per statistical life (VSL),
is an important determinant of the net benefits of many government policies. As a result, regulatory
agencies and researchers have devoted substantial attention to identifying the values to be used.
Historically, the values applied by U.S. regulators have been based on studies of fatal injuries,
raising questions about whether different values might be appropriate for risks associated with fatal
illness. Our review suggests that, despite the substantial expansion of the research base in recent
years, few U.S. studies of illness-related risks meet quality criteria for use in regulatory analysis.
Those that do yield values that are similar to the values for injury-related risks. Given this result,
combining the findings of these few studies with the findings of the more robust literature on injuryrelated risks appears to provide a reasonable range of estimates for application in regulatory
analysis. Although the studies we identify differ from those that underlie the values currently used by
Federal agencies, the resulting estimates are remarkably similar, suggesting that there is substantial
consensus emerging on the values applicable to the general U.S. population.
17
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
4. The Role of Publication Selection Bias in Estimates of the Value of a Statistical Life, W.
Kip Viscusi* (kip.viscusi@vanderbilt.edu), Vanderbilt University
Meta-regression estimates of the value of a statistical life (VSL) controlling for publication selection
bias often yield bias-corrected estimates of VSL that are substantially below the mean VSL
estimates. Labor market studies using the more recent Census of Fatal Occupational Injuries (CFOI)
data are subject to less measurement error and also yield higher bias-corrected estimates than do
studies based on earlier fatality rate measures. These results are borne out by the findings for a
large sample of all VSL estimates based on labor market studies using CFOI data and for four metaanalysis data sets consisting of the authors’ best estimates of VSL. The confidence intervals of the
publication bias-corrected estimates of VSL based on the CFOI data include the values that are
currently used by government agencies, which are in line with the most precisely estimated values in
the literature.
 C.3: Development and Miscellaneous Regulatory Issues (Marvin 308)
Chair: Tony Homan (Anthony.homan@dot.gov), U.S. Department of Transportation
Presentations:
1. Integration Investment Appraisal of Critical for Post-Conflict Development Investment:
Milk Processing Plant, Ethiopia, Glenn P. Jenkins* (Jenkins@cri-world.com), Queen’s University;
Mikhail Miklyaev, Cambridge Resources International
In recent years the Somali region of Ethiopia has been affected by the political unrest in Federal
Republic of Somalia. As a consequence the level of private investment in the region has been very
low. After some degree of political stabilization in Somalia, the post-conflict development in the
Somali Region of Ethiopia has begun. This study is a cost benefit analysis of a critical private
investment in the milk value chain for dairy production in the region.
The investment is in the construction of the first milk processing plant in the region. The plant will
process both cows’ and camels’ milk and will be the first plant in Ethiopia processing camels’ milk.
The pasteurized milk will mainly target consumers in the Somali Region, with some output delivered
to Addis Ababa. In addition approximately 40% of pasteurized camel’s milk will target the Somalia
export market and will be shipped using refrigerator trucks. Pastoralists in Somalia culturally prefer
camel’s milk to any other, and the limited supply of this milk has resulted in high prices. These prices
will generate some early high private returns required to compensate for the risk associated with a
first-mover investment in an unstable environment.
Recognizing the importance of the success of such investments for the sustainable development of
the cross-border regions, the United States Agency for International Development has decided to
support the private entrepreneur with limited grand and technical assistance. It is also expected the
lowering of the price over time as the supply of milk products increases may generate consumer
surplus. In addition access to pasteurized milk that has longer shelf life than the raw milk alternative
and the living preferences for the consumption of milk throughout the day imply significant positive
anticipated health impacts. The study primarily focuses on the estimation of financial and economic
returns of the investment and stakeholder analysis of the value chain intervention.
2. Efficient Microlending without Joint Liability, Can Sever* (sever@econ.umd.edu),
University of Maryland; Ahmet Altinok, Boğaziçi University
18
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Peer-group mechanisms have been widely used by micro-credit institutions to minimize default risk.
However, there are costs associated with establishing and maintaining liability groups. In the case
when output is fully observable, we propose a dynamic individual lending mechanism. Assuming that
risky borrowers discount the future costs and benefits relatively higher, our mechanism performs
equally well in repayment rates, distinguishes safe and risky borrowers through differentiated interest
rates and payment schedules. Our mechanism is able to eliminate adverse selection problem, and
reaches the first best outcome. We also identify welfare maximizing payment schedules. Individual
lending further saves from internal costs of group formation, hence our mechanism achieves a net
welfare-superior outcome in case of costly joint liability schemes, and relaxing the group requirement
broadens the fractions of society into which microfinance institutions can penetrate. We also
introduce history dependent success probabilities, and show existence of efficient individual
contracts in that environment. Finally, when outputs are private information, we propose a
refinancing scheme in order to give incentive to the borrowers to tell the truth about outputs.
3. The Impact of Past Landscapes Memory on Land Holders’ Willingness to Participate in
Payments Policy Instrument in Highlands of the Blue Nile Basin, Ethiopia, Befikadu Alemayehu
Legesse* (befikealeme2000@yahoo.com) and Frank Wätzold, Brandenburg University of
Technology
The Blue Nile Basin is harshly endangered due to natural degradation and economic activities,
hence, it is worthwhile to implement alternative policy instrument. In this regard, payment schemes
become a prominent policy instrument to preserve biodiversity and ecosystem services worldwide. In
developing countries, payments schemes are often referred to as payments for ecosystem services
(PES) and in developed countries they are widespread in the context of agricultural policy as agrienvironment schemes. Participation in PES intervention is voluntary, which raises the question of
what factors affect land owners to participate in PES interventions. Identifying influencing factors
may help to predict participation rates ex ante, to predict spatial allocation if factors are spatially
clustered, and to target efforts to increase participation at specific groups. This study aims to analyze
factors affecting landholders’ willingness-to-participate in a PES intervention. Besides, this study
investigates the impact of past landscape memory on land holders’ willingness-to-participate in a
PES scheme so as to provide possible policy implications.
For this study, demographic, socioeconomic and institutional data was collected from primary and
secondary sources of information. The primary data was collected from sampled farmers through
structured questioner. We employed multistage probability sampling procedure in order to select the
respondents. A total of 301 upstream respondents were interviewed by well experienced
enumerators along close supervision. In addition, data also collected from group discussions.
 D.3: The Valuation of Ecological Goods and Services in Support of
Benefit-Cost Analysis (Marvin 413-414)
Chair: Ann Cavlovic (Ann.Cavlovic@ec.gc.ca), Environment Canada
Discussant: Randall Lutter (lutter@rff.org), University of Virginia and Resources for the Future
Presentations:
1. Environmental Benefit Transfer for Decision-Making, Jean-Michel Larivière* (JeanMichel.Lariviere@ec.gc.ca), Environment Canada
19
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Monetizing the value of the environmental impacts associated with governmental policies or
development projects is increasingly becoming a requirement for benefit-cost analysis. However,
constraints around timelines, funding, or analytical resources may prevent detailed primary research
on these impacts from being undertaken. As such, the valuation of environmental impacts may be
imprecise or reduced to qualitative statements. When facing such limitations, transferring values
from a previous study to a similar analytical context (benefit transfer) becomes a cost-effective
approach to generate defensible estimates of environmental values to inform decision-making.
However, practitioners would still need to spend significant time searching the literature to find
appropriate studies to transfer values from. The Environmental Valuation Reference Inventory
(EVRI) was developed in the late 1990s by Environment Canada and the U.S. Environmental
Protection Agency as a cost-effective tool to help analysts apply benefit transfer techniques via
better access to valuation studies. As the world’s largest online searchable storehouse of
environmental valuation studies, EVRI facilitates the literature review process by providing
summaries (“captures”) of the key variables contained within each of these studies, e.g. study
location, valuation technique, survey instrument, type of environmental asset, etc. This presentation
will highlight how EVRI can support the work of benefit-cost analysis practitioners and illustrate how
it has been used in Canada.
2. Improving the Valuation of Water Quality, William Wheeler* (wheeler.william@epa.gov), U.S.
Environmental Protection Agency
For environmental quality improvements, particularly those involving air pollution controls, major
investments have been made in data collection and modeling methods. As a result, decisionmakers have well-developed tools available to inform their understanding of air pollution policies.
America’s water resources are also at risk of degradation from pollutants such as nutrients,
sediment, pathogens, and toxins; however, the ability to estimate the benefits of water quality
improvements has lagged behind analogous work for air quality. The U.S. Environmental Protection
Agency’s (EPA’s) Office of Policy, Office of Research and Development, and Office of Water have
formed a collaborative team of economists, ecologists and water quality modelers to develop a
national water quality benefits modeling framework to support greatly improved quantification of the
economic benefits of improved water quality. This effort represents a significant investment of
resources. A focus will be on expanding the scope of benefits estimates beyond freshwater lakes
and rivers (which have been the primary target of the majority of existing valuation studies) to
estuaries, coastal areas, freshwater small streams, and the Great Lakes. Another focus will be on
expanding the range of benefits that can be considered; for example, there is a dearth of studies
valuing recreational swimming in comparison to those valuing recreational fishing. This presentation
will describe EPA’s effort in more detail, including outreach, results so far, and future tasks.
3. Measuring Ecosystem Service Benefits for Benefit-Cost Analysis, Kimberly Rollins*
(Kimberly.Rollins@ec.gc.ca), University of Nevada, Reno and Environment Canada
Estimating changes in benefits can be challenging in cases where a policy change in an affected
ecosystem could change the probability of crossing an irreversible ecological threshold from one
dynamic steady state system to another steady state system, which is less desirable for
society. System dynamics within each state may result in predictable variations in ecosystem goods
and services over time, but a shift between states can lead to a very different set of flows of goods
and services. In this case, a primary focus is on how the policy change affects the probability of an
irreversible transition across the threshold between states. There is no economic concept that is
directly comparable to an ecological threshold. Indeed, the point at which changes in costs and
benefits from ecosystem disturbance would indicate a change in economic decision-making is likely
to occur before an ecological threshold is reached. Practical examples of such situations have
arisen in previous cases where policy changes have affected (1) the likelihood of introduction and
diffusion of invasive exotic species, such with as Bromus annual grasses on rangelands in the
20
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
intermountain West; and (2) the probability of catastrophic fires, through fuel reduction and
restoration treatments in forests where ecosystem dynamics have been disturbed by heavy fuel
accumulations from decades of over-suppression of fire. This presentation will consider decisions
that can alter costs and benefits through altering the probability of crossing irreversible ecological
thresholds, using arid rangeland ecosystems and forested systems as examples. We demonstrate
how models that incorporate economics and ecological dynamics can be designed and repeatedly
used when benefits measures are needed for different contexts in large landscape-level
systems. We argue that close collaboration with system ecologists is necessary in order to design
models that produce reliable benefits measures.
 E.3: Climate Policy Benefits Issues (Marvin 310)
Chair: Richard Belzer (regcheck@mac.com), Regulatory Checkbook
Presentations:
1. Examining the Energy Efficiency Gap in EPA's Benefit-Cost Analysis of Vehicle
Greenhouse Gas Regulations, Gloria Helfand* (helfand.gloria@epa.gov), U.S. Environmental
Protection Agency; Reid Dorsey-Palmateer, University of Michigan
Recent federal regulations require new light-duty vehicles to have lower greenhouse gas emissions
and better fuel economy. This paper presents the reasoning used by the U.S. Environmental
Protection Agency (EPA) in its benefit-cost analysis of the standards. According to EPA, many
available technologies could achieve these goals without affecting other vehicle qualities, and fuel
savings would pay for the increased technology costs with short payback periods. This lack of
market adoption of cost-effective energy-saving technologies has been termed the energy efficiency
gap or energy efficiency paradox. It suggests either that there are additional costs, such as changes
in vehicle qualities, not considered in cost estimates, or that markets for energy-saving technologies
are not achieving all cost-effective savings. EPA argued that, even if consumers do not accurately
consider expected future fuel savings when buying new vehicles, consumers are projected to
receive those savings; the latter measure should reflect the impacts of the rule on fuel
expenditures. On the cost side, without evidence of adverse effects on other vehicle characteristics,
EPA argued that its measure of technology costs accounts for or overestimates the compensating
variation associated with the costs of meeting the goals and are therefore a reasonable measure of
social costs.
2. Loaded DICE: Refining the Meta-analysis Approach to Calibrating Climate Damage
Functions, Peter Howard* (HowardP@exchange.law.nyu.edu), NYU School of Law; Thomas
Sterner, University of Gothenburg
Climate change is one of the preeminent policy issues of our day, and the social cost of carbon
(SCC) is one of the foremost tools for determining the socially optimal policy response. The SCC is
estimated using Integrated Assessment Models, of which Nordhaus’ DICE is the first and one of the
best respected. While accuracy at each of the steps of these climate-economic models is necessary
to precisely estimate the SCC, calibrating the climate damage function, which translate a
temperature change into a percentage change in GDP, correctly is critical. Calibration of the damage
function determines which climate damages are included and excluded from the cost of carbon.
Traditionally, Nordhaus calibrated the DICE damage function using a global damage estimate
calculated by aggregating a series of region-sector specific damage estimates. However, in his latest
version, Nordhaus moved to calibrating the DICE damage function using a meta-analysis at the
global scale. This paper critiques this meta-analysis approach on several grounds, and re-estimates
the DICE-2013 damage function using up-to-date techniques to more accurately reflect climate
21
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
damages and the uncertainty underlying them. The effect of this updated damage function on the
resulting SCC estimate is determined.
This paper improves an important economic model of climate damages by ensuring that the
estimating of its damage function meets the currents standards for meta-analysis. First, I improve
the data in the underlying study by including additional estimates and correcting previous estimates.
Second, I improve the econometric specification by including key control variables currently omitted.
Third, I employ up-to-date meta-analysis techniques. Fourth, I conduct a sensitivity analysis to test
the robustness of my results. Last, after estimating a new damage function and the resulting SCC, I
conclude with a critique of meta-analysis at the global scale.
3. Think Global, Benefit Local: How the United States Benefits from Calculating the Global
Costs of Its Carbon Emissions, Jason Schwartz* (jason.schwartz@nyu.edu) and Peter Howard,
NYU School of Law
The massive benefits to the US from global greenhouse gas emission reductions provide a powerful
incentive for the US to strategize ways to encourage a global response to climate change. One
prudent strategy is for the US to continue calculating the global costs of its own greenhouse gas
emissions, and to use that value to set economically efficient policies. Theories and experiments on
negotiation strategy suggest that such efforts could successfully stimulate cooperative international
action. The Obama Administration has adopted a global calculation of the Social Cost of Carbon
(SCC) as part of its climate negotiation strategy. Charged by the US Constitution with managing
foreign affairs and coordinating executive branch activities, President Obama deserves political and
judicial deference on his choice to calculate the global benefits of US climate regulations. Binding
legal obligations further counsel in favor of the US using a global SCC value. In short, to safeguard
its own national interests and maximize benefits locally, the US should continue to think globally.
This paper provides an extensive discussion of the legal and economic argument for using a global
SCC. First, we calculate the benefits that the US already has and will receive from foreign countries
taking action on climate change. By demonstrating that US citizens benefit from foreign emission
reductions, we argue that the US should work to continue and expand these benefits using various
tools, specifically a global SCC. Second, we review economic models of strategic behavior. These
models suggest that using the global SCC value in US policy could help induce international
cooperation on climate change. Finally, we explore the legal authority (as well as some
requirements) to use the global SCC in analyzing and setting US policy.
4. Estimating the Social Benefits of a State-Wide Carbon Tax: A Case Study in Washington
State, Alison Saperstein* (atarbox@uw.edu), Andrew Martin and Kate Delavan*
(kate.delavan@gmail.com), University of Washington
Proponents of state government legislation to restrict greenhouse gas emissions claim that such
policies at the state level could encourage and inform future national legislation to mitigate global
climate change. Given the uncertainty of that political outcome, we asked, what local economic
benefits can we expect from a state-wide carbon tax? As a case study, we estimate the social
benefit of a proposed policy for a revenue-neutral tax on carbon in the state of Washington,
developed by Carbon Washington. The proposed legislation could become a ballot initiative in 2016
and includes a $25 per metric ton tax on carbon and three additional, off-setting tax reforms: (1) a
reduction in sales tax, (2) a tax rebate to low income households, and (3) elimination of the state B &
O tax for manufacturers. We estimate the potential impacts of this policy in carbon-related markets
and associated secondary markets. We also estimate the non-market value of the resulting
reduction in greenhouse gases and other emissions and the local willingness to pay to contribute to
22
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
a solution to the global problem of climate change. We discuss future research needed in order for
analysts to more accurately evaluate state government decisions of this nature.
Session 4 - Thursday, March 19, 3:45 - 5:15
 A.4: Estimating the Benefits of Policies that Address Addictive Goods
(Marvin 309)
Chair: Lisa A. Robinson (Lisa.A.Robinson@comcast.net), Harvard Center for Risk Analysis
Panelists to Include:
1.
David Cutler (dcutler@harvard.edu), Harvard University
2.
Sherry Glied (sherry.glied@nyu.edu), New York University
3.
James K. Hammitt (jkh@harvard.edu), Harvard University
4.
Donald Kenkel (dsk10@cornell.edu), Cornell University
It is unusual to find benefit-cost analysis featured both in Doonesbury and on the front page of the
New York Times. Yet that is what happened with the U.S. Department of Health and Human
Services’ (HHS’) approach for estimating the benefits of tobacco regulations. The core question is
whether and how to account for the “pleasures” of tobacco use; i.e., the losses in consumer surplus
that accrue when government policy curtails or discourages behavior that individuals prefer.
Typically, we assume that individuals’ preferences are revealed by their behavior and respect these
preferences. If an individual understands the likelihood of addiction, the associated health risks, and
the difficulty of quitting, yet continues to smoke, presumably the utility associated with smoking
outweighs the value of the related health risks. However, an individual’s behavior may diverge from
his or her “true” preferences due to self-control problems, misperception of risks, present bias, misforecasting of future consequences and other factors. The underlying issues are profound and raise
several questions related to how we assess the benefits of numerous policies that correct “individual
failures” in addition to or instead of market failures. When might it be appropriate to move away from
reliance on revealed preferences? Whenever addiction is involved? What about other cases where
misunderstanding or decision-making anomalies affect behavior? What would then be the basis for
valuing policy outcomes? This symposia will bring together a group of experts to discuss specific
approaches for estimating benefits in the context of tobacco control, who will also discuss general
concepts and the broader policy relevance of the approaches.
 B.4: Benefits, Costs and Labor Markets (Marvin 307)
Chair: Joseph Cordes (cordes@gwu.edu), The George Washington University
Presentations:
1. Public Policy-Induced Changes in Employment: Valuation Issues for Benefit-Cost
Analysis, David Weimer* (weimer@lafollette.wisc.edu) and Robert Haveman, University of
Wisconsin-Madison
23
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
We explore the economic welfare effects of direct and indirect government-induced changes in
employment under varying market conditions. We begin with a discussion of those policy-induced
employment changes that seamlessly reshuffle workers among jobs in an efficient (i.e., fullemployment, full-information) economy; generally such changes create few, if any, net changes in
economic welfare not captured in changes in wage bills. We then turn to the effects of policy-induced
employment changes in economies with two market distortions: 1) involuntary unemployment during
periods of deficient aggregate demand for labor resulting from inflexible wages set by law or custom,
and 2) illiquidity resulting from imperfect capital markets that prevent people from borrowing against
future earnings. Induced employment changes in these circumstances impose real net social costs
or generate real net social benefits beyond changes in the wage bill. We also assess the likely
magnitude of the social opportunity cost of labor in the case of involuntary unemployment and
imperfect liquidity, and address how the welfare effects of such employment changes should be
valued. Based on currently available empirical research, we develop estimates of the opportunity
gains or costs of hiring or releasing an employee during periods of high unemployment with and
without other market distortions. In contrast to conventional benefit-cost analysis practice, which
treats releasing workers as having a negative opportunity cost, we estimate an opportunity cost for
firing that is positive and equal to about 45 percent of pre-firing compensation, primarily because of
the “scarring effect” of unemployment. Also in contrast to conventional practice, we estimate an
opportunity cost for hiring an unemployed worker that is less than the worker’s opportunity cost of
time.
2. Income Growth, Inequality and Happiness, Richard Zerbe* (zerbe@uw.edu), University of
Washington
Per capita real income has increased in Europe and in the World from about the year 1000. The
great increase has occurred since about 1815, in not only the West but in the World. We ask first,
does this increase also increase happiness? If so why? Since about 1976, after a substantial
period of decline, the income share of the top 1% especially but also the top 10% has increased
substantially in the US and in much of Europe and also in Canada and Australia. Nevertheless the
real income of the bottom part of the income distribution has made absolute gains, while losing
relatively. Why is this, and what has this done to happiness? This paper proposes answers or at
least suggested answers to these questions.
3. The Social and Economic Effects of Wage Violations: Estimates for California and New
York, Kelly Haverstick* (Kelly.Haverstick@erg.com), Calvin Franz, Tess Forsell and Lou Nadeau,
Eastern Research Group
This project estimated compliance with labor laws and the costs of non-compliance. The Fair Labor
Standards Act (FLSA) sets national standards for a minimum hourly wage, maximum hours worked
per week at the regular rate of pay, and premium pay if the weekly standard is exceeded (overtime
pay). State governments can implement labor laws that provide higher wage floors, more restrictive
overtime laws, or stricter exemption requirements. Both California and New York have enacted
stricter rules. Accounting for variation in state labor law adds to the complexity of evaluating
compliance with labor laws. The study focused on minimum wage and overtime pay violations in
California and New York. Using two large nationally representative datasets, the Current Population
Survey (CPS) and the Survey of Income and Program Participation (SIPP), the first step in
estimating the costs and benefits of non-compliance was to evaluate which workers are covered by
these labor laws. Then lost wages to workers were estimated which is the primary cost. Using the
CPS, an estimated $22.5 million in wages were lost by workers in California and $10.2 million in
wages were lost by workers in New York in 2011 due to minimum wage violations. However, failure
to comply with the FLSA and state labor laws has implications beyond the dollar amount of unpaid
wages. Additional costs were estimated including government impacts of lower tax revenue (due to
lower employment and income tax payments by employees) and higher expenditures on social
24
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
support programs due to the lack of compliance with the FLSA and state minimum wage laws.
Specifically, the analysis included estimates of any change in eligibility and benefits for several state
and federal social assistance programs (such as the Supplemental Nutrition Assistance Program)
based on the estimated loss in earnings due to minimum wage violations.
4. Social Enterprises and the Disadvantaged: The Costs and Benefits of Stabilizing Lives
through Transitional Employment, Dana Rotz* (drotz@mathematica-mpr.com) and Nan Maxwell,
Mathematica Policy Research
This research examined participant experiences in and operations of six social enterprises. Social
enterprises were structured to be financially viable businesses that intentionally employ four groups
of individuals facing severe employment barriers (those with mental health disabilities, formerly
homeless individuals, parolees and ex-offenders, and young adults who are neither enrolled in
school nor participating in the labor market) as a means to increase these workers’ economic selfsufficiency. Our research collected both participant- and organization-level data with the ultimate
goal of performing a cost-benefit analysis of (1) developing a social enterprise and (2) transforming
an existing profit-driven business into a social enterprise. Our pre-post analyses drew information
from social enterprise workers in all six social enterprises and our quasi-experimentally designed
(QED) study drew information from a single, case-study social enterprise. Across all enterprises,
benefits were measured using (1) organization balance sheets (revenues), (2) fixed-effects analysis
analyzing changes in outcomes for social enterprise employees after one year in five domains:
employment, income (self-sufficiency), housing, criminal activity, and health, and (3) a difference-indifferences analysis in the QED as a sensitivity analysis in each of the five domains. Cost data were
drawn from the organizations’ balance sheets. We estimated that social enterprise employment was
associated with a return on investment of 131 percent across all social enterprises and 41 percent
for the case study enterprise. These returns are largely driven by gains to taxpayers from lower
costs of housing employees (who move out of homelessness) and reduced rates of recidivism of
employees. Additionally, we found social returns to converting profit-driven businesses into social
enterprises in excess of 100 percent.
 C.4: Benefit-Cost Practices and Discounting Issues (Marvin 308)
Chair: Jack Knetsch (knetsch@sfu.ca), Simon Fraser University
Presentations:
1. Kaldor, Hicks, and Discounting, Daniel Wilmoth* (daniel.wilmoth@hhs.gov), U.S. Department
of Health and Human Services
Time preference discounting and opportunity cost discounting are alternatives that have each
aroused ardent support among economists. The extensive analyses developed for the discounting
debate have typically assessed these options by exploring their implications for a representative
consumer. In practice, the impacts of government policies vary across consumers, with some
bearing costs and others enjoying benefits. When impacts are heterogeneous, policies are typically
assessed using some version of the criterion introduced by Kaldor, under which a policy is deemed
desirable if productivity improvements would allow winners to benefit even if losers were
compensated at their expense. Because the analysis of costs and benefits is used to evaluate
satisfaction of that criterion, the relationship between discounting and that criterion is a fundamental
determinant of the appropriate discounting scheme. I argue that the criterion is ambiguous when
impacts span multiple periods. Two methods for implementing the criterion are considered, and
each method is shown to correspond to one of the two popular discounting schemes. I argue that
only one of the methods succeeds in identifying policies that improve productivity. That method
25
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
corresponds to opportunity cost discounting, and opportunity cost discounting should therefore be
used to evaluate government policies.
2. The Social Discount in Developing Countries, Missaka Warusawitharana*
(m1mnw00@frb.gov), Federal Reserve Board
The social discount rate is the interest rate used to evaluate infrastructure and other public
projects. As seen from the discussion on the Stern report on climate change (see Stern, 2007, and
Nordhaus, 2007), differences in the social discount rate can have substantial implications for
evaluating the costs and benefits of public projects. This note proposes a heuristic approach to
deriving the social discount rate for developing countries based on the sovereign borrowing rate.
This note argues for using real sovereign borrowing rates as the social discount rate for evaluating
public projects in developing countries. Compared to current practice, such an approach would
result in a lower discount rate than currently applied, potentially leading to greater public
infrastructure investments in developing countries. If carried out wisely, such investments may help
boost living standards for many people in these countries.
3. Does Benefit-Cost Analysis Solve the Right Problem?, Timothy Brennan*
(brennan@umbc.edu), University of Maryland, Baltimore County
Benefit cost analysis (BCA) is the standard tool for assessing whether a public policy is worth
doing. A general if not uniform justification for BCA is that it indicates when the winners from the
implementation of a public policy could, in principle, compensate the losers. A second defense of
BCA as a rule is that on average everyone will eventually gain if one implements only regulations
with positive net benefits. BCA merits a stronger role to the extent that economic efficiency is a
paramount consideration in setting public policies. However, the attraction of BCA goes beyond
that. In some respects, BCA is an important regulatory practice because it imposes an impersonal
discipline on the regulatory process. The key terms are the related concepts of “impersonal” and
“discipline”. “Impersonal” refers to the idea that the net merit of a regulation stands apart from the
particular preferences of those with the authority to determine whether a policy is
implemented. “Discipline” refers to ensuring that the regulatory process leads to only those
outcomes meeting that impersonal standard. BCA can promote interpersonal discipline, but it will fail
to do so in contexts where benefit-cost calculations are not necessarily met. Examples include rules
nominally instituted to promote competition, redistribute wealth, or promote civil, constitutional or
political rights. A key consideration is the burden of proof and who bears it. Informed by recent
experience in communications regulation, the analysis will look at whether the Administrative
Procedures Act addresses this concern and how “Chevron deference” evolved from giving EPA the
authority to define a polluting “source” to allowing agencies to institute regulations based merely on
their judgment. The problem facing the regulatory process may be too big for BCA alone to solve.
4. Formality and Informality in Cost-Benefit Analysis, Amy Sinden*
(Amy.Sinden@temple.edu), Temple University
Cost-benefit analysis (CBA) is usually treated as a monolith. In fact, the term can refer to a broad
variety of decisionmaking practices, ranging from a qualitative comparison of pros and cons to a
highly formalized and technical method grounded in economic theory that monetizes both costs and
benefits, discounts to present net value, and locates the point at which the marginal benefits curve
crosses the marginal costs curve. This article develops a typology that helps to conceptualize and
analyze the multiple varieties of CBA along the formality-informality spectrum. It then uses this
typology to analyze the treatment of CBA by the academic community and the three branches of the
federal government. In academic and policy circles, the formal end of this spectrum generates far
more controversy than the informal end. Additionally, the law (federal environmental statutes and
26
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
federal case law) seems to favor informal over formal varieties of CBA. Nonetheless, the executive
branch appears to be moving toward the formal end of the spectrum. Executive Orders and
guidance documents direct agencies to conduct a highly formal mode of CBA. And anecdotal
evidence suggests that agencies often go out of their way to give their CBAs the trappings of
formality, sometimes in ways that lead to irrational results. I argue that 1) failing to distinguish
between formal and informal CBA, and the many varieties in between, has led to muddled thinking
and to misuses of CBA; and 2) the trend toward formality in the executive branch is a bad
development, in part because it can, and often does, lead to what I call “false formality”—a
corruption of CBA that can occur when agencies fail to clearly and consistently define where on the
formality-informality spectrum a particular CBA falls.
 D.4: Electricity Sector Optimization (Marvin 413-414)
Chair: Anne Smith (anne.smith@nera.com), NERA
Presentations:
1. Analyzing the Costs and Benefits of Microgrids, Brian Morrison* (bgm@indecon.com),
Nadav Tanners, Claire Santoro, and Christopher Smith, Industrial Economics, Incorporated
In 2014, Governor Andrew Cuomo announced a $17 billion strategy to transform the State of New
York’s infrastructure, enhancing its ability to withstand severe weather events. An important element
of the strategy is developing a more resilient energy system. This includes soliciting proposals to
fund the development of microgrids in selected communities throughout the state. Microgrids are
electric distribution systems that can operate when connected to the larger grid, but can also
disconnect from it and operate independently during an emergency. This capability enables them to
support the delivery of critical services – i.e., services that are essential for public safety and health –
when the conventional grid is down. The anticipated benefits of each microgrid will be an important
consideration in evaluating community proposals for state funding. This presentation describes a
model developed to assist the New York State Energy Research and Development Authority
(NYSERDA) in evaluating these benefits, as well as each project’s costs. The model analyzes a
broad range of costs and benefits, enabling NYSERDA to identify investments that are likely to be
cost-effective from a social welfare perspective. A key consideration in evaluating the benefits of a
microgrid is the frequency of major power outages, which can be difficult to predict. Concern that
climate change is likely to increase the frequency of such outages is high in New York, which
suffered nine presidentially declared weather disasters, including Hurricane Sandy, from 2010
through 2013. To address this concern, the model relies on breakeven analysis to determine how
often lengthy outages would need to occur for the benefits of a microgrid to equal its costs. The
results can be compared with historical data on the frequency of major outages to determine
whether development of a microgrid is likely to be cost-effective.
2. An Economic Analysis of Power Generation Options for Nigeria, Ijeoma L. Eziyi*
(eziyi.ijeoma@yahoo.com) and Omotola M. Awojobi, Eastern Mediterranean University
Over the past two decades, electricity prices have been highly subsidized for consumers in Nigeria.
Because of this, investment in the power sector has not been competitive, leaving the economy with
an energy deficient power sector. Nigeria’s power crisis has been further compounded by the lack of
enabling environment for private investors due to regulated prices, corruption; inflation and high cost
of capital. As of December 2013, the country had a total of 8,664MW capacity installed, out of which
only 4,842MW was available to meet peak demand. With an estimated peak demand for electricity in
the country of about 11,230MW, there is huge investment gap in the power sector which could
27
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
generate substantial welfare advantage for consumers, and more surplus to the producers and other
stakeholders in the system.
This study estimates the cost of power shortages and further examines the various power generation
options available to Nigeria. Using cost-benefit analysis approach, we are able to show the marginal
benefits of implementing policies that promotes new investments in the power sector. We estimate
the net economic benefits of various options that can be explored by the power utility in Nigeria. This
analysis takes into consideration the high level dependence of consumers on self-generation in the
country.
Our findings shows that the country could have a net economic resource savings (discounted)
ranging between US$4.9billion and US$29 billion over a study period of 25 years. The simple
benefits estimated under this study is equivalent to the cost savings from self-generation. Perhaps,
these benefits would be much higher if there were no captive generators presently to meet the
energy needs of consumers. This range in the estimation of net benefits provide substantial
evidence to perform a sensitivity analysis on each viable power generation option. Our results are
quite sensitive to future prices of fuel.
3. Cost-Benefit Analysis of Fuel-Flexibility in Thermal Power Generation, Bahman Kashi*
(bkashi@econ.queensu.ca) and Glenn Jenkins, Queen's University
This study provides an empirical framework for deterministic and probabilistic cost-benefit analysis
(CBA) of investment in fuel flexibility in the thermal generation of electricity. Natural gas has become
the fuel of choice for new thermal electricity generation plants across the globe. While every country
has access to imported or domestic sources of natural gas, instabilities in the price, availability and
quality of natural gas have resulted in suboptimal operation of many thermal power plants. This has
resulted in an increased interest in investments in fuel-flexible power plants. When needed, such
plants can operate on alternative fuels such as the abundantly available, but more expensive, light
crude oil. Operators of such power plants can switch to the alternative source when natural gas is
not available. Countries can also benefit from such operational flexibility when faced with volatile fuel
prices, or when there is a prospect of cheap domestic supply of natural gas in the future. This
analysis looks at the trade off between the greater maintenance and capital costs of building the
power plants with fuel flexibility and the benefits of lower fuel costs and increased reliability. In
addition, it considers the conditions necessary for it to be worthwhile to convert a single fuel
generation plant into one that is dual fired.
4. Implications of Technology Availabilty on Clean Power Plan Compliance Costs, Scott
Bloomberg* (scott.bloomberg@nera.com), NERA
In this presentation, I will explore the electric sector compliance costs (and other relevant cost
measures) associated with the EPA’s proposed Clean Power Plan based on different assumptions
regarding the availability of selected technologies. In particular, I will explore the potential for new
nuclear generation to assist in meeting state emission rate requirements. I will also evaluate the
impacts associated with different levels and costs of energy efficiency.
 E.4: Non-Market Recreational Welfare Effects of Changes in the
Diversity and Abundance of Species (Marvin 310)
Chair: Trudy Ann Cameron (cameron@uoregon.edu), University of Oregon
Presentations:
28
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
1. A Combined Revealed/Stated Preference Model for Projecting the Impact of Aquatic
Nuisance Species on Recreational Angling in the Great Lakes, Upper Mississippi, and Ohio
River Basins, Gregory L. Poe* (glp2@cornell.edu), Cornell University; Richard C. Ready,
Pennsylvania State University / Montana State University; T. Bruce Lauber, N.A. Connelly, R.C
Stedman and S. Creamer, Cornell University
With the objective of preventing interbasin transfer of Aquatic Nuisance (ANS) or Invasive Species
the Great Lake and Mississippi and Ohio River Basins, Congress mandated that the U.S. Army
Corps of Engineers (USACE) conduct a Great Lakes and Mississippi River Interbasin Study
(GLMRIS) of the benefits and costs of alternative ANS controls, including severing all hydrological
connections between the two basins. The estimated investment costs of alternate strategies with
differential expected success rates range from negligible to over $18 billion.
As part of the GLMRIS project the co-authors conducted a combined revealed preference-stated
preference survey to predict how recreational anglers in these basins would respond to potential
decreases in recreational catch rates that could occur as a result of ANS transfer. Using a recall
survey conducted during March through May 2012, travel cost data for the 2011 fishing season was
collected from over 3,500 recreational anglers (response rate = 46%) in MN, IA, MO, WI, IL, IN, KY,
MI, OH, WV, PA and NY.
A repeated site-choice, three-level, nested-logit model was estimated using the 2011 travel cost
data. For each choice occasion, anglers were assumed to choose whether to go fishing
(Participation decision), what fishing type to engage in (Fishing Type decision), and where to fish
(Fishing Site decision). Overall, the estimated site-choice model is consistent with underlying
economic theory. The revealed behavior model was augmented with stated preference (contingent
behavior) data by eliciting how anglers would change their behavior in response to decreases in
catch rates. The resulting stated preference estimates of participation can be incorporated into the
travel cost choice framework to project changes in fishing frequency and consumer surplus with
spatially explicit changes in catch rates for seven different types of fishing.
2. The Value of Water Quality to Fishermen in the Chesapeake Bay, Matt Massey*
(massey.matt@epa.gov), U.S. Environmental Protection Agency; Steve Newbold, U.S.
Environmental Protection Agency
This analysis estimates the value of water quality changes (measured by nitrogen, phosphorus, and
sediment) to recreational anglers in the Chesapeake Bay. Water quality changes resulting from
reductions in pollutant loads are assumed to affect fishermen in two ways. First, water quality
indirectly affects fishermen by influencing the abundance of fish and therefore fishermen’s expected
catch rates. Second, water quality may directly affect fishermen by influencing the tangible aesthetic
qualities of the fishing sites, such as water clarity. These water quality driven changes in species
abundance and site characteristics may cause changes in fishermen’s per trip utility, the numbers of
trips taken, or both.
Using historical data from NOAA’s National Marine Fisheries Service’s Marine Recreational
Fisheries Statistic Survey (MRFSS) on recreational fishing trips to and catch rates at sites located on
the Chesapeake Bay, we estimate a random utility maximization (RUM) site choice travel cost
model. Following Murdock (2006), we estimate a complete set of alternative specific constants
(ASC’s) and then in a second stage regress site characteristics such as catch rate and site water
quality on the estimated ASC’s. Results indicate that water quality has a significant impact on
anglers’ utility per trip and on the number of trips they take per season. Results also suggest that
most fishermen care about the total numbers of fish caught although some fishermen care strongly
about what types of fish they are catching. In order to capture changes in the number of trips taken
29
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
in response to the water quality changes we link the results of the RUM model to a negative binomial
participation model. Results indicate that increased water quality does cause more trips to be taken.
3. Joint Estimation of Revealed and Stated Preference Data from the Alaska Saltwater
Sportfishing Economic Survey, John C. Whitehead* (whiteheadjc@appstate.edu), Appalachian
State University; Daniel K. Lew, Alaska Fisheries Science Center, National Marine Fisheries Service
One of the major policy issues in fisheries management is the efficient allocation of catch quota
across commercial and recreational sectors. The benefit of a reallocation toward one sector is the
value of additional catch. The cost of the reallocation is the value forgone in the other sector.
Accurate measurement of the value of catch in both sectors is essential for an economically-efficient
allocation. In this paper we develop econometric models to jointly estimate revealed preference (RP)
and stated preference (SP) models of recreational fishing behavior and preferences using survey
data from Alaska. The RP data are from site choice survey questions and the SP data are from a
choice experiment. Models using only the RP data are likely to estimate the effect of cost on site
selection well but marginal values associated with catch rates may not be reflected well in the
benefits of the trip as perceived by anglers. The SP models are likely to estimate the effects of trip
characteristics well but provide less attention to the cost variable. The combination and joint
estimation of revealed and stated preference data seeks to exploit the contrasting strengths of RP
and SP data. We find that there are significant gains in econometric efficiency by combining data
sources, and differences between RP and SP willingness to pay estimates are mitigated by joint
estimation. The nested logit trick model fails to account for the panel nature of the data and is less
preferred to scaled, random parameter and generalized mixed logit models that account for the
panel data and scale differences. We find scale differences across data sources in only one
candidate model, the error components random parameters logit. While this model outperformed the
standard random parameters logit model, willingness to pay estimates do not differ across these two
models.
4. The Value to Birders of Species Biodiversity: A Random Utility Model of Site Choice by
eBird Participants, Sonja Kolstoe* (skolstoe@uoregon.edu) and Trudy Ann Cameron, University of
Oregon
The eBird database is a product of a huge citizen science project at the Cornell Laboratory of
Ornithology. Members report their birding excursions, including their destinations and the numbers
and types of birds they observe on each trip. Based on home address information, we calculate
round-trip road distance and travel time and use this information to construct travel cost measures
for outings to competing destinations. We focus our analysis on birders in the Pacific Northwest U.S.
(Washington and Oregon states). Each possible destination (birding hotspot) can be characterized
by a number of alternative measures of expected species diversity abundance in that month by
combining the eBird data with additional data from Birdlife International. Travel costs allow us to
estimate the marginal utility of other consumption. We allow for heterogeneity in the marginal utility
per species in each of four different groups, allowing variously for a time trend and seasonal effects,
as well as random variation in key utility parameters. We allow for habit formation and varietyseeking. Inferred total WTP estimates for a birding outing depends upon a number of features of the
destination site in addition to species diversity (including ecological region and management
regime). The main goal of the research is to quantify the potential effects of land-use changes or
climate-changeinduced alterations in species’ ranges on the value of birding opportunities to this
population of birders.
Session 5 - Friday, March 20, 9:00 - 10:30
30
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
 A.5: Financial Benefits of Remediation (Marvin 309)
Chair: Heidi King (heidi.king@mac.com), GE Capital
Presentations:
1. The Value of Brownfield Remediation, Lala Ma* (lala.ma@uky.edu), University of Kentucky;
Kevin Haninger, U.S. Environmental Protection Agency; Christopher Timmins, Duke University
The U.S. Environmental Protection Agency Brownfields Program awards grants to redevelop
contaminated lands known as brownfields. This paper estimates cleanup benefits based on a
nationally representative sample of brownfields using a variety of quasi-experimental techniques. We
advance the existing body of work on two important fronts. To our knowledge, this is the first paper
that combines non-public EPA administrative records with high-resolution, high-frequency housing
data to estimate the effects of brownfield cleanup across the entire federal Brownfields Program.
Next, only under certain conditions can the capitalization of disamenities into local housing markets
be given a welfare interpretation. We utilize different sources of variation available in our unique data
to estimate cleanup benefits without relying on those assumptions, which makes our estimates
particularly useful for cost-benefit analysis. We find increases in property values accompanying
cleanup, ranging from 4.9% to 9.3%; for a welfare interpretation that does not rely on the
intertemporal stability of the hedonic price function, a double-difference nearest-neighbor matching
estimator finds even larger effects of up to 15.6%. Our various specifications lead to the common
conclusion that Brownfields Program cleanups yield a positive, statistically significant, but highlylocalized effect on housing prices.
2. The Labor Market Impacts of the 2010 Deepwater Horizon Oil Spill and Offshore Oil
Drilling Moratorium, Joseph E. Aldy (joseph_aldy@hks.harvard.edu), Harvard Kennedy School
In 2010, the Gulf Coast experienced the largest oil spill, the greatest mobilization of spill response
resources, and the first Gulf-wide deepwater drilling moratorium in U.S. history. Taking advantage of
the unexpected nature of the spill and drilling moratorium, I estimate the net effects of these events
on Gulf Coast employment and wages. Despite predictions of major job losses in Louisiana —
resulting from the spill and the drilling moratorium — I find that Louisiana coastal parishes, and oilintensive parishes in particular, experienced a net increase in employment and wages. In contrast,
Gulf Coast Florida counties, especially those south of the Panhandle, experienced a decline in
employment. Analysis of accommodation industry employment and wage, business establishment
count, sales tax, and commercial air arrival data likewise show positive economic activity impacts in
the oil-intensive coastal parishes of Louisiana and reduced economic activity along the NonPanhandle Florida Gulf Coast.
3. Welfare Impacts of Labor Market Changes Induced by Regulations, Ann E. Ferris*
(ferris.ann@epa.gov) and Alex Marten, U.S. Environmental Protection Agency
The welfare impacts of labor market changes induced by regulations are multi-faceted, include both
private and social components, and are difficult to identify and measure empirically. There is an
ongoing debate on whether the costs and benefits of labor market changes induced by regulations
should be included with welfare measures in benefit-cost analysis (BCA), or analyzed separately as
economic impacts. Cass Sunstein has called this a “frontiers question”. Whether included directly or
not, the monetized value of these labor market changes may be of value to policy makers when
assessing regulatory options. However, there are theoretic disagreements and significant difficulties
31
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
in practice, in terms of monetizing labor market changes, which the available literature has yet to
address.
This paper first reviews the arguments in the current debate about incorporating the potential welfare
impacts of labor market changes in applied BCA of regulations. In addition to the theoretical and
political considerations that frame the debate, there is a practical dimension that should also inform
the conversation but is often left out of the debate. We examine the techniques brought forth in the
empirical literature to monetize the numerous facets of this problem, carefully considering their
applicability, data requirements, and uncertainty with regards to analyzing national level regulations.
This review provides grounds to inform the debate about the incorporation of labor market impacts in
BCA by highlighting the potential information available to practitioners relative to the resources
required to develop credible analytic estimates.
4. From Controversy to Consensus: Decommissioning California's Offshore Oil Platforms,
Max Henrion*(henrion@lumina.com), Lumina Decision Systems
The 27 oil platforms in California's coastal waters are many decades old and reaching the end of
their productive life. The original leases required the platform operators to dismantle and remove
them entirely after they cease production. The platforms are massive structures, up to 1200 feet
deep. They are now encrusted with marine organisms, providing a rich habitat for economically
valuable rock fish. They have become popular with sea lions, recreational human divers, and many
other marine life forms. Removal would be costly, likely over a billion dollars, and have substantial
environmental impacts, including emissions to air and water, as well as destruction of this habitat.
The California Ocean Science Trust (OST) commissioned an interdisciplinary team to provide a
comprehensive survey of the relevant scientific, engineering, economic, and legal issues, with a
decision analysis of the key decommissioning options. The study had to consider the concerns of a
full range of stakeholders, including environmental groups, oil companies, commercial and
recreational fishermen, divers, and a range of state and federal agencies. These concerns include
the cost of removal, environmental impacts on biological productivity, marine mammals and birds, air
and water quality, and the seabed, ocean access, and compliance with the leases. A key insight
from the study was that partial removal or "rigs to reefs" option reduced both costs and
environmental impacts relative complete removal. Technical contributions included the first
quantitative models of fish production and air emissions, and the first probabilistic analysis of
decommissioning costs. A key option for partial removal was to cut platforms off at 85 feet below sea
level to avoid interfering with shipping. An additional option to sweeten partial removal for
environmental advocates was to share the cost savings from partial removal (potentially over $500
million) between the oil companies and an Ocean Conservation Fund. With this addition, almost all
stakeholders supported the rigs-to-reefs. The California state legislature passed an enabling law, AB
2503, almost unanimously. Governor Arnold Schwarzenegger signed it in September 2010.
Decommissioning of the first platforms is expected to start in 2015.
 B.5: Economics Frontiers and Benefit-Cost Analysis (Marvin 307)
Chair: Sandra Hoffmann (shoffmann@ers.usda.gov), U.S. Department of Agriculture
Presentations:
1. WTP or WTA: Determining the Appropriate Measure When Preferences Are Reference
Dependent, Phumsith Mahasuweerachai* (phumosu@gmail.com), Khon Kaen University; Jack
Knetsch, Simon Fraser University
32
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Many recent studies provide strong evidence that people often value losses substantially more than
gains, and to value changes not in terms of final outcomes as assumed in standard theory and
common practice, but in terms of changes from reference states. Positive changes may, therefore,
often be reductions of losses rather than gains, with WTA measures then more accurately assessing
welfare improvements than WTP (with analogous implications for negative changes). It is, therefore,
important to have objective means of determining which measure will provide a more accurate
assessment in particular cases. One such means of discriminating is presented here, based on
observed differences and similarities of positive and negative changes within the domains of gains
and of losses. The method is illustrated and tested with the results from an extensive study of Thai
farmers’ valuations of changes in river erosion control works. Its applicability is further tested with
choices of measure to value other entitlement changes. If further tests provide similar results, this
suggested means may offer considerable potential to deal with the likely bias of current assessment
practice of seriously understating the value of many losses and reductions of losses.
2. More Accurate Estimation of Program Benefits in a Hedonic Setting, Thomas J. Kniesner*
(Thomas.Kniesner@cgu.edu), Claremont Graduate University; Chris Rohlfs, Morgan Stanley; Ryan
Sullivan, Naval Postgraduate School;
Hedonic estimation involving the marginal willingness to pay for goods’ and services’ attributes is a
vital tool for measuring the benefits of public policies that improve safety or environmental, school, or
health care quality. Our research extends the hedonic estimation approach to include three
important, but generally ignored, aspects of markets for heterogeneous goods or services.
Specifically, first we consider that many attributes are endogenous and change in response to
exogenous shocks. Second, we consider that heterogeneous goods and services can have
substitutes and complements and exogenous shocks to a market of interest may affect the markets
for the other products. Third, aggregate quantities supplied may change in response to an
exogenous policy induced shock. For all three reasons just mentioned the benefits of an exogenous
policy induced shock to one product attribute will be incompletely capitalized into the price of that
product, and traditional hedonic estimators will produce biased estimates of policy benefits.
The focal point of our research is the presentation of new experimental and quasi-experimental
estimators that avoid the just-mentioned biases and are consistent estimators in a general setting.
The new estimators are, of course, more demanding in their data requirements and need for
exogenous sources of variation than the currently familiar hedonic estimators. A variety of our new
estimators are presented for situations where different acceptable sources of data are available.
We conclude with an application of one of the improved estimators of the total demand for an
attribute to measure the value that military recruits place on funds to be used eventually for their
higher education. We find that contributions to an educational spending account are valued at about
$0.25 to $0.50 per $1 of benefits and that direct tuition and stipend support are valued at about $0 to
$0.10 per $1 of benefits provided.
3. Challenges in BCA from Implementing the Many Concepts of Risk and Uncertainty, Scott
Farrow* (farrow@umbc.edu), University of Maryland, Baltimore County
Benefit-cost analyses are often used to inform investments or policy choices about risks. The
applied analyst suddenly finds that there are multiple aspects of modeling and quantifying risk. This
paper seeks to clarify the several ways in which risk enters typical environmental or health
analyses. Among the different aspects of risk requiring different theory and different quantitative
methods can include: a) is there a distinction between risk and uncertainty, and if so, how to
separate their modeling?, b) the use of probabilistic methods to model statistical distributions of
33
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
inputs and the results distribution of outputs, c) the importance of conditional distributions which may
shift risk, d) risk preferences of individuals, including behavioral approaches, such that valuations
differ by those preferences, and e) decision rules with uncertainty. This paper synthesizes the issues
and identifies currently available empirical tools.
4. A Study in the Use of Game Theory in the Regulatory Process, Jose Davalos*
(jad8793@gmail.com), U.S. Coast Guard
Use of game theory, which is focused on countering an adversary intending to do harm, seems as a
well – motivated and appropriate fit for security applications. However, the use of game theory is not
limited to just security, but can also be a formable application to non-security regulations as well.
The use of game theory can prove useful in estimating benefits of regulation, largely by estimating
how the regulated industry and other will response to the language of the regulation and the
enforcement of the regulation. We have conducted a study that examines the use of game
theoretical approaches to develop an assessment of benefits for regulatory analysis of security
regulation. The study has resulted in recommended game theory methodologies that can be used to
examine regulatory benefits in cases where benefits are not quantitatively measurable. We found
that security regulations can be reduced to an extensive form of game with perfect information. We
also found that a limited set of action for a would-be adversary can be expressed and compared
efficiently in a strategic form. In addition, game theory shifts the question of which regulatory actions
have the least burden of proving net benefits under assumed adversary actions to a question of
which alternative has the greatest net benefit under assumed adversary resources for any desired
action. This presentation discusses the use of game theory to realize the benefits and best
alternative when developing and analyzing security related regulations.
 C.5: Finance Issues (Marvin 308)
Chair: Ali Gungor (ali.gungor@uscg.mil), U.S. Coast Guard
Presentations:
1. Accounting for Market Distortions in an Integrated Investment Appraisal Framework,
Kemal Bagzibagli* (kemal.bagzibagli@emu.edu.tr), Eastern Mediterranean University; Glenn P.
Jenkins, Queen’s University/Eastern Mediterranean University; Octave Semwaga, Ministry of
Finance of Rwanda
Public investments are key policy instruments used by governments in pursuing their overall
development goals and strategies. Given the limited resources available to an economy, the chosen
projects should fit into the overall development strategy, which usually concerns many stakeholder
groups. Despite this fact, in practice the appraisal of most investment projects carried out by
governments, multilateral financial institutions and consultants have tended to be basically a
financial analysis with only a partial, if any, economic evaluation. The stated constraints are largely
the time frame in which these appraisals are to be prepared, and the lack of data for carrying out a
professionally adequate economic appraisal. This paper reports on an effort in Rwanda that, we
believe, has successfully addressed both of these constraints. Our paper first presents the
adjustments required to convert the financial values of investment projects into their corresponding
economic values in a manner that meets a high standard of professionalism. The paper also
describes the comprehensive framework and practical approaches to the estimation of the economic
prices and Commodity Specific Conversion Factors (CSCFs) for project inputs and outputs. The
paper applies the framework to tradable and non-tradable goods and services in Rwanda, and
estimates their CSCFs to be used in the economic appraisal of investment projects in the country.
These analytical frameworks have then been used to develop a web based database of CSCFs for
34
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Rwanda,4 containing more than 5,000 tradable commodities, and non-tradable goods and services
such as transportation, construction, electricity, and telecommunication. The database provides easy
access from anywhere in the world for project appraisal specialists involved in the formulation,
evaluation and implementation of projects, and allows them to conduct an up-to-date economic
appraisal of investment projects in a professionally satisfactory manner.
2. A Review of and Lessons Learned from Federal Research and Development Facility
Capital Budgeting Practices, Vanessa Pena* (vanessa.pena@gmail.com) and Susanna Howieson
(showieso@ida.org), IDA Science and Technology Policy Institute
Researchers at the IDA Science and Technology Policy Institute (STPI) conducted various studies
on the planning and prioritization processes and evaluation frameworks for Federal research and
development (R&D) facilities. This presentation provides an overview of these past studies related to
this topic. The analyses included input from interviews with Federal agency senior executives,
budget analysts, laboratory directors, and facilities managers, among others across the Federal R&D
enterprise to understand the variety of capital budgeting processes across the Federal Government.
STPI researchers also performed an extensive literature review of planning and assessment of
Federal R&D facilities and reviewed agency planning and programmatic documents, such as agency
and laboratory strategic plans and budgets.
The studies exposed various challenges and best practices relevant to (1) assessing Federal R&D
facility condition, value and overall benefits to the agency and (2) using the outcomes of those
assessments to inform capital budgeting prioritization and investment decisions. Based on these
findings, STPI researchers propose several strategies to improve agency capital budgeting practices
for R&D facilities: (1) encourage interagency benchmarking, data-sharing efforts, and exchange of
best practices, such as the use of long-term modeling tools and integrated metrics, (2) standardize
Federal facility data and metrics definitions to facilitate peer review, (3) develop strong prioritization
frameworks and criteria when metrics are lacking to effectively capture and communicate the R&D
facility’s impact on agency missions. The studies indicate that longer-term interagency efforts may
be necessary to address these issues and implement the strategies for improving the Federal capital
budgeting process for R&D facilities.
3. Identifying a Suitable Control Group Based on Microeconomic Theory: The Case of
Escrows in the Subprime Market, Xiaoling Ang* (xiaoling.ang@cfpb.gov) and Alexei Alexandrov,
Consumer Financial Protection Bureau
We analyze the effect of a Federal Reserve Board’s subprime mortgage regulation requiring
escrows on the availability of mortgage credit. Due to all mortgage originators being affected by the
regulation, there is no natural control group for affected markets. We use the assumption of profit
maximization to construct a control group. Applying a difference-in-difference strategy to a dataset
constructed using Home Mortgage Disclosure Act loan level data and USDA Rural Atlas county level
data, we find no statistically or economically significant impact on the loan origination markets across
the U.S., despite 202 institutions exiting the subprime mortgage market in over 200 counties. These
results, along with other evidence presented in the paper, strongly suggest that consumers were
able to switch to similar loans originated by the exiting creditors’ competitors.
4. Escape from the Silos: Stakeholder Analysis, William McCarten* (wmccarten@gmail.com),
World Bank - retired
This paper proposes to advance the policy-institutional practice frontier by first asking how relatively
homogenous groups of stakeholders are currently integrated in benefit-cost work undertaken by
international financial institutions (IFIs) for their clients and then by posing and answering normative
35
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
questions about how operational practice can be improved to enhance infrastructure investment
outcomes and mitigate the losses of adversely affected groups. The paper asks whether utilizing
evaluation shortcuts in the form of “add-ons” and “carve outs” in the interest of operational simplicity
will potentially lead to project option ranking reversals or the neglect of key insights that might
motivate project design changes to mitigate risk and reduce stakeholder losses. Finally, it asks if
there is scope for advancing a multi-IFI consensus on a minimum set of stakeholder categories and
analytical conventions, such as a willingness to accept criterion for involuntary resettlements, that
would improve the support of IFI client governments, stakeholder advocacy groups, and capital
market partners by enhancing political acceptability and reducing project execution risk? Improved
support is desired as a potential key to improving infrastructure investment climates.
The main tool of analysis will be a review of selected actual projects sponsored by IFIs to catalogue
the ways in which stakeholder have been identified and integrated into benefit-cost modules. The
author will consult project appraisal documents (PADs), project completion reports and independent
evaluation reports produced by affiliated evaluation agencies. These findings will be benchmarked
with the methodological guidelines for stakeholder-inclusive appraisal work advocated by experts
from selected academic clusters to measure the degree of alignment between lending preparation
practice and academic models of best practice. The reemergence of interest in formal cost-benefit
analysis of investment projects has responded to an increased emphasis on project-driven impacts
on the poor, the quantification of environmental externalities, the identification of differential gender
impacts, and the identification of differential behavioral responses from different categories of
stakeholders. In the Harberger tradition, stakeholder analysis focuses on a small set of project
externalities, while sharply contrasting ideas of stakeholder integration emerge when analysts adopt
a sociological or political economy driven frameworks to disaggregate affected groups. Both
traditions will be reviewed in benchmarking IFI practice.
 D.5: Economic Evaluation of Medical Interventions (Marvin 413-414)
Chair: Don Kenkel (dsk10@cornell.edu), Cornell University
Presentations:
1. Saving Lives with Stem Cell Transplants, Damien Sheehan-Connor*
(dsheehanconn@wesleyan.edu), Wesleyan University; Theodore Bergstrom, University of California
Santa Barbara; Rodney Garratt, Federal Reserve Bank of New York
Blood stem cell transplants can be life saving for some patients, but the chances of finding a
matching donor are small unless a large number of potential donors are evaluated. Many nations
maintain large registries of potential donors who have offered to donate stem cells if they are the
best available match for a patient needing a transplant. An alternative source of stem cells, umbilical
cord blood, is stored in banks. Everyone faces a small probability of needing a transplant that will
increase their likelihood of survival. The registries and cord blood banks are thus an interesting
example of a pure public good with widely dispersed benefits. This paper explores the gains in
survival probability that arise from increased registry and bank sizes and uses ``value of statistical
life'' methods to estimate benefits and compare them to costs. Our results suggest that for the United
States and for the world as a whole, the sum of marginal benefits of an increase in either the adult
registry or the cord blood bank exceeds marginal costs. However, marginal benefit-cost ratios for the
adult registry are much greater than those for the cord blood banks, which suggests that to the
extent that these two sources of life saving compete for public funds it may be preferable to prioritize
expansion of the adult registry over cord blood banks.
36
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
2. The Use of Economic Evaluation to Inform Newborn Screening Policy Decisions: The
Washington State Experience, Scott D. Grosse* (sgrosse@cdc.gov), U.S. Centers for Disease
Control and Prevention; John D. Thompson and Michael Glass, Washington State Department of
Health
Context: The use of economic evaluation methods to inform newborn screening (NBS) policy
decisions has received little attention. This paper documents the use of benefit-cost analysis (BCA)
models to inform policy decisions in Washington State since 2001. The experience of the
Washington State Department of Health in developing analyses of expected costs and outcomes
can help other political jurisdictions to decide whether to take a similar approach.
Methods: Discussions with experts involved in the NBS policy process in Washington State were
combined with an analysis of internal reports and spreadsheet files to summarize and discuss BCA
models for two disorders: MCAD deficiency and cystic fibrosis completed in 2003 and 2005. Avoided
deaths were valued using societal willingness-to-pay estimates of the value of a statistical life (VSL)
of $4 million. The assumptions and findings were compared with subsequently published findings in
the peer-reviewed scientific literature to assess the accuracy of the estimates of outcomes.
Findings: The Department of Health prepared spreadsheet models demonstrating positive expected
net benefit of adding MCAD deficiency and cystic fibrosis to the state NBS panel. The primary
benefit in both models was from the expected reduction in infant deaths resulting from early
detection and treatment. These models provided the needed information to complete the policy
making process. The findings of the models are consistent with other information on the health
outcomes and costs of screening for the selected disorders.
Conclusions: Public health NBS programs can develop the capacity to project the expected costs
and benefits of expansion of NBS panels. The Department of Health has continued to use BCA to
inform NBS policies, including a 2013 analysis of screening for severe combined immune deficiency
(SCID) that used VSL estimates of $6.1-9.1 million and led to the adding of SCID to the state NBS
panel.
3. Retrospective Benefit-Cost Analysis Review of Bar Code Labeling for Human Drug and
Biological Products, Nellie Lew* (nellie.lew@fda.hhs.gov), Clark Nardinelli, and Andreas Schick,
U.S. Food and Drug Administration; Elizabeth Ashley, Office of Management and Budget
With an objective to reduce the number of medication administration errors that occur in hospitals
and other healthcare settings each year, FDA published a final regulation in 2004 that requires
pharmaceutical manufacturers to place linear bar codes on certain human drug and biological
products. At a minimum, the linear barcode must contain the drug’s National Drug Code (NDC)
number, which represents the product’s identifying information. The intent was that bar codes would
be part of a system where healthcare professionals would use bar code scanning equipment and
software to electronically verify against a patient’s medication regimen that the correct medication is
being given to the patient before it is administered. By requiring commercial drug product packages
and unit-dose blister packages to carry bar code labels, it was anticipated that the rule would
stimulate widespread adoption of bar code medication administration (BCMA) technology among
hospitals and other facilities, thereby generating public health benefits in the form of averted
medication errors. We use the 2004 prospective regulatory impact analysis as the basis to reassess
the costs and benefits of the bar code rule. Employing the most recent data available on actual
adoption rates of bar code medication administration technology since 2004 and other key
determinants of the costs and benefits, we examine the impacts of the bar code rule since its
implementation and identify changes in technology that have occurred. In this retrospective study,
37
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
we also use alternative models of health information technology diffusion to improve estimates of
counterfactual scenarios against which we compare the effects of the bar code rule.
4. Benefit-Cost Analysis of Universal Newborn Screening for Severe Combined
Immunodeficiency (SCID): A Policy Tool for State Health, Yao Ding* (yao.ding@aphl.org),
Ruhiyyih Degeberg, Jelili Ojodu, Association of Public Health Laboratories; Scott D. Grosse, Centers
for Disease Control and Prevention; John D. Thompson, Washington State Department of Health
Context: Severe combined immunodeficiency (SCID) is a genetic disorder that results in profound
T-lymphocyte deficiencies. Infants with SCID typically are diagnosed after the onset of recurrent
severe infections around 6 months of age. Among infants with SCID who receive hematopoietic
stem cell transplantation, 94% of those transplanted within 3.5 months survive, compared to 50-70%
for those transplanted later. The T-cell receptor excision circle (TREC) assay using dried blood spots
collected for newborn screening (NBS) can identify infants with T-lymphocyte deficiencies, and this
screening test has been successfully integrated into more than 20 state NBS programs. Other states
still need to assess the costs of adding screening for SCID relative to potential benefits.
Methods: We constructed a customizable decision tree model as a spreadsheet tool to estimate the
potential cost benefit of universal NBS for SCID compared to no screening. Probabilities and costs
were derived from published literature and recommendations from expert panels, with averted
deaths valued at $9 million each.
Findings: For a birth cohort of 86,600 babies, annual costs of screening, including laboratory tests,
short-term follow-up, diagnostic testing of false-positive cases, clinical care and diagnostic testing of
non-SCID T-cell lymphopenia cases were estimated to total $747,008. Total benefit consisting of the
value of infant deaths averted and treatment costs avoided by SCID screening were estimated at
$3.99 million, resulting in net benefit of $3.24 million and a benefit-cost ratio of 5.35. If no monetary
value is assigned to deaths, the incremental cost-effectiveness ratio is approximately $30,000 per
life-year saved (using 3% discount rate).
Conclusions: Universal NBS for SCID provides the possibility of early diagnosis, improved
treatments, and lives saved for babies with SCID. This spreadsheet tool can help state officials to
estimate SCID screening costs and net benefits.
 E.5: Methodological Approaches to Benefit-Cost Analysis (Marvin
310)
Chair: Kerry Krutilla (krutilla@indiana.edu), Indiana University
Presentations:
1. Regulatory Compliance Learning Costs: An Assessment of Agencies’ Practices, Ronald
Bird* (rbird@uschamber.com), U.S. Chamber of Commerce
A potentially significant element of the compliance cost of a new or revised regulation encompasses
the time, effort and expense that affected entities incur to learn whether the rule imposes liabilities
on them and, if so, what are the dimensions of the compliance obligation. This is a cost element that
is antecedent to the labor and capital costs of actually complying with the rule, and it is a cost
element that potentially impacts to some degree those who are exempt from the scope of the
regulation as well as those who are its intended targets. There is no osmosis by which I learn
without opportunity cost that a new requirement exists and whether or not it applies to me. I must
38
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
know the rules that impact my choices and action. When new rules are announced, I must learn
whether or how constraints on my choices and actions are altered. Learning requires time, and time
use implies opportunity cost. This paper surveys existing research literature regarding regulatory
learning costs, and it reviews employment, occupational safety and health, hazardous waste
disposal, land/water use, and other regulations issued anew or revised over the past ten years to
determine the frequency and detail of analysis by regulatory agencies of regulatory learning
costs. Illustrations of aggregate learning costs for selected regulations are calculated. Strategies
that agencies have used or could use to reduce learning costs are identified and evaluated. The
paper also addresses the fundamental question of whether learning cost is proper to include in a
benefit cost analysis of a regulation designed to correct a significant economic inefficiency arising
from market failure. Could consideration of short-term learning costs create an inefficient barrier to
adoption of standards that yield long-term economic efficiency or equity benefits?
2. The Mercatus Center’s Regulatory Cost Calculator: A Survey Tool to Capture the Full
Opportunity Costs of Regulation, Jerry Ellig* (jellig@mercatus.gmu.edu), George Mason
University
Federal agencies’ Regulatory Impact Analyses often measure expenditures necessary to comply
with proposed regulations. But they frequently fail to assess the full opportunity costs of regulations,
including lost business opportunities and forgone consumer surplus. This presentation will document
these claims with data from an evaluation of 108 RIAs for economically significant regulations
proposed between 2008 and 2012. It will then propose a solution that would give federal agencies
access to better data on the full range of anticipated costs of individual regulations. The Mercatus
Center at George Mason University has developed a survey tool that can be administered to
regulated entities by trade associations, independent researchers, or government regulatory
analysts. Costs of regulation captured by the survey include:





Direct expenditures on compliance, such as paperwork, new equipment, or employee
training;
The value of owner, manager, or employee time diverted to regulatory compliance;
Profit forgone on investments the business no longer makes as it is forced to divert
resources to regulatory compliance;
The profit businesses lose and the value consumers lose from the price increases, quality
changes, or other sales-reducing behavioral responses induced by regulation;
The costs of resources that businesses and trade associations expend to influence
regulation.
These costs are not always explicit or obvious, because some of them involve lost opportunities
rather than expenditures. The purpose of the Cost Calculator is to provide better information on the
full cost of regulation by identifying both direct and indirect costs and eliciting information that
economists can use to estimate the cost of regulation to customers and our broader society.
3. Spatial Analysis in a Benefit-Cost Context, Erik Gomez* (erik.gomez@uscg.mil) and
Douglas Scheffler, U.S. Coast Guard
The typical tools of benefit-cost studies are the analysis, modeling, and display of data in
spreadsheets or specialized statistical software. Many benefit-cost analyses involve data that have a
location reference, such as street addresses of homes and businesses, latitude longitude
coordinates of accidents, or at a larger scale names of cities and towns. This type of data is called
“geo-referenced or spatial data” and are capable of analysis by Geographic Information Systems
(GIS). GIS is used to create, manage, analyze, and exhibit spatial information. Specifically, spatial
statistics have the ability to analyze spatial distributions, patterns, events, and relationships. These
39
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
features add to conventional statistics in that they incorporate spatial elements into their
computations. This allows users to identify statistically significant phenomena, assess overall
clustering patterns, and explore spatial connections. The proposed presentation will provide an
overview of GIS, review the tools that can be used in the analyses of benefits and costs, discuss
how the Coast Guard has added GIS to its cost-benefit analysis toolkit, and provide a case study of
a Coast Guard regulatory analysis project that used GIS to complement the traditional spreadsheetbased calculations. We also will provide an overview of the GIS market.
4. Multi-Period Benefit-Cost Analysis, Troy G. Schmitz* (aschmitz@ufl.edu) and Dwayne J.
Haynes, University of Florida; Troy G. Schmitz, Arizona State University
We discuss the link between classical welfare economics and benefit-cost analysis (BCA) by
conducting a welfare analysis of the 2004 Fair and Equitable Tobacco Reform Act (buyout). Using
both a one-period and multi-period partial-equilibrium model, we compare and contrast several
possible choices of benefit-cost ratios (BCRs) that could be used to assess the impact of the buyout.
These BCRs differ depending upon, for example, whether one includes foreign consumers in the
model. We also include a model accounting for distributional considerations, by explicitly using
distributional welfare weights. A strong conclusion is that one-period BCRs are not affected by the
choice of social discount rate, because the present value calculation associated with a one-period
BCR requires the discounting of both the benefits in the numerator and the costs in the denominator.
We extend the analysis to include multiple time periods and show the conditions under which the
choice of social discount rate alters the underlying present values associated with the BCRs. We find
that the BCRs are only slightly affected by the choice of social discount rate. The relative length of
the different periods under consideration has more of an effect on the BCRs than the choice of the
discount rate.
Session 6 - Friday, March 20, 10:45 - 12:15
 A.6: Valuing Reductions in Morbidity and Mortality (Marvin 309)
Chair: Joseph E. Aldy (joseph_aldy@hks.harvard.edu), Harvard University
Presentations:
1. The Morbidity and Mortality Components of the Value of Statistical Life, Elissa Philip*
(elissa.philip@vanderbilt.edu) and W. Kip Viscusi, Vanderbilt University
Although government agencies generally rely on a uniform value of statistical life (VSL), numerous
studies have documented substantial heterogeneity in these values. Most of the research to date on
the heterogeneity of the VSL has focused on differences based on individual characteristics, such as
age, and on long-term illnesses, such as cancer. This article uses labor market data to estimate the
mortality and morbidity components of the VSL for acute accidents. The individual fatality rate data
from the BLS Census of Fatal Occupational Injuries (CFOI) make it possible to construct measures
of the mortality and morbidity components of fatality risks. Our labor market estimates of morbidity
effects are positive, even for fatalities that are predominantly caused by traumatic injuries. However,
the fatality risk, rather than the morbidity loss associated with the fatal accident event, is the principal
contributor to the VSL.
2. Implications for Regulatory Analyses of Different Approaches to Estimating the Value of
a Statistical Life, Ines Havet* (ines.havet@ontario.ca), Ontario Ministry of the Environment
40
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
The Value of a Statistical Life (VSL) estimates used in benefit-cost analyses have been discussed
extensively and a large number of studies have been developed estimating values with increasing
refinements over time to the statistical methods applied. This article is concerned with the theoretical
foundations of the VSL estimates recommended by regulatory agencies in regulatory benefit-cost
analyses and their implications in terms of the findings put forward to decision makers. First, the
difference between willingness-to-pay (WTP) and willingness-to-accept-compensation (WTAC) will
be reviewed. While in theory, VSL estimates based on either approach should be equal, in practice
estimates tend to be higher when measuring WTAC. We argue that WTAC measures are not
appropriate in analyses of regulatory proposals that are typically intended to reduce the risk of
premature death rather than provide compensation for a loss. Second, the recommended VSL
values used by regulatory agencies in benefit-cost analyses will be reviewed, including whether they
are based on WTAC or WTP studies and the estimates from published regulatory analyses. The
recommended VSL estimates are derived either exclusively from WTAC or WTP studies or some
combination of the two. Where possible, the results of these analyses will be reassessed by applying
WTP estimates. We propose that where WTAC values were used, the conclusions of analyses may
be biased toward accepting the proposal.
3. Eliciting Fatal and Non-Fatal Risk Trade-Offs: An Experimental Approach Using an
Incentivized Learning Experiment, Hugh Metcalf* (hugh.metcalf@ncl.ac.uk), Susan Chilton and
Jytte Seeted Nielsen, Newcastle University
In general, stated preferences are elicited to be used in a Cost Benefit Analysis to inform an
allocative decision process. The Risk-Risk Trade-Off (RTO) has been used to elicit the relative
trade-off between changes in morbidity and mortality risk (Viscusi et al., 1991) however only a very
few examine potential methodological issues affecting this method. An experimental design will allow
us to explore whether it is possible to ameliorate two general problems within stated preference
surveys that may affect the RTO approach. The first problem is a general lack of sensitivity in
surveys to changes in characteristics that economic theory would predict should matter to
respondents. This includes “scope insensitivity” which has been found in the willingness-to-pay
literature. We refer to this generic problem as “stickiness”. “Stickiness” might manifest itself in two
ways in the RTO method Firstly, in people indicating indifference between increasing the risk of two
very different health outcomes, or, secondly, in an unwillingness to take any risk increase in more
severe outcomes. The second general problem is framing effects and the violation of the
assumption of procedural invariance (Tversky & Thaler 1990). In the RTO literature, the risk change
has either been presented as a marginal change to the current situation (Chilton et al., 2006) or as
the total risk in the changed situation (Viscusi et al. 1991). In this paper we compare the responses
from these two frames. We also explore the impact on responses of a pre-survey experiment that
invokes the spirit of “rationality spillover” (Cherry et al., 2003), in which subjects make (incentivised)
risky choices as in a standard laboratory experiment. We find that the RTO is affected by the
“stickiness” problems but the impact on central tendency measures can be reduced by the learning
experiment and by a change in risk frame.
4. Preferences for Life-Expectancy Gains: Sooner or Later? James K. Hammitt*
(jkh@harvard.edu) and Tuba Tuncel, Harvard University/Toulouse School of Economics
We assess individuals’ preferences for time paths of reductions in mortality risk yielding a lifeexpectancy gain of about one month. Using data from a survey of more than 1000 French residents,
we find substantial heterogeneity. We elicit pairwise preferences between three primary
perturbations of age- and gender-specific survival curves: transient (reduce hazard for next ten
years), additive (reduce hazard in all years by an additive constant) and proportional (reduce hazard
in all years by a multiplicative constant). The preference order implied by these pairwise responses
is transitive for 85 percent of respondents. The most common preference orders, accounting for over
53 percent of respondents, are strict indifference, proportional ≻ additive ≻ transient, and the exact
41
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
opposite ranking. These preference orders are consistent with globally risk-neutral, risk-seeking, and
risk-averse preferences toward longevity, respectively. Choices between one of these scenarios and
a latent version that provides no risk reduction for the first 10 or 20 years are consistent with these
risk postures. Preferences toward the time path of mortality-risk reduction are not strongly
associated with individual characteristics, although respondents who are younger, higher-income, or
exhibit higher consumption-discount rates tend to exhibit less longevity-risk aversion.
 B.6: Food and Water Issues (Marvin 307)
Chair: James Neumann (jneumann@indecon.com), Industrial Economics
Discussant: Linda Abbott (LABBOTT@OCE.USDA.GOV), U.S. Department of Agriculture
Presentations:
1. Valuing an Ounce of Prevention: the Social Cost-Effectiveness Analysis of Alternative
Strategies to Secure Groundwater Quality, Ted Horbulyk* (horbulyk@ucalgary.ca), University of
Calgary
This analysis characterizes and compares alternative strategies to secure groundwater supplies
under threat of nitrate contamination from agriculture. These approaches are to be viewed as a
social investment to be made by a local water authority acting in the public interest. The authority’s
objective function consists of delivering a future supply of municipal drinking water that meets or
exceeds specified targets for nitrate concentration at the lowest social cost.
The present analysis is structured as a case study of the choices being faced by the Oxford County
in Ontario, Canada, where its ability to provide a future supply of safe drinking water to municipal
users is threatened by nitrate contamination. There is a relatively large set of management and
policy approaches upon which it can draw. For instance, there is an option to encourage or to
require the adoption so-called beneficial management practices by local farmers, such as the
reduction of application rates of nitrate fertilizers. Remediation approaches include treating the
water for nitrates either before (in situ cross-injection scheme) or after (ion-exchange process) the
water has reached the wellhead. Alternatively, it may be preferable to develop new groundwater or
surface water sources, or to construct new conveyance inter-ties to existing sources elsewhere
within the county. Demand-side management approaches might be needed to reduce the future rate
of demand growth, thereby delaying the need to make specific capital investments.
The contribution of this analysis is to bring together all of these alternatives in a social costeffectiveness framework that makes explicit the nature of the planning and management tradeoffs
faced by these water managers when acting on society’s behalf. The careful and standardized
comparison of these management approaches on a present value basis can make clear that, in
many cases, an ounce of prevention can be an extremely valuable investment indeed.
2. Consumer Valuation of Organic Egg Characteristics and its Implications for Public
Policy, Eliza M. Mojduszka* (emojduszka@oce.usda.gov), U.S. Department of Agriculture
We develop and estimate a discrete choice model of product choice for organic eggs in the US
market in the period from 2010 to 2014. The model links observable and latent consumer
characteristics to observable and latent product attributes and allows us to obtain consumer
preference parameters and demand elasticities with regard to product attributes.
42
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
The objectives of this paper are twofold. First, we build a comprehensive understanding of the
relative importance of different determinants of consumer organic egg choices from 2010 to 2014.
Second, we make a significant contribution to demand analysis by basing this understanding on the
use of uniquely comprehensive data sets and theoretical/modeling techniques that evaluate demand
on the brand/product level. The overall goal is to analyze what is driving consumer choices of
organic eggs and the implications of these drivers for public policy (including animal welfare policy)
in the United States. A particular focus is the relative importance of organic egg attributes, consumer
attitudes and perceptions, as well as marketing and product development strategies in determining
consumer demand for organic eggs.
We define individuals' perceptions as latent variables corresponding to the product attributes. More
specifically, perceptions are individual’s estimates or beliefs of the levels of attributes of product
alternatives. We also define individual’s attitudes as latent variables corresponding to the
characteristics of the decision maker. Attitudes reflect individual’s taste, awareness, needs, values,
and capabilities. Thus, in our model, the distribution of consumer utilities depends on both
observable and latent (unobservable) individual characteristics. These determine preferences for
product attributes (some of which are unobservable) and hence determine demand.
The database utilized in our estimations integrates product level IRI scanner purchase data, product
attributes data (including information on the level of animal care provided by organic egg producers
and rated by the Cornucopia Institute in the form of Organic Egg Scorecards), consumer
characteristics data, as well as producers’ marketing efforts data for products and brands in the
organic eggs category. By integrating all of our data sources, we are able to obtain more precise
estimates of the demand parameters that are crucial for the design of more effective public policy
(including animal welfare policy) and for the marketing and promotion of food products by producers
and distributors.
3. Benefit Cost Analysis of Water, Smita Bhatia* (smita.bhatia@specstra.ca), Specstra
Consulting Inc.
Water is fundamental to human life. Unlike fossil fuels, there is no substitution for water. The
valuation of water will become increasingly relevant as the water demand increases across the
globe. Water management is complex and needs to be addressed now to avert a future global crisis.
Private companies have suggested commodification of water. While this is an interesting proposition;
it underscores important issues such as access, ownership and pricing. Water is viewed as a public
good and a unique feature of water management is the role of individuals. This increases the
complexity in determining the demand, supply, storage and transportation of water. Water
management requires intelligent and thoughtful foresight in policies. Good policies require sound
benefit cost analysis. How should water be valued in the face of scarcity and unequal distribution of
water resources globally? As a commodity, is water different than other natural resources? Do
existing economic models adequately address the complexities in water management? This
presentation will canvass these issues from a benefit-cost analysis perspective. It will examine the
utility of existing benefit cost analysis models for natural resources to manage water. At a minimum,
water economics needs to be broader than the neoclassical focus on efficient allocation of
resources.
 C.6: Local Policy Issues (Marvin 308)
Chair: Richard Zerbe (zerbe@uw.edu), University of Washington
Discussant: Stuart Shapiro (stuartsh@rutgers.edu), Rutgers University
43
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Presentations:
1. How to Deal with a Public University’s Funding Cut: An Analysis Using Benefit-Cost
Principles, Qingqing Sun* (qingqs@uw.edu), Daniel Ahn and Michael McBride, University of
Washington
The Office of financial management (OFM) in Washington State plans to cut 15% from higher
education funding next year. We are assuming that Western Washington University (WWU)’s
response to the potential cuts in state funding is to increase tuition substantially. We want to
estimate the impact of tuition increase on WWU itself and local community as a whole; and whether
this decision will maximize social net benefit, while overcoming budget shortfalls.
At first, we measure the impact of raising tuition on primary market: 1) the quantity and quality of
WWU’s utilities and 2) the quality of undergraduate education - academic reputation and 3) WWU’s
freshmen enrollment. And then we measure the impact of raising tuition on secondary market: 1)
community colleges and other universities’ freshmen enrollments in Washington State 2) economic
Impact of WWU’s tuition increase on local economy 3) WWU Students’ financial burdens. And then
we calculate and discount the social net welfare of WWU’s tuition increase in 7% discount
rate, which is the same recommend by Office of Management and Budget.
Based on our research, we estimate the tuition elasticity of total headcount at WWU is -0.094. We
find out tuition increase decreases WWU’s freshmen enrollment slightly and does not improve
WWU’s academic reputation and infrastructure obviously. Besides, tuition increase also does not
have obvious economic impacts on local economy and WWU students’ financial burden. However, it
did increase enrollments at community colleges and other Universities in Washington State who
have lower tuitions and higher admission rates than WWU. Finally, we also conduct sensitivity
analysis to test the validity of our study.
2. The Benefits and Costs of an Earthquake Early Warning System in Washington State, Eli
Lieberman* (lieberman8228@gmail.com) and Andrew Calkins, University of Washington
Earthquake Early Warning Systems have the potential to help mitigate loss of life, severity and
number of human causalities, damage to infrastructure, and negative impacts to an array of
businesses by providing advance warnings that range from a few seconds to multiple minutes.
Washington State is a high-risk area for seismic activity, including being susceptible to the damages
associated with a magnitude nine earthquake along the Cascadia Subduction Zone off the
Washington State coast. This paper presents a preliminary attempt at conducting a cost-benefit
analysis of implementing an Earthquake Early Warning System for Washington State. Specifically,
this paper examines the Shake Alert system proposed by the United States Geological Survey and
several West Coast universities including the University of Washington. The study develops a
casualties-avoided model for assessing whether such a system is worthwhile, given extreme
variability and uncertainty in earthquake prediction models and early warning technology. Impacts
are limited to the Shake Alert’s potential to mitigate loss of human life and the severity and number
of casualties. A Monte Carlo analysis is used to provide a more robust finding owing to variability of
the study’s variables. We believe we lay a solid framework from which a more comprehensive
analysis can be built.
3. Ex Ante Benefit-Cost Analysis of New Stadium Construction: Theory and Application,
Bryan Roberts* (broberts@econometricainc.com), Econometrica; Daniel Greene, Alden Street
Consulting
44
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Since 1990, dozens of states, municipalities, and local governments have spent nearly $19 billion
tax payer dollars to construct new sports stadiums in hopes of fostering positive economic growth.
Yet, in justifying the use of such large taxpayer subsidies, many governments fail to use cost-benefit
analysis and, instead, rely on distorted and often unreliable economic impact analysis. As a result,
new stadium construction has become an important issue for benefit-cost analysis. We contribute to
the current literature on the economic benefits of sports stadiums by developing a methodology for
analyzing the benefits of new stadium construction in a willingness-to-pay framework that takes into
account increased welfare of event attendees due to improved facility quality. This approach
requires careful specification of the market structure for event attendance at the new stadium. We
also identify benefits related to secondary-market effects that can be legitimately incorporated into a
benefit-cost analysis. We then apply our framework to the construction of a new stadium for the
Atlanta Braves baseball team. This example involves a shift of stadium location from the City of
Atlanta to Cobb County, which is in the Atlanta metropolitan region. We evaluate benefits and costs
both from the broader perspective of the Atlanta metropolitan region and the narrower perspective of
the Cobb County government. A Kaldor-Hicks table of benefits and costs for specific stakeholders is
presented for each perspective.
 D.6: Real Option Value and Federal Offshore Leasing (Marvin 413414)
Chair: Michael Livermore (Livermore@exchange.law.nyu.edu), NYU Institute for Policy Integrity
Panelists to Include:
1.
Michael Hanemann (hanemann@are.berkeley.edu), UC Berkeley and Arizona State University
This presentation will describe the concept of option value and how it arises in the context of natural
resources extraction. The real option value of a discrete irreversible investment in the face of
uncertainty is the value of information conditional on not taking on the investment in the present.
This information comes from learning more about the state of the world: the effect of drilling on the
environment or reducing the uncertainty over the price of environmental services. The real option
value can be broken down into two components: the pure postponement value (PPV), which does
not depend on the chance of getting new information, and the pure informational quasi-option value,
which depends on society’s ability to learn information in the future. Professor Hanemann will
discuss the Arrow-Fisher-Hanemann-Henry (AFHH) option value, also called the quasi-option value,
and its applications.
2.
Scott Farrow (farrow@umbc.edu), University of Maryland, Baltimore County
Many government agencies account for the first part of the real option value (PPV) in their benefitcost analyses, and may partially account for the second part (pure informational quasi-option value).
Professor Farrow will discuss recent and potentially new applications of options value, including
merging risk assessment, benefit-cost analysis, and real options. For example, valuing flood
protection may involve a comparison of multi-state expected damages, option price and cumulative
prospect measures. Professor Farrow will also discuss other potential applications of option value to
federal natural resources management.
3.
Jayni Hein (jayni.hein@nyu.edu), New York University
In 2014, Policy Integrity challenged the legality of the Department of the Interior’s five-year offshore
leasing plan, for failure to use or consider option value when determining the timing and location of
45
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
offshore leases. A decision by the U.S. Court of Appeals for the District of Columbia Circuit is
expected by early 2015. We will provide an overview of the case, focusing on the issues we raised
with respect to option value. We argued that the Department of the Interior should use an option
value approach in order to fully value the American people’s offshore resources—including the value
attached to the option to delay extraction of those resources. Framing leasing decisions as now-ornever choices made within a single five-year period will systematically lead to inefficient overexploitation of natural resources. Further, different offshore regions face different uncertainties, and
therefore, have different options value. In line with its statutory mandate under the Outer Continental
Shelf Lands Act, the Department of the Interior should transparently disclose regional differences in
costs—including environmental and social costs—and consider the uncertain environmental
sensitivities and uncertain drilling and remediation technologies that certain regions (such as the
Artic) face. By failing to account for the informational value of delay with respect to environmental
and social uncertainties, Interior exposes the public to suboptimal levels of environmental risk, as
well as suboptimal returns on the sale of it nonrenewable offshore resources.
4.
Peter Howard (HowardP@exchange.law.nyu.edu), New York University
The fourth presentation will review methods for estimating real option and quasi-option values, in the
context of leasing in the arctic region of the United States. There are several well established
methodologies that agencies can use to estimate quasi-option value. First, an agency could use
contingent valuation techniques by surveying various regulators involved in oil-environmental
planning decisions to determine the value that they place on waiting (Fisher and Hanemann, 1990).
Second, an “engineering-economic approach” could be applied where the theoretical model is
parameterized using studies from the literature, additional analysis, and surveys of experts (Fisher
and Hanemann, 1990). Third, a programming approach can be applied. While similar to the previous
method, this method consists of running numerical simulations (Mahul and Gohin, 1999, and HaDuong, 1998) that allow for multiple runs using different future scenarios (e.g., low and high drilling
cost scenarios). Calculating the real option value would require only one more step in which the
value of the additional information can be calculated by comparing the results of these simulations
that are run under certainty to those that are run under uncertainty using the formulas established in
the literature. If some of the parameters for such models (e.g. probabilities of various scenarios)
could not be determined, Monte Carlo simulations, which are frequently used in physicals sciences
and finance when there is significant uncertainty, can be used. Last, though not specifically
calculating quasi-option value, the agency could expand its current calculation of the hurdle price,
which captures the value of the option to delay from market price uncertainty, to a social hurdle
price. The benefits and costs of these various approach as they apply to the government’s drilling
decisions are discussed.
 E.6: International Benefit-Cost Analysis Issues (Marvin 310)
Chair: Timothy Brennan (brennan@umbc.edu), University of Maryland, Baltimore County
Discussant: William McCarten (wmccarten@gmail.com), World Bank - retired
Presentations:
1. Should BCA Be Harmonized? Leo Dobes* (Leo.Dobes@anu.edu.au), Australian National
University; George Argyrous, Australia and New Zealand School of Government; Joanne Leung,
New Zealand Ministry of Transport
46
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Governments invariably require expenditure proposals to be supported by a Benefit-Cost Analysis
(BCA), or a similar approach. In theory, this allows decision-makers to compare proposals and to
choose those with the highest net social benefits. The results of BCAs depend on the specific
methodology adopted: a production function approach is likely to yield a different result to an
analysis based on stated preference methods. However, results will also depend on the values of
variables and parameters used as inputs to Net Present Value (NPV) calculations. Discount rates
are the usual focus of those promoting consistency of analytical inputs. But NPVs can also be highly
sensitive to key variables such as travel time, the marginal excess tax burden used (or not used), or
externalities such as congestion or greenhouse gas emissions. Without consistency of methodology
and input variables, BCAs will not provide directly comparable options for decision makers, even if
individual analyses are highly sophisticated. A base case response might be to simply present
decision-makers with unadjusted studies, provided that they meet fundamental analytical
standards. A ‘work-around’ used by some central agencies is to partly recalculate or adjust BCA
results to increase the degree of comparability. A third potential solution is to standardize the
variable and parameter values to be used by analysts as a means of increasing consistency. This
paper presents a taxonomy of current practice in the various Australian and New Zealand
jurisdictions and examines the arguments for and against greater harmonization of methodology and
input variables in BCA in a federal system.
2. The Social Cost-Benefit Analysis of Research Infrastructure: An Exploratory Framework,
Massimo Florio* (massimo.florio@unimi.it), University of Milan; Emanuela Sirtori, CSIL Centre for
Industrial Studies
When decision-makers consider fundamental research infrastructures, such as astronomic
observatories, nano-electronic laboratories, oceanographic vessels and particle accelerator facilities
(just to mention some examples) are faced by this question: what is the net social benefit of these
costly scientific ventures? The answer is often given qualitatively, or even rhetorically, by scientists
and other stake-holders in these projects. But can we go beyond anecdotal evidence, narratives and
ad hoc studies and try a structured ex-ante and ex-post social cost-benefit analysis (CBA) of
infrastructural for pure research? This paper explores some of the methodological issues involved
when valuing the costs and benefits of capital-intensive scientific projects. The paper has been
drafted in the context of the research project “Cost/Benefit Analysis in the Research Development
and Innovation Sector” sponsored by the European Investment Bank University Research
Sponsorship program. After a brief overview of the earlier literature on the evaluation of fundamental
research infrastructures, the authors propose a conceptual model and suggest empirical approaches
for estimating the quantities and shadow prices of cost aggregates and different categories of
economic benefits. Benefits associated to fundamental research include: the non-use value (i.e.
existence and quasi-option values) of discoveries, the creation and diffusion of knowledge in the
form of papers signed by scientists, technological spillover to the supply chain and other external
users, human capital formation, and cultural effects imputable to outreach activities. In discussing
possible approaches for assessing costs and benefits in this challenging field, attention is also given
on how to deal with uncertainty and risk affecting the analysis. Directions for future research are
sketched in the concluding section, along with some policy implications for national and supranational institutions and funding agencies.
3. New Trends in CBA of public investments in France, Emile Quinet* (quinet@enpc.fr), Paris
School of Economics-Ecole des ponts Paris Tech
Cost benefit assessment of investments is an ongoing preoccupation for public authorities in France
as in many other countries. Long enshrined in the legislation concerning Transportation, this
requirement has been quite recently extended by the law to all public investment in civil investments.
47
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
France has a long tradition in this field. On several occasions, commissions met to define and
improve evaluation procedures. Their findings were then converted into guidelines by the competent
authorities.
The report presented in this communication revises the recommendations of previous reports issued
around 10 years ago. The main trends are:






Updating the monetary values, the result being an increase of the values of amenities.
Assessing the wider effects, especially the effects on space, the increase in productivity due
to agglomeration externalites and the market powers.
Taking into consideration the systemic risk through procedures akin to the financial methods
(risk-free discount rate+risk premium depending on the correlation between the project’s
benefits and the economic growth),
Setting the rules for hierarchization of projects
Taking into account the long term issues
Improving the governance of projects.
These points will be developed using the transport sector as a primary example, since economic
calculations are most widely used in that sector, even if the energy and health sectors will also be
considered. An example of implementation of these new procedures is presented in the case of a
large project, the “Grand Paris Express”, a ring metro around Paris area which is now scrutinized.
Session 7 - Friday, March 20, 2:00 - 3:30
 A.7: Retrospective BCA of Federal Rules (Marvin 309)
Chair: Dick Morgenstern (morgenst@rff.org), Resources for the Future
Presentations:
1. A Retrospective Assessment of EPA’s Air Toxics Rules, Art Fraas* (fraas@rff.org),
Resources for the Future
2. A Retrospective Assessment of Federal Efforts to Reduce Foodborne Illness: Shell Eggs
and Salmonella Enteritidis, Randall Lutter* (lutter@rff.org), Resources for the Future and
University of Virginia
3. A Retrospective Evaluation of the Cluster Rule: The Pulp and Paper Industry, Ron
Shadbegian* (Shadbegian.ron@epa.gov), U.S. Environmental Protection Agency; Wayne Gray,
Clark University
Dicussant: Al McGartland (McGartland.al@epa.gov), U.S. Environmental Protection Agency
 B.7: Law and Economics Perspectives on Benefit-Cost Analysis
(Marvin 307)
Chair: Jack Knetsch (knetsch@sfu.ca), Simon Fraser University
Presentations:
48
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
1. Is Every Regulation Potentially Cost-Benefit Justified?: Methodological Pluralism and
the Estimation of Regulatory Benefits, Jason Scott Johnston* (jjohnston@virginia.edu), University
of Virginia
Recent work by Acs and Cameron (2013) reports that the requirement of regulatory impact analysis
(RIA) has had no statistically significant impact in lowering the rate at which regulations are
promulgated. As some such impact was clearly among the objectives of requiring RIA, this is a
phenomenon in need of explanation. This paper argues that the reason that RIA does not constrain
regulation is because there is so much discretion in the estimation of regulatory benefits. This
argument is made by first surveying recent estimates of regulatory benefits since 1990 prepared and
published by the Office of Information and Regulatory Affairs (OIRA). It reports, as have previous
studies, that the vast majority of quantified benefits for regulations promulgated since 2008 come
from the reduction in fine particulate air pollution. It also notes that increasingly widespread use of
contingent valuation in estimating regulatory benefits. The paper then critiques the methodological
basis for recent regulatory benefit estimations. It is argued that surveys, especially contingent
valuation surveys, are unlikely to generate reliable and valid measures of actual benefits. It is also
argued that epidemiological evidence alone is highly unreliable as a measure of actual health
impacts. This point is made with a detailed analysis of estimates of the impact of fine particulates on
excess mortality. The data show that particulates have their biggest impact on cardiovascular
mortality among the elderly in the winter months. However, the medical literature reveals a variety of
mechanisms that account for the heightened risk of death from cardiovascular causes among the
elderly during the winter, and these mechanisms do not involve exposure to elevated levels of fine
particulates. If researchers look statistically at only one particular factor – fine particulates – while
ignoring others, then estimates are subject to omitted variables bias. A better approach is to look
first to identify potential causal mechanisms so that all potential factors are controlled for in statistical
studies of the relationship between exposure levels and excess mortality rates.
2. Judicial Review of Agency Benefit-Cost Analysis, Caroline Cecot*
(caroline.cecot@vanderbilt.edu) and W. Kip Viscusi, Vanderbilt University
This Article evaluates judicial review of agency benefit-cost analysis (“BCA”) by examining a
substantial sample of thirty-eight judicial decisions on agency actions that implicate BCA.
Essentially, the Administrative Procedure Act tasks federal courts with ensuring that federal agency
action is reasonable. As more agencies use BCA to justify their rulemakings, the court’s duty often
requires judges to evaluate the reasonableness of agency BCAs. In this Article, we discuss the
challenges that trigger judicial review of agency BCAs and the standards that govern the review. We
then present specific examples of how courts analyze BCAs. Overall, we find many examples of
courts promoting high-quality and transparent BCA. Courts have been willing to question BCA
methodology and assumptions and request more transparency on these issues. As agencies rely
more on BCA in their decision making, judicial review of BCA will be increasingly important. The
stakes are high. Additional judicial oversight can be valuable—but bolstering any oversight effort to
provide a policy check can also impose societal costs if desirable policies are delayed or left
unimplemented. Ideally, efforts to foster greater judicial review should be structured so that the
enhanced role of the judiciary itself passes a benefit-cost test. Armed with this Article’s examination
of the state of judicial review of BCA, scholars can more effectively evaluate the impact of judicial
checkpoints on the use of BCA in agency decision making and assess whether shifting more
regulatory oversight authority to the courts would be an effective approach to fostering more welfareenhancing policies.
3. Cost-Benefit Analysis, Distributional Weights, and Institutions, Matthew Adler*
(adler@law.duke.edu), Duke University
49
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Distributional weights are a natural way to incorporate distributional considerations into cost-benefit
analysis (CBA). CBA with appropriate weights can mimic either a utilitarian social welfare function
(which is averse to inequality in income), or even a more egalitarian social welfare function.
However, important institutional objections have been raised against distributional weighting. First,
isn’t it better to handle distributional considerations through the tax system, with regulatory CBA
undertaken in its traditional unweighted form? Second, doesn’t the specification of distributional
weights involve contestable value choices? Is it normatively legitimate for unelected regulators to
make such choices? This presentation will briefly present the theory of weighting, and then address
the institutional questions.
4. Using Kaldor-Hicks Tableaus for Distributional Accounting in Regulatory Impact
Assessment, Kerry Krutilla* (krutilla@indiana.edu), Gabriel Piña, and Yu Zhang, Indiana University
The OMB recommends that agencies provide a separate description of the distributional effects of
regulations in their regulatory impact assessments (RIAs). However, a review of recent RIAs shows
that agencies rarely follow this analytical guidance. Our research assesses the feasibility of
improving the representation of distributional effects in RIAs using the Kaldor-Hicks tableau (KHT)
display format. In concept, a KHT disaggregates total benefits and costs and allocates them to
stakeholders, and also records between-stakeholder financial transfers. The format provides a
conceptually consistent display of distributional effects at a chosen level of stakeholder
representation, revealing the effects on “public stakeholders” as a fiscal impact assessment. To
operationalize this concept, five final RIAs completed from 2011-2013 are chosen for detailed
analysis, one from each of the DOT, EPA, DOL, HHS, and DHS. KHTs are constructed based on
information presented in the regulatory impact assessments themselves, and assumptions about the
tax status of the identified industrial sector subjected to the regulation. We show that it is feasible to
construct KHTs for regulatory impact assessments from the data that is usually collected to produce
them, and that this approach provides better insight about the distributional effects of regulations
than current practice. Moreover, revealing the fiscal impact of regulations is relevant for the
efficiency analysis, given the marginal value of public funds.
 C.7: Preliminary Recommendations from the 2nd Panel on CostEffectiveness in Health and Medicine (Marvin 308)
Chair: Theodore Ganiats (tganiats@mac.com), UC San Diego
Panelists to Include:
Scott D. Grosse (sgrosse@cdc.gov), Centers for Disease Control and Prevention
James K. Hammitt (jkh@harvard.edu), Harvard University
 D.7:Cost-Effective Air Quality Strategies (Marvin 413-414)
Chair: Anne Smith (anne.smith@nera.com), NERA
Presentations:
1. A Critical review of the Development and Use of Externality Costs for Air Quality,
Elisabeth Gilmore* (gilmore@umd.edu), University of Maryland
50
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Externality costs are needed for regulatory and policy analysis. For exposure to adverse air quality,
an impact pathway approach is generally used to estimate the externality costs. This entails
converting the emissions to ambient concentrations, translating the concentrations to their equivalent
health and ecosystem effects, and applying willingness to pay estimates to avoid these outcomes.
Since this approach can be time consuming, estimates from the literature are frequently used
instead of conducting a full impact pathway approach. There is, however, limited guidance for
selecting literature values for a specific policy under consideration. Here, I conduct a critical review
of the available estimates for the externality costs from air quality. I then use these values to develop
a set of characteristics to consider when selecting literature values, namely physical features of the
sources, exposed population, and air quality chemistry and meteorology. For pollutants that undergo
significant chemical transformations after release, it can be challenging yet critical to account for
differences in the temporal profile of the emissions, concentrations of other pollutants and
meteorological conditions. For policies where this is likely to matter, it may be more appropriate to
conduct a bounding analysis of the magnitude of the air quality externality rather than selecting a
value from literature.
2. A Mixed Integer Programming Model for National Ambient Air Quality Standards
(NAAQS) Attainment Strategy Analysis, Alexander Macpherson* (macpherson.alex@epa.gov),
Heather Simon, David Misenheimer, Charles Fulcher, Bryan Hubbell and Robin Langdon, U.S.
Environmental Protection Agency
The United States Environmental Protection Agency (EPA) is currently reviewing the National
Ambient Air Quality Standard (NAAQS) for ozone. States with areas designated as nonattainment
with the standards are required to develop State Implementation Plans (SIPs) to demonstrate how
pollution levels will be reduced to meet the standard. Historically, many states have developed SIPs
independently. However, for ozone, some states have at times recognized the important role of
regional transport (for example the Ozone Transport Commission which addresses ozone air quality
in the Northeast) and have developed regional agreements to control ozone-forming emissions.
These types of regional air quality management approaches have the potential for improved
pollution control efficiency if states collaboratively determine the least-cost controls within or across
regions. We present a Mixed Integer Programming model for devising least cost control strategies
that recognize the potential for interstate transport of ozone. While linear programming models have
been used to assess regional ozone control strategies, this model applies the framework nationally
to identify efficiencies from reducing regional transport. Air quality is characterized by a sourcereceptor matrix estimating the impact of regional emissions reductions on ozone concentrations at
monitors. Least cost control strategies are determined by decisions about using specific control
technologies on emissions sources. This tool allows user-defined policy constraints about which
ozone precursors and emissions locations to consider in identifying the least cost attainment
strategy. A case study is presented using information from a series of emissions sensitivity air quality
model simulations along with current emissions abatement supply information. The model holds
promise for evaluating alternative scenarios, testing the role of transport in compliance strategies,
and identifying monitors exerting disproportionate influence on attainment strategies. The case study
is a proof of concept but is limited by the specificity of the source-receptor matrix. As additional air
quality simulations are performed, more refined information about the response of ozone to
emissions reductions in specific locations will improve the accuracy of the model.
3. Benefit-Cost Analysis of Phasing Out Coal in Power Plants and for Residential Use,
Henrik Andersson* (henrik.andersson@tse-fr.eu), Professor, Toulouse School of Economics
(LERNA); Yana Jin and Shiqiu Zhang, Peking University
To cope with the serious air pollution situation the Chinese State Council issued the National Action
Plan on Air Pollution Prevention and Control. Coal consumption reduction was chosen as a key
strategy and three key regions were targeted to achieve negative growth of total coal consumption
51
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
by 2017. One of these regions was the Beijing-Tianjin-Hebei region (BTH). In this study we focus on
Beijing and its proposed prioritization of coal reduction options, which has also then been followed
by many other local governments, focusing on substituting coal fired power plants and large boilers
by gas fired ones, while substituting residential cooking and heating coal consumption to cleaner
energy sources is given least emphasis. Despite the fact that this priority can realize rapid coal
reduction, the substitution for gas fired plants has been remarkably costly, and the efficiency of
pollution reduction and health protection from such interventions are under debate.
This study estimates the benefits and costs of interventions for phasing out coal in power plants and
among residential users in Beijing. Our study helps to evaluate the conventional claim that coal fired
power plants contribute more on ambient air quality-born health damage, and allows for taking both
ambient and household air-pollution from coal combustion into consideration. The results suggest
that the substitution of coal in residential sector can realize net social benefits, while for the power
plant sector it actually brings net social costs. This analysis indicates that the current coal
consumption reduction plan in Beijing, with its focus on coal fired power plants instead of the
residential sector, will realize limited health and environment benefits, and may induce huge social
cost in the long run. Given the current trend of scaling up coal reduction interventions in China, this
study is timely for reevaluation and adjustment of the current policy.
4. Benefit-Cost Analysis of Efficient Environmental Damage Emission Pricing in the Power
Sector, Daniel Shawhan* (DLS77@cornell.edu), Resources for the Future; Biao Mao, Rensselaer
Polytechnic Institute; Ray Zimmerman, Charles Marquet, and Jubo Yan, Cornell University; Yujia
Zhu, Arizona State University
We estimate the benefits and costs of charging each commercial electricity generator in the US and
Canada for the estimated value of the premature mortality caused by its sulfur dioxide (SO2) and
nitrogen oxide (NOX) emissions and the climate changes caused by its carbon dioxide (CO2)
emissions. We use, and describe, several new methods useful in benefit-cost analysis of electricity
policies and transmission and generation investments. Our representation of the power grid retains
all of the thousands of high-voltage transmission line segments. The power flows in our model are
based on physics, which cause flows to be largely uncontrollable except by changing where the
power is generated. This is important because avoiding the overloading of each transmission
segment can affect how a policy or investment will change the pattern of generation, and hence the
costs and benefits. We have also developed a method of modeling price-responsive demand while
keeping the model linear for tractability. We use an air pollution fate-and-transport model in order to
estimate the mortality impact of each generator, which depends on its location and effective
smokestack height. We know the economic and environmental characteristics of each of the 19,000
generators from creating an unprecedented combination of twelve data sets. We simulate Pigouvian
pricing of just SO2 and NOX, just CO2, and both together. We calculate the consumer, producer,
environmental, government, and congestion surplus of each policy. The SO2 and NOX pricing has a
net benefit almost twice as large as that of the CO2 pricing and reduces CO2 by almost as much,
while increasing the average electricity price by less. Both kinds of pricing together reduce
estimated premature deaths by 12,000 per year and produce an estimated net benefit similar to the
current direct variable cost of electricity.
 E.7: Valuing Outcomes and Performing BCA for Social Policy
Intervention (Marvin 310)
Chair: Francisco Perez-Arce (fperezar@rand.org), RAND Corporation
Presentations:
52
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
1. Benefit-Cost Analysis of Communities That Care: Social Policy Implications in
Washington State, Margaret R. Kuklinski* (mrk63@uw.edu), University of Washington
Social policies increasingly consider the economic implications of various policy alternatives. In
Washington state, E2SHB 2536, passed in 2012, mandated that “prevention and intervention
services delivered to children and juveniles in the areas of mental health, child welfare, and juvenile
justice be primarily evidence-based and research-based.” In response, the Washington State
Institute for Public Policy (WSIPP) and the University of Washington Evidence-Based Practice
Institute, both independent bodies, were asked by the legislature to create an inventory of evidencebased practices and services that met criteria for receiving legislative funding and support. To
receive the most favorable “evidence-based” designation, programs needed to generate positive net
present value when subjected to benefit-cost analysis, among other criteria.
In this presentation, we examine the policy from the lens of one evidence-based program included in
the inventory, Communities That Care (CTC). CTC is a coalition-based prevention system shown in
a rigorous community-randomized trial to have sustained preventive effects through Grade 12 on
youth delinquency and substance use. CTC has also been subjected to a benefit-cost analysis with
WSIPP’s benefit-cost analysis software tool, also used to evaluate programs for inclusion in the
E2SHB inventory. The tool can monetize benefits across a number of policy and program areas,
incorporates Monte Carlo methods for capturing uncertainty, and produces several policy-relevant
outcomes, including net present value, costs and benefits to various stakeholders, investment risk,
and cash flows.
Using CTC as a case study, this presentation shows how substance use and delinquency outcomes
are monetized under the model. It also illustrates the sensitivity of benefit-cost conclusions to
alternative assumptions about costs, effect sizes underlying benefits estimates, and time frame for
estimating benefits. Results point to the need for principles and standards in economic analysis of
prevention programs, particularly when conclusions are used to inform policy decisions.
2. Investigations of the Links Between Early Non-cognitive Skills and Future Adult
Outcomes: Relevance for Economic Assessment of Programs for Children, Damon Jones*
(dej10@psu.edu) and Mark Greenberg, Pennsylvania State University
In recent years, much focus has been directed toward understanding the link between non-cognitive
traits in children and the likelihood of healthy personal development and eventual adult wellbeing. Such traits play an important role both independently of and in conjunction with cognitive
traits (such as IQ) in influencing long-term outcomes. From an economic perspective, non-cognitive
skills are worth studying given their potential role in influencing future labor force outcomes or
reducing the likelihood for future problems that require societal resources (crime, substance abuse,
etc.). An additional feature of non-cognitive traits is that they may be more malleable than cognitive
skills, and thus may be effective targets for prevention or intervention programming. A challenge lies
in effectively assessing children’s competencies at an early enough age when such efforts might be
introduced but also when such skills can be measured.
Our study investigated how well adult outcomes can be predicted from ratings of children’s socialemotional (SE) skills, one indicator of non-cognitive ability, measured many years earlier in
elementary school. We examined how early SE skills predict late adolescent and adult outcomes in
participants from lower SES neighborhoods in both urban and rural areas. We utilized data from
three intervention projects. Our analytic models assessed the link between indicators in early
elementary school and economically-relevant outcomes 13-19 years later. Models included a large
number of control variables enabling us to explore the unique determination of featured
predictors. Results indicate how early non-cognitive skills are uniquely predictive of adult outcomes
53
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
across multiple domains such as crime, employment and future need for public assistance. Such
data that are rich in coverage of early non-cognitive skills as well as adult outcomes many years
later can provide important information that may facilitate development of shadow prices for use in
economic assessment of programs for children.
3. Valuing Outcomes of Social Programs: The RAND Database of Shadow Prices for
Benefit-Cost Analysis of Social Programs, Lynn A. Karoly* (karoly@rand.org), RAND Corporation
In conducting benefit-cost analyses (BCAs) of social programs, the analyst requires estimates of the
economic value, or “shadow prices,” associated with the various short- and long-term outcomes that
may be affected by a particular program. Indeed, many important benefits that flow from welldesigned and -implemented social programs are rarely, if ever, captured in monetary terms in the
associated BCAs, or such BCAs are not performed, in part because shadow prices (i.e., the
economic values) are not readily available to express the outcomes that the programs affect in
monetary terms. In other cases, BCAs are performed but are not comparable with BCAs in the same
or other areas of social policy because different shadow prices are used. The lack of valid shadow
price measures or the inconsistent use of such measures across BCAs currently constrains the set
of social programs for which benefit-cost studies are conducted and limits the comparability of those
BCAs that are prepared. This presentation will feature the Valuing Outcomes of Social Programs
(VOSP) database, a new resource developed by researchers at the RAND Corporation, which
serves as a centralized web-based repository of shadow prices for use by the research and policy
communities. The primary objective for the database is to reduce the “transaction costs” associated
with conducting BCAs of social programs by providing researchers with an objective, welldocumented resource. A second goal is to support standardization in the shadow prices used by the
research community. The presentation will outline the range of shadow prices covered in the
database, describe the methods used to populate the database, and demonstrate the output from
the tool.
4. Strengthening Benefit-Cost Analyses of Behavioral Prevention: Report on Progress of
the Society for Prevention Research’s Task for on Economic Analyses of Prevention, Max
Crowley* (maxcrowley@gmail.com), Duke University
Increasingly, behavioral researchers are recognizing the importance of understanding the costs and
benefits of their interventions. Yet, best practices for conducting benefit-cost analyses of programs
that intervene in psychosocial functioning remain limited. For instance, infrastructure for delivering
preventive services continues to be inadequate. As a result, cost analyses of these programs must
include essential, but hard to measure, capacity building resources (e.g., home visiting, substance
abuse prevention). Further, prevention programs that intervene early in life are known to accrue
benefits across decades. Such expanded time-horizons make direct measurement of benefits
difficult (e.g., Perry Preschool, Abecedarian). Guidance on such issues is needed to support future
benefit-cost evaluations. This presentation will provide an overview of efforts by the Society for
Prevention Research’s Task Force on Economic Analyses of Prevention to provide such
guidance. This includes outlining a working paper developed by the taskforce. In particular, this
work explores issues around how to integrate behavioral research with economic and public finance
methodologies. Efforts to find consensus around best practice for quantifying resources and valuing
benefits will be shared. The struggle to balance the need for consistent estimates—while maintaining
a flexible methodology will be highlighted. Feedback will be solicited from the SBCA membership. A
new NIH supported research network studying the science of investing in healthy development and
employing best practices identified by the SPR task force will be introduced.
54
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Session 8 - Friday, March 20, 3:45-5:15 p.m.
 A.8: Challenges and Opportunities for Economic Analysis of Risk
Regulations (Marvin 309)
Chair: Jennifer Baxter (jbaxter@indecon.com), Industrial Economics, Incorporated
Panelists to Include:
1.
Elizabeth Ashley (Elizabeth_M_Ashley@omb.eop.gov), U.S. Office of Management and Budget
2. Tony Cheesebrough (acheesebrough@gmail.com), National Protection and Programs
Directorate, U.S. Department of Homeland Security
3. Sandra Hoffmann (shoffmann@ers.usda.gov), Economic Research Service, U.S. Department
of Agriculture
4.
Amber Jessup (Amber.Jessup@HHS.GOV), U.S. Department of Health and Human Services
5.
Clark Nardinelli (clark.nardinelli@fda.hhs.gov), U.S. Food and Drug Administration
6.
Rosemarie Odom (rosemarie.a.odom@uscg.mil), U.S. Coast Guard
7.
Al McGartland (McGartland.al@epa.gov), U.S. Environmental Protection Agency
8.
Jack Wells (jackwells1@mac.com), U.S. Department of Transportation (retired)
Regulatory impact analysis (RIA) is an important tool for improving the quality of regulation and
motivates the development of methods that are useful in other risk policy contexts. However, U.S.
federal agencies face analytic challenges both common across the government and unique to their
agencies. Developing and applying better tools both improves the quality of regulatory analysis and
offers potential for improving agency decision making in other contexts. This panel brings together
regulatory and other economists from across the government to share the particular
analytic challenges and opportunities they observe in using these tools to promote evidence-based
decision making.
 B.8:Retrospective Review of Federal Regulations (Marvin 307)
Chair: Maeve Carey (MCAREY@crs.loc.gov), Congressional Research Service
Discussant: James Broughel (jbroughel@mercatus.gmu.edu), George Mason University
Presentations:
55
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
1. Learning from Experience: An Assessment of the Retrospective Review of Agency Rules
and the Evidence for Improving the Design and Implementation of Regulatory Policy, Joseph
E. Aldy* (joseph_aldy@hks.harvard.edu), Harvard University
A well-functioning regulatory program makes the American people better off by promoting
innovation; encouraging competition; protecting the air we breathe, water we drink, and food we eat;
and improving the safety of our workplaces and the goods we buy. Determining if society is realizing
the most out of its regulatory program requires rigorous analysis. Despite a long track record of
prospective analysis for proposed regulations, the Federal government has a mixed record on
retrospective review of existing regulations. Every administration dating back to the Carter
Administration in 1978 has implemented some form of regulatory look-back. The ad hoc nature of
the Presidential-mandated reviews, the apparent need for every administration to implement such a
retrospective review, and the heterogeneity in approaches to retrospective review by agencies
suggest that efforts to enhance and institutionalize retrospective review are merited. This paper
evaluates the practice of the Obama Administration's retrospective review and places it in the
context of the academic literature and past administrations' efforts at retrospective review. In
particular, I analyze the processes and products of agency review plans, agencies' initial tranche of
identified rules for review, and every economically significant rule finalized in fiscal years 2012 and
2013 in my examination of retrospective review under Executive Orders 13563, 13579, and 13610.
This assessment identifies best practices among agencies, describes key lessons learned from the
ongoing and past retrospective review efforts, and makes recommendations for ways to improve
retrospective review.
2. Looking Back at Retrospective Review: How Did Agencies Measure Up in 2014? Sofie E.
Miller* (sofiemiller@gwu.edu), The George Washington University Regulatory Studies Center
Through a series of Executive Orders, President Obama has encouraged federal regulatory
agencies to review existing regulations “that may be outmoded, ineffective, insufficient, or
excessively burdensome, and to modify, streamline, expand, or repeal them in accordance with what
has been learned.” Evaluating whether the intended outcomes of regulations are met ex post can be
challenging, so multiple government guidelines instruct agencies to incorporate retrospective review
plans into their proposals during the rulemaking process. To support this effort, the George
Washington University Regulatory Studies Center examined significant regulations proposed in 2014
to assess whether they included plans for retrospective review, and provided recommendations for
how best to do so. This paper examines how often agencies successfully incorporate plans for ex
post analysis into their rules and provides agencies with five recommendations to facilitate
transparency, public accountability, and measurement of their rules’ success.
3. A Framework for Evaluating Regulatory Outcomes, Kathryn Newcomer*
(newcomer@gwu.edu), Susan Dudley, and Estelle Raimondo, George Washington University
Benefit-cost analysis is a key component of the regulatory impact analysis that agencies are
required to conduct before introducing new regulations. However, the U.S. government has
traditionally placed much less emphasis on ex-post evaluation of regulatory outcomes after they are
in effect. This emphasis on ex-ante rather than ex-post regulatory impact analysis differs from the
practice in other government policy tools, where evaluation of program outcomes is routine. Better
ex-post measurement and evaluation of regulatory benefits and costs is essential for evidencebased decision making, and could be helpful not only for gauging regulatory effectiveness, but also
for improving ex-ante predictions of regulatory impact. This paper will attempt to bring together the
techniques of program evaluation and regulatory benefit-cost analysis, which are each wellestablished in their own right but have not generally been used to inform each other. We start with
the premise that evaluating the impact of regulations is a process analogous to evaluating the impact
56
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
of public programs, which depends on proper modeling and valid and reliable measurement of
inputs, outputs and outcomes. We hope to develop a general framework for measuring regulatory
impacts and test it with a case study.
 C.8: State and Local Benefit-Cost Issues (Marvin 308)
Chair: Ronald Bird (rbird@uschamber.com), U.S. Chamber of Commerce
Discussant: Jerry Ellig (jellig@mercatus.gmu.edu), George Mason University
Presentations:
1. Evidence-Based Policymaking: Integrating Cost-Benefit Analysis into a Broader
Approach to Fund What Works in State and Local Government, Darcy White*
(dwhite@pewtrusts.org) and Torey Silloway* (tsilloway@pewtrusts.org), The Pew Charitable Trusts
Governments make policy and budget choices each year have long-term impacts on their fiscal and
social outcomes. Currently, governments often make these decisions based on inertia, anecdotes,
and political and ideological factors, but they can achieve substantially better outcomes by using
rigorous evidence to guide their choices. This approach, called evidence-based policymaking, can
enable policymakers to strategically fund and operate their programs, and benefit-cost analysis plays
a key role in this process. To date, most states and local governments have made limited use of
benefit-cost analysis and evidence-based policymaking, and there has been no comprehensive
roadmap to help guide them on this approach. Since 2011, the Pew-MacArthur Results First
Initiative (Results First) has worked in a growing number of states to help them customize and use a
benefit-cost analysis model initially developed by the Washington State Institute for Public
Policy. Results First has also developed an integrated framework of steps that governments can
take to build and support a system of evidence-based policymaking. This framework, built on an
extensive review of research and discussions with public officials, practitioners, and academic
experts, includes five key components – program assessment, budget development,
implementation, oversight, and evaluation. The framework will be published as a Pew report in the
fall of 2014. This presentation will present the framework and highlight the role of benefit-cost
analysis in helping governments to think holistically about using research and benefit-cost analysis
to inform their budget and policy choices. It will emphasize that while identifying and investing in
cost-effective programs and practices are an essential step to achieving strong outcomes for
citizens, it must be done in conjunction with implementation fidelity and outcome monitoring. It will
also discuss how Results First states have used benefit-cost analysis to drive evidence-based
policymaking in their states and the results they have seen to-date.
2. Why Look Back? An Analysis of State Government Decisions to Analyze Existing
Regulations, Stuart Shapiro* (stuartsh@rutgers.edu), Debra Borie-Holtz and Ian Markey, Rutgers
University
Regulatory agencies have long been required to analyze the costs and benefits of their prospective
regulations. Yet once a regulation is in place, discussions of its effectiveness and its costs and
benefits are, if they are examined at all, the province of academic analyses. Rarely do agencies
revisit earlier regulatory decisions.
In recent years “retrospective review” has started to become more common. At the federal level,
President Obama’s Executive Order 13563 required agencies to conduct some reviews of existing
regulations. Among the 50 states, exactly half have also implemented retrospective review
57
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
requirements on regulatory agencies. For proponents of retrospective review (also known as “lookbacks”) this trend is an encouraging sign that governments are beginning to examine the benefits
and costs of previous regulatory decisions, and hopefully revise policies that have not been
successful. However, good policy is only one possible motivation for this change.
We examine the recent wave of retrospective review in the 50 states in an attempt to determine the
motivation for states to pursue this policy reform. We have collected data on the use of retrospective
reviews in each state as well as political and economic variables that may have influenced the
decision by governors and legislatures to undertake a look-back at older regulations. Using survival
analysis, we model the decision to undergo retrospective review of regulations as a function of these
independent variables. The results of this analysis are then used to draw conclusions about the
likely fate of retrospective review as political and economic conditions change.
3. Ex-Ante Cost-Benefit Analyses of Community-based DRR Interventions in the Caribbean,
Meenakshi Jerath* (mjera001@fiu.edu) and Juan Pablo Sarmiento, Florida International University
The evidence for the effectiveness and impact of disaster risk reduction (DRR) interventions
concerns humanitarian organizations, donor agencies and communities alike. The cost benefit
analysis (CBA) of DRR projects can demonstrate the attractiveness of these interventions by
enumerating the benefits of lower costs to donor agencies and reduced damages to beneficiaries.
This paper presents an approach to analyze community-based DRR interventions through the
findings of CBA within the larger context of project management. The study aims to assist leaders
and practitioners in the humanitarian community in analyzing a DRR project within its institutional
and community setting with a focus on capacity building of personnel for evidence based decisionmaking. We conducted an ex ante CBA of several DRR interventions to improve climate change
resilience in Caribbean communities by estimating the costs incurred by the society and the benefits
accrued to the community in general. The current accepted rate of the central bank of each country
was selected as the discount rate for analysis. The effect of varying discount rates (3–10%) on the
benefit cost ratio (BCR) was tested. All the DRR interventions analyzed, including safer shelters,
water and health micro-projects, obtained a BCR above 1, justifying the implementation of all the
interventions. For interventions other than safer shelters, the BCR ranged from 2.6 to 215. The study
revealed several areas for policy consideration within humanitarian organizations: the need to
comply with standards of a rigorous CBA, capacity building in the areas of CBA and project
management, need for timely integration of economic analysis of DRR projects within the project
cycle, importance of data collection and record keeping, building informational sources, and use of
more simplified related forms of CBA such as cost effectiveness and least cost analysis.
 D.8: Assessing Benefits in Consumer Protection Regulation (Marvin
302)
Chair: Lisa A. Robinson (Lisa.A.Robinson@comcast.net), Harvard Center for Risk Analysis
Panelists to Include:
1.
Howell Jackson (hjackson@law.harvard.edu), Harvard Law School
2.
Paul Rothstein (paul.rothstein@cfpb.gov), Consumer Financial Protection Bureau
3.
Michael Livermore (Livermore@exchange.law.nyu.edu), University of Virginia
4.
Art Fraas (fraas@rff.org), Resources for the Future
58
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
5.
Dick Morgenstern (morgenst@rff.org), Resources for the Future
In the United States, consumer protection regulation has not traditionally been the focus of extensive
benefit-cost analysis. But recent decisions of the Federal Court of Appeals for the District of
Columbia, new statutory mandates under the Dodd-Frank Act of 2010, and a flurry of law review
articles have brought the topic to the fore. Drawing on a new survey of seventy-two recent
consumer protection rulemakings, this panel will review current approaches to benefit analysis in
consumer finance and analogous areas of consumer protection. The presentation will offer an
overview of how federal agencies are currently analyzing and quantifying the benefits of consumer
protection regulation, including statistics about the incidence and intensity of benefit analysis across
the survey sample and selected subsamples. The presentation will also include a more in-depth
discussion of eighteen “exemplars” of benefit analysis across nine different benefit types. Together
these exemplars could be said to represent the current state-of-the-art benefit analysis in consumer
protection regulation. Finally, we hope to propose a handful of specific future research projects that
could improve the quality of benefit analysis in consumer protection regulation and explore
institutional arrangement to improve the quality of benefit analysis in federal agencies.
 E.8: The Effectiveness of Policies Involving Health Warning Labels
and Signage: Cigarettes, e-Cigarettes, and Alcohol (Marvin 310)
Chair: Trudy Ann Cameron (cameron@uoregon.edu), University of Oregon
Discussant: Joseph Cordes (cordes@gwu.edu), The George Washington University
Presentations:
1. Will New Warning Labels Encourage or Discourage the Use of Electronic Cigarettes?
Evidence from Experimental Markets, Don Kenkel* (dsk10@cornell.edu), Cornell University
Electronic cigarettes are battery-powered devices that deliver various concentrations of nicotine in
an aerosol mist. The use of e-cigarettes is rising rapidly and is expected to reach $3 billion in US
sales in 2015. Emerging research suggests that e-cigarettes are much less harmful than regular
cigarettes and hold promise as smoking cessation devices, although they are not approved as
medical devices. E-cigarettes are also not currently required to carry warning labels. However, the
2009 Family Smoking Prevention and Tobacco Control Act gave the FDA regulatory authority over
tobacco products. The FDA recently announced that it intends to use that authority to require ecigarettes to carry a warning label that they contain nicotine which is derived from tobacco and is an
addictive chemical. Also recently, some e-cigarette manufacturers have voluntarily adopted
somewhat stronger warning labels. Under another regulatory scenario, e-cigarette manufacturers
could apply to market their products as a modified risk tobacco product that could carry a more
favorable label. We conduct discrete choice experiments to estimate the impact of alternative
warning labels on adult smokers’ use of e-cigarettes. The data are from on-line surveys that present
smokers with choices between their current tobacco product, an e-cigarette, and a smoking
cessation product. The e-cigarette warning label is randomly assigned to vary across different choice
scenarios. The econometric discrete choice model yields estimates of the impact of the different
warning labels on the probability smokers choose e-cigarettes. In addition, the model yields an
estimate of smokers’ willingness to pay for the different warning labels.
2. What are the Potential Benefits of Graphic Warning Labels? Richard M. Peck*
(rmpeck@uic.edu) and John A. Tauras, University of Illinois at Chicago
59
Society for Benefit-Cost Analysis Conference 2015: Advancing the Policy Frontier
Huang et al. (2013) estimate that in the United States, graphic warning labels (GWL) would result in
5.3 to 8.6 million less smokers in 2013, reducing smoking prevalence by 12.6 percent to 20.4
percent. Using the framework of Murphy and Topel (2006), we find that the annual gross benefits of
GWL arising from lower levels of premature death would be about $32 billion to $52 billion dollars
annually. This is higher than the FDA estimates of 225 to 675 million dollars annually – their
approach is similar but there are important methodological differences (page 36723 of Federal
Register, V. 76, No 120, 2011). Their estimates of the efficacy of GWL are also much lower. If we
use recently proposed adjustments for loss of consumer surplus, then the net benefits of GWL are
$10 to $16 billion annually, for premature death alone. An adjustment is also made to take into
account the fact that on average current smokers have lower than average income so that the
average value of a statistical life that is used to calibrate the parameters of the model is lower (4.2
million dollars). Initial prevalence rates vary by state, gender and age and the population numbers,
from the Census Bureau, are for 2011. The model is linear, so the consequences of adjusting
wages, or the calibrating value of statistical life are straightforward to compute.
3. The Effects of Posted Point-of-Sale Warnings on Alcohol Consumption during
Pregnancy and on Birth Outcomes, Gulcan Cil* (gcil@uoregon.edu), University of Oregon
In 23 states and in the District of Columbia, all alcohol retailers are required by law to post signs that
warn against the risks of drinking during pregnancy. This study examines the effects of these pointof-sale alcohol warning signs (AWS) on alcohol consumption during pregnancy and on birth
outcomes using the variation in the timing of AWS legislation across states. The contributions of this
study are two-fold. First, this research is an investigation of the effectiveness of AWS as a policy tool
in reducing prenatal alcohol use. Moreover, using a quasi-experimental setting that results from a
plausibly exogenous policy change that affects all pregnant women, this study aims to establish a
causal link between prenatal alcohol exposure and birth outcomes. Using the National Vital Statistics
data and a specification that accounts for state and year fixed effects, state-specific time trends, and
individual- and state-level covariates, I find a statistically significant reduction in the odds of drinking
during pregnancy in response to AWS laws. This finding is supported by the results obtained using
the Behavioral Risk Factor Surveillance System data in a model that compares the change in alcohol
consumption before and after AWS laws among pregnant women with that of non-pregnant women.
In the reduced form regressions for birth outcomes, I find that AWS laws are associated with
decreases in the odds of very low birth weight and very pre-term births.
60