Powerpoint - Oxford Uehiro Centre for Practical Ethics

Transcription

Powerpoint - Oxford Uehiro Centre for Practical Ethics
Faculty of Philosophy
Oxford
March 1st 2013
Peter Taylor
Jerry Ravetz
The Value of Uncertainty
• Perceived need to eliminate uncertainty
– Confusing science with removing uncertainty
– Delusional certitude
– Wilful blindness
• Recognition of uncertainty can have value
– Appreciation of possibilities
– Adaptation to circumstances
– Better decisions
The Ethics of Uncertainty
• Uncertainty mostly seen as undesirable, yet
– False certainties of “Useless Arithmetic”
– Consequences of authority metrics
• Is recognising uncertainty good or bad?
– Prevents or delays crucial action?
(overuse of “precautionary principle” or on the
other hand tobacco “manufacturing doubt”)
– Confuses the public or causes loss of trust?
– Makes us feel uncomfortable?
– Surely someone must know the “truth”?
• What if the answer isn’t at the back of the
book?
The March of Science
• Error -> very bad
• Uncertainty -> bad
• Accuracy -> good
• Precision -> very good
but recall Aristotle on appropriate precision:
“It is the mark of an educated man to look for precision in each
class of things just so far as the nature of the subject admits”
Kelvin Science
Relativity
Newtonian
Physics
Quantum Theory
General Relativity
Thermodynamics
Quantum
Electrodynamics
“The task of science is to add
another decimal point”
Statistical
Mechanics
Delusional Certitude
“Everything is vague to a degree you do not
realise till you have tried to make it precise.”
Bertrand Russell
But what if
it’s really like
this?
We’d like
this
So we refine
the model to
show this
Our first
model looks
like this
Diagram Source: Charles E. Kay. 2010. The Art and Science of Counting Deer. Muley Crazy Magazine, March/April 2010, Vol 10(2):11-18
“It is hard to overstate the damage done in
the recent past by people who thought they
knew more about the world than they
really did.”
John Kay in “Obliquity” 2010
“Understanding the models, particularly their
limitations and sensitivity to assumptions, is the
new task we face.
“Many of the banking and financial institution
problems and failures of the past decade can be
directly tied to model failure or overly optimistic
judgements in the setting of assumptions or the
parameterization of a model.”
Tad Montross, 2010, Chairman and CEO of GenRe in “Model Mania”
“ ― Seek transparency and ease of interrogation of
any model, with clear expression of the
provenance of assumptions.
― Communicate the estimates with humility,
communicate the uncertainty with confidence.
― Fully acknowledge the role of judgement.”
D. J. Spiegelhalter and H. Riesch in “Don’t know, can’t know: embracing deeper
uncertainties when analysing risks” Phil. Trans. R. Soc. A (2011) 369, 4730–4750
Useless Arithmetic
Useless Arithmetic: Why Environmental Scientists Can't Predict the Future
Orrin Pilkey and Linda Pilkey-jarvis (2009)
• Maximum Sustainable Yield
• Bruun Rule for beach erosion
• Open pit mine pollution
“Admit to uncertainties and complexities, yet in the
end ignore them and recommend the modeling
approach. It is as though admission of fatal flaws
somehow erases them.”
Earthquake
The models were all convenient
but all were wrong and had bad
consequences
Tools For Judgement
• Blobograms
• Decision Portraits
• Nomograms
Blobograms
Iron Fertilisation
green blobs for:
• Mt Pinatubo
• Sequestration
• LOHAFEX
• Redfield Ratio
• James Martin’s …
red blob for:
• Blooms
• Deep Oxygen Depletion
• Ecosystem
Decision Portraits
Issues for Building site
Could encounter costly delays if the
planners rejected it
• green blob for increased profit
• rejection with a red blob.
Using low-lying land runs the risk of
some houses being uninsurable (and
hence unsaleable) if there are severe
floods before completion
• green blob for extra houses
• red blob for the insurability risk.
Nomograms
Source: Wikipedia
Beyond crisp numbers
• The combination of these tools will enable
us to reason rigorously about uncertain
quantities
• In particular, the use of blobs with
nomograms would enable the
identification of models that are strictly
nonsensical: GIGO
• That is, where uncertainties in inputs must
be suppressed lest outputs become
indeterminate
Tools for risk management
• Second-order Probability
• Exploring Outcome Space
• The Logic of Failure
On the Quantitative Definition of Risk
Kaplan and Garrick, Risk Analysis Vol 1 No 1 1981
Klibanoof, Marinacci, Mukerji, Econometrica, Vol. 73, No. 6 (November, 2005), 1849–1892
To understand the next slides
• Underwriting decisions made nowadays on
the basis of an “EP curve” from which
three statistics are typically used:
Usually
small
Usually
huge
– Mean
– Standard Deviation
– 1 in 200 year VaR
The single blue line in this chart
of the chance of annual loss
Underlying uncertainty
Loss Distribution
1000 Yr
Yr RP
RP
1000
Occupancy - Industrial
L.A., 1991, 2-story, Level 0 for
all subtypes; 50 simulations,
with 50,000-yr walkthrough
each
TU: 40%
BSW_C: 20%
BSW_M: 20%
W_E:
10%
CBF: 7%
LS: 3%
Probabilities of Probabilities
Source: AIR
Source: Managing Catastrophe Model Uncertainty- Guy Carpenter 2011
400
350
Full Uncertainty Mean = $251m
250 Yr RP
25%
Mean of EP Curves = $251m
20%
Loss ($m)
300
15%
10%
250
5%
0%
200
226 229 232 235 238 241 244 247 250 253 256 259 262 265 268 271 274 277 280 283
Losses $m
150
10
16
25
40
63
100
167
250
333
500
1,000
Return Period (Yrs)
Source: Oasis project
2% chance
of $283m
Exploring Outcome Space
Source: Dawn of War
Exploring Outcome Space
Characteristic Events
• NE Wind
• New Orleans Hurricane
Karen Clark & Company “RiskInsight Open”
Exploring Outcome Space
Model Generated PML
Karen Clark & Company “RiskInsight Open”
The Logic of Failure
Source: Dietrich Dorner
The Logic of Failure
• Dorner is a Cognitive Psychologist
interested in human decision-making
• He uses computer simulation models for
participants in various complex settings
Losers:
• Acted without prior analysis
• Didn’t test against evidence
• Assumed absence of negative
meant correct decisions made
• Blind to emerging circumstances
• Focused on the local not the
global
• Avoided uncertainty
‘good participants differed from the
bad ones … in how often they tested
their hypotheses. The bad
participants failed to do this. For
them, to propose a hypothesis was
to understand reality; testing that
hypothesis was unnecessary.
Instead of generating hypotheses,
they generated “truths” ’
“ … we do not feel it is generally appropriate to
respond to limitations in formal analysis by
increasing the complexity of the modelling. Instead,
we feel a need to have an external perspective on
the adequacy and robustness of the whole
modelling endeavour, rather than relying on withinmodel calculations of uncertainties, which are
inevitably contingent on yet more assumptions that
may turn out to be misguided. ”
D. J. Spiegelhalter and H. Riesch in “Don’t know, can’t know: embracing deeper
uncertainties when analysing risks” Phil. Trans. R. Soc. A (2011) 369, 4730–4750
Implications - Business
• Operational
 Greater spread of price
• Management  Judgement on portfolio
• Regulators
 Ask different questions
Implications - General
Whilst uncertainty is not to be glorified
• We should not disguise our ignorance with
delusionally certain models
• We can take advantage of the greater
scope uncertainty offers
• Tools are needed to support judgement
peter.taylor@philosophy.ox.ac.uk
jerome.ravetz@gmail.com