Deep Learning: Principles and Use

Transcription

Deep Learning: Principles and Use
Deep Learning:
Principles and Use
Rafael S. Calsaverini
Nº 94
March 2016
Challenges of the Companies in View of the Anticorruption Law
Franklin Mendes Thame
16
Increased Capital Allocation to Operational Risk and
its Implications for Local Financial Institutions
Marcelo Petroni Caldas | Oswaldo Pelaes Filho | Frederico Turolla
23
Positive
Prepare sua empresa para extrair o máximo de benefícios

Saiba como explorar todo o
potencial dessa nova realidade

Conceda mais crédito com
menos riscos


Aperfeiçoe sua gestão de clientes

Minimize prejuízos causados
por fraudes e inadimplência
dos dados positivos com a ajuda da Serasa Experian.
Positive-se! Contrate agora mesmo a Consultoria Serasa
Experian em Cadastro Positivo e rentabilize suas decisões
de negócios.
Para saber mais, acesse
serasaexperian.com.br/cadastropositivo
ou ligue para 3004 7728
Aumente o retorno das suas
ações de cobrança
+
++
3
06
Deep Learning: Principles and Use
Rafael S. Calsaverini
This review article presents the concept of Deep
Learning) and some of its recent results. Deep Learning algorithms allow for learning vector representations of data whose nature is difficult to deal with
mathematically (images, text, audio, etc.), in different levels of abstraction. In addition to a short review of the literature of Deep Learning, the article
presentes the essential theoretical principles and a
view on the use of such technology in the research
areas of the DataLabs at Serasa Experian.
16
Challenges of the Companies in
View of the Anticorruption Law
Franklin Mendes Thame
The Brazilian Anticorruption Law established the
strict liability in the civil and administrative sphere of companies for the practice of harmful acts
against public administration, completed an important gap in Brazil’s legal order to punish the corruptors. Thus, the companies are called for consideration and action so as to prevent themselves
from punishments as a result of torts before the
public sector. The law 12.846/2013 aims at fighting
against impunity in the businesses with the public sector, mainly for economic segments of performance with greater susceptibility to the corruption practices.
23
Increased Capital Allocation to
Operational Risk and its Implications
for Local Financial Institutions
Marcelo Petroni Caldas
Oswaldo Pelaes Filho
Frederico Turolla
This article explores the impacts of changes in
capital allocation to operational risk as currently
being studied by the Basel Committee, and its impact on local (i.e.: Brazilian) Banks. Our studies
show material impact, as the change may lead to
allocations three times greater than current. This
significant impact affects the macro-economy
through lower credit supply and its consequent impact for business firms and households. The new
capital burden must therefore be aligned with the
reality of the economic cycle at hand. Or, otherwise, pari-passu implementation might be proposed
to blunt the relevant impacts that the new rule may
have on the local and regional economy.
4
From the Editors
The first issue of 2016 contains three articles widely updated, aiming at informing,
clarifying and guiding experts from the economic and financial areas in relation to new techniques and their use. They refer to Anticorruption Law, Capital Allocation to cover Operational Risk and Deep Learning.
Our cover article points out the work of the Data Scientist Rafael Calsaverini on
Deep Learning informing that technological improvements have enabled a growing capacity for storing and processing data resulting in business improvement and appearance of
new opportunities. Simultaneously to the new computational capacity, new algorithms and
modeling techniques appeared which can recognize more and more complex patterns. Those algorithms and techniques derive from a series of research fields in several areas – Computational Statistics, Computer Science, Optimization, Control Engineering, among others
– that have been united in a field called Machine Learning. Thus, Machine Learning is the
Science field that studies the development of computational algorithms that can learn about
a set of data, and then make predictions. Those algorithms will be embedded in computational applications that should be able to take decisions about their environment through standards learnt from the data instead of predetermined rules in a static program. Calsaverini
clarifies that there is a fine and sparse dividing line between Machine Learning and Computational Statistics, and there is a trend to group such two aspects, in addition to a few other
techniques deriving from Computer Science, in a new area called Data Science. An approach would be to say that Computational Statistics is the area whose concern is to obtain efficient algorithms to use models that are mainly statistical in their nature, while Machine Learning may have more options of models based on optimization (such as Compressed Sensing,
Support Vector Machines, Neural Networks, etc.), or even algorithmic models in their nature
(such as Decision Trees); however, this distinction becomes more and more fine when algorithmic and optimization models acquire a statistical interpretation and typical computational
models are developed with a clear statistical intuition.
Franklin Mendes Thame presents, in due time, a work on Anticorruption Law No.
12.846/2013 whose objective is to fight impunity in the businesses with the public sector whenever corruption practice is evidenced. The author informs that Brazil loses the frightening
amount of R$ 50 billion to 80 billion in corruption every year. Corruption has become the main
problem in the country once it destroys confidence among economic agents, impedes investments, destabilizes the economy, reduces tax collection and withdraws fundamental rights from all Brazilians. “When the Anticorruption Law establishes the strict liability, within
the civil and administrative spheres, of the companies that practice harmful acts against public administration, it fills a relevant legal gap in Brazil and punishes the corrupters. Thus,
the companies are called to reflect and prevent themselves from punishments for torts before the public sector”, points out Thame. In Thame’s view, the Anticorruption Law will induce
the organizations to a new manner to make businesses with the public sector, which will require the adoption of good corporate governance practices with transparence in all hierarchical levels, strict fulfillment of legal compliance, segregation of duties, outsourcing policies, program for defining specific risks, internal controls and audits. The anticorruption actions and strategies in the companies should reach all hierarchical levels and all administrative areas. The actions to face corruption are based on ethics, fair competition, efficiency,
5
merits and are inserted in the best corporate governance practices such as integrated risk
management, compliance structure, human capital development and administration of suppliers and partners. The anticorruption policy should be part of the companies’ strategic policy aiming at preserving their highest value, i.e., the reputation.
Marcelo Petroni Caldas, Oswaldo Pelaes Filho and Frederico Turolla addressed the
subject concerning the Increase in Capital Allocation to cover Operational Risk and its implications in financial institutions. With basis on the studies from Basle’s Committee, they have
analyzed the impacts on Brazilian financial institutions as a result of the changes in Capital
allocation. The studies showed that such extremely significant impact will cause effects on
macroeconomics once it will reduce the credit offer. The authors of the article indicate the
need for aligning that new tax burden on capital to the current economic cycle, or a pari passu implementation with the purpose of reducing the relevant impacts that such new rule may
cause to the local and regional economy. They believe that the next steps to obtain an accurate view of the impacts, which will also include small and medium-size Financial Institutions,
should refer to repeat study as from the moment the rule is actually validated and divulged.
The consultants remember that the 2008 crisis resulted in a worldwide movement towards
a new form of regulation of Financial Institutions (FIs), in view of the concern with their solvency; however, the operational risk has been little addressed in that context of global changes once the focus was to improve the quality of capital and deepen the requirements for liquidity risk. Studies and considerations from Basel’s Committee resulted in a new proposal
for capital allocation to cover the operational risk to make it more susceptible to the extent
they consider new aspects to calculate the capital requirement such as volumes and businesses done among other items. They say that the studies indicate a trend to considerable
impact and the banking industry has been acting to present the specific characteristics of
the process to the regulators and to the Committee, pointing out that the theme is recent and
there is scarce literature under the view of International Business, and addressing the theme contributes to understand the internationalization concepts of Financial Institutions, regulation on home host, psychic distance, risk management and internal controls of the Financial System.
CREDIT TECHNOLOGY
YEAR XIII
Trimonthly published by Serasa Experian
Nº 94
ISSN 2177-6032
President - Brazil
Cover, Desktop Publishing and Ilustration
José Luiz Rossi
Gerson Lezak
Business Units Presidents/Superintendents
Translation
Mariana Pinheiro, Steven Wagner and Vander Nagata
Allan Hastings and Exman Fucks Traduções
Directors
Correspondence
Amador Rodriguez, Guilherme Cavalieri, Lisias Lauretti,
Serasa Experian - Comunicação & Branding
Alameda dos Quinimuras, 187 - CEP 04068-900 - São Paulo - SP
Manzar Feres, Paulo Melo, Sergio Fernandes and Valdemir Bertolo
Responsible Editor
www.serasaexperian.com.br
rdangina@gmail.com
Rosina I. M. D’Angina (MTb 8251)
The concepts issued in the signed articles are the responsibility of the authors, which do not necessarily express
the point of view of Serasa Experian and the Editorial Council. Total or partial reproduction of the articles hereby
published is strictly forbidden.
6
Deep Learning:
Introduction and Review
Rafael S. Calsaverini
7
Summary
Recent developments in the use of neural networks
and deep models in Machine Learning ended up in a quick
succession of results superseding the state of the art in several modeling challenges in different areas such as object recognition, handwritten characters recognition, face recognition, sentiment analysis, among others. Then, those methods have been quickly adopted to solve tasks involving sets of
highly complex data where the numerical representation, necessary for the traditional algorithms is a challenge. Deep Learning algorithms allow for learning vector representations of
data whose nature is difficult to deal with mathematically (images, text, audio, etc.), in different levels of abstraction. This
review article presents the concept of Deep Learning (LeCun, Bengio, and Hinton 2015) and some of its recent results.
1. Introduction
The boom in the large quantity of data in the latest decades, combined
with technological improvements enabling a growing capacity for storing and processing such data, allowed, in the last decades, the appearance of new businesses and the transformation of existing businesses by using those data for several
purposes. Along with such abundance of data and computational resources, new
algorithms and modeling techniques appeared which can recognize more and
more complex patterns. Those algorithms and techniques derive from a series
of research fields in several areas – Computational Statistics, Computer Science, Optimization, Control Engineering, among others – that have been united in a
field called Machine Learning. Machine Learning is the field that studies the development of computational algorithms that can learn about a set of data, and then
make predictions. Those algorithms will be embedded in computational applications that should be able to take decisions about their environment through standards learnt from the data instead of predetermined rules in a static program.
There is a fine and sparse dividing line between Machine Learning and
Computational Statistics, and there is a trend to group both aspects, in addition
to a few other techniques deriving from Computer Science, in a new area called
Data Science (Michael I. Jordan 2014). An approach would be to say that Computational Statistics is the area whose concern is to obtain efficient algorithms to
use models that are mainly statistical in their nature, while Machine Learning may
have more options of models that may be based on optimization (such as Compressed Sensing, Support Vector Machines, Neural Networks, etc.), or even algo-
8
rithmic models (such as Decision Trees); however, this distinction becomes more
and more fine when algorithmic and optimization models acquire a statistical interpretation and typical computational models are developed with a clear statistical intuition (MacKay 1992; Michael I Jordan 1994; G. E. Hinton, Osindero, and Teh
2006; Mezard and Montanari 2009; Koller and Friedman 2009; Neal 2012; Kingma
et al. 2014; Gal and Ghahramani 2015; Gal 2015).
A particularly interesting topic in Machine Learning since mid-2006 has
been the use of Neural Networks, as well as other algorithms suitable to be represented as stacked layers, that have been denominated Deep Learning. That
type of algorithm has established state-of-the-art performance records in several
types of artificial intelligence tasks that deal with large sets of complex data and
revealed useful techniques and results in an extensive range of problems – from
recognition of handwritten text characters to analysis of Particle Accelerator data
(Baldi, Sadowski, and Whiteson 2014) to drugs discovery and analysis of biomolecules (Unterthiner et al. 2015). We intend with this article to present a short review of the literature of Deep Learning, the essential theoretical principles and a
view on the use of such technology in the research areas of the DataLabs at Serasa Experian.
2. Machine Learning and Deep Learning
Machine Learning algorithms are typically trained, that is, their operational parameters are adjusted to a specific dataset to carry out specific tasks. The
purpose of some tasks is, given a series of examples, i.e., known pairs of Features
and Targets, to obtain an algorithm that estimates new Targets for a new population of Features. Such tasks are known as Supervised Learning. A typical example of supervised learning is a regression task, for example, or a classification
task. In other kinds of tasks there is no clear target to be predicted, but a pattern
recognition task to be carried out. Such tasks are called Unsupervised Learning.
A typical example is the clustering of points of a set of data in natural groups or
the discovery of a hierarchical tree of clusters (hierarchical cluster), or modeling
the distribution of a dataset.
The traditional Machine Learning approach when dealing with the processing of a new dataset is the engineering and extraction of features: in a new
dataset, specialists and modelers must initially design and extract from raw data,
in manual or semi manual way, which Features1 would be good predictors of the
target variable or of the patterns that the algorithm should learn.
These features could be simple combinations of raw data like, for example, polynomial composition of the independent variables, or can be expensive
processes, like automatically extracting from a large text corpus expressions which seem to indicate entity names (companies, persons, countries, etc.). After a
feature vector is designed and extracted, it is fed into generic algorithms capa-
9
ble of performing the designated task like random forests, support vector machines, among others.
Typically these tasks are difficult and expensive, and it could take years
to discover an extremely specialized set of good features for specific tasks, like,
for example SIFT (Scale-invariant feature transform) features (Lowe 1999) or Viola-Jones Haar-like descriptors (Viola and Jones 2004) for the recognition of objects in Computer Vision. Since the mid-2000s, (LeCun, Bengio, and Hinton 2015),
research groups started to obtain very important results using a different approach: Feature Learning or Representation Learning.
The fundamental principle of this approach consists in replacing the long
feature engineering and extraction time by algorithms capable of learning features from raw data while simultaneously learning to solve the task using those features. For example, in a handwritten character recognition task, the algorithm
should be able to learn which compositions of the original bytes of the image are
important to find out which character it represents during the same process in
which it is learning to predict the character. Ideally, there should not be any difference between both processes: in both processes the parameters of a model are
being adjusted to obtain the desired result (minimize a cost function, for example); therefore, they should be part of the same process. At the end of the process
the algorithm learned not only to solve a task related to those data, but, also, a representation of the data in a feature vector which can be useful for multiple tasks.
Typically, the adequate features for the execution of a complex task are highly non-linear transformations of the raw data. Modeling highly non-linear functions is a difficult task; however, it is feasible using deep models. A deep model is
a model in which several small model instances – called layers – are stacked so
that the output of one layer becomes the input for the next. Each of these layers
is able to learn a simple non-linear map of the former layer output. By stacking
layers that learn non-linear maps, the model is able to compose more and more
complex concepts at each new layer, eventually producing more and more complex maps of the input variables.
The typical example of this capability of deep models is the detection of
handwritten characters. In a model with several layers, trained to identify which
character is represented in an image, the first layer take the image of a handwritten character as input, and learns, for example, to detect horizontal lines, vertical lines and other primitive simple geometries. The second layer, using as input
the output signal from the first layer, composes such primitive geometries into
more complex signals: oblique lines, angles and soft curves. The third layer composes such signals into more sophisticated detectors capable of identifying circles, sharp curves and, finally, the last layer composes these more sophisticated
geometries into complete character shape detectors. This process of constructing signals that are ever more complex and non-linear in the deeper layers of the
model is what gives Deep Learning models the ability to learn highly non-line-
10
ar maps and solve complex tasks; the model is learning, from the most raw data,
what are the features that are most predictive of the target, without the need for
manual feature engineering.
3. Neural Networks
The main class of models which has been used in deep architectures are
artificial neural networks. An artificial neural network is a set of small units typically called, in analogy with biological systems, neurons or perceptrons. Those
units receive inputs and produce a single output which consists of a linear transformation of the inputs followed by the application of a non-linear map, called activation function. Each unit can be grouped in layers in a feed forward network
and work as a deep model.
Neural networks are not recent
algorithms. Since their early inspiration in the study of biological structures in the 1940s (McCulloch and Pitts
1943; Hebb 2005), the first supervised
learning models like the Perceptron
(Rosenblatt 1957) and the creation of
the training algorithm that allows training deep networks, the Back Propagation algorithm (Williams and Hinton 1986; Parker 1982; Werbos 1974),
decades of development have passed
and, in spite of the relative success in
some areas, there was dormant period
in research in this area during the end
of the 90’s and the beginning of the
2000’s. Neural networks present many theoretical and engineering challenges and
its application was more expensive than the application of other algorithms which
obtained more success at that time like Support Vector Machines and Random Forests.
However by the mid-2000s (LeCun, Bengio, and Hinton 2015), there was a
conjunction of conditions which allowed the rebirth of neural networks as an algorithm capable of reaching and enhancing the state of the art in several modeling
tasks. Besides a series of discoveries of new techniques and algorithms for training
neural networks, the main new resources which enabled their rebirth were the availability of much bigger datasets for training and the progress in the specific computational hardware for vector processing, capable of accelerating in several orders
of magnitude the time needed to train a neural network. Today it is possible, in an
11
average capacity personal computer, using a GPU (graphical processing unit), typically used for computer graphics, to train, in minutes, a neural network capable of
recognizing handwritten digits with accuracy higher than human operators (>99%).
This enabled a rebirth in the use of neural networks in different types of modeling
tasks, starting in the fields of Computer Vision and Voice Recognition, but soon
spreading to other areas in science and engineering. Today a big part of the state-of-the-art results in several tasks are achieved by neural network based systems.
4. Unsupervised Learning
Neural networks are naturally oriented to supervised tasks in which there is a dataset with well-defined inputs and outputs. However, the research of unsupervised techniques using deep neural networks presents a promising horizon
of algorithms that can be used in the analysis of non-structured data, particularly, for the learning of features to analyze text, sequential data, time series, images
and other type of data without a clear
tabular structure or in which the engineering and extraction of features for
traditional algorithms is challenging.
The typical training algorithms for neural networks however,
are dedicated to supervised learning and the solution of non-supervised problems using neural networks
is still a young, active and promising
research area. Two strategies are coming up in this direction:
(1) Creating supervised tasks, which,
when solved, make the neural network learn vector representations
of the data of interest, thus generating inputs for other types of algorithms (such as clustering, for example), and
(2) Creating generative models which try to learn the probability distribution of
the data and generate, in the process, knowledge about it – for example, latent
factors vectors, classification and clustering, etc.
Both strategies are based on the concept of representation learning;
each layer of a deep neural network learn how to detect or to represent concepts
ever more abstract about the original dataset. Eventually, the same intermediate
representations created by the neural network from the data to solve a specific
12
task contains summarized information about the data itself, which can be useful
in an exploratory investigation, or even in other related tasks (Yosinski et al. 2014).
An example of neural network that solves an auxiliary supervised task to
learn non-supervised representations for datasets of interest are the Autoencoders (Bengio 2009). An autoencoder is a neural network composed of two sub-networks: one encoder, whose task is to generate a low dimensional representation of a complex object, such as an image or text; and a decoder, whose function is to recover the original object from the representation generated by the encoder. The objective of training an autoencoder is, by forcing the compression
of information in a low dimensional representation, to obtain a feature vector that
represents an image or text, for example, which summarizes the information contained in this complex object and allows exploratory analyses or even the use in
other classification, clustering or regression tasks. Thus, an autoencoder can be
trained, for example, to create vector representations of texts that later will be
used to train sentiment analysis using another algorithm – a Logistic Regression
or Vector Support Machine, for example.
Generative models with deep neural networks are more recent developments and are an active area of research. A generative model is a model that tries
to capture the process that generated the data observed, modeling its probability distribution. This type of modeling are already widely known in Machine Learning, with models such as Gaussian Mixtures for clustering, or Latent Dirichlet
Allocation for topic analysis. However, very complex probabilistic generative models are typically intractable by using analytic techniques and depend on the usage of variational approximations or Monte Carlo simulations to be fitted. Recently, some progresses were achieved delegating the complexity of the probabilistic
models to neural networks and developing techniques which take advantage of
algorithms such as Backpropagation to solve part of the problem. An example is
the Generative Adversarial Network (Goodfellow et al. 2014; Makhzani et al. 2015;
Denton et al. 2015) in which it is possible to create a neural network which learns
so well the probability distribution of a set of images, that it is capable of generating new images that are practically visually undistinguishable from real images.
The intermediate representations learned by those generative models frequently
contain a lot of information about the original objects, being possible, for example, to use them to extract information about the style of an image (Makhzani et
al. 2015) and to apply it over another (Gatys, Ecker, and Bethge 2015), or recreate the same image from another point of view (Radford, Metz, and Chintala 2015).
5. Conclusions
Deep neural networks have shown themselves to be an extremely valuable tool for modeling big datasets with complex structure, particularly text and
image data. That type of deep architecture is capable of producing end-to-end
models, without the need of an expensive initial feature engineering. Besides, the
13
representations learned by the neural networks keep information about the original dataset and can serve as input for other algorithms and analysis. Given the
richness of textual datasets of difficult exploration using traditional techniques,
unsupervised techniques using neural networks can generate new opportunities
by allowing the use of raw data sources, eliminating an uncertain and expensive
variable engineering process.
6. Note:
Author
1 Features. Typically in the jargon of Machine Learning, covariates, control variables, explanatory variables or independent variables are called Features. It is
the same concept – they are the variables that will be inputted in the model to
predict a dependent variable or target.
Rafael S. Calsaverini
Rafael S. Calsaverini is a Ph.D. in Physics by the University of São Paulo, with works in statistical modelling and machine learning. Works in the private market since 2012 as a Data Scientist, developing
applications based on mathematical models and machine learning. Joined in 2015 the brazilian unit of
Experian Datalabs at Serasa-Experian. E-mail: Rafael.Calsaverini@br.experian.com
References
14
BALDI Pierre; SADOWSKI, Peter; WHITESON ,Daniel. 2014. “Searching for Exotic Particles in
High-Energy Physics with Deep Learning.” Nature Communications 5. Nature Publishing Group.
DENTON, Emily L; SOUMITH, Chintala; FERGUS, Rob, and others. 2015. “Deep Generative Image
Models Using a Laplacian Pyramid of Adversarial Networks.” In Advances in Neural Information
Processing Systems, 1486–94.
GATYS, Leon A; ECKER, Alexander, S; BETHGE, Matthias. 2015. “A Neural Algorithm of Artistic
Style.” ArXiv Preprint ArXiv:1508.06576.
GOODFELLOW, Ian; POUGET-ABADIE, Jean; MEHDI, Mirza; BING, Xu; WARDE-FARLEY, David;
OZAIR, Sherjil; COURVILLE, Aaron; BENGIO, Yoshua. 2014. “Generative Adversarial Nets.” In
Advances in Neural Information Processing Systems, 2672–80.
HEBB, Donald. 2005. The Organization of Behavior: a Neuropsychological Theory. Psychology Press.
HINTON, Geoffrey E; OSINDERO ,Simon; YEE-WHYE, Teh. 2006. “A Fast Learning Algorithm for
Deep Belief Nets.” Neural Computation 18 (7). MIT Press: 1527–54.
JORDAN, Michael I. 1994. “A Statistical Approach to Decision Tree Modeling.” In Proceedings of
the Seventh Annual Conference on Computational Learning Theory, 13–20. ACM.
JORDAN, Michael I. 2014. “AMA: Michael I. Jordan.” Reddit. /r/MachineLearning/. https://www.
reddit.com/r/MachineLearning/comments/2fxi6v/ama_michael_i_jordan/ckelmtt?context=3.
KINGMA, Diederik P.; MOHAMED, Shakir; REZENDE, Danilo J; WELLING, Max 2014. “SemiSupervised Learning with Deep Generative Models.” In Advances in Neural Information Processing
Systems, 3581–89.
KOLLER, Daphne; NIR, Friedman,. 2009. Probabilistic Graphical Models: Principles and Techniques.
MIT press.
LECUN, Yann; BENGIO, Yoshu; HINTON, Geoffrey. 2015. “Deep Learning.” Nature 521 (7553).
Nature Publishing Group: 436–44.
LOWE, David G. 1999. “Object Recognition from Local Scale-Invariant Features.” In Computer
Vision, 1999. the Proceedings of the Seventh IEEE International Conference on, 2:1150–57. IEEE.
MACKAY, David J. C. 1992. “A Practical Bayesian Framework for Backpropagation Networks.”
Neural Computation 4 (3). MIT Press: 448–72.
MAKHZANI, Alireza; SHLENS, Jonathon, JAITLY, Navdeep; GOODFELLOW, Ian. 2015.
“Adversarial Autoencoders.” ArXiv Preprint ArXiv:1511.05644.
Referências
15
MCCULLOCH, Warren S.; PITTS, Walter. 1943. “A Logical Calculus of the Ideas Immanent in
Nervous Activity.” The Bulletin of Mathematical Biophysics 5 (4). Springer: 115–33.
MEZARD, Marc; MONTANARI, Andrea. 2009. Information, Physics, and Computation. Oxford
University Press.
NEAL, Radford M. 2012. Bayesian Learning for Neural Networks. Vol. 118. Springer Science &
Business Media.
PARKER, David B. 1982. “Learning Logic. Invention Report S81-64, File 1, Office of Technology
Licensing.” October, Stanford University.
RADFORD, Alec; METZ, Luke; CHINTALA, Soumith. 2015. “Unsupervised Representation
Learning with Deep Convolutional Generative Adversarial Networks.” ArXiv Preprint
ArXiv:1511.06434.
ROSENBLATT, Frank. 1957. The Perceptron, a Perceiving and Recognizing Automaton Project Para.
Cornell Aeronautical Laboratory.
RUMELHARD, David E; HINTON, Geoffrey E.; WILLIAMS, Ronald J. 1986. “Learning
Representations by Back-Propagating Errors.” Nature, 323–533.
UNTERTHINER, Thomas; MAYER, Andreas; KLAMBAUER, Günter; HOCHEITER Sepp. 2015.
“Toxicity Prediction Using Deep Learning.” ArXiv Preprint ArXiv:1503.01445.
VIOLA, Paul; JONES, Michael J. 2004. “Robust Real-Time Face Detection.” International Journal of
Computer Vision 57 (2). Springer: 137–54.
WERBOS, Paul. 1974. “Beyond Regression: New Tools for Prediction and Analysis in the
Behavioral Sciences.”
YARIN, Gal. 2015. “What My Deep Model Doesn’t Know...” Blog. Yarin Gal. http://mlg.eng.cam.
ac.uk/yarin/blog_3d801aa532c1ce.html.
YARIN, Gal; GHAHRAMANI, Zoubin. 2015. “Dropout as a Bayesian Approximation: Insights and
Applications.” In Deep Learning Workshop, ICML.
YOSHUA, Bengio. 2009. “Learning Deep Architectures for AI.” Foundations and Trends in Machine
Learning 2 (1). Now Publishers Inc.: 1–127.
YOSINSKI, Jason; CLUNE, Jeff; BENGIO, Yoshua; LIPSON, Hod. 2014. “How Transferable Are
Features in Deep Neural Networks?” In Advances in Neural Information Processing Systems, 3320–28.
16
Challenges of the Companies in
view of the Anticorruption Law
Franklin Mendes Thame
17
Summary
The Anticorruption Law - Law No. 12.846/2013 – has
improved the regulatory framework of the Country in the fight against corruption. Such regulation is an essential instrument to correct market flaws, mould the conduct, change the relationship standard, improve the prevention practices and control the companies’ behavior. The law aims
at fighting against impunity in the businesses with the public sector, mainly for economic segments of performance with greater susceptibility to the corruption practices.
1. Introduction
Anticorruption Law 12.846/2013 established the strict liability in the civil
and administrative sphere of companies that practice harmful acts against the public administration completed an important gap in Brazil’s legal order, punishing
the corruptors. Thus, the companies are called for consideration and action so as
to prevent themselves from punishments for torts before the public sector.
The operation of the companies that are connected with the public sector
and the regulatory agencies is requiring, in view of the possibility of being fitted in
the framework of such Law, greater care when they:
• participate in bids and concessions;
• request for licensing, authorizations, approvals ad permits;
• contract partners, third parties and suppliers;
• request subsidized loans before the official financial institutions; and
• apply for authorizations to commercialize new products and services, among
other actions.
The consciousness by the companies of the need to prevent themselves
from sanctions deriving from crimes in the relationship with the public sector implies a cultural change to face the practices aiming at obtaining advantages whenever they act as suppliers and services providers to such sector. Therefore, the
companies need to structure specific governance for the integrated management
of risks, adopting compliance policies once certain behaviors, justified in view of
the bureaucratic procedures of public bodies (excess of laws, regulations, requirements of permits and licenses), may be considered subject to be fitted in the Anticorruption Law.
18
The actions and strategies on anticorruption in the companies shall be
transverse, reach all hierarchical levels and all the administrative areas. The actions for facing corruption are based on ethics, fair competition, efficiency, merits
and are included in the best practices of corporate governance such as integrated risk management, compliance structure, development of human capital and
management of suppliers and partners. The anticorruption policy must be part of
the strategic policy of the companies in order to preserve their greatest value: reputation.
2. Self-regulation
The company shall define the policies and the self-regulation actions evidencing its commitment to the principles guiding the ethics, attributing responsibilities to the areas involved in the risk management of possible impacts deriving
from its activities with the public sector.
The scope and contents of the prevention policy in relation to the Anticorruption Law of a company are defined according to:
• size;
• nature of its activity;
• peculiarities in the business lines; and
• forms of action before the public sector.
Initially, the company must diagnose its actions with the public sector,
from the mere obtaining of license or permit to a complex official authorization for
launching of a new product. Once the actions are aligned, the company will have
to verify the risks and classify them according to their relevance and proportionality as well as to define the areas in charge.
Subsequently, the company will have to identify the employees and outsourced (CNPJs, i.e., Legal Entities’ National Enrollments and CPFs, i.e., Individual Taxpayers’ Enrollments) involved and to create a “risk matrix’’, with information
obtained about them from information technology companies such as Serasa Experian, from the media, from public websites containing legal and procedural information in the civil, labor and criminal spheres and from enrollments and public
lists including the ones who had behaviors considered disreputable.
The company will further have to establish the standards of prudential
performance and the actions that can avoid crimes to mitigate eventual judicial
sanctions. The procedures adopted by the company will have to enable defense allegations in judicial actions, provide comfort/legal security and demonstrate that
it was diligent in relation to its businesses with the public sector. An important reference is the existence of a compliance program consistently demonstrating the
19
strategies and actions intended to avoid acts subject to being considered corrupt.
The program will have to make feasible that the principles of prevention and precaution are set to face any and all corruption act.
The company’s internal standards should reflect the anticorruption legislation. The company’s fulfillment of such standards shows that the company has
started to consider the ethical and moral factors together with the economic and
financial ones. Such standards will be part of the integrated management of risks
that protects the company from operational risks, reputation, crimes, punishments
and joint and several liability, among others.
3. Clean Company
Anticorruption Law No.12.846/2013, known as the Law on Clean Company
has, therefore, the purpose of dealing with the conduct and the punishment of the
companies for corruption acts practiced by their direct or outsourced representatives against public administration.
This law can be applied against companies that corrupt public agents
with graft and bribery, defraud public bids or contracts, violate - by means of adjustment or agreement - the competitive character of a bidding procedure,
and obtain benefits pursuant to corrupt
acts.
This is a case of objective punishment; thus, the company may be
held liable in cases of corruption, regardless of evidence for fault. The company will be liable before judicial authorities for corruption act practiced by its
employees or by outsourced company
or by outsourced employee, even without direct involvement of its owners.
Therefore, the company will be punished if it had obtained benefits by means of
corrupt act resulting including from administrative improbity of public employees.
The fine imposed by judicial authorities may vary from 0.1% to 20% of the
gross revenue of the last fiscal year preceding the filing of administrative process
or up to R$ 60 million of reais, whenever such calculation is not possible. Another
punishment directly affecting the company’s reputation is the publication of an
enforceable judgment in the media.
The law prohibits new companies organized by partners of disreputable companies, included in the National Registry of Disreputable and Suspen-
20
ded Companies - CEIS, whether in their own names or in a hidden manner, from
contracting with the public administration. The controlling companies, controlled
companies, affiliates, or within the sphere of the respective contract, the companies in consortium, will be held jointly and severally liable for paying the fine and
fully redressing the damage caused. Under the hypotheses of consolidation and
merger, the successor’s liability will be restricted to the obligation to pay the fine
and to fully redress the damage caused, up to the limit of the transferred assets.
The Federal Government, the
States, the Federal District, the Cities
and the Public Services Department
may file action with the purpose of applying the following sanctions to the infringing legal entities: I. Loss of assets,
rights or values; II. Suspension or partial
interdiction of their activities; III. Mandatory dissolution of the legal entity; and
IV. Prohibition to receive incentives,
subsidies, subventions, donations from
public entities and from public financial
institutions for a term of 1 to 5 years.
In accordance with Law No. 12.846, article 7, the following items will be
considered in the imposition of the sanctions:
I.
Seriousness of the infringement;
II.
Advantage obtained or intended by the infringing party;
III. Completion or not of the infringement;
IV. Level of injury or danger of injury;
V. Negative effect caused by the infringement;
VI. Economic status of the infringing party;
VII. Cooperation from the legal entity to investigate the infringements;
VIII. Existence of internal mechanisms and procedures concerning integrity, audit
and incentive for reporting irregularities and the effective application of ethics
and conduct codes in the sphere of the legal entity;
IX. Value of the contracts maintained by the legal entity with the public body or
entity injured.
21
4. Decree No. 8.420/15
Decree No. 8.420/15, which regulates Law No. 12.846/2013, provides for the
administrative liability of legal entities for the practice of acts against public administration and establishes that an effective Compliance Program must contain:
I.
Ethics and Conduct Codes;
II.
Commitment from upper management;
III. Training to employees and outsourced;
IV. Monitoring and periodical audit;
V. Communication channel for guidelines and reporting of irregularities;
VI. Investigation policy including corrective actions;
VII. Policy for contracting outsourced and other specific mechanisms to each segment.
The company’s Compliance area is liable for ensuring that the officers, employees, business partners, outsourced and suppliers will fulfill the company’s policies, the ethics code, the laws and the regulations.
The existence of a Compliance Program in the company will be a mitigating factor in the imposition of the administrative sanctions if the company is assessed for torts.
Many companies in Brazil have already adhered to such practice once they
fulfill the conduct codes from their countries of origin where there are advanced
laws in that regard: in the United States: FCPA - Foreign Corrupt Practices Act and
SOA – Sarbanes Oxley Act, and in United Kingdom: UKBA - United Kingdom - Bribery Act.
5. Conclusion
It is estimated that Brazil loses the frightening amount of R$ 50 billion to 80
billion every year with corruption. The corruption has become the main problem of
the Country once it destroys the reliability of the economic agents, hinders investments, destabilizes the economy, reduces the tax collection and withdraws fundamental rights from all the Brazilians.
Currently, the National Congress has carried out 355 bills on corruption, at
the initiative of deputies and 173 senators, totaling 528 proposals. Thus, it is impossible that any type of corruption has not been contemplated in these bills.
The Brazilian Anticorruption Law established the strict liability in the civil and administrative sphere of companies for the practice of harmful acts against
public administration, completed an important gap in Brazil’s legal order to punish the corruptors. Thus, the companies are called for consideration and action
22
so as to prevent themselves from punishments as a result of torts before the public sector.
The anticorruption Law will direct the organizations to a new manner to
make businesses with the public sector, requiring the adoption of good practices
of corporate governance, with transparence in all hierarchical levels, with strict
fulfillment of the legal compliance, segregation of duties, outsourcing policies,
program of definition of specific risks, with internal controls and audits.
It should be pointed out the importance of the Compliance Program, which contains definitions of conducts inhibiting frauds, money laundering, exercise
of undue influence, conflicts of interest of employees, partners, outsourced and
suppliers, thus enabling the business rules with the public sector to be more ethical, transparent, fair and balanced.
The inclusion of the requirements from the anticorruption law in the
company’s strategic management makes the company’s activity be subject to morality and legality. Thus, the company’s ethical attitudes create sustainable value
and its attitudes multiply and start to influence the whole society.
Author
Laws and condemnations aiming at punishing crimes as corruption acts,
tax fraud and money laundering have added efforts for the development of the
Brazilian productive sector towards sustainability.
Franklin Mendes Thame
Agronomist Engineer and Business Administrator with specialization in agribusiness, development
of financial products and sustainability. Worked 35 years at Banco Noroeste and Banco Santander.
Rural Credit Officer and Officer at FEBRABAN (Brazilian Bank Federation). At present, Product
Manager at Serasa Experian. E-mail: Franklin.Thame@br.serasaexperiam.com
23
Increased Capital Allocation
to Operational Risk and
its Implications for Local
Financial Institutions
Marcelo Petroni Caldas
Oswaldo Pelaes Filho
Frederico Turolla
24
Abstract
The 2008 crisis triggered a worldwide shift towards a
new way of regulating Financial Institutions (FIs), as a result of
concerns over their solvency. Operational risk, however, was little addressed within the context of global changes, as the focus
was on improving capital quality and deepening requirements
associated with liquidity risk. After many studies and much reflection, the Basel Committee published a draft covering a new
proposed allocation of capital for operational risk. The proposal aims to make operational risk more risk sensitive by considering additional aspects for required capital calculation purposes, such as volumes and transactions conducted, among
other elements. Studies show that the impact tends to be significant and the banking industry has been moving to show regulators and the Committee the idiosyncrasies present in the
process. The topic is recent and literature is scarce from the
International Business angle, and addressing it contributes to
understanding the concepts of Financial Institution internationalization, home-host regulation, psychic distance, risk management, and internal financial system controls. This article researched secondary data, including documentary analysis, to
determine the impact of these regulatory measures on local
and global Financial Institutions in Brazil. It finds that the five
banks included in the analysis will experience material impacts
to their capital in the absence of changes since the study’s base
date. Given this finding, the alternative lies in reinforcing better-quality capital and reducing business in certain niches.
1. Introduction
Operational risk is a reasonably new subject, compared to credit and market risks. Understanding this risk assumes extensive knowledge of financial and
non-financial firms, as it may arise in any line of business. Therefore, the issue
relates not only with understanding this risk, but also its measurement and the
calculation of a capital tranche to properly reflect its allocation. Ever since operational risk was first introduced in Basel II1, in 2004, the discussion has led to heated debates between advocates and critics of the method, and evolved little until
the release of the Basel Committee (Basel2)’s October 2014 draft intended to modify the standard calculation model for operational risk capital and make it more
risk-sensitive.
25
The purpose of this article is to check the impact that the Basel
Committee’s new capital allocation regulation may have on the required capital
of the Financial Institutions at hand. We aim to address a theoretical concept based on the analysis of documents produced by the Basel Committee and the risk-management reports disclosed by Financial Institutions and made available on
their Websites, revealing the capital allocated to operational risk.
2. Regulatory Development and Operational Risk
As is common knowledge, the first capital accord focused primarily on
credit, risk, which is defined as:
The risk that the counterparty to a transaction could default before the final settlement of the transaction’s cash flows. An economic loss would occur if the transactions or portfolio of transactions with the counterparty has a positive economic
value at the time of default4.
The first accord also disseminated an indicator referred to as the “Basel Index” (BIS, 1988 International Convergence of Capital Measurement and Capital Standards, July 1988), which is constantly used and represents a relevant solvency driver for financial institutions. The goal was for only internationally active
Financial Institutions to use the metric, but the simplicity of the formula led to the
concept’s massification and its consequent implementation even at banks with
no global operations. After the publication of the first accord, an amendment
came out that addressed market risk5 specifically, as this was an emerging risk
that required distinctive treatment. Let us define Market Risk6:
The risk of losses in on and off-balance-sheet positions arising from movements
in market prices. The risks subject to this requirement are:
• The risks pertaining to interest rate related instruments and equities in the trading book;
• Foreign exchange risk and commodities risk throughout the bank.
Finally, the New Basel Capital Accord (or simply Basel II) consolidated
and improved upon the concepts of credit risk and market risk, and implemented
the concept of operational risk. It was a novelty at the time, as this was a risk that
the market did perceive, but had not yet been noted and addressed by a supranational entity; Basel II provided guidance on how to address this risk from the
governance viewpoint and laid out three ways to calculate capital for allocation.
It was a revolution at the time, and it is worth emphasizing Basel II’s concept for
Operational Risk:
The risk of loss resulting from inadequate or failed internal processes, people and
systems or from external events. This definition includes legal risk, but excludes
strategic and reputational risk.
26
The concept was developed locally as well, and in 2006 Brazilian regulators chose to publish a standard – Resolution #33807 – that is deemed the local best practice, requiring implementation of operational-risk management frameworks at Financial Institutions and intended to describe concepts, entry methods, responsibilities, and scope of application for the entities authorized to operated under the oversight of the Central Bank of Brazil. It is worth mentioning that
only in April 2008 was the standard that governs the allocation of capital to operational risk implemented in Brazil. Circular #3383/08 8 laid out procedures to calculate the operational risk trance using three methods:
Method
1. Basic Indicator
2. Alternative Standardized
3. Simplified Alternative
Standardized
Calculation
Gross Income, n segregation by
business lines, average of the past
three years
Gross Income segregated by 8
business lines, where 2 business
lines (commercial and retail) use
the credit portfolio instead of gross
income. Uses the average of the past
3 years
Gross Income segregated by 2
business lines; Uses the average of
the past 3 years
Coeffi cient
15% of Gross Income
12%-18% of Gross Income applied
to the 8 business lines
15% and 18% of Gross Income
applied to the 2 business lines
Source: Circular #3383/08
Financial Institutions have since then had to choose their method for
compliance. Local banks deemed large have, according to the respective publications, chosen the alternative standardized approach– ASA). This has to do with
a smaller impact on capital allocation, as two business lines (commercial and retail) use an average spread of 3.5%9 (as per Basel II and as adopted by the Brazilian supervisory authority in Circular #3383/08). This is quite below practice in
the Brazilian financial system, according to the Credit Outlook Report (Relatório
do Panorama de Crédito) published by industry association Febraban for the base
date of October 2015, showing an average consolidated spread of 15.5% in October 2013-October 2015. It is worth emphasizing that in 2010-2014, average capital
allocation to operational risk was R$ 13.8 billion for the five largest banks in Brazil
(Bradesco, Itaú, Caixa Econômica Federal, Banco do Brasil and Santander) which represents, on average, 5% of the average reference equity of the same Financial Institutions in that period.
Circular #3383/08 gave way to Circular #3640/1310, which laid out procedures to calculate the standardized approach’s calculation of the share of risk-weighted assets associated with the determination of required capital for operational risk. In fact, it meant alignment with the international standards for operational risk calculations, where the amount of the allocation obtained by means
27
of the concepts described earlier in this article must be leveraged by 11%. Finally,
regardless of the regulation’s number, we find that standardized models are not
helpful to measure risk itself, as their rationale is that the greater the revenue, the
bigger the institution’s risk. It is therefore not logical insofar as the method does
not account for the possibility of control and business environment factors’ influence and ability to prevent income increases from necessarily increasing risk
and, consequently, potential losses arising from the additional risk stemming for
increased business.
We stress that a local regulation (Circular #3647/1311) was published in
2011 to lay out minimum requirements for the use of the advanced internal model-based approach in calculating the tranche for operational risk. Indeed, by publishing that regulation, regulators excessively limited local banks’ use of the advanced model. The document does not include a margin for partial use of the model, nor incentives for the introduction of systemic improvements in tandem with
the implementation of the model, or of processes that are not fully automated.
These are characteristics of a perfect environment, and makes implementation
very difficult at the entirety of complex firms active in varied business lines, as is
the case in Brazil, where certain financial institutions own large insurance companies with millions of customers that are subject to Solvency II (which is “Basel for Insurers”). We believe that the matter can be explored in a different thread of research, as it provides a broad field for reflection on operational risk management.
4. The Advent of the 2008-2009 Economic Crisis
As defined in number 45 of the information and debates review of the
Applied Economic Research Institute (“Instituto de Pesquisa Econômica Aplicada” – IPEA),
Subprimes were characterized as higher risk, or bottom-shelf mortgages. The
international marketplace’s excess liquidity in recent years led US banks and
financial companies to finance home purchases at low interest rates for obligors with poor credit histories, using the property itself as the sole collateral. Then came the drop in property prices and banks were threatened by the
prospect of not getting their loans repaid.
The 2008 crisis stemming from the collapse of subprime operations in the
US market triggered a series of reflections on the subject of systemic liquidity risk
and the need for better-quality capital at Financial Institutions. As widely publicized, the consequence was a series of regulations affecting the financial market,
as follows: Basel III12 - A Global Regulatory Framework for more Resilient Banks and
Banking Systems, and the Dodd Frank Wall Street Reform and Consumer Protection
28
Act13, to mention only the two most important regulatory standards to emerge in
response to the crisis. This worldwide stance was provided by G20, the group formed by the world’s 20 most relevant economies. The group was concerned with
not permitting a global-scale economic depression, and therefore with providing
increased regulation of the global financial system. The FSB (Financial Stability
Board) was created to coordinate global regulation and check for implementation
of its policies.
Although all eyes were fully on the risks mentioned earlier, operational
risk reemerged in the discussions due to some concern over mortgages and, consequently, the allocation of capital to operational risk. In connection with mortgages – since it is unclear how much fraud and process failures stood for as the
main cause in this issue – loan write-offs might take place as a result of operational risk. On the capital allocation side, the red flag went up for regulators, as capital allocation under the standardized methods is calculated based on the average of the three latest years’ gross income. The consequence emerged years after the crisis, with the finding that, because of it, bank incomes dropped and the
immediate effect was reduced allocated capital. While Basel already intended to
review the method, this fact was the missing ingredient to trigger the proposed
change.
5. Post-Crisis Actions
In reality, Basel took some time to publish something in response to the
crisis and that addressed the deficiencies found in the concept of capital allocation to operational risk. After impact studies involving Financial Institutions and
several analyses, they published Operational Risk – Revisions to the simpler approach14 in October 2014. With this document, Basel embraced an emphasis oriented
towards simplicity and greater sensitivity to risk, and the thee existing allocation
methods were replaced with a single method. However, the increased risk sensitivity aspect is still wanting, as most of the vectors remain unchanged.
The document finds that gross income must give way to a new variable
called BI (business indicator), intended to capture the major banking businesses, as follows: interest component, services component and financial component. The new calibration aims to increment allocation and, consequently, become more sensitive to risk. It is worth noting that countries with high net interest margins (NIM) were also included in and considered by the Basel draft, but
the treatment for this particular situation is still under analysis. Please note that
a change in the spread (from 3.5% to a greater figure) used for the purposes of
capital allocation calculations means for Brazilian banks a marked change in capital approach, and its impact will materially affect the leverage levels of the financial industry as a whole. It is important to correlate this with macroeconomic
29
effects, as something that affects capital allocation (shifting it upwards) will result in less credit available for clients and/or end consumers to purchase goods
and services. In sum, Financial Institutions, after several standard and regulation
changes, have been experiencing something that is increasingly a part of managers’ everyday lives: the scarcity of a resource called “capital”.
The new formulation for capital allocation as per the Basel draft and associated with the business indicator considers ranges from 10% to 30% and, consequently, increase the allocation of institutions’ capital, particularly compared
with the previous 8-12% range for 8 business lines. Application of the formula is
extremely simple and intended to distribute the results for each business indicator across the selected ranges, as show next:
Coeffi cient
Coeffi cient by range
0-100
BI (billion €)
10%
10%
>100 – 1,000
13%
10% - 12.7%
>1.000 – 3,000
17%
12.7% - 15.57%
>3.000 – 30,000
22%
15.57% - 21.36%
>30,000
30%
21.36% - 30%
Source: Operational Risk – Revisions to the simpler approaches
Therefore, one can use the Basel document to simulate how the distribution would take place within a financial simulation based on the newly proposed
formulation:
Bank
BI
Calculations
Allocated Capital
A
$ 80
$ 80 * 10%
$8
B
$ 800
$ 100 * 10% + $ 700 * 13%
$ 101
C
$ 2.000
$ 100 * 10% + 900 * 13% + 1.000 * 17%
$ 297
Source: Operational Risk – Revisions to the simpler approaches
6. Quantitative Analysis
Quantitative analysis sought to check for the presence of a correlation
between the Basel Index (BI) and certain selected indicators for Brazilian banks
Bradesco, Itaú, CEF – Caixa Econômica Federal, Banco do Brasil and foreign bank
Santander. The indicators tested were Reference Equity (RE), Required Reference Equity (RRE), tranches for Operational Risk (OR), Credit Risk (CR), and Market
Risk (MR), and Return on Equity (ROE). To this end, we collected the correlation’s
data for the 2010-’14 period.
30
Taking into account comments from the financial market on the potential change in rules for allocating capital to operational risk and the impact such
a change might cause, we calculated the Operational Risk tranche multiplied by
2 (ORx2) and multiplied by 3 (ORx3), as the market’s estimated impact for financial institutions is two to three times the increase in allocated capital to operational risk. It is important to emphasize that the Basel Index was recalculated (as derived from ORx2) and renamed IBx2, and the Basel Index from ROx3 was renamed IBx3.
The table next shows the data obtained that provided the basis for calculating the correlations:
Data Table
R$ million
Reference
Equity
RRE
RREx2
RREx3
OR
ORx2
ORx3
CR
MR
Basel
Index
Basel
Index
x2
Basel
Index
x3
ROE
Source: Risk Management Reports – Banks’ Websites
Columns RREx2, RREx3, ORx2 and ORx3 developed by the authors
All of the results from the correlation have been obtained using Excel
data analysis. The correlations were initially calculated based on the banks’ average data for each year analyzed, so that the data obtained from the correlation
between BI and the variables at hand, in 2010-’14, yielded the following:
31
Correlation Results
Correlation
BI_2010
BI_2011
BI_2012
BI_2013
BI_2014
OR
-2%
74%
74%
26%
51%
RRE
14%
71%
52%
28%
51%
RE
-15%
80%
84%
42%
58%
CR
13%
66%
43%
25%
48%
MR
69%
15%
53%
60%
59%
ROE
7%
-21%
-41%
37%
94%
BI2_2010
BI2_2011
BI2_2012
BI2_2013
BI2_2014
OR x2
Correlation
-13%
5%
37%
65%
85%
RRE x2
-44%
-7%
9%
51%
75%
RE
-91%
-49%
47%
60%
74%
CR
-46%
-2%
-7%
49%
72%
MR
-67%
-58%
76%
22%
26%
ROE
49%
82%
-82%
12%
67%
BI3_2010
BI3_2011
BI3_2012
BI3_2013
BI3_2014
OR x3
Correlation
-13%
5%
37%
65%
85%
RREx3
-43%
-6%
10%
52%
75%
RE
-91%
-48%
46%
60%
74%
CR
-46%
-2%
-7%
49%
72%
MR
-67%
-58%
75%
22%
26%
ROE
49%
82%
-82%
12%
67%
Source: The authors
The variables analyzed may show correlations from “very weak” to “very
strong”, as shown next:
ρ value (+ or -)
Interpretation
0.00-0.19
Very weakly correlated
0.20-0.39
Weakly correlated
0.40-0.69
Moderately correlated
0.70-0.89
Strongly correlated
0.90-1.00
Very strongly correlated
As conclusões da análise da correlação estão apresentadas no Quadro
Resumo das Correlações Analisadas:
32
Summary of the Analysis of Correlations with BI
Year
Very Strong
Strong
Moderate
2010
MR
2011
OR, RRE, RE
CR
2012
OR, RE
RRE, CR, MR, ROE
2013
PR, RM
2014
ROE
OR, RRE, RE, CR, MR
Summary of the Analysis of Correlations with BI2x
Year
Very Strong
2010
RE
Strong
Moderate
RRE, CR, MR, ROE
2011
ROE
RE, MR
2012
MR, ROE
RE
2013
OR, RRE, RE, CR
2014
OR, RRE, RE, CR
ROE
Summary of the Analysis of Correlations with BI3x
Year
Very Strong
2010
RE
Strong
Moderate
RRE, CR, MR, ROE
2011
ROE
RE, MR
2012
MR, ROE
RE
2013
OR, RRE, RE, CR
2014
OR, RRE, RE, CR
ROE
Source: The authors
We may conclude that there appears to exist no strong correlation between the variables analyzed and the BI, BI2x and BI3x Basel Indices The only variable with a larger number of strong correlations is ROE, but the correlation is positive in some years and negative in others, so that no trend can be identified between the two variables. One of the reasons that may lead to such a lack of trending, in spite of the strong correlation, is the relationship between Basel indices
and other variables that also relate with ROE and have not been addressed in this
study.
7. Suggestions and Limitations of the Quantitative Analysis
No statistical test by regression was carried out due to the scarce
historical data for good adherence of the data to reality, as this study only used in-
33
formation since the implementation of BI3. For future research, we suggest using
a more comprehensive database, that is, panel data addressing a larger number
of observation and variables, which may enable using regressions to analyze the
data, as well as understanding the correlations between variables affecting the
analyzed indicators.
8. Impacts of the Increased Capital Allocation
It is important to point out that another aspect this study addresses concerns the impact of a potential increase allocation of capital to operational risk may
cause in the economy through the availability of credit to obligors.
It has been mentioned that market rumors speculate that the new method
for capital allocation to operational risk tends to increase the current tranche by
two (ORx2) to three (ORx3) times. Ascertaining an average amount to calculate the
impact on the five financial institutions over five years, the following emerges:
Figure 1 – all variables evolve regularly as published
Average RE
RRE
Basel Index
R$ 418.9 billion
R$ 280.2 billion
16.42%
Figure 2– all variables evolve regularly as published, but the allocation to operational risk multiplies by 2
Average RE
RRE (x2)
Basel Index
R$ 418.9 billion
R$ 294.1 billion
10.48%
Figure 3– all variables evolve regularly as published, but the allocation to operational risk multiplies by 3
Average RE
RRE (x3)
Basel Index
R$ 418.9 billion
R$ 308.0 billion
10.50%
Source: The authors
The estimates allow concluding that the potential increased allocation of
capital to operational risk would remove from financial institutions the ability to
generate loans in the amount of R$ 27.7 billion. In addition, we have not considered the potential leverage that this amount represents for financial institutions’
lending capacity. Something that Basel and the local supervisor might consider
is phased implementation by layers, such as: only 50% of the calculated amount
in year 1; only 70% in year 2 and so forth, until full allocation is reached. We un-
34
derstand that something along these lines may blunt the impact on increased
allocation on banks and therefore not excessively harm the real economy.
9. Closing Statements
This article explores the impacts of changes in capital allocation to operational risk as currently being studied by the Basel Committee, and its impact on
local (i.e.: Brazilian) Banks. Our studies show material impact, as the change may
lead to allocations three times greater than current. This significant impact affects
the macro-economy through lower credit supply and its consequent impact for business firms and households. The new capital burden must therefore be aligned
with the reality of the economic cycle at hand. Or, otherwise, pari-passu implementation might be proposed to blunt the relevant impacts that the new rule may have
on the local and regional economy. We believe that the next step to make headway
into researching these impacts is to repeat the study once the rule is effectively
validated and published in its final form, in order to gain a precise sense of the impact, to include small and medium-sized Financial Institutions.
10. Notes:
1
Basel II: International Convergence of Capital Measurement and Capital Standards: a Revised Framework
2
The Basel Committee on Banking Supervision is a committee of Banking
Supervision authorities formed by the Chairmen of the central banks of G10
members in 1975. The committee has senior representatives from the central
banks of Belgium, Canada, France, Germany, Italy, Japan, Luxembourg, the
Netherlands, Spain, Sweden, Switzerland, United Kingdom and United States. It usually meets at the Bank for International Settlements in Basel, where its permanent Secretariat lies.
3
Operational Risk – Revisions to the simpler approaches – consultative document
4
Basel II: International Convergence of Capital Measurement and Capital Standards: a Revised Framework – page 19; footnote 16 - http://www.bis.org/publ/
bcbs128.pdf
5
Amendment to the capital accord to incorporate market risk, January 1996
6
Basel II: International Convergence of Capital Measurement and Capital Standards: a Revised Framework – page 157 - http://www.bis.org/publ/bcbs128.pdf
7
Resolution #3380/06, Central Bank of Brazil http://www.bcb.gov.br/pre/normativos/busca/downloadNormativo.asp?arquivo=/Lists/Normativos/Attachments/48239/Res_3380_v3_P.pdf
8
Circular #3383/08, Central Bank of Brazil http://www.bcb.gov.br/pre/normativos/busca/downloadNormativo.asp?arquivo=/Lists/Normativos/Attachments/47919/Circ_3383_v3_P.pdf
35
9
Basel II: International Convergence of Capital Measurement and Capital Standards: a Revised Framework – page 146; footnote 104 - http://www.bis.org/publ/
bcbs128.pdf
10 Circular #3640/13 – Lays out procedures to calculate risk-weighted assets
(RWA) relative to calculating required capital for operational risk according
to the standardized approach (RWAOPAD) - http://www.bcb.gov.br/pre/normativos/busca/downloadNormativo.asp?arquivo=/Lists/Normativos/Attachments/48997/Circ_3640_v4_P.pdf
11 Circular #3647/13 – Lays out minimum rquirements for the advanced approach to operational risk http://www.bcb.gov.br/pre/normativos/busca/downloadNormativo.asp?arquivo=/Lists/Normativos/Attachments/48990/Circ_3647_
v2_P.pdf
12 Basel III: A Global Regulatory Framework for more resilient Banks and Banking systems - http://www.bis.org/publ/bcbs189.pdf
13 Wall Street Reform and Consumer Protection Act - http://www.cftc.gov/idc/
groups/public/@swaps/documents/file/hr4173_enrolledbill.pdf
Authors
14 Operational Risk – Revisions to the simpler approach - http://www.bis.org/publ/
bcbs291.pdf
Marcelo Petroni Caldas
Bachelor of Business Administration and Accounting from Universidade Presbiteriana Mackenzie, Post-graduate degree in Accounting and Financial Management from Faap, MBA in
Financial Management and Risk from Fipecafi –USP, and Master’s candidate in International
Business at ESPM. E-mail: m.petronic@terra.com.br
Oswaldo Pelaes Filho
Bachelor of Civil Engineering from FESP (Faculdade de Engenharia São Paulo) and of Business
Administration from Universidade Presbiteriana Mackenzie, Post-graduate degree in Accounting and Financial Management from FAAP, master of Administration from PUC/SP, and doctoral candidate in International Management at ESPM. E-mail: oswaldo@espm.br
Frederico Turolla
Bachelor of Economics from Universidade Federal de Juiz de Fora, Master of Economics from
Brandeis International Business School, and Doctor of Economics from Fundação Getúlio Vargas (FGV). E-mail: fredturolla@pezco.com.br
References
36
ACHARYA, V. A Theory of Systemic Risk and Design of Prudential Bank. Journal of Financial
Stability. Vol. 5, No. 3, (Sep 2009), pp. 224–255.
ALLEN, F.; CARLETI, E.; GU, X. The role of baking in financial systems. In BERGER, Al.L.;
MOLYNEUX, P. WILSON, J.O.S. (eds) The Oxford of Banking. Second edition. Oxford: Oxford
University Press, p.27-46, 2015.
AHARONY, J; SWARY I. Contagion effects of Bank Failures: Evidence from Capital Markets. The
Jounal of Business. Vol. 56, No. 3(Jul., 1983), pp. 305-322.
BANCO CENTRAL DO BRASIL - 50 maiores bancos e o consolidado do Sistema Financeiro
Nacional.http://www4.bcb.gov.br/fis/TOP50/port/Top50P.asp
BANK FOR INTERNATIONAL SETTLEMENTS. Basel Committee on Banking Supervision.
International Convergence of Capital Measurement and Capital Standards: a Revised Framework.
Suíça, 2005.
BANK FOR INTERNATIONAL SETTLEMENTS. Sound Practices for the Management and
Supervision of Operational Risk. Basiléia, 2003.
BLUNDELL-WINGNALL, ATKINSON , Adrian and Paul (2012), “Deleveraging, Traditional versus
Capital Markets Banking and the Urgent Need to Separate and Recapitalise G-SIFI Banks”,
OECD Journal: Financial Market Trends, Vol. 2012/1.
FINANCIAL STABILITY BOARD. http://www.financialstabilityboard.org/publications/r_091107c.pdf
FURFINE, C. Interbank Exposures: Quantifying the Risk of Contagion. Journal of Money, Credit
and Banking. Vol. 35, No. 1 (Feb. 2003), pp. 111-128.
GENNAIOLI, N., SHLEIFER S.; VISHNY V. A Model of Shadow Banking. The Journal of Finance.
July 2013. Global systemically important banks updated assessment methodology and the higher
loss absorbency requirement. http://www.bis.org/publ/bcbs255.pdf
LASTRA, R. M. Systemic risk, SIFIs and financial stability. Capital Markets: Law Journal. (2011)
STERN, G. H.; FELDMAN, R. J. Too big to Fail: The hazards of banks bailouts (2004).
Thematic Review on Supervisory Frameworks and Approaches for SIBs. http://www.
financialstabilityboard.org/wp-content/uploads/Thematic-Review-on-Supervisory-Approachesto-SIBs.pdf
Revista
Tecnologia de Crédito Online
Acesse informação de ponta,
onde você estiver.
Sua revista Tecnologia de Crédito é publicada
exclusivamente na web.
É mais um passo da Serasa Experian para a redução
do impacto de suas atividades sobre o meio ambiente,
além de um alinhamento à tendência global da
universalização de conteúdos.
Acesse e mantenha-se bem informado sobre
o mercado de crédito.
serasaexperian.com.br/revista-tecnologia-de-credito
Proteja
A Serasa Experian oferece soluções sob medida para o seu
negócio. São produtos e serviços exclusivos, que vão desde
consultas de crédito até ferramentas e softwares para gerir melhor
sua empresa. Tudo muito prático, rápido e rentável. E o melhor:
cabe no seu orçamento. Entre em contato e prepare-se para se
destacar da concorrência.
Para saber mais, acesse
serasaexperian.com.br
ou ligue para 3004 7728
SoluçõeS eSPeCífiCaS Para
Cada etaPa do Seu negóCio


Busque novos clientes


Venda com segurança
Monitore empresas,
clientes e sócios
Cobre devedores
Compre só os dados de que precisa