Showing posts with label behavioural economics. Show all posts
Showing posts with label behavioural economics. Show all posts
Saturday, November 24, 2018
Tuesday, September 20, 2016
19/9/16: Big Data Biases?
A very interesting, and broad (compared to our more statistics-specific discussions in MBAG 8541A) topic is touched in this book: http://time.com/4477557/big-data-biases/. The basic point is: data analytics (from basic descriptive statistics, to inferential statistics, to econometrics and bid data analysis) is subject to all the normal human biases the analysts might possess. The problem, however, is that big data now leads the charge toward behaviour and choice automation.
The book review focuses on ethical dimensions of this problem. There is also a remedial cost dimension - with automated behaviour based on biased algorithms, corrective action cannot take place ex ante automated choice, but only either ex ante analysis (via restricting algorithms) or ex post the algorithm-enabled action takes place. Which, of course, magnifies the costs associated with controlling for biases.
One way or the other - the concept of biased algorithmic models certainly presents some food for thought!
Saturday, June 18, 2016
Sunday, May 22, 2016
22/5/16: Lying and Making an Effort at It
Dwenger, Nadja and Lohse, Tim paper “Do Individuals Put Effort into Lying? Evidence from a Compliance Experiment” (March 10, 2016, CESifo Working Paper Series No. 5805: http://ssrn.com/abstract=2764121) looks at “…whether individuals in a face-to-face situation can successfully exert some lying effort to delude others.”
The authors use a laboratory experiment in which “participants were asked to assess videotaped statements as being rather truthful or untruthful. The statements are face-to-face tax declarations. The video clips feature each subject twice making the same declaration. But one time the subject is reporting truthfully, the other time willingly untruthfully. This allows us to investigate within-subject differences in trustworthiness.”
What the authors found is rather interesting: “a subject is perceived as more trustworthy if she deceives than if she reports truthfully. It is particularly individuals with dishonest appearance who manage to increase their perceived trustworthiness by up to 15 percent. This is evidence of individuals successfully exerting lying effort.”
So you are more likely to buy a lemon from a lemon-selling dealer, than a real thing from an honest one... doh...
Some more ‘beef’ from the study:
“To deceive or not to deceive is a question that arises in basically all spheres of life. Sometimes the stakes involved are small and coming up with a lie is hardly worth it. But sometimes putting effort into lying might be rewarding, provided the deception is not detected.”
However, “whether or not a lie is detected is a matter of how trustworthy the individual is perceived to be. When interacting face-to-face two aspects determine the perceived trustworthiness:
- First, an individual’s general appearance, and
- Second, the level of some kind of effort the individual may choose when trying to make the lie appear truthful.
The authors ask a non-trivial question: “do we really perceive individuals who tell the truth as more trustworthy than individuals who deceive?”
“Despite its importance for social life, the literature has remained surprisingly silent on the issue of lying effort. This paper is the first to shed light on this issue.”
The study actually uses two types of data from two types of experiments: “An experiment with room for deception which was framed as a tax compliance experiment and a deception-assessment experiment. In the compliance experiment subjects had to declare income in face-to-face situations vis-a-vis an officer, comparable to the situation at customs. They could report honestly or try to evade taxes by deceiving. Some subjects received an audit and the audit probabilities were influenced by the tax officer, based on his impression of the subject. The compliance interviews were videotaped and some of these video clips were the basis for our deception-assessment experiment: For each subject we selected two videos both showing the same low income declaration, but once when telling the truth and once when lying. A different set of participants was asked to watch the video clips and assess whether the recorded subject was truthfully reporting her income or whether she was lying. These assessments were incentivised. Based on more than 18,000 assessments we are able to generate a trustworthiness score for each video clip (number of times the video is rated "rather truthful" divided by the total number of assessments). As each individual is assessed in two different video clips, we can exploit within-subject differences in trustworthiness. …Any difference in trust-worthiness scores between situations of honesty and dishonesty can thus be traced back to the effort exerted by an individual when lying. In addition, we also investigate whether subjects appear less trustworthy if they were audited and had been caught lying shortly before. …the individuals who had to assess the trustworthiness of a tax declarer did not receive any information on previous audits.
The main results are as follows:
- “Subjects appear as more trustworthy in compliance interviews in which they underreport than in compliance interviews in which they report truthfully. When categorizing individuals in subjects with a genuine dishonest or honest appearance, it becomes obvious that it is mainly individuals of the former category who appear more trustworthy when deceiving.”
- “These individuals with a dishonest appearance are able to increase their perceived trustworthiness by up to 15 percent. This finding is in line with the hypothesis that players with a comparably dishonest appearance, when lying, expend effort to appear truthful.”
- “We also find that an individual’s trustworthiness is affected by previous audit experiences. Individuals who were caught cheating in the previous period, appear significantly less trustworthy, compared to individuals who were either not audited or who reported truthfully. This effect is exacerbated for individuals with a dishonest appearance if the individual is again underreporting but is lessened if the individual is reporting truthfully.”
21/5/16: Manipulating Markets in Everything: Social Media, China, Europe
So, Chinese Government swamps critical analysis with ‘positive’ social media posts, per Bloomberg report: http://www.bloomberg.com/news/articles/2016-05-19/china-seen-faking-488-million-internet-posts-to-divert-criticism.
As the story notes: “stopping an argument is best done by distraction and changing the subject rather than more argument”.
So now, consider what the EU and European Governments (including Irish Government) have been doing since the start of the Global Financial Crisis.
They have hired scores of (mostly) mid-educated economists to write, what effectively amounts to repetitive reports on the state of economy . All endlessly cheering the state of ‘recovery’.
In several cases, we now have statistics agencies publishing data that was previously available in a singular release across two separate releases, providing opportunity to up-talk the figures for the media. Example: Irish CSO release of the Live Register stats. In another example, the same data previously available in 3 files - Irish Exchequer results - is being reported and released through numerous channels and replicated across a number of official agencies.
The result: any critical opinion is now drowned in scores of officially sanctioned presentations, statements, releases, claims and, accompanied by complicit media and professional analysts (e.g. sell-side analysts and bonds placing desks) puff pieces.
Chinese manipulating social media, my eye… take a mirror and add lights: everyone’s holding the proverbial bag…
21/5/16: Banks Deposit Insurance: Got Candy, Mate?…
Since the end of the [acute phase] Global Financial Crisis, European banking regulators have been pushing forward the idea that crisis response measures required to deal with any future [of course never to be labeled ‘systemic’] banking crises will require a new, strengthened regime based on three pillars of regulatory and balance sheet measures:
- Pillar 1: Harmonized regulatory supervision and oversight over banking institutions (micro-prudential oversight);
- Pillar 2: Stronger capital buffers (in quantity and quality) alongside pre-prescribed ordering of bailable capital (Tier 1, intermediate, and deposits bail-ins), buffered by harmonized depositor insurance schemes (also covered under micro-prudential oversight); and
- Pillar 3: Harmonized risk monitoring and management (macro-prudential oversight)
All of this firms the core idea behind the European System of Financial Supervision. Per EU Parliament (http://www.europarl.europa.eu/atyourservice/en/displayFtu.html?ftuId=FTU_3.2.5.html): “The objectives of the ESFS include developing a common supervisory culture and facilitating a single European financial market.”
Theory aside, the above Pillars are bogus and I have commented on them on this blog and elsewhere. If anything, they represent a singular, infinitely deep confidence trap whereby policymakers, supervisors, banks and banks’ clients are likely to place even more confidence at the hands of the no-wiser regulators and supervisors who cluelessly slept through the 2000-2007 build up of massive banking sector imbalances. And there is plenty of criticism of the architecture and the very philosophical foundations of the ESFS around.
Sugar buzz!...
However, generally, there is at least a strong consensus on desirability of the deposits insurance scheme, a consensus that stretches across all sides of political spectrum. Here’s what the EU has to say about the scheme: “DGSs are closely linked to the recovery and resolution procedure of credit institutions and provide an important safeguard for financial stability.”
But what about the evidence to support this assertion? Why, there is an fresh study with ink still drying on it via NBER (see details below) that looks into that matter.
Per NBER authors: “Economic theories posit that bank liability insurance is designed as serving the public interest by mitigating systemic risk in the banking system through liquidity risk reduction. Political theories see liability insurance as serving the private interests of banks, bank borrowers, and depositors, potentially at the expense of the public interest.” So at the very least, there is a theoretical conflict implied in a general deposit insurance concept. Under the economic theory, deposits insurance is an important driver for risk reduction in the banking system, inducing systemic stability. Under the political theory - it is itself a source of risk and thus can result in a systemic risk amplification.
“Empirical evidence – both historical and contemporary – supports the private-interest approach as liability insurance generally has been associated with increases, rather than decreases, in systemic risk.” Wait, but the EU says deposit insurance will “provide an important safeguard for financial stability”. Maybe the EU knows a trick or two to resolve that empirical regularity?
Unlikely, according to the NBER study: “Exceptions to this rule are rare, and reflect design features that prevent moral hazard and adverse selection. Prudential regulation of insured banks has generally not been a very effective tool in limiting the systemic risk increases associated with liability insurance. This likely reflects purposeful failures in regulation; if liability insurance is motivated by private interests, then there would be little point to removing the subsidies it creates through strict regulation. That same logic explains why more effective policies for addressing systemic risk are not employed in place of liability insurance.”
Aha, EU would have to become apolitical when it comes to banking sector regulation, supervision, policies and incentives, subsidies and markets supports and interventions in order to have a chance (not even a guarantee) the deposits insurance mechanism will work to reduce systemic risk not increase it. Any bets for what chances we have in achieving such depolitization? Yeah, right, nor would I give that anything above 10 percent.
Worse, NBER research argues that “the politics of liability insurance also should not be construed narrowly to encompass only the vested interests of bankers. Indeed, in many countries, it has been installed as a pass-through subsidy targeted to particular classes of bank borrowers.”
So in basic terms, deposit insurance is a subsidy; it is in fact a politically targeted subsidy to favor some borrowers at the expense of the system stability, and it is a perverse incentive for the banks to take on more risk. Back to those three pillars, folks - still think there won’t be any [though shall not call them ‘systemic’] crises with bail-ins and taxpayers’ hits in the GloriEUs Future?…
Full paper: Calomiris, Charles W. and Jaremski, Matthew, “Deposit Insurance: Theories and Facts” (May 2016, NBER Working Paper No. w22223: http://ssrn.com/abstract=2777311)
21/5/16: Voters selection biases and political outcomes
A recent study based on data from Austria looked at the impact of compulsory voting laws on voter quality.
Based on state and national elections data from 1949-2010, the authors “show that compulsory voting laws with weakly enforced fines increase turnout by roughly 10 percentage points. However, we find no evidence that this change in turnout affected government spending patterns (in levels or composition) or electoral outcomes. Individual-level data on turnout and political preferences suggest these results occur because individuals swayed to vote due to compulsory voting are more likely to be non-partisan, have low interest in politics, and be uninformed.”
In other words, it looks like there is a selection bias being triggered by compulsory voting: lower quality of voters enter the process, but due to their lower quality, these voters do not induce a bias away from state quo. Whatever the merit of increasing voter turnouts via compulsory voting requirements may be, it does not appear to bring about more enlightened choices in policies.
Full study is available here: Hoffman, Mitchell and León, Gianmarco and Lombardi, María, “Compulsory Voting, Turnout, and Government Spending: Evidence from Austria” (May 2016, NBER Working Paper No. w22221: http://ssrn.com/abstract=2777309)
So can you 'vote out' stupidity?..
Saturday, May 21, 2016
20/5/16: Business Owners: Not Great With Counterfactuals
A recent paper, based on a “survey of participants in a large-scale business plan competition experiment, [in Nigeria] in which winners received an average of US$50,000 each, is used to elicit beliefs about what the outcomes would have been in the alternative treatment status.”
So what exactly was done? Business owners were basically asked what would have happened to their business had an alternative business investment process taken place, as opposed to the one that took place under the competition outcome. “Winners in the treatment group are asked subjective expectations questions about what would have happened to their business should they have lost, and non‐winners in the control group asked similar questions about what would have happened should they have won.”
“Ex ante one can think of several possibilities as to the likely accuracy of the counterfactuals”:
- “…business owners are not systematically wrong about the impact of the program, so that the average treatment impact estimated using the counterfactuals should be similar to the experimental treatment effect. One potential reason to think this is that in applying for the competition the business owners had spent four days learning how to develop a business plan… outlining how they would use the grant to develop their business. The control group [competition losers] have therefore all had to previously make projections and plans for business growth based on what would happen if they won, so that we are asking about a counterfactual they have spent time thinking about.”
- ”…behavioral factors lead to systematic biases in how individuals think of these counterfactuals. For example, the treatment group may wish to attribute their success to their own hard work and talent rather than to winning the program, in which case they would underestimate the program effect. Conversely they may fail to take account of the progress they would have made anyway, attributing all their growth to the program and overstating the effect. The control group might want to make themselves feel better about missing out on the program by understating its impact (...not winning does not matter that much). Conversely they may want to make themselves feel better about their current level of business success by overstating the impact of the program (saying to themselves I may be small today, but it is only because I did not win and if I had that grant I would be very successful).”
The actual results show that business owners “do not provide accurate counterfactuals” even in this case where competition awards (and thus intervention or shock) was very large.
- The authors found that “both the control and treatment groups systematically overestimate how important winning the program would be for firm growth…
- “…the control group thinks they would grow more had they won than the treatment group actually grew”
- “…the treatment group thinks they would grow less had they lost than the control group actually grew”
Or in other words: losers overestimate benefits of winning, winners overestimate the adverse impact from losing... and no one is capable of correctly analysing own counterfactuals.
Full paper is available here: McKenzie, David J., Can Business Owners Form Accurate Counterfactuals? Eliciting Treatment and Control Beliefs About Their Outcomes in the Alternative Treatment Status (May 10, 2016, World Bank Policy Research Working Paper No. 7668: http://ssrn.com/abstract=2779364)
Saturday, December 19, 2015
19/12/15: Another Un-glamour Moment for Economics
Much of the current fascination with behavioural economics is well deserved - the field is a tremendously important merger of psychology and economics, bringing economic research and analysis down to the granular level of human behaviour. However, much of it is also a fad - behavioural economics provide a convenient avenue for advertising companies, digital marketing agencies, digital platforms providers and aggregators, as well as congestion-pricing and Gig-Economy firms to milk strategies for revenue raising that are anchored in common sense. In other words, much of behavioural economics use in real business (and in Government) is about convenient plucking out of strategy-confirming results. It is marketing, not analysis.
A lot of this plucking relies on empirically-derived insights from behavioural economics, which, in turn, often rely on experimental evidence. Now, experimental evidence in economics is very often dodgy by design: you can’t compel people to act, so you have to incentivise them; you can quite select a representative group, so you assemble a ‘proximate’ group, and so on. Imagine you want to study intervention effects on a group of C-level executives. Good luck getting actual executives to participate in your study and good luck getting selection biases sorted out in analysing the results. Still, experimental economics continues to gain prominence, as a backing for behavioural economics. A still, companies and governments spend millions on funding such research.
Now, not all experiments are poorly structured and not all evidence derived from is dodgy. So to alleviate nagging suspicion as to how much error is carried in experiments, a recent paper by Alwyn Young of London School of Economics, titled “Channelling Fisher: Randomization Tests and the Statistical Insignificance of Seemingly Significant Experimental Results” (http://personal.lse.ac.uk/YoungA/ChannellingFisher.pdf) used “randomization statistical inference to test the null hypothesis of no treatment effect in a comprehensive sample of 2003 regressions in 53 experimental papers drawn from the journals of the American Economic Association.”
The attempt is pretty darn good. The study uses robust methodology to test a statistically valid hypothesis: has there been a statically significant result derived in the studies arising from experimental treatment or not? The paper tests a large sample of studies published (having gone through peer and editorial reviews) in perhaps the most reputable economics journals. This is creme-de-la-creme of economics studies.
The findings, to put this scientifically: “Randomization tests reduce the number of regression specifications with statistically significant treatment effects by 30 to 40 percent. An omnibus randomization test of overall experimental significance that incorporates all of the regressions in each paper finds that only 25 to 50 percent of experimental papers, depending upon the significance level and test, are able to reject the null of no treatment effect whatsoever. Bootstrap methods support and confirm these results. “
In other words, in majority of studies claiming to have achieved statistically significant results from experimental evidence, such results were not really statistically significantly attributable to experiments.
Now, the author is cautious in his conclusions. “Notwithstanding its results, this paper confirms the value of randomized experiments. The methods used by authors of experimental papers are standard in the profession and present throughout its journals. Randomized statistical inference provides a solution to the problems and biases identified in this paper. While, to date, it rarely appears in experimental papers, which generally rely upon traditional econometric methods, it can easily be incorporated into their analysis. Thus, randomized experiments can solve both the problem of identification and the problem of accurate statistical inference, making them doubly reliable as an investigative tool. “
But this is hogwash. The results of the study effectively tell us that large (huge) proportion of papers on experimental economics published in the most reputable journals have claimed significant results attributable to experiments where no such significance really was present. Worse, the methods that delivered these false significance results “are standard in the profession”.
Now, consider the even more obvious: these are academic papers, written by highly skilled (in econometrics, data collection and experiment design) authors. Imagine what drivel passes for experimental analysis coming out of marketing and surveying companies? Imagine what passes for policy analysis coming out of public sector outfits? Without peer reviews and without cross-checks like those performed by Young?
Sunday, October 4, 2015
4/10/15: Data is not the end of it all, it’s just one tool...
Recently, I spoke at a very interesting Predict conference, covering the issues of philosophy and macro-implications of data analytics in our economy and society. I posted slides from my presentations earlier here.
Here is a quick interview recorded by the Silicon Republic covering some of the themes discussed at the conference: https://www.siliconrepublic.com/video/data-is-not-the-end-of-it-all-its-just-one-tool-dr-constantin-gurdgiev.
Thursday, September 17, 2015
17/9/15: Predict Conference: Data Analytics in the Age of Higher Complexity
This week I spoke at the Predict Conference on the future of data analytics and predictive models. Here are my slides from the presentation:
Key takeaways:
- Analytics are being shaped by dramatic changes in demand (consumer side of data supply), changing environment of macroeconomic and microeconomic uncertainty (risks complexity and dynamics); and technological innovation (on supply side - via enablement that new technology delivers to the field of analytics, especially in qualitative and small data areas, on demand side - via increased speed and uncertainty that new technologies generate)
- On the demand side: consumer behaviour is complex and understanding even the 'simpler truths' requires more than simple data insight; consumer demand is now being shaped by the growing gap between consumer typologies and the behavioural environment;
- On micro uncertainty side, consumers and other economic agents are operating in and environment of exponentially increasing volatility, including income uncertainty, timing variability (lumpiness) of income streams and decisions, highly uncertain environment concerning life cycle incomes and wealth, etc. This implies growing importance of non-Gaussian distributions in statistical analysis of consumer behaviour, and, simultaneously, increasing need for qualitative and small data analytics.
- On macro uncertainty side, interactions between domestic financial, fiscal, economic and monetary systems are growing more complex and systems interdependencies imply growing fragility. Beyond this, international systems are now tightly connected to domestic systems and generation and propagation of systemic shocks is no longer contained within national / regional or even super-regional borders. Macro uncertainty is now directly influencing micro uncertainty and is shaping consumer behaviour in the long run.
- Technology, that is traditionally viewed as the enabler of robust systems responses to risks and uncertainty is now acting to generate greater uncertainty and increase shocks propagation through economic systems (speed and complexity).
- Majority of mechanisms for crisis resolution deployed in recent years have contributed to increasing systems fragility by enhancing over-confidence bias through excessive reliance on systems consolidation, centralisation and technocratic responses that decrease systems distribution necessary to address the 'unknown unknowns' nature of systemic uncertainty. excessive reliance, within business analytics (and policy formation) on Big Data is reducing our visibility of smaller risks and creates a false perception of safety in centralised regulatory and supervisory regimes.
- Instead, fragility-reducing solutions require greater reliance on highly distributed and dispersed systems of regulation, linked to strong supervision, to simultaneously allow greater rate of risk / response discovery and control the downside of such discovery processes. Big Data has to be complemented by more robust and extensive deployment of the 'craft' of small data analytics and interpretation. Small events and low amplitude signals cannot be ignored in the world of highly interconnected systems.
- Overall, predictive data analytics will have to evolve toward enabling a shift in our behavioural systems from simple nudging toward behavioural enablement (via automation of routine decisions: e.g. compliance with medical procedures) and behavioural activation (actively responsive behavioural systems that help modify human responses).
Friday, October 24, 2014
24/10/2014: Behavioural Political Economy
A very interesting survey paper on the topic of behavioural drivers of political economy by Schnellenbach, Jan and Schubert, Christian, titled "Behavioral Political Economy: A Survey" (September 30, 2014. CESifo Working Paper Series No. 4988. http://ssrn.com/abstract=2507739).
From the abstract:
"Explaining individual behavior in politics should rely on the same motivational assumptions as explaining behavior in the market: That’s what Political Economy, understood as the application of economics to the study of political processes, is all about." So far - fine.
However, there is a problem: "In its standard variant, those who played the game of politics should also be considered rational and self-interested, unlike the benevolent despot of earlier models. History repeats itself with the rise of behavioral economics: Assuming cognitive biases to be present in the market, but not in politics, behavioral economists often call for government to intervene in a “benevolent” way. Recently, however, political economists have started to apply behavioral economics insights to the study of political processes, thereby re-establishing a unified methodology. This paper surveys the current state of the emerging field of “Behavioral Political Economy” and considers the scope for further research."
It is a lengthy and solid review, covering some 41 pages. Dense. But absolutely a great read as an introduction into the subject.
Thursday, April 3, 2014
3/4/2014: Reforming Economics? Try Politics First...
This is an unedited version of my article for Village Magazine, February 2014
The Global Financial Crisis and the Great Recession are actively reshaping the public discourse about the ways in which we analyse social phenomena, and how our analysis is shaping public policies choices.
In many ways, these changes in our attitudes to social inquiry have been positive. For example, more critical re-appraisal of the rational expectations-based models in macroeconomics and finance have enriched the traditional policy analysts' toolkits and advanced our understanding of choices made by various economic agents and governments. Shift in econometric tools away from those based on restrictive assumptions concerning underlying probability distributions and toward new methods based on more direct integration of the actual data properties is also underway. The result is improved analytical abilities and more streamlined translation of data insights into policy famework. The launching of the public debates about how we teach economics in schools and universities and how economic parameters reflect social and cultural values (as evidenced by the ongoing debate at the OECD and other institutions about introducing measures of quality of life and social well-being into economic policy toolkits) are yet more examples of the longer-term positive change. Absent such discussions, the entire discipline of social sciences risks sliding into complacency and statism.
However, in many areas, changes in our approaches to social studies have been superficial at best, and occasionally regressive. And these changes are not limited to economics alone, spanning instead the entire range of social sciences and related disciplines.
For the sake of brevity, let me focus on some comparatives between economic analysis and one other field of social policies formation: environmental policy. The same arguments, however, hold in the case of other social policy disciplines.
Prior to the crisis, environmental sciences largely existed in the world of mathematical modeling, with core forecasts of emissions paths and their effects on the environment relying on virtually zero behavioural inputs. These technocratic models influenced both public opinion and policies. The proverbial representative agent responsible for production of emissions, was not a human being requiring age, gender, family, income and otherwise differentiated supplies of energy, goods and services. In a way, therefore, environmental policy was further removed from the realities of human and social behaviour than, say, finance, monetary or macro economics. Where economists are acutely aware of the above differences as drivers for demand, supply and valuations of various goods and services, environmental policy analysts are focused on purely aggregate targets at the expense of realism and social and economic awareness.
The same remains true today. Over recent years, the thrust of environmental policies has drifted away from local considerations of the impact of pollution on quality of life and economic environment considerations. As the result, environmental policies and programmes, such as for example wind energy development or localised incineration of waste, are becoming more orthogonal, if not outright antagonistic to the interests of consumers. Rhetoric surrounding these environmental policies considerations is also becoming more detached from the demos. For example, Ireland's attempt to make a play at European wind energy generation markets, replete with massive wind farms and miles of pillions, is now pitting our imagined (or mathematically-derived) exports potential, fuelled by nothing more than massive subsidies and consumer rip-off pricing for electricity, against all those interested in preserving the countryside's natural amenities, cultural heritage and other economically and socially meaningful resources.
Whereby behaviourally-rich analysis is now moving into the mainstream in finance and is starting to show up within the macroeconomic models, it is still wanting in the environmental policies research. The result is distortion of public responses and reshaping of political landscape around the environmental movements.
In most basic terms, there are three core problems with the current state of social sciences and policies formation mechanisms. None of these problems are new to the post-crisis world or unique to economics. In fact, in many case economics as a discipline of inquiry is years ahead of other social sciences in dealing with these shortfalls. In summary the core problems are: insufficient modeling tools, poor data, and politically captive analytics and decision-making.
The first problem is the lack of rigorous modelling tools capable of handling behavioural anomalies. Put differently, we know that people often make non-rational choices and we occasionally know how to represent these choices using mathematical models. But we are far from being able to incorporate these individual choice models into macro-level models of aggregate behaviour. For example, we know that individually people often frame their choices in the broader context of their own and collective past experiences, even when such framing can lead to undesirable or suboptimal outcomes. Yet we have few means of reflecting this reality in economic models, although we are getting better in capturing it empirically. We can model habitual and referenced behavior of individual agents and we can even extend these models to macroeconomic setting, but we have trouble incorporating this behavior into explicit policy analysis. We also face mathematical constraints on our ability to deal with the more advanced and more accurate models extensions.
The problem of insufficient tools is often compounded by the problem of over-reliance on technocratic analysis that marks our policy formation. Put simply, we live in the world dominated by policy-making targeting aggregate performance metrics (such as global emissions levels or nation-wide GDP growth rates). This implies that we often aim to create policies that are expected to deliver specific and homogeneous outcomes across a number of vastly heterogeneous geographies – physical, cultural, political, social and economic systems, nations and societies. The only feasible approach to such policymaking is via technocratic reliance on ‘hard’ targets, often with little immediate connection to everyday life, and prescriptive policy designs. The core pitfall of this approach is that when a harmonised policy fails, it fails across all heterogeneous locations and environments. There is nothing more erroneous from risk management perspective than attempting to introduce a harmonised response to such systemic failures. Yet this is exactly what the policymakers strived to achieve in the setting of the euro area crises. The more reliance we place on technical models-driven solutions being right all of the time in all of the locations, the more harmonised and coordinated our responses to shocks are, the higher is the probability that a policy failure will be systemic, rather than localised.
The only alternative to this fallacy of reliance on technical analysis and hard targets-based modeling is to permit local innovation and differentiation. This historically-validated approach of the past, however, is not en vogue in the world where global institutions and aspirations dominate local objectives and systems, and where pseudo-scientific fetishism for technical knowledge dominates social sciences and policy making.
Beyond technocratic fallacy of over-reliance on mathematical models and the shortage of some key tools looms an even larger problem.
Consider the most recent example of a systemic failure by the economics profession to predict the current financial crisis. Instead of tools shortfalls, this failure rests with the problem of analysis and policy capture by political and economic interest groups that firstly determine the agenda for policy analysis and research, then define parameters and scope of such research and, finally, set bounds for measuring, monitoring and actioning data on policy outcomes.
With the onset of the financial crisis, economists working outside regulatory offices, ministries and central banks have gotten a much greater access to data than ever before. Still, even with data in public domain, analytical resources come at a cost premium, as anyone attempting to compete with, say the Department of Finance, finds out very quickly. By the time it takes an independent analyst to compile and analyse data, the Department of Finance can deploy dozens of staff to flood the media and public domain with own reports and papers. The asymmetry of resources drives the asymmetry of power in analysis and this fuels the asymmetry in policymakers’ perception of data insights. For example, lone voices of dissent or single pieces of contrarian analysis are pushed aside by the sheer magnitude of consensus, often representing little more than one agency replicating the insights of the other agency.
We might be able to produce better insights into the workings and risks of the banking sector today than before the crisis, but this does not mean that the actions of regulators and Governments are going to be any better informed or better tailored.
Even when independent analysis and scrutiny are available, regulatory and policy responses largely ignore empirical insights. In a recent study, myself and a co-author looked at asset prices across the number of advanced economies prior to and after the crisis. Using a very simple econometric model, we showed that data prior to 2006 was providing clear and loud signals as to the emergence of a number of crisis-level risks. However, to derive this result we had to calibrate the model using a parameter that was set at ten times the levels assumed to be likely by the banking regulators. Thus, by regulations, by own governance and remuneration standards, our public servants simply were not required to do this analysis. As the result, regulators around the world sleepwalked the entire financial system into the latest crisis and found themselves utterly unprepared for the fallout.
This is not unique to our study conclusions. Back in 2005-2006, inside the Irish civil service there were several senior voices raising concerns over the direction of our economy. These were echoed by a number of research papers and analysts warnings coming from the ranks of independent and academic economists. They were ignored not because they lacked empirical basis, but because the policymakers were captive to consensus view aligned with their own political objectives.
Nobel prize winners, Robert Shiller (2013), and Edmund Phelps (2006) economists such as Nouriel Roubini, Roman Frydman and Michael D. Goldberg repeatedly warned about systemic problems in the US property and financial markets back in 2004-2007. The NYU Stern School of Business research centre did the same for the banking sector. Last, but not least, in academic economics, research into non-rational, non-representative agent models has been on-going since the start of the 1990s, largely unbeknown to the general public and politicians. In fact, since the mid-1990s, majority of the Nobel Memorial Prize awards in economics went to researchers who pushed aside the bounds of rational expectations and/or representative agent frameworks.
Still, the problem of policy capture by the often poorly informed adherents to specific schools of thought is hardly unique to economics. Let's take two examples of policies that have seized public imagination and policymakers' attention, while sporting only tenuous empirical foundations.
One is wind and wave energy. Although it appears that there is a near-consensus in academic and policy circles that these two sources of energy offer preferred alternatives to traditional fuels, in reality, such consensus can and should be questioned. The latent energy stored in water and wind is huge. However, wind energy harvesting is also subject to own externalities. One key one is the transfer of cost of pollution abatement from the commons relating to energy production and use, to the commons relating to land and natural amenities use. This externality was already mentioned above and its discovery credit goes to economics, not to environmental sciences. Another one is the transfer of the cost of energy-related pollution to consumers. In the real world, different consumers access energy through different channels. Some channels offer energy users a subsidy over the other. Some channels come with a choice that a consumer can make to substitute between different service providers based on environmental and economic costs considerations, other channels do not. Again, credit for pointing this out goes to economists; environmentalists are all too often simply opt to ignore these realities in pursuit of aggregate emissions targets over and above the consideration of their feasibility or their effectiveness in the face of social, cultural, political and economic realities.
For example, state-owned public transport is commonly priced differently from the privately-owned public transport and both are priced distinctly from private transport. Unless use of energy is explicitly and uniformly priced across all modes of transport and unless all modes of transport are perfectly substitutable, some consumers of public transport will receive subsidies at the expense of others and majority will be subsidized relative to private transport users. Thus, a suburban family is likely to pay a higher price for pollution per mile travelled than an urban one. The fact that in many cases a suburban family might have been forced (by planning, zoning, pricing and other systems operating in a heavily distorted markets) to make a choice of living outside the areas with dense cover by transport alternatives does not enter into the determination of pollution-linked taxes and prices. Any decent economist can be expected to understand this much. Yet the simplified worldview that public transport subsidies and private transport taxes are always good persists among our policymakers and within environmental lobby.
Another example of the policy that is empirically shoddy, yet politically heavily supported is electrification of transport. Recent research shows that in the US, even if electrical vehicles made up over 40 percent of passenger vehicles in the, there would be little or no reduction in the emission of key air pollutants. Now, consider the case of Ireland, where ESB has been running multi-billion euro investment programme aimed at developing EVs networks since the early days of the financial crisis. Just as the value of private sector investment shot through the roof, Irish semi-state sector, encouraged by policymakers and subsidized by high prices on consumers, launched into a major investment programme based on questionable benefits to the economy and society at large. The Government of the day even announced back in April 2010 (with the country rapidly hurtling toward an IMF-led bailout) a EUR5,000 grant to EVs buyers. That Ireland’s electricity supply comes from environmentally damaging sources does not phase the environmental policy advocates.
The debates about the current state of economics and social sciences in general are a welcome departure from the pre-crisis status quo, where such discussions primarily took place in the marbled halls of academia and beyond the scrutiny of public attention. However, it is worth remembering that the core problems faced by social policies analysts today are the ages-old ones problems of insufficient modeling tools, poor data, and politically captive analytics and decision-making. We might be able – with time and effort – to fix the first one. Fixing the other two will require a paradigm shift in the ways we collect and publish data, and in the ways our political and public service elites approach policy formation. Two thirds of economics and social sciences problems are political, not scientific.
Thursday, January 2, 2014
2/1/2014: Markets, Invisible Hand & Social Ethics
A fascinating paper on the role that markets may play in influencing our social values and shaping our social ethics. Bartling, Björn and Weber, Roberto A. paper "Do Markets Erode Social Responsibility" (November 29, 2013. CESifo Working Paper Series No. 4491. Available at http://ssrn.com/abstract=2363888) "studies the stability of socially responsible behavior in markets".
The authors develop a laboratory 'experimental' approach "in which low-cost production creates a negative externality for third parties' or in other words, the choice of low cost production is associated with imposition of cost on third party. However, the experimental mechanism also includes and alternative production "with higher costs" to the main parties, but which "entirely mitigates the externality". So, in other words, the contracting parties can have a choice:
1) Opt for lower cost technology at a cost to a third party, or
2) A lower higher cost technology that has no additional cost to a third party.
Obviously, choice of (2) implies stronger social values, whilst choice of (1) implies rational optimisation in normal market setting.
"Our data reveal a robust and persistent preference for avoiding negative social impact in the market, reflected both in the composition of product types and in a price premium for socially responsible products. Socially responsible behavior in the market is generally robust to varying market characteristics, such as increased seller competition and limited consumer information. Fair behavior in the market is slightly lower than that measured in comparable individual decisions."
This is really, really interesting. More specifically, the core findings are:
"In our market, competition does drive down overall prices" in other words, it delivers economically efficient outcome, "thus yielding greater relative surplus for consumers at the expense of firms".
However, "there is no detrimental effect of increased competition on the degree of concern exhibited toward externality-bearing parties outside of the market. In fact, the market share of products that yield no externality increases slightly under increased firm competition, relative to our market baseline, as does the price premium for the socially responsible product. Thus, instead of decreasing the expression of social responsibility, increased market competition in this case seems to have, if anything, the opposite effect."
The puzzling bit is why this outcome arises in the setting where the parties know they are facing higher cost by accepting the need for concern for third parties? "One possible interpretation for this finding is that, as competition yields increased surplus for consumers, they become more willing to bear the costs associated with mitigating the externality for third parties." In a sense, greater efficiency funds 'purchases' or 'consumption' of social justice.
And what about giving parties more information to attempt to steer their decisions in the desired (presumably socially) direction?
"…we consider the possibility that consumers may have limited information about the degree of externality produced by available products, but have the ability to learn about such product characteristics. This reflects the fact that many consumers do not know which firms’ products are, for example, environmentally or socially harmful, but that such information is often available if a consumer chooses to acquire it. We study both a case in which the information is free to consumers and one in which acquiring it involves the consumer incurring a small cost. In both cases, we find that the need for consumers to actively acquire product information regarding social impact has only a small effect—though slightly larger when acquiring information is costly—on the expression of social responsibility in the market."
The invisible hand of the markets, it seems, is rather kind to ethical concerns, when the markets reach at least some level of prior efficiency...
Thursday, November 7, 2013
7/11/2013: Taxation and Human Capital: Blundel's Thoughts
A recent paper from Richard Blundel, titled “Taxation of Earnings: the impact on labor supply and human capital” (Becker Friedman Institute, 27th September 2013 available at: http://bfi.uchicago.edu/sites/default/files/research/Blundell_BFI%20_September_27_2013.pdf) argues that the tax system can be reformed “to generate the levels of revenue required to fund public goods while reducing the overall level of distortions implicit in the system”.
“The discussion in this paper draws on the work in the [Mirrlees Review (2011)] and concerns the taxation of labour earnings as well as relevant aspects of the welfare benefit and tax credit systems.” The core focus here is “on the empirical foundations for tax reform” in favour of “placing the analysis of earnings taxation in a lifetime setting, recognising the importance of human capital investments.”
Summary
Per Blundel, earnings taxation:
1) Raises revenue for public goods
2) Acts as the main source for funding redistribution of “resources from richer to poorer households”
3) “… From a more dynamic perspective, it ‘insures’ individuals and families against adverse events such as job loss and disability.
“Not surprisingly, it occupies a special place in debates about levels and structure of taxation.”
Several other important aspects that Blundel fails to consider are:
- Earnings taxation represents an opportunity cost of public goods provision in terms of reduced availability of funding for investment in enterprise creation and entrepreneurship; and
- Earnings taxation levies a charge on that part of personal income that is linked directly to individual effort and investments in human capital.
“One central question in the policy debate on earnings tax reform is whether, and to what degree, ‘supply side’ reforms can be used to relieve the pressure from ageing populations.” Thus, the question is: “How best to increase employment and earnings over the working life?” Per Blundel, evidence suggests that “the key to using tax policy for improving the trends in employment, hours and earnings in the longer-run will be to focus on”:
1) labor market entry (“Enhancing the flow into work for those leaving education and for returning mothers after childbirth”)
2) retirement (“maintaining work among those in their late 50s and 60s”) and
3) human capital (“Understanding the implicit incentives (or disincentives) created in the tax and welfare system or human capital investments .... Encouraging human capital improves the pay-off to work and ensures earnings grow, and hold up longer, throughout the working life.”)
Tax reforms accounting for human behavior
Key here is that “Reform of the tax system as it impacts on labor supply and human capital is not simply about increasing life-time earnings”. In addition to levels of earnings consideration, we must also account for “many other aspects of human welfare, including the utility from consuming goods, from home production, from reducing risks, etc.”
Thus, taxes on earnings “should be seen as part of the whole ‘tax system’. In terms of an overall reform package, it is important to view corporate and personal taxation together as there are many aspects where they overlap: not every tax needs to be progressive for the tax system to be progressive; not every tax needs to be ‘green’ for the tax system to provide the right incentives for environmental protection.” In other words, “we still need to be aware of the interactions with capital, savings and environmental taxes.”
All of the above suggests that the Irish Government approach to tax policy, based on the explicitly defined premise that no matter what, the corporate tax system rests outside the scope of any tax reforms consideration, is not and cannot ever be a good practice.
Complexity avoidance is real
Another major point raised by Blundel is that “In most developed economies, the schedule of tax rates on earned income is rather complex. This may not always be apparent from the income tax schedule itself, but note that what really matters is the total amount of earnings taken in tax and withdrawn benefits—the effective tax rate. The schedule of effective tax rates is made complicated by the many interactions between income taxes, earnings-related social security contributions by employers, welfare benefits, and tax credits.” In other words, Blundel clearly states that total burden – whether via direct or indirect taxes – matters. This is something that the Irish Government simply refuses to recognize.
However, in criticism of Blundel, I would also add that it is too simplistic to look at the effective macro-level (economy-wide or average/media) level of taxation. We have to recognize that many benefits paid out in the economy do not apply or are not available to all participants in the economy. Thus, for example, famers transfers are not available to non-farmers, youth support schemes relating to training and education are not available to older adults, unemployment benefits are not accessible to entrepreneurs and so on.
Taxes and labour supply
“At a very high level, some of the main points that emerge from this evidence are that substitution effects are generally larger than income effects: taxes reduce labour supply. Especially for low earners, responses are larger at the extensive margin—employment—than at the intensive margin—hours of work. Responses at both the intensive and extensive margins (and both substitution effects and income effects) are largest for women with school-age children and for those aged over 55.”
There is much, much more to read in Blundel’s insights, so do not even for a second think the above summary is a substitute to reading the whole paper.
Subscribe to:
Posts (Atom)