Sex, Genes, Politics and Company Law:
Can Capitalist Democracy coexist with Human Survival?
Chris King Aug 2013 - Feb 2020 Genotype 1.1.38
Fig 1: Mortal combat between the Bear and Bull stock market overlaid by the triple witching hour instability 2011.
Fireworks over Lady Liberty, Ellis Island, NY.
The twin pillars of Western civilization are capitalism and democracy, but are these consistent with surviving in a living planet? Are they beneficent foundations of individual freedom and prosperity, or are they malign forces doomed to boom and bust instabilities that will carry us past tipping points into a hard landing for human survival? Could they, in the face of natural abundance, bring about economic collapse in a tragedy of the commons due to environmental disruption caused by the human impacts they have set in motion?
While capitalism’s money-driven investment economy has a controversial name - a Jekyll and Hyde black and white character depending on one's pendulum point in the ever-smouldering left-right political divide, democracy is almost universally anointed a saintly role in the preservation of civil liberties and freedom of choice of the people – the bedrock of Western civilization.
But this is a little naēve? As Winston Churchill once famously said: "Democracy is the worst form of government, except for all those other forms that have been tried from time to time." Of course he said this ironically, after being defeated in a rebound election, having won the Second World War, but his point is not just about the electoral vagaries of democracy for aspiring leaders, but the intrinsic paradoxes of social government. George Bernard Shaw highlighted the skeleton in the democratic spin closet we have to be astute enough to see through: "Democracy is a device that ensures we shall be governed no better than we deserve."
This article investigates whether electoral democracy and corporate activity, lacking any real genetic stability and prone to winner-take-all exploitation, particularly of natural resources; able to change from a shark to a tiger by a simple act of recapitalization; and often short-lived in a predatory merry-go-round of relentless takeover and corporate cannibalism; driven only by the profit imperative and the vagaries of the free market, can provide any basis for long-term economic and ecological sustainability. We shall also examine how both capitalism and democracy are manifestly social products of the male gender to the exclusion of the immortal sex men live their lives to fertilize, unearth the inevitable Machiavellian strategies of deceit that coexist in any climax society and seek the keys to an ecological completion of the economic quest for a life of natural abundance.
We need first to take a step back and expose the sexual underpinnings of this entire process. In "Sexual Paradox: Complementarity, Reproductive Conflict and Human Emergence", we articulate the thesis that the emergence of human culture, super-intelligence and social complexity has come about through an irresolvable red-queen race of sexual selection, in which neither sex has had the upper hand, leading to strategic paradox and the runaway selection of genes favouring both male genius and female social and sexual astuteness, enhanced by mammalian XY sexual chromosome genetics.
Fig 2: Human sexual dimorphism at the cellular and organismic levels.
Central to this idea is the primary role of female reproductive choice, rather than the 'Flintstones' - man the hunter - view of male chauvinist warriors clubbing prospective partners, or abducting them, as still occurs commonly in some Amazonian warrior cultures such as the Yanomamo, and in Central Asian countries such as Kyrgyzstan. Intriguingly both evolution of humanity through sexual selection and the dominant role of women in choosing mating partners was recognized by Charles Darwin in his second work ‘The Descent of Man, and Selection in Relation to Sex’ after writing ‘On the Origin of Species’.
Although humans have strong pair-bonding partnerships associated with the long child-rearing times of humans compared with other primates, female reproductive choice remains paramount among mammals because they bear live young and engage lactation giving females a primary parenting reproductive investment and males a primarily sexual fertilizing investment. The evolution of female orgasm, concealed ovulation, counterpointed by menstruation, lunar menstrual synchrony, perpetual sexual receptiveness associated with sexual coitus as a social means of family bonding, the loss of the penis bone and penile spines in favour of a large tumescent penis providing a genuine indicator of genetic fitness in men, all point to a line of human evolution in which female reproductive choice has had a central role. A role in driving human language and cultural emergence, in which women gatherers chatting together in the field about their relationships, provided the central bulk of the subsistence diet, while male hunters' meat from the kill, gained in silent vigil, was a dietary supplement traded for sexual favours.
Fig 3: Clockwise from top left: Nisa, Deep trench in mitochondrial DNA extending to 140,000 years (Behar et al), Menarche rite (Fulton’s Cave), actual eland dance for the menarche, cave with 70,000 year old paintings (Tsolido Hills) evidence of cosmetics and shell jewelry (Blombos Cave 75,000).
The reproductive investments of the two human sexes are diametrically opposed, with males investing primarily in sexual fertilization by many means, from cut and run through faithful husbanding to the alpha male harems of Udayama and Ghengis Khan. Women on the other hand have a major investment in parenting and have to spread their investment over the relatively few children they can give birth to. Only three percent of mammals are monogamous because of the major polarization internal fertilization, live birth and lactation precipitate and humans stand at an extreme among mammals because of the massive nature of human pregnancy, the increased risk to the mother due to the large human head and the long period of vulnerability a young lactating mother faces protecting her family.
Thus women's choices have been driven towards resource-bearing men who are also intelligent providers, gleaned through the social filters of good hunting, musical and artistic ability, and good jokes and story telling around the fires during long discussions in the night about the affairs of the human grape vine, while occasionally outsiring on the sly to a stud with desirable genes as an insurance against putting all her eggs in one man's basket, something all men find a mortal threat of paternity uncertainty, but no woman faces.
Humans, like several ape species are commonly female exogamous, with females moving to live with their male partner’s kin. Most cultures have had patrilineal kinship, rather than the matrilineal patterns of temporary sexual partnerships, or 'walking marriages', with uncles helping rear their sisters offspring. Nevertheless the social traditions of founding human cultures, such as the bushmen, show a pattern respecting a young woman's first pregnancy and delivery being with the maternal family, honouring the power of menarche as a sacred rite of passage upon which the fertility of the people depends, rather than regarding women as ‘unclean’, and allowing a degree of female choice about their partnerships surprisingly similar to the more recent achievements of modern Western cultures after centuries of male dominance, as illustrated in Marjorie Shostak's "Nisa".
As the gatherer-hunter way of life gave way to a combination of agriculture, invented by women gatherers, and animal husbandry and herding discovered by the men, the rise of great urban cultures was accompanied by a massive transfer of power over reproductive choice to male dominant coalitions. We can see in Sumeria the delicate association of planter Queen and shepherd King founding one of the greatest cultural flowerings in human history, ultimately giving way to male dominance, in favour of the trinity of male gods An, Enlil and Enki masturbating into the primal waters.
Fig 5: Marduk slaying Tiamat the primal chaos Goddess illustrates the rise of patriarchal dominance
over the old order of the planter Queen.
We can see an even more tortured version of this transition in Genesis where Jacob rejects the matrilineal pattern of Laban and after seven years escapes with his flocks as the arch founding patriarch along with his two wives, one hiding the house gods, or teraphim, of the maternal family ominously under her menstrual skirts. This combined with the avowedly male invocation to "go forth and multiply", celebrated in the ritual circumcision of the male member as a token of fertility, confirms the patriarchal relationship with God, consecrated in the sin arising from Eve's cavorting with the serpent, dooming woman to be in bondage to their husbands as men were in subservience to God.
This transition is sealed in a dire warning in Judges that matrilineal patterns were giving way to a staunch patriliny. The concubine of Bethlehem-Judah is accused of 'whoring' by going back to live with her father-in-law for four months. When the Levite returned to claim her, the father-in-law kept saying to stay a little longer for six days, nigh on a week. When the couple left and turned in at Gibeath of the Benjaminites, men of Belial ask to "know the man within". In an attempt to avoid sodomy, the host offers his daughter to which they refuse. The Levite then offers his concubine. She is raped and abused all night and dies on the doorstep, while her master sleeps peacefully. He then cuts her in twelve pieces and sends them to all the coasts of Israel, setting off the Benjaminite wars. These are finally resolved in moving four hundred virgins of Jabesh-Gilead to their husbands homes, capped by the abduction of the daughters of Shiloh dancing at a festival to satisfy the remaining Benjaminite men.
Invocations against female reproductive choice are likewise enforced by dire rulings on stoning for adultery, or for losing the tokens of virginity and not crying out loud enough for someone to hear, steeped everywhere in the themes of Israel guilty of whoring against Jehovah by worshipping Asherah the goddess of natural fertility represented by a tree in the Temple and the other gods and goddesses of the Nations, on every high hill and under every green tree. Note however that Jewish inheritance still unilaterally comes through the mother, in acknowledgement of the incontrovertible truth that the child of a Jewish woman is a Jew while the child of a Jewish man could be a pretentious bastard.
We see this pattern confirmed across a swathe of urban cultures, from Assyria through Persia, Greece and Rome, also manifest in all great religions, from Judaism, through Christianity and Islam to Vishnavite Hinduism, with occasional appearances of feminine deities, from Inana through Cybele to Kali. The essential feature is the rise of male military coalitions, generally headed by a tribal warlord, given mythical or divine status by degrees, supported by a rank and file of soldiers receiving menial rewards in rape and pillage by comparison with the reproductive harems of their lords and masters.
Democracy originates from ancient Greece when established in 508/7 BC by Cleisthenes, an Athenian noble, in response to the endless struggles between conflicting tyrants of the noble families, themselves the strong men of family clans tracing their origins back into mythological antiquity. Democracy arose as a compensating antidote to these patriarchal clan struggles in the form of an electoral coalition of all the Athenian men of fighting age. In most of antiquity the benefit of citizenship has been tied to the obligation to fight war campaigns. Women, slaves and foreigners were specifically excluded, meaning only about one in ten Athenians were citizens, but still it was a fundamental innovation, resulting in the most direct form of electoral democracy in history, in which the citizens decided all policy matters directly, rather than electing representatives to form a government, and in which officials were chosen from the citizens by random lot - both being devices to bypass the back-room corruption rife in clan dealings.
Fig 4: Democracy is a complex social dynamical system invented by Cleisthenes (Lower right) as an egalitarian parliament of male citizens of fighting age (upper right) directly making the decisions of government through discourse. The democratic system does not just involve the popular assembly, but law courts, tribunals, clan structures social agencies and the military, just as modern electoral democracy depends on the rule of law and political accountability to function. (Upper left): Zeus abducts his great-grandson Ganymede in an incestuous homosexual act of paedophilia to become his lover and cup bearer on Olympus. 470 BC Temple of Zeus, Olympia. (Lower left) Priapos (god Bes) c500 BC from a brothel in Ephesus. (Top center) Appenzell the last cantonment to give women the vote in 1990, and then only when compelled by the federal government, is one of the last two cantonments still operating the Landsgemeinde or "cantonal assembly", one of the oldest forms of direct democracy, dating from the middle ages. Eligible citizens of the canton meet on a certain day in the open air to decide on laws and expenditures by the council. Everyone can debate a question. Voting is by those in favour of a motion raising their hands. Until the admission of women, the only proof of citizenship necessary for men to enter the voting area, as in the 1971 meeting above (National Geographic), was to show their ceremonial sword or Swiss military bayonet. This gave proof that you were a freeman allowed to bear arms and to vote, pretty much exactly as in the Athenian male-military coalition model.
Notably Athenian society was one which extolled the virtues of men above women, as noted in Eva Keuls “The Reign of the Phallus: Sexual Politics in Ancient Athens” and Gerda Lerner’s “The Creation of Patriarchy”. Greece was a patriarchal class-driven society with slavery, in which women were excluded from political life and were lifelong minors under the guardianship of a male.
When Zeus the male high god at the centre of the pantheon overthrows Kronos he swallows his wife Metis thus preventing her bearing a son, in the same process, assimilating to himself her power of procreativity. He is thus able to give birth to Athena. We thus see not just woman but the very capacity of women to contribute to the nature of the offspring unravelled by the patriarchy.
Woman becomes an empty vessel for male procreativity:
"The mother is not the true source of life.
We call her the mother, but she is more the nurse,
The furrow where the seed is thrust.
The thruster, the father is the true parent:
The woman but tends the growing plant".
Apollo in Aeschylus' "Eumenides" or "Furies"
The idea that only the male was procreative spilled over into excessive absorption with male sexuality in men loving men, and 'passing on one's manhood' to under-age boys. Pederasty was an institution sanctioned by the Olympian gods and mythical heroes. Zeus, Apollo, Poseidon and Heracles all had pederastic experiences. So did many of the most illustrious real-life Greeks including Solon, Pythagoras, Socrates and Plato. The act was part of the foundation of an elitist, military culture that elevated the idea of the penis beyond biology and religion to the rarefied heights of philosophy and art. The pederastic act was the culmination of a one-on-one mentoring aimed at passing on arete a set of manly virtues including courage, strength, fairness and honesty. Believing Anaxagoras, in a bid to father only sons, men even had their left testicle removed.
Essentially democracy was a patriarchal trade-off between fighting men to balance the alpha male dominance of feudal tyranny arising from tribal clan warlords later become urban ‘nobility’. We know that women began to gain the democratic vote a full two and a half millennia later, only at the turn of the twentieth century, when New Zealand gave women the vote in 1893, underlining the deep relationship throughout history between patriarchal dominance and democracy. Previously France had instituted universal male suffrage abolishing all property requirements to allow men to vote in 1792. Ironically woman didn't get the vote in France until 1945 and in Switzerland women gained the vote in 1971 and in cantonal elections only in 1990, underscoring how deep and long the association between patriarchal dominance and democracy has been.
But neither have individual woman leaders in this avowedly patriarchal tradition necessarily been willing or able to transform the situation for the better when in power, with leaders from Margaret Thatcher to Indira Gandhi taking the authoritarian path of the extreme right. For a time New Zealand had two alternative female leaders, both of who became prime minister. Jennifer Shipley was renowned for divisive new right policies such as dissolving family trusts to make elderly people pay for their health care, despite a national health service. Helen Clark led a labour government for three terms, which was socially conscious, but did so by holding her cabinet in line with strict alpha leadership discipline, leaving no strong contender to fill her shoes when she moved on, after losing her fourth term election, to head the UN Development Program.
The Age of Patriarchy: how an unfashionable idea became a rallying cry for feminism today.
It is this alpha male pyramid that Athenian democracy sought to mediate, by instituting direct democracy among the citizens of fighting age, upon whom the independence of the city state depended. Little wonder then that democracy has been until the last few decades a purely male affair and that in its first past the post adversarial competing two-party form, is a veritable male reproductive combat ritual for the winner-take-all complete spoils of the feminine partner in the piece, the blind lady justice of the voting population.
Laying bare how central male reproductive combat is to democratic electoral systems, Klofstad et al. (2015) found that the deeper the voice of a contestant of either sex, the more popular they were, with the deeper voices gaining between 60 and 80% of the vote, indicating markers of testosterone dominance are more influential than a candidate's policies and trustworthiness.
In reflection of the tendency to adversarial positions in male combat there has emerged a major polarization in democratic politics between the right, which in its harder forms leans to a set of patriarchal beliefs in individual enterprise for winner-take-all gains, veering towards Fascist dictatorship of the strong leader, in opposition to the left and its 'nanny society' of the welfare state, more reminiscent of shared parenting notions. On the extreme left we find again a totalitarian tendency, turning social equality into a big brother society maintained through the illusion of class warfare run by nepotistic cliques of one party state officials.
In first past the post form, democracy is prone to a tyranny of the majority in which the policies and legislation of the winning party can act to protect the interests only of those supporting the government in power to the exclusion, or outright detriment of the opposing minority. The Muslim Brotherhood in Egypt is not the only democratically elected government, which has been accused of failing to act in the interests of their country as a whole. The problem is endemic to all first past the post democracies.
Fig 6: (Left) Gerrymandering and the proportional paradox (Stewart New Scientist) . (Right) An engraving from the 1772 edition of the Encyclopédie; Truth, in the top center, is surrounded by light (Wikipedia).
Democracy in its modern forms is in many senses both a product and a facilitator of the age of enlightenment. The age induced examination of the standards by which people ruled and gave validation to the idea of human rights. Democracy in turn became a catalyst of free-thinking ideas and opportunities in society and commerce.
To compensate this glaring pattern of perpetual conflict, many societies have sought to modify the simplicity of adversarial democracy. The US federal government for example has a written constitution and three houses President, Senate and House of Representatives intended to provide a set of checks and balances against the potential tyranny of any other branch. However, in practice, this seems to create a very expensive cumbersome top-heavy governmental system prone to intractable conflicts of government, and an opacity more easily served by business interests and professional lobby groups than the average citizen.
Other countries have sought to dilute the manifest reproductive combat scenario of first past the post with various forms of proportional representation such as STV (single transferrable vote) and particularly MMP (mixed member proportional), in which each party adds list members in addition to their elected members to give them proportional membership in parliament, leading to coalitions of smaller parties and more representative forms of government. Although this advantage is parried by a rise in back-room deals between un-elected list candidates and a tendency to unstable alliances. MMP does serve to provide a more ecosystemic form of democratic process, which has a greater probability of serving the interests of diverse minorities.
Electoral theory shows that changes in the electoral system can produce almost any outcome in a closely fought election. In “Electoral dysfunction: Why democracy is always unfair”, the mathematician Ian Stewart shows that virtually all voting systems lead to paradoxes of one sort of another. First past the post ranks well in stability and accountability, but is a dud in fairness. In first past the post with several candidates a candidate can win without even getting a majority, so most votes are literally wasted. A runoff doesn’t solve this either because the two highest candidates may come from the same political side of the spectrum if there were a multiplicity of opposing candidates. Preferential voting can lead to paradox in which everyone wins because the preferential order of the voters chases its tail. MMP avoids such paradoxes but leads to list candidates and unstable coalitions of government so it is fairer but less stable and sometimes less accountable.
As of 2018, in New Zealand we have an MMP government formed by a coalition of three parties Labour, New Zealand First and the Greens agreed to a coalition between the first two, in which the Greens provide confidence and supply, after NZ First rejected a coalition with the largest party National holding 45% of the vote, due to loss of trust between that party and NZ First. This arrangement appears to be working well although naysayers would claim the largest party was robbed of the right to govern. It demonstrates a refreshing counterpoint to Trump’s divisive politics of deceit and abuse, particularly when the Prime Minister brings her newborn child to the UN, as both a leader and a nursing mother.
Although elections to the US House of Representatives use a first-past-the-post voting system, the constitution requires that seats be "apportioned among the several states according to their respective numbers" - that is, divvied up proportionally. In 1880, the chief clerk of the US Census Bureau, Charles Seaton, discovered that Alabama would get eight seats in a 299-seat House, but only seven in a 300-seat House. In the proportional paradox, increasing the total number of seats available to balance the parties to their proportional vote can reduce the representation of an individual constituency, even if its population stays the same because the way the proportions are rounded down and then compensated for by an integer number of additional seats can change the balance in the rounding so an electorate loses representation.
Central in all these systems is the allocation of electorate boundaries, because many elections such as GW Bush’s first election was won against Al Gore with less than half the popular vote. Gerrymandering, choosing election boundaries to favour a candidate or party, is named after a 19th-century governor of Massachusetts, Elbridge Gerry, who created a an electoral division to bias the vote whose shape was so odd as to remind a local newspaper editor of a salamander.
Economist Kenneth Arrow discovered one of the most fundamental paradoxes of voting. He set out four general attributes of an idealised fair voting system - (1) that voters should be able to express a complete set of their preferences; (2) no single voter should be allowed to dictate the outcome of an election; (3) if every voter prefers one candidate to another, the final ranking should reflect that and (4) if a voter prefers one candidate to a second, introducing a third candidate should not reverse that preference. However Arrow and others went on to prove that no conceivable voting system could satisfy all four conditions. In particular, there will always be the possibility that one voter, simply by changing their vote, can change the overall preference of the whole electorate. In many ways Winston Churchill's comment thus remains true.
In response, given the access to information flows facilitated by the internet age, some advocates seek a return to forms of direct democracy not interfaced by elected, or unelected party representatives.
In a position piece on such developments in Europe, Nathan Gardells (2018) comments:
For the first time, an Internet-based movement has come to power in a major country, Italy, under the slogan “Participate, don’t delegate!” All of the Five Star Movement’s parliamentarians, who rule the country in a coalition with the far-right League party, were nominated and elected to stand for office online. And they have appointed the world’s first minister for direct democracy, Riccardo Fraccaro. “Referenda, public petitions and the citizens’ ballot initiative are nothing other than the direct means available for the citizenry to submit laws that political parties are not willing to propose or to reject rules approved by political parties that are not welcome by the people. Our aim, therefore, is to establish the principles and practices of direct democracy alongside the system of representative government in order to give real, authentic sovereignty to the citizens.”
Another participatory tool being used around the world from Iceland to India, is “crowdlaw” - “a form of crowdsourcing that uses novel collective intelligence platforms and processes to help governments engage with citizens. In Taiwan, the new Referendum Act that took effect in January 2018, means the public has “more say than ever in the country’s future.”
Running against this trend, the idea of direct democracy has retrenched instead of advanced in the Netherlands. After a non-binding 2016 referendum that expressed euroskeptic sentiment, the Dutch Parliament abolished the referendum law, worried that it would lead to populism.
The difficulty with government by referenda is that there are few safeguards against absolute tyranny of the majority, even it is when razor-thin, or achieved through a campaign of misinformation and foreign interference, as with Brexit. The process is also prone to populist sentiments, as there are no constitutional, or institutional safeguards of accountability for the decisions made, and no moderating influence of a governing track record to establish trust in the proposed agenda, which could become irreversibly repressive on cultural diversity causing diverse minorities to suffer disproportionately. However an informed process of direct democracy could serve to complement and enrich representative government.
Many features of venture capitalism, particularly those that lead towards a tragedy of the commons and a tipping point into climatic crisis, also display graphic features of human reproductive imperatives - distinctively those of the spermatogenetic reproductive strategy of males in the absence of a contravening and complementing female long-term out-front parenting investment strategy across several offspring and multiple generations.
Firstly capitalism is based on monetary resources, just as male reproductive investment has a major component of the resource-bearing male securing the sexual commitment of one or more female partners. Classically a majority of ethnic societies are polygynous, with a man able to secure sufficient income to support two wives frequently doing so. Thus the proportion of men in polygynous marriages in such societies is around one in eight or 1/23, reflecting the inverse cube power law noted in the distribution of capital in human societies. Thus the distribution of financial wealth in capitalist societies is closely tied to the human male reproductive imperative. Nowhere in natural ecosystems do we find one individual possessing a million or a billion times the resources of another member of the same species, except in terms of male reproductive imperatives, where an alpha male bearing the right resources in bulk, display, fighting prowess or monetary or military capital can capture 100% of the reproductive resources of all the females he can command.
Other features of capitalist investment, including winner-take-all intellectual property rights, the tendency to short term boom and bust investment at the expense of long-term sustainability, the reckless risk-taking preparedness to pass irreversible tipping points unless damage is exhaustively proven in advance, all reflect the male reproductive imperative's venture risk strategy - preparedness to die to secure immortality - as illustrated in Matty Groves facing death to fertilize Lady Arlen in the hope of giving birth to a prince who can have riches, the choice of many eligible women and thus many offspring. No woman can afford to risk her life to reproduce because she can only give birth if she IS alive.
Another patriarchal feature of capitalist economics is an obsession with exponential growth to the exclusion of any understanding of how to benefit long-term from inevitable cyclic changes and non-linear feedbacks that arise in natural systems. An exponentiating resource, by its very nature, is unsustainable long term in any finite environment such as a planetary biosphere. While natural growth is a feature of all living systems the universal application of exponentials to the economic condition in terms of expectations of an endlessly increasing gross national product as an indicator of health has parallels only with male reproductive resource seeking. The health of an economy is not measured by exponential growth, but by long term robustness and the quality of human life it can sustain. An exponentiating economy, like the population explosion, is a long-term threat to our survival through habitat destruction resource depletion and an unsustainable dynamic that has no day after tomorrow. Economics also needs to be able to model itself on the full suite of transcendental functions including periodic functions sines and cosines and incorporate non-linear feedback principles to be able to respond to fluctuating market and natural conditions.
Steady state economists such as Herman Daly, Richard Heinberg and Brian Czech try to make clear that growth pursued over and above what the natural environment and non-renewable resources can sustain is unsustainable bad economics, which may benefit the perpetrator but overall reduces our long-term collective wealth. However nature itself is stable amid climax diversity and natural fluctuation. This leads to ecological economics in which the priority is modelling our economic system on natural principles to enable society to coexist over time with the biosphere on which we depend for our survival. Penetratingly John Stuart Mill, one of the founders of economics, both hypothesized that the "stationary state" of an economy was the desirable condition, and at the same time in his work "The Subjection of Women" claimed that society and gender construction was holding women back and that the oppression of women was one of the few remaining relics from ancient times, a set of prejudices that severely impeded the progress of humanity.
Fig 6b: Introduction to "The Subjection of Women" John Stuart Mill 1869
The prisoners’ dilemma is a classic paradox of game theory, in which two prisoners are tempted to defect and expose one another, and pass the blame to get off, rather than incur a light or moderate sentence if they both cooperate and stay silent. This is a temptation into mutual jeopardy because mutual defection causes both to get a long prison term by ratting on one another. Virtually all strategic social encounters in which the players endure are prisoner’s dilemma games of the strategic competition between defection and cooperation, in which temptation into mutual defection is a tragic outcome.
Both the classic Tragedy of the Commons, drawn attention to by Garret Hardin, is a laissez-faire prisoners' dilemma of mutual economic disaster in which it serves everyone who can to pillage the commons to its extinction, because if they don't someone else will, and the notion of "Rape of the Planet" - a male sexual crime against Mother Nature - are manifestations of patriarchal venture capitalism lacking a balancing long-term out-front feminine reproductive nurturing strategy to maintain the viability of a closing circle of the biosphere. It is this balancing strategy we need to find to avoid capitalism threatening our economic and biological viability.
The patriarchal competitive winner-take-all investment environment in the electronic age, reflecting male spermatogenetic investment, leads to an ever sharpening set of instabilities in which instruments such as futures, originally intended as arbitrage to mediate commodity price fluctuations, themselves become heightened volatility instruments of rapid trade, leading to instabilities, especially in volatile times such as the triple witching hour - the last hour of the stock market trading session on the third Friday of every March, June, September, and December when three kinds of securities: Stock market index futures, Stock market index options and Stock options expire together.
Edmundo Braverman's blog "Go get Somebody Pregnant" has an interesting insight into the sexually charged relationship between sowing wild oats, ramped up personal debt and a competitively hungry trading drive. Commenting on an associate's incipient fatherhood honing his approach to business he notes: "It instantly brought me back to my old stockbroker days because, believe it or not, management encouraged knocking someone up all the time for this very reason. Management understood the correlation between outside pressure and increased production, and you never saw this more pronounced than when one of the guys had some girl pull up pregnant. That's [also] why they wholeheartedly supported guys getting into enormous amounts of consumer debt. you'd see a level of motivation and resourcefulness come out in them that you hadn't seen before. There was certainly no altruistic intent behind my old firm encouraging guys to have kids, but the end result was the same: increased production."
And there is clear evidence from a 2008 research study in PNAS (Bryner) for sexually physiological responses in male stock traders, whose testosterone levels soar on days they make above average profitable trades. This has led to concern they may then become susceptible to excessive risk taking due to a continuing rise in hormones in the "winner effect" useful for ongoing sexual conquests but which in a trading situation could lead to whiplash losses. However exposure to market volatility and the resulting stress elevated cortisol levels, potentially leading to a psychological state known as "learned helplessness" where risk aversion may lead to stasis. In men these opposing forces could lead to sentimental boom-bust instability. Paradoxically another study (New Scientist 24 March 2011) has shown that both high and very low testosterone lead to heightened risk taking, possibly for opposite reasons. Men on the bottom of the social heap need to take risks to reproduce because they have no other choice.
By contrast another more recent survey by Rothstein Kass found that the few hedge funds led by women far outperformed the global hedge fund index in 2012. The reasons attributed to this success were that women are more averse to high risk strategies than men making them potentially "better able to escape market downturns and volatility" and thus make better long term investors. Author LouAnn Lofton whose book title states "Warren Buffett invests like a Girl" says women "trade less and their investments perform better, that they are more realistic, that they are more consistent investors, and that they tend to engage in more thorough research and ignore peer pressure". Notably' when it comes to the higher risk, venture capitalist arena' women are harder to find - just three women ranked at positions 36, 47 and 82 in the latest top hundred 'Midas list' in Forbes magazine.
However the presence of women in prominent positions in major corporations doesn't necessarily lead to a change in the capitalist zeitgeist any more than it has in adversarial politics. Business as usual is built on a complex edifice of patriarchal institutions, from company and corporate structures, through banks, commercial law and regulatory regimes all designed to keep the flow of business as usual operating. We cannot thus expect the appointment of Ginni Rometty as CEO of IBM, Sheryl Sandberg to Facebook, or Marissa Mayer at Yahoo to result in iconic qualitative changes in the way these corporations operate in the competitive business environment.
There are also poignant lessons to be learned from the history of radical feminism, which show unbridled female-only strategies can have every bit as disquieting outcomes. Susan Faludi notes in the New Yorker that virtually every feminist who founded a radical movement was subsequently banished from the group. Shulamit Firestone was forced out of New York Radical Women after she and two associates were accused of being ‘defensive’ and ‘unsisterly.’ Marilyn Webb was forced out of Off Our Backs - because she was the only one with journalistic experience, being told 'You can’t write at all; you have to help other people', and banned from accepting public-speaking engagements. Jo Freeman was ostracized by members of the group Westside she had helped found. “There were dark hints about my ‘male’ ambitions — such as going to graduate school,” she said. Carol Giardina was ousted from her Florida group by “moon goddess” worshippers who accused her of being “too male-identified.” “I don’t know anyone who founded a group and did early organizing” who wasn’t thrown out. It was just a disaster, a total disaster.” The emerging lesbian wing browbeat Kate Millett into revealing that she was bisexual, and then denounced her for not having revealed it earlier. Millett had a breakdown and was committed to a mental hospital. Firestone, who had been denounced by feminists for violating the “We’re all equals” ethic by accepting a small book advance, when the women wanted to collectively own the copyright, and for appearing on “The David Susskind Show”, developed schizophrenia and eventually died alone in her apartment, apparently from starvation. Anselma Dell’Olio, founder of the New Feminist Theatre, in New York warned that women’s “rage, masquerading as a pseudo-egalitarian radicalism under the ‘pro-woman’ banner,” was turning into “frighteningly vicious anti-intellectual fascism of the left.” After Ti-Grace Atkinson resigned from the Feminists, a group she had founded in New York, she declared, “Sisterhood is powerful. It kills. Mostly sisters."
In 2015, research into the comparative population diversity of maternal mitochondrial DNA and the male Y-chromosome led to an astounding contrast. Around 10,000 years ago, corresponding to the birth of agriculture, the diversity of the Y-chromosome underwent a collapse across vast areas on the human-colonized planet.
Fig 6b; Top a bottleneck in Y-chromosome diversity absent in mitochondrial DNA
inherited through the female line shows the reproductive sex ration in diverse
cultures went from an evolutionary 1 man to 2 women to 1:17 between 10000 and
5000 years ago. Bottom: Rise of the Gini coefficient
in terms of time versus advent of horticulture shows a steep rise in inequality
in the Old World associated with use of livestock.
There is no evidence this was a result of direct biological or genetic factors as there were no differences between differing Y-clades. The conclusion is that the effect was driven by cultural changes associated with agriculture in which powerful men were able to reproductively exploit large numbers of women and transmit their reproductive success on to their male heirs, squeezing the majority of males out of the reproductive race. Estimates of this phase of extreme reproductive polygyny suggest that for every reproducing male there were 17 reproductive females effectively making harems the predominant form of sexual relationship (Karmin et al. 2015).
This comes as an ironical twist since it is assumed that agriculture was an invention of women coming out of their role as gatherers in gather-hunter societies and provides a new perspective on the societies of the planter queens where female deities appear to have been worshipped at the same time as this extreme form of male reproductive elitism. The other thing that is really stunning about this effect is that it has been repeated widely across disparate world cultures, from China through the Near East to Europe and even Pre-Colombian America.
In parallel with this, is evidence for increasing inequality in large old-world urban civilizations (Kohler et al. 2017). The team worked with archaeologists around the world to collect data from 62 sites in North America and Eurasia dating from before 8000 B.C.E. to about 1750 C.E. (They also included one modern hunter-gatherer group, the !Kung San in Africa.) From the distribution of house sizes, they calculated each site's Gini coefficient, a standard measure of inequality, discussed below.
Gini coefficients range from zero, indicating that each person has exactly the same amount of wealth, to one, representing a society in which a single person has all the wealth. The researchers found that inequality tended to gradually increase as societies transitioned from hunting and gathering to farming, supporting long-held hypotheses about how agriculture intensified social hierarchies. About 2500 years after the first appearance of domesticated plants in each region, average inequality in both the Old World and the New World hovered around a Gini coefficient of about 0.35.
This figure stayed more or less steady in North America and Mesoamerica. But in the Middle East, China, Europe, and Egypt, inequality kept climbing over time, topping out at an average Gini coefficient of about 0.6, roughly 6000 years after the start of agriculture at Pompeii in ancient Rome and Kahun in ancient Egypt. The authors propose that domestic animals may explain the difference between the New World and the Old World: Whereas North American and Mesoamerican societies depended on human labour, Old World societies had oxen and cattle to plough fields and horses to carry goods and people.
Consistent with these findings, another study demonstrates that, in contrast to men, rigorous manual labor was a more important component of prehistoric women's behavior than was terrestrial mobility through thousands of years of European agriculture, at levels far exceeding those of modern women. However, humeral rigidity exceeded that of living athletes for the first ~5500 years of farming, with loading intensity biased heavily toward the upper limb. Interlimb strength proportions among Neolithic, Bronze Age, and Iron Age women were most similar to those of living semi-elite rowers (Macintosh, Pinhasi, Stock 2017 Prehistoric women's manual labor exceeded that of athletes through the first 5500 years of farming in Central Europe Sci. Adv. 2017;3:eaao3893).
A simple rule of thumb is that wealth is distributed roughly as the inverse cube, so if 1 man in 8 = 23 has enough income to support 2 wives, then a = 3 in the power law distribution:
A wide range of physical, social and mathematical processes are governed by power laws. When the power, or exponent, is non-integer, they can also lead to fractal processes such as the Koch flake shown below, where each side is endlessly replaced by 4 of 1/3 the size. The power law model shows very good fit with actual data right up to the super-rich hundred or so worldwide.
It is usual to normalize the distribution so that the total probability over the population is unity:
We can plot this either as a linear, or a log-log plot to show the slope has a constant power factor, as shown below:
However, as we can see, the tail of this
distribution becomes randomly erratic due to the very number small of samples in the
higher value histogram bins.
A better way of plotting is to simply plot the cumulative probability distribution as follows:
As can be seen this has a slope exponent shift, but gives a more accurate picture of the process.
We can model such distributions using a random generating function, derived as the inverse function of the integral of the distribution, where r is uniformly distributed between 0 and 1:
To investigate the proportion of total income above a certain level xq , we proceed as follows:
This is called a Lorenz curve after its inventor Max Lorenz and now has a positive exponent.
To estimate inequality, a widely used measure is the Gini coefficient, which ranges from 0 to 1, as the proportion of income which is derived from an unequal distribution of wealth. If all people earn the same amount, the cumulative distribution is the diagonal, so the proportion G is given by:
For our original inverse cubic example, G = 0.33. Gini coefficients have tracked from a gatherer-hunter value of 0.18 through 0.3 in horticulture, tracking up to 0.6 with the advent of urban civilizations and according to recent research has reached 0.8 and 0.73 in the US and China (Wade 2017).
The way we have drawn the Lorenz curve is a little unusual. It is usually flipped over so the high incomes are on the right, but we can immediately see for α=2.5 that the top 10% have (0.1)(2-2.5)/(1-2.5) = 0.46, or 46% of the wealth and the top 1% have (0.01)(2-2.5)/(1-2.5) = 0.215 or 21.5% of the wealth, quite close to the figures in fig 7.
We can also invert the relationship so that given such a 1% share, we can calculate α and hence retrieve G, as follows:
Gini values remain somewhat inconsistent, as can be seen by comparing lower local values from the 2016 US census https://www.census.gov/content/dam/Census/library/publications/2017/acs/acsbr16-02.pdf with the higher world ones in Wikipedia https://en.wikipedia.org/wiki/Gini_coefficient.
Part of the explanation for this is that the developed capitalist economies are not the most excessively unequal as the world map in fig 6c indicates. In fact China on this data exceeds the US and Southern Africa is the most unequal with many countries from Europe and Canada to India having much lower values.
Fig 6c: World Gini coefficients as of 2014 with some figures several years out of date
Part of the reason many developed countries appear to have lower Gini coefficients is because they have better social welfare systems, which mean the lowest income brackets are not overpopulated and the income spectrum thus doesn't fit a power law distribution for the lower income end, although the power law model does hold good for higher and even the highest incomes.
For example in 2014 Oxfam released figures stating that the top 62 income earners planet wide earned as much as the bottom half put together and that the richest 1% earned as much as the rest put together. We can easily analyse these results applying the previous formulae. At a power law α= 2.15, with a Gini coefficient of 0.77, the top 62/7.2 billion will earn 0.0887of the total and the upper half will gain 0.9136, leaving 0.0864 for the lower half, very close to equal. We can also use the last formula to directly calculate that if the top 1% earn half the total income, then α= 2.17 with a Gini coefficient of 0.75.
Other figures for the top US earners appear to be a little softer. In 2000 the top 400 earned an average of $173,192,000. Given a population of 282.2m and a median income of $42,148 and a slightly higher mean of around $50,000, we arrive at α= 2.65 and G = 0.434. In 2007 the top 0.01% earned 5% of the total equating to α= 2.48 and G = 0.51.
Fig 6d: The picture emerging from US Census figures.
One definitive source of US income data is from the US Census (https://www.census.gov/topics/income-poverty/income/data/tables.html). Fig 6d shows the household income distribution as recorded for the latest year on record 2016. This data is incomplete for the very high end trends, which have been lumped together in a single pool, but it also demonstrates that income does not follow anything like a power law at the low end, due to social factors including social welfare even in strongly capitalist economies such as the US, which limit the number o very poor people, unlike developing countries like Botswana
In the second row of figures, the higher end excluding the two highest lumped bins from $95,000 to $195,000 and plotting number of people in each bracket against mean income and performing logarithmic regression, well approximates a power law fit, which gives a picture of a hardening exponent running from 2.94 in 2009 to 2.27 in 2016, with the trends in the Gini running from 0.35 to 0.65, as illustrated in black bottom left.
We can also make a rough estimate based on the top earners using the last formula above, which arrives at a much softer value of the Gini at around 0.3 throughout. This is probably an inaccurately low measure of inequality due to the crude lumping of high earners, the small sample size of 120,000 people and the fact that the survey concentrates on household incomes and poverty.
But there is another very simple way we can calculate a Gini directly, by simply constructing a cumulative income distribution by cumulatively adding up the total income in each bracket from the top down (number of people x mean income), plotting the graph, calculating the area A under it, and arriving at G=2(A–1/2). This results in a middle ground G, trending from 0.468 in 2009 to 0.48 in 2016, as shown in blue in the bottom row figures. Notice also that the empirical Lorenz curve in blue for 2016 does not have the skewed profile of the power law curve of higher incomes in black, which actually crosses the empirical plot due to the reduced number of very low incomes we have seen in the census data at the top, conflicting with the hardening power law in higher incomes, demonstrating both changing economic climates and social policies and the fact that G is independent of any particular model. This Gini calculation is consistent with the census's own figures and has been arrived at entirely empirically without any model assumptions such as a power law distribution.
Fig 6e: Trends in top earners share in the US.
In fig 6(e) is shown independent data on trends of the top 1% and 0.1% of income earners in the US. The top 1% earning 10% and 20% tallies to α = 2.53, 3 with G = 0.48, 0.33 respectively and the top 0.1% earning 10% and 3% to α = 2.5, 3.03 with G = 0.5, 0.32. This definitely shows hardening trends, reaching historically high levels in the 2000s only seen before in the 1920s, however again the Gini estimates are not severe at face value and should only be seen as indicators of relative trends, which may actually be more severe than the sampled data, for the reasons outlined below.
Gini figures are subject to instability, both because of grainy low levels of sampling among the highest income earners and because of ambiguities of how to tally negative income at the bottom end due to debt. Both ends also tend to misreport their figures, leading to the use of relative measures, such as the Palma ratio, of the richest 10% share divided by the poorest 40% share, on the basis that middle class incomes tend to represent about half of gross national income, while the other half is split between the richest 10% and poorest 40%.
Half of England is owned by less than 1% of the population. Guy Shrubsole, author of the book in which the figures are revealed, Who Owns England?, argues that the findings show a picture that has not changed for centuries. 'Land ownership in England is astonishingly unequal, heavily concentrated in the hands of a tiny elite.'
However all the above estimates are probably too soft, because the richest are extremely adroit at hiding their incomes in convoluted corporate dealings, opaque trusts and offshore tax havens, as the Panama and Paradise papers aptly demonstrate. The true figure is thus likely to correspond more closely with the harder G figures quoted by Chinese research, Oxfam and the UN for world values of 0.7 – 0.8, despite efforts to reduce extreme poverty in developing countries.
This brings us back to the testy relationship between capitalism and democracy. According to the Oxford English Dictionary the term capitalism was first used by William Makepeace Thackeray in 1854 in "The Newcomes". The initial usage of the term capitalism in its modern sense has been attributed to Louis Blanc in 1850. Karl Marx and Friedrich Engels referred to the capitalistic system and to the capitalist mode of production in "Das Kapital" in 1867. Thus capitalism was coined by social commentators and its detractors before being used by its free-market champions.
However the term capitalist has an earlier usage and the phenomenon of capital investment emerged from earlier forms of merchantilism, illustrated by city states, from Phoenicia to Venice, and later by the colonial expansion of European nations across the globe, from India to the Americas, in trade driven expansions sometimes enforced by nationalistic protectionism. During the Industrial Revolution, the industrialist replaced the merchant as a dominant actor. At the same time, the surplus generated by the rise of commercial agriculture encouraged increased mechanization of agriculture. Industrial capitalism marked the development of the factory system of manufacturing, characterized by a complex division of labor and established the global domination of the capitalist mode of production.
The rise of democracy in Europe in parallel with industrialization led to a notion that the two social processes were causally interconnected. The rise of communism and the Cold War came to entrench capitalism as a social complement and opponent of the totalitarian manifestations of communism, as if capitalism was the bulwark of freedom and democracy, but later developments have shown that capitalism is equally a bedfellow of a variety of political systems, from monarchies to dictatorships, and for example is the driving force for the rise of China as a world superpower, despite the continued autocratic rule of the communist party and repression of calls for democratic freedoms in China.
Fig 7: (a) The graphical data from “Capitalism and Democracy” 2003, (b) 2007 NY Times data on income distributions in the US (Johnston), (c) Public opposition to the US Supreme Court decision allowing unlimited anonymous funding of political parties by corporations and unions, (d) extreme L-curve (Lorenz curve) of income distribution Price 2003, (e) this curve remains extreme even when the log of the income is used, showing a super-exponential distribution.
At the same time, capitalism has evolved from Keynesian economics to monetarism. In the tradition of Keynesian economics, economic output is seen to be strongly influenced by demand which does not necessarily equal productive capacity and can be influenced erratically by a host of factors, affecting production, employment, and inflation, requiring social adjustment. Monetarism is an alternative economic view that variation in the money supply has major influences on national output in the short run and the price level over longer periods, which is more compatible with laissez-faire attitudes. As the twentieth century progressed public and political interest began shifting away from the collectivist concerns of Keynes's managed capitalism to a focus on individual choice.
Increasing globalization and the formation of transnational corporations have acted to increase the mobility of people and capital since the last quarter of the 20th century, combining with the cementing of legally binding trade agreements to circumscribe the room to manoeuvre of individual states in choosing national policies and avenues of development, leading to the notion of capitalism as a global phenomenon circumscribing the democratic political freedom of choice of sovereign nations, while claiming to facilitate an era of greater efficiency, prosperity and opportunity, free of barriers to trade.
In a 2003 article celebrating the 160th anniversary of The Economist entitled “Capitalism and Democracy” then editor Bill Emmott sets out a libertarian case for the claim that capitalism and its freedom of enterprise and ensuing globalization have improved living standards, reduced poverty and enhanced productivity across the planet.
Two of his central claims supported by graphical data are the power of liberal trade to raise living standards, complemented by laissez-faire globalization, citing the increase in trade surpassing GNP growth and higher growth rates in more globalized economies (a1,2 in fig 7). He then goes on to note that the beneficent effect of libertarian capitalism has also been accompanied by a rise in the number of countries having nominally democratic governments, albeit compromised to varying degrees by graft and corruption – lack of an independent judiciary, equality before a well-enforced rule of law, and constitutional limits on the abuse of political power.
He also makes the claim that these processes have a causal role in alleviating world poverty: “Measured by the benchmark favoured by the World Bank of income of $2 a day or less, adjusted to cater for differences in purchasing power, the proportion of the worldęs population in poverty dropped from 56% in 1980 to 23% in 2000, on Indian economist Surjit Bhallaęs calculations [with the absolute numbers declining by a lesser 42% due to population growth]. Before 1980, the absolute numbers were rising. That date roughly coincides with the spread of trade and internal-market liberalisation to many poor countries. … The truth about market liberalisation and economic growth is not that it increases inequality, nor that it hurts the poor: just the opposite.”
While it is well and good that outright poverty has decreased, it is not clear whether the relationship with laissez-faire capitalism is causal, coincidental or even 'in spite of'. One test of this idea is to examine how income distributions have changed, for example in some of the most laissez-faire economies such as the US. Emmott provides only a chart entitled "bulging toward equality", claiming middle incomes have risen steadily, if marginally, over the period, despite periods of financial crisis and economic setback. This data is very insensitive to both low and high incomes and presents only a marginal growth of the middle classes of around 3% per annum.
However in 2007 David Johnston in the New York Times showed the income gap is widening and that the lowest 90% have seen a net decline in share from 64% to 52% in the same period, while the top 1% have gained from 10% to over 20% by 2005, while the top 0.1% and 0.01% have seen even more extreme increase between 2002 and 2005 (fig 7 b).
Income inequality grew significantly in 2005, with the top 1% of Americans receiving their largest share of national income since 1928. The top 10 percent also reached a level not seen since before the Depression. The top 10% of Americans collected 48.5% of all reported income in 2005, up from roughly 33% in the late 1970s. The peak for this group was 49.3 percent in 1928. The top 1% received 21.8% of all reported income in 2005, more than double their share of income in 1980. The peak was in 1928, when the top 1 percent reported 23.9 percent of all income.
While total reported income in the United States increased almost 9% in 2005, average incomes for those in the bottom 90% of $28,666 dipped compared with the year before, dropping 0.6%. The gains went largely to the top 1%, earning $1.1 million each, an increase of 14%. The 0.1% and top 0.01% recorded even bigger gains in 2005 over the previous year of around 20%, largely because of the rising stock market and increased business profits. The 0.1% reported an average income of $5.6 million, up 19.3%, while the top 0.01% of a percent had an average income of $25.7 million, up 20.6%.
Per person, the top 300,000 Americans collectively enjoyed almost as much income as the bottom 150 million Americans receiving 440 times as much as the average person in the bottom half earned, nearly doubling the gap from 1980. Professor Saez who analysed the IRS data said. “If the economy is growing but only a few are enjoying the benefits, it goes to our sense of fairness. It can have important political consequences.”
These disparities were confirmed in 2012 Alan Dunn’s “Average America vs the One Percent” in Forbes magazine: “If the Occupy movement does nothing else, it has at least introduced a new set of terms into the American vocabulary to talk about the distribution of wealth in America. Until recently, most average people had no idea how wealth was distributed in the country; most people had a vague idea of a wealthy minority, but they rarely grasped the full extent of income disparity between classes. Unequal wealth distribution is hardly a new or uniquely American problem. In fact, it’s been prevalent throughout society since humans first built civilizations: A small minority of aristocrats has always wielded the most power throughout history. In modern times, America lags behind nearly every other first-world nation in closing the gap between the classes. In fact, we’re making it worse.”
As of his 2012 data the average annual income of the top 1% was $717,000, compared to the average income of the rest, of around $51,000. In net value, the 1% were worth about $8.4 million, or 70 times the worth of the lower classes. Within this group of people was an even smaller and wealthier subset of people, 1% of the top, or 0.01% of the entire nation, had incomes of over $27 million, or roughly 540 times the national average. Altogether, the top 1% controlled 43% of the wealth; the next 4% controlled an additional 29%. Income disparity between the top 1% and the other 99% was nothing compared to that within the top earners. At the bottom of the 1% was a living wage of around $300,000. At the top of the 1 percent, people made around $5.2 to $7.5 million, with some people making closer to a billion. This 0.1% of the country paid closer to 23% in taxes. The top 400 highest earners in the country paid only 18% personal income tax. Between 2007 and 2009, Wall Street profits swelled by 720%, while unemployment rates doubled and home equity dropped by 35%. Since 1979, the bottom 90% has consistently lost money while the upper classes have gained. If the average person’s wages had kept pace with the economy since the 70s, most people would be making $92,000 per annum by the time of writing. It is historically common for a powerful minority to control a majority of finances, but Americans haven’t seen a disparity this wide since before the Great Depression - and it keeps growing.
Dunn comments further: “A common complaint against the Occupy Wall Street movement is that it relies on “zero sum” thinking. Opponents will argue that the wealth of the upper class should have no effect on the lower classes, but in practice it doesn’t usually work that way. The fact is that while wealth can be generated, money generally flows from one side of a population to the other. While money often works its way to the upper classes, it very rarely flows back the other way. The fact is that the upper classes really are taking money from the poor in a very real and concrete way. The so-called trickle-down economy has never worked, despite the protestations of conservatives. Most extremely rich people do not spend enough money to stimulate the economy; they save or invest their money rather than spending it. “
“Turning to the grip of capitalism on US democratic processes it is impossible to ignore the fact that 57 members of Congress, or roughly 11%, are members of the financial elite. Overall, 250 members of Congress are millionaires, and their median net worth accounts for roughly nine times that of the average American. Asking politicians to enact changes that would reduce the wealth of the upper classes is a conflict of interests. It’s little wonder that tax cuts for the wealthy are repeatedly enacted while the reverse is so rarely true. In order to be elected, politicians of all levels require backing, and that backing generally comes from corporations. It’s impossible to deny the link between politicians and corporations, and the link is consistent regardless of a person’s political leanings. With the majority of tax income coming in from the middle class - despite the progressive tax - and the government’s interests clearly mingling with the upper class, is there any question of why the income disparity continues to grow?”
This income disparity results in a super-exponential L-shaped curve for personal income, unparalleled in nature with the exception of male reproductive strategies (Price in fig 7(d)), which remains L-shaped even when plotted against the log of the income (e). Oxfam in a 2017 report provides similar L-shaped statistics for New Zealand, which according to the OECD in 2014 had a moderate Gini index of 0.333. In 2017, the top 1% own 20% of the country's wealth equating to a power law α = 2.537 and G = 0.48. But when we take the 2 top earners out of a population of 4.69 million, we get α = 2.23 and G = 0.685, again super-exponential, although both these billionaires manage trans-national corporations.
Keith Payne in "The Broken Ladder" (2017) explains that central part of the problem posed by inequality is the negative affect it has on the social climate, public health, reproductive health and economic efficiency. This is because people on the lower part of the hierarchy begin to react stressfully, to try to compensate for their underlying fear and anxiety, seeking riskier short term gains, and making less constructive decisions, such as forsaking educational opportunities which would otherwise both help them up the ladder and lead to a skilled efficient population, perceiving them to be valueless because they lead only to unachievable outcomes. Health statistics correlate tightly with inequality but less so with poverty. The factors are multiple, more self-destructive life-styles, a poorer diet, less exercise because a constructive engagement with life is perceived as not attainable. Inequality also affects neurotransmitters and hormones in a way that leads towards socially disruptive and counterproductive outcomes. These trends correlate much more tightly with inequality than with poverty, because it is not so much the direct effects of poverty on people, but the perception that the players further up the ladder are reaping rewards that are unfairly greater and there is no rational solution to the un-level playing field of inequality of opportunity.
"When people feel that tomorrow's uncertain or the resources today are scarce, they tend to focus on immediate rewards because who knows if it will still be there tomorrow?" Keith Payne/
These collateral effects can have long-term reproductive consequences. According to a study in the 1990s, those in poor suburbs, where people have harder, shorter lives, have children earlier. As life expectancy decreased, so did a women's age when they started having children. And the process is in positive feedback. Women who grow up under stress or duress begin ovulating earlier.
Payne hopes is that democracies recognise the implications of income inequality and make real moves to correct it. He believes it should be treated as a public-health problem, and addressed via policies that raise the minimum wage, expand early-childhood education and paid parental leave, cap executive pay and strengthen unions.
Fig 7b: The extent to which so much global wealth has become corralled by a virtual handful of the so-called 'global elite' is exposed in a 2014 report from Oxfam. The richest 85 people on the globe control $1tn as much wealth as the poorest half of the 3.5 bn global population put together. The wealth of the 1% richest people in the world amounts to $110tn, 65 times as much as the poorest half of the world. See: http://topincomes.g-mond.parisschoolofeconomics.eu/ Bottom: In 2015 the richest 62 people now earned as much as the bottom half of the world population and the richest 1% with cash and assets of over $760,000 earn as much as the rest put together. Both of these statistics equate to a power exponent of 2.15-2.17 and a Gini factor of 0.74-0.77, consistent with the higer estimates above, and demonstrating how consistently higher income fits with the power law model. Critics say this accounting includes first world people such as students with large debt, but the world map above tells a different story, Nevertheless, those in the middle and bottom of the world income distribution have all got pay rises of around 40% between 1988-2008. Global inequality of life expectancy and height are narrowing too - showing better nutrition and better healthcare where it matters most.
When we turn to the question of how this extreme disparity of personal income combined with massive corporate earnings influence US elections, we get into even more stormy political ground.
The dysfunctional presidency of Donald Trump is only the latest phenomenon of, and partly a product of, the very inequalities that his policies, in so far as he as been able to establish them, and those of the fractious Republican party he has manipulated, both tend and intend to accentuate. His maverick presidential bid, with the help of shadowy behind-the-scenes financial actors such as the supercomputing firm Cambridge Analytica enabled a narcissistic product of the world of privilege to falsely represent himself as a populist voice of the under-privileged, in a strategic finesse using the rust-belt electorates to achieve an electoral college win, despite a loss in the popular vote of some 3 million voters, only to institute policies from health care to tax reform that benefit only the rich to the cost of the very voters who elected him, at the same time seeking to unwind the measures to protect the planetary future in the form of decapitating the EPA and reneging on the Paris accords, setting the US on a course of selfish isolation, held in check only by the economic decline of the coal industry in the face of technological change and the concerted opposition of a coalition of US states, large corporate businesses and major cities, who seek to protect America and the planets future, constituting over half the economic and energy powerhouse of the US as a whole. An ongoing situation of systematic deceit, driven by a litany of lies, amid concerted attacks on the free press as purveyors of the very "fake news" these lies are the central expression of. But this history of gerrymandering the political realities has a longer deeper history.
In 2010 the United States Supreme Court, in Citizens United v. Federal Election Commission, ruled in a split 5-4 decision that the First Amendment prohibits the government from restricting politically independent expenditures by corporations, associations, or labor unions. The conservative lobbying group Citizens United wanted to air a film critical of Hillary Clinton and to advertise the film during television broadcasts in apparent violation of the 2002 Bipartisan Campaign Reform Act.
The case did not involve the federal ban on direct contributions from corporations or unions to candidate campaigns or political parties, which remain illegal in races for federal office, however an ABC–Washington Post poll conducted February 4–8, 2010, showed that 80% of those surveyed opposed (and 65% strongly opposed) the Citizens United ruling, which the poll described as saying "corporations and unions can spend as much money as they want to help political candidates win elections". Additionally, 72% supported "an effort by Congress to reinstate limits on corporate and union spending on election campaigns". The poll showed large majority support from Democrats, Republicans and independents (fig 7c).
Despite polls and comments from conservatives claiming support for the measure, the New York Times stated: "The Supreme Court has handed lobbyists a new weapon. A lobbyist can now tell any elected official: if you vote wrong, my company, labor union or interest group will spend unlimited sums explicitly advertising against your re-election." Journalist Jonathan Alter called it the "most serious threat to American democracy in a generation". The Christian Science Monitor wrote that the Court had declared "outright that corporate expenditures cannot corrupt elected officials, that influence over lawmakers is not corruption, and that appearance of influence will not undermine public faith in our democracy".
This brings into focus the lack of full independence of the US Supreme Court from political influence. Adam Liptiak in the New York Times article “Court Under Roberts Is Most Conservative in Decades” points out the Chief Justice Roberts heading of the Court along with changes such as Justice Alito, both appointed by George W Bush, has resulted in a politically conservative carry on effect, for which the Supreme Court is renowned.
In a swiping aside in the HBO drama “The Newsroom”, anchor man Will McAvoy comments that the next thing the conservatives will do, noting the unions can also contribute money, will be to wipe the unions out, citing the anti-union budget repair bill of the Republican Governor of Wisconsin who won election over his democratic rival with just 52% of the vote. Ironically, just as this article is being written, the unelected Republican National Committee Chairman Reince Priebus told NBC and CNN to ditch planned series potentially favourable to Hilary Clinton, after speculation mounted that she might run for the 2016 presidential election, warning the channels that Republicans could refuse to hold debates on the two networks.
Mathematical ecologist Peter Turchin (MacKenzie 2013) has developed a cyclical complex systems model of human economic society which explains this increasing disparity between rich and poor and the development of increasingly polarized government, as characterized by the current impasse shutting down the US government in 2013, threatening to result in a debt-ceiling default. He finds that a mathematical model, combining economic output per person, the balance of labour demand and supply, and changes in attitudes towards redistributing wealth, generates a curve that matches the change in real wages since 1930, including complex rises and falls since 1980. Emerging from a new social 'deal' in an epoch epoch of relative harmony, as population grows, workers start to outnumber available jobs, driving down wages. The wealthy elite then end up with an even greater share of the economic pie, and inequality soars. This process also creates new avenues - such as increased access to higher education - that allow a few workers to join the elite, swelling their ranks. Eventually, this results in "elite overproduction" - more people in the elite than there are top jobs. The richest continue to become richer, as existing advantage feeds back positively to create yet more. The picture becomes increasingly predatory as the rest of the elite fight it out ever more fiercely. Such political acrimony is paralleled by rising discontent among workers, and increasing state bankruptcy as spending by the elite who control the government coffers and don't wish to pay their share of taxes spirals. Ultimately, the situation gets so bad that order cannot be maintained and the state collapses. A new cycle begins.
In elections where it is not the total number of electors, but the number of electoral districts won, or their share of the electoral college vote, there is a massive incentive for incumbent politicians to cheat democracy by gerrymandering the electorate boundaries.
Gerrymanderers rig maps by “packing” and “cracking” their opponents. In packing, you cram many of the opposing party’s supporters into a handful of districts, where they’ll win by a much larger margin than they need. In cracking, you spread your opponent’s remaining supporters across many districts, where they won’t muster enough votes to win.
Fig 7c: Packing and cracking and electoral boundary measures (Arnold 2017 Klarreich 2017).
In 1986, the Supreme Court ruled extreme partisan gerrymanders unconstitutional. But without a reliable test for identifying unfair district maps, the Supreme Court historically has not intervened, as long as districts meet four criteria: they are continuous; they are compact; they contain roughly the same number of people; and they give minority groups a chance to elect their own representatives in accordance with the Voting Rights Act of 1965.
To pick up the right combination of voters, cartographers may design districts that meander bizarrely, as was the case with the “salamander”-shaped district signed into law in 1812 by Massachusetts governor Elbridge Gerry. In an assortment of racial gerrymandering cases, the Supreme Court has acknowledged that crazy-looking shapes are an indicator of bad intent.
Many states require that districts should be reasonably “compact” wherever possible, but there’s no one mathematical measure of compactness that fully captures what these shapes should look like – some focus on a shape’s perimeter, others on how close the shape’s area is to that of the smallest circle around it, and still others on things like the average distance between residents. The Supreme Court justices have found it impossible on these bases to decide what shapes are too bad.
Mattingly and Graves developed a compactness score calculated as the length of a district’s perimeter squared divided by its area, a version of what's known as the Polsby–Popper measure. A circle has the lowest ratio of perimeter to area; but as borders meander to include and exclude specific areas, the perimeter expands, giving a higher ratio (Arnold 2017).
The Supreme Court did rule in 2017 that North Carolina's 1st and 12th districts were products of racial gerrymandering. For about a decade, the state had had a relatively even split in its 13 electoral districts. Sometimes Democrats took six seats, sometimes seven. But Republican redistricting before the 2012 election packed Democrats into three districts, putting the party at a severe disadvantage (Mattingly & Vaughn 2014). Even though its candidates won 50.3% of the votes, the party captured only four seats.
In the summer of 2016, a bipartisan panel of retired judges met to see whether they could create a more representative set of voting districts for North Carolina. The judges’ districts, Mattingly found, were less gerrymandered than in 75% of the computer-generated models — a sign of a well-drawn, representative map. By comparison, every one of the 24,000 computer-drawn districts was less gerrymandered than either the 2012 or 2016 voting districts drawn by state legislators (Bangia et al. 2017).
There are also legitimate reasons why some districts are not compact: In many states, district maps are supposed to try to preserve natural boundaries such as rivers and county lines as well as “communities of interest,” and they must also comply with the Voting Rights Act’s protections for racial minorities. These requirements can lead to strange-looking districts — and can give cartographers latitude to gerrymander under the cover of satisfying these other constraints.
Neither do compact districts give any guarantee that the resulting map will be fair. Chen & Roden (2013) demonstrate that even when districts are required to be compact, drawing biased maps is often easy, and sometimes almost unavoidable. For example, Democratic voters in the early 2000s were clustering into highly homogeneous neighborhoods in big cities like Miami and spreading out their remaining support in suburbs and small towns that got swallowed up inside Republican-leaning districts. They were packing and cracking themselves.
Solving the gerrymandering problem thus requires ways to measure how biased a given map is. In a 2006 ruling, the Supreme Court offered tantalizing hints about what kind of measure it might look kindly on: one that captures the notion of “partisan symmetry,” which requires that each party have an equal opportunity to convert its votes into seats.
As of June 2017, the Supreme Court declared that it will consider whether gerrymandered election maps favoring one political party over another violate the Constitution. The court accepted a case from Wisconsin, where a divided panel of three federal judges last year ruled last year that the state’s Republican leadership in 2011 pushed through a plan so partisan that it violated the Constitution’s First Amendment and equal rights protections. The court’s action comes at a time when the relatively obscure subject of reapportionment has taken on new significance, with many blaming the drawing of safely partisan seats for a polarized and gridlocked Congress. Plaintiffs claim “Republican legislative leaders authorized a secretive and exclusionary mapmaking process aimed at securing for their party a large advantage that would persist no matter what happened in future elections.” The plaintiffs are pushing the “efficiency gap” as a measure to determine how Republican mapmakers hurt Democrats.
Fig 7d: Increasing Republican bias in the efficiency gap.
Stephanopoulos & McGhee (2015) have proposed a simple measure of partisan symmetry, called the “efficiency gap,” which tries to capture just what it is that gerrymandering does. At its core, gerrymandering is about wasting your opponent’s votes: packing them where they aren’t needed and spreading them where they can’t win. So the efficiency gap calculates the difference between each party’s wasted votes, as a percentage of the total vote — where a vote is considered wasted if it is in a losing district or if it exceeds the 50 percent threshold needed in a winning district. The efficiency gap is the difference in the totals of the wasted votes for each party, expressed as a percentage of total votes cast. Two gerrymandering strategies are used to ensure that the opposition party stays below the 50% threshold in districts (called cracking), or near the 100% level (stacking). Either tactic forces the other party to waste votes on losing candidates or on winning candidates who don't need the votes. The efficiency gap captures the relative numbers of wasted votes.
Many states require that districts be "compact," a term with no fixed mathematical definition. As noted above, in 1991, Daniel Polsby and Robert Popper proposed 4πA/P2 as a way to measure the compactness of a district with area A and perimeter P. Values range from 1, for a circular district, to close to zero, for misshapen districts with long perimeters. Meanwhile, Nicholas Stephanopoulos and Eric McGhee introduced the "efficiency gap" in 2014 as a measure of the political fairness of a redistricting plan. These are both useful measures for detecting gerrymandering. But in 2018, Boris Alexeev and Dustin Mixon proved that "sometimes, a small efficiency gap is only possible with bizarrely shaped districts." That is, it is mathematically impossible to always draw districts that meet certain Polsby-Popper and efficiency-gap fairness targets.
If you wanted to engineer victories for your party, your strategy would be to minimize the wasted votes for your party and maximize the wasted votes for your opponent. To this end, opposition votes are packed into a small number of conceded districts, and the remaining block of votes is cracked and spread out thinly over the rest of the districts to minimize their impact. This practice naturally creates large efficiency gaps, so we might expect fairer distributions to have smaller ones.
Stephanopoulos & McGhee have calculated the efficiency gaps for nearly all the congressional and state legislative elections between 1972 and 2012 and state that the efficiency gaps of today’s most egregious plans dwarf those of their predecessors in earlier cycles. They have proposed the efficiency gap as the centerpiece of a simple standard the Supreme Court could adopt for partisan gerrymandering cases. To be considered an unconstitutional gerrymander, they suggest, a district plan must first be shown to exceed some chosen efficiency gap threshold, to be determined by the court. Second, since efficiency gaps tend to fluctuate over the decade that a district map is in force, the plaintiffs must show that the efficiency gap is likely to favor the same party over the entire decade, even if voter preferences shift about somewhat. If these two requirements are met, the burden would then fall on the state to explain why it created such a biased plan. They state that this “would neatly slice the Gordian knot the Court has tied for itself,” by explicitly laying down just how much partisan effect is too much.
The efficiency gap is a simple metric that may go on to shape the future of US democracy. But it's not without its drawbacks. It turns a blind eye to gerrymandering committed by both parties, for example, so long as they offend equally, and struggles to produce meaningful results when one party has a genuinely dominant majority. For some, the efficiency gap also fails because it assumes that voters are entitled to a specific relationship between vote share and representation.
Fig 7e: US 2016 Presidential Election (http://metrocosm.com/election-2016-map-3d/). Perfect asymmetric electoral warfare. Color = winner and margin of victory Height = total number of votes (all candidates). The map confirms extreme asymmetry between the electorate membership of Democrats and Republicans, leading to unequal wasted votes, and able to be exploited by astute computational modelling by supporting organizations such as Cambridge Analytica, owned by the Mercer family.
To attempt to determine whether gerrymandering in intentional, Cho and Liu (2016) unveiled a simulation algorithm that generates a large number of maps to compare to any given districting map, to determine whether it is an outlier. In that paper, they drew 250 million imperfect but reasonable congressional district maps for Maryland, whose existing plan is being challenged in court. Nearly all their maps, they found, are biased in favor of Democrats. But the official plan is even more biased, favoring Democrats more strongly than 99.79 percent of the algorithm’s maps - a result extremely unlikely to occur in the absence of an intentional gerrymander.
Herschlag, Ravier & Mattingly (2017) used a similar algorithm to randomly draw 20,000 possible electoral maps for Wisconsin that satisfied all of the required criteria laid down in US law. In most of these, the Republicans won a majority, making it seem like the Democrats were just at a natural disadvantage. But in the majority of the maps, Republicans secured a narrow advantage, replicating their 2014 margin of victory only in a very small number of maps. This means that the Wisconsin electoral map is a clear outlier and therefore is likely to have been gerrymandered. For mathematicians who worked on the algorithm, this statistical analysis is vitally important yet has been largely ignored in favour of the efficiency gap, although one has noted that the justices did take notice, and they are the ones who matter.
Bernard Grofman has developed a five-pronged gerrymandering test that distills the key elements of the Wisconsin case. Three prongs are similar to Stephanopoulos and McGhee: evidence of partisan bias, indications that the bias would likely endure for the whole decade, and the existence of at least one replacement plan that would remedy the existing plan’s bias. To these, Grofman adds: simulations showing that the plan is an extreme outlier, suggesting that the gerrymander was intentional, and evidence that the people who made the map knew they were drawing a much more biased plan than necessary.
Chen and David Cottrell, used simulations to measure the extent of intentional gerrymandering in congressional district maps across most of the 50 states and found that on the national level, it mostly canceled out, except at the state level, but banning unintentional gerrymandering as well would lead to a more radical redrawing of district maps that could potentially make a significant change to the membership of the House.
"If the court rules the Wisconsin map unconstitutional under a particular test," says Joshua Douglas at the University of Kentucky, "then that will place an outer limit on the worst abuses in partisan gerrymandering." This would have the biggest effects in swing states such as Wisconsin, North Carolina and Maryland. "The ruling would ultimately produce fairer maps, which also will likely give average Americans more confidence in the election process," says Douglas.
9 Jan 2018: A panel of federal judges struck down North Carolina's congressional map on Tuesday, condemning it as unconstitutional because Republicans had drawn the map seeking a political advantage. The ruling was the first time that a federal court had blocked a congressional map because of a partisan gerrymander, and it instantly endangered Republican seats in the coming elections. Judge James A. Wynn Jr., in a biting 191-page opinion, said that Republicans in North Carolina's Legislature had been "motivated by invidious partisan intent" as they carried out their obligation in 2016 to divide the state into 13 congressional districts, 10 of which are held by Republicans. The result, Judge Wynn wrote, violated the 14th Amendment's guarantee of equal protection. The unusually blunt decision by the panel could lend momentum to two other challenges on gerrymandering that are already before the Supreme Court — and that the North Carolina case could join if Republicans make good on their vow to appeal Tuesday's ruling. In October, the court heard an appeal of another three-judge panel's ruling that Republicans had unconstitutionally gerrymandered Wisconsin's State Assembly in an attempt to relegate Democrats to a permanent minority. In the second case, the justices will hear arguments by Maryland Republicans that the Democratic-controlled Legislature redrew House districts to flip a Republican-held seat to Democratic control (North Carolina Congressional Map Ruled Unconstitutionally Gerrymandered NY Times).
Fig 7f: 20 Feb 2018 Pennsylvania's Supreme Court has redrawn the map of the state's congressional districts, overturning a Republican gerrymander that's been used in the past three congressional elections. The new map more closely reflects the partisan composition of the state, all but ensuring that Democrats will pick up several new U.S. House seats in November. It's also more compact than Republicans' original map, and it splits fewer counties and municipal areas — a key concern of the court as it sought to ensure voters' ability to participate in "free and equal" elections.
20 Feb 2018: After the 2010 Census, Republicans in control of Pennsylvania's legislature drew a map that gave them control of 13 of the state's 18 House seats, despite the fact that according to the state government, there are currently 4 million registered Democrats in Pennsylvania and only 3.2 million registered Republicans. The state Supreme Court ruled that the map violated the state's constitution, then ordered the legislature and the governor to come up with one that treated all voters fairly. When they couldn't agree on one, the court did it itself, with the help of Nathaniel Persily, a Stanford Law School professor and redistricting expert who has provided the same service to other courts. When your starting point is a map as skewed as Pennsylvania's, any attempt to reset to something fair looks like a pro-Democratic effort.
As the New York Times described the consequences, "Over all, a half-dozen competitive Republican-held congressional districts move to the left, endangering several incumbent Republicans, one of whom may now be all but doomed to defeat, and improving Democratic standing in two open races." Naturally, Republicans are portraying this as an unconscionably unfair decision. TheNational Republican Congressional Committee announced that "State and federal GOP officials will sue in federal court as soon as tomorrow to prevent the new partisan map from taking effect." (Waldman P 2018 A big gerrymandering case raises a profound question about our elections, Ingraham C 2018 Pennsylvania Supreme Court draws 'much more competitive' district map to overturn Republican gerrymander Washington Post 20 Feb 2018).
And President Trump is outraged
Hope Republicans in the Great State of Pennsylvania challenge the new "pushed" Congressional Map, all the way to the Supreme Court, if necessary. Your Original was correct! Don't let the Dems take elections away from you so that they can raise taxes & waste money!
March 19 2018 The Supreme Court on Monday turned down a request from Republican legislative leaders in Pennsylvania to block the implementation of a redrawn congressional map that creates more parity between the political parties in the state. The 2018 elections are thus likely to be held under a map much more favorable to Democrats, who scored an apparent victory last week in a special election in a strongly Republican state district. The 2011 map that has been used this decade has resulted in Republicans consistently winning 13 of the state's 18 congressional seats. Monday's action was the second time that the court declined to get involved in the partisan battle that has roiled Pennsylvania politics. The commonwealth's highest court earlier this year ruled that a map drawn by Republican leaders in 2011 "clearly, plainly and palpably" violated the free-and-equal-elections clause of the Pennsylvania Constitution. The U.S. Supreme Court deliberated nearly two weeks before turning down the request to stop the map from being used in this fall's elections. Generally the justices stay out of the way when a state's highest court is interpreting its own state constitution. The action came shortly after a three-judge federal panel also turned down a separate attempt by Republican legislators and members of Congress to stop implementation of the map.)
26 March 2018 The once-per-decade U.S. Census will ask about people's citizenship status in 2020, the Commerce Department announced late Monday. It is a decision that carries potentially major political ramifications — most notably for Republicans' ability to gerrymander Democrats into the minority for years to come. And some are crying foul. Because Republicans already have a significant edge on the congressional and state legislative maps, thanks to how our population is distributed and to the GOP having earned the power to redraw lots of the new maps after the 2010 Census. And this could significantly increase their advantages for two reasons:
1. It might dissuade noncitizens from participating in the census, thereby diluting the political power of the (mostly urban and Democratic) areas they come from.
In its 2016 ruling on Evenwel v. Abbott, the court unanimously ruled that the “one person, one vote” standard did not require states to draw legislative districts according to the voting-eligible population – that is, without including noncitizens and children. That was a win for Democrats. In the ruling, the Supreme Court did not prohibit states from drawing state legislative districts by that standard, and Justice Samuel Alito suggested that would be a future question for the court to decide.
How widely congressional districts varied by percentage of voter-eligible population.
To be precise, federal courts have long ruled that congressional districts must use total population, so this is not about them – at least not directly. At issue is only whether state legislative districts could be drawn by voting-eligible population. The GOP's domination of state legislatures is how they have gained the power to be able to draw so many favorable congressional maps. So allowing Republicans to draw more favorable state legislative maps generally means more favorable congressional ones – and a potentially more resilient GOP majority. That could have a major impact in certain areas. That's because the current method has resulted in some Democratic districts with far fewer eligible voters than Republican ones.
30 June 2018 Dana Milbank in the Washington Post notes: "Now we have a Supreme Court nomination — the second in as many years — from an unpopular president who lost the popular vote by 2.8 million. The nominee will be forced through by also-unpopular Senate Republicans, who, like House Republicans, did not win a majority of the vote in 2016. Compounding the outrage, each of the prospective nominees is all but certain, after joining the court, to support the eventual overturning of Roe v. Wade, which has held the nation together in a tenuous compromise on abortion for 45 years and is supported by two-thirds of Americans . For good measure, the new justice may well join the other four conservative justices in revoking same-sex marriage, which also has the support of two-thirds of Americans. And this comes after the Republicans essentially stole a Supreme Court seat by refusing to consider President Barack Obama's nominee, Merrick Garland, to a vacant seat. You can only ignore the will of the people for so long and get away with it. Republicans have been defying gravity for some time. As New York magazine's Jonathan Chait reminds us in a smart piece, they lost the popular vote in six of the last seven presidential elections. Electoral college models show Republicans could plausibly continue to win the White House without popular majorities. Because of partisan gerrymandering and other factors, Democrats could win by eight percentage points and still not gain control of the House, one study found. And the two-senators-per-state system (which awards people in Republican Wyoming 70 times more voting power than people in Democratic California) gives a big advantage to rural, Republican states."
27 August 2018 A panel of three federal judges held Monday that North Carolina's congressional districts were unconstitutionally gerrymandered to favor Republicans over Democrats and said it may require new districts before the November elections, possibly affecting control of the House. The judges acknowledged that primary elections have already produced candidates for the 2018 elections but said they were reluctant to let voting take place in congressional districts that courts twice have found violate constitutional standards. The North Carolina case is a long-running saga, with a federal court in 2016 striking down the legislature's 2011 map as a racial gerrymander. The legislature then passed a plan that left essentially the same districts in place but said lawmakers were motivated by politics, not race. The Supreme Court told the three-judge panel to take another look at the North Carolina case in light of the high court's June 2018 decision in a Wisconsin partisan gerrymandering case, in which the justices said those who brought that case did not have legal standing.
Fig 7h: The North Carolina gerrymandered vote in 2018
In considering a Republican-drawn map from Wisconsin and a Democratic effort in Maryland, the court had raised the possibility of producing a landmark change in the way the nation's elections are conducted. Chief Justice John G. Roberts Jr. said that the plaintiffs in Wisconsin had not shown they were hurt individually by the legislature's actions, a necessary component for courts to intervene. But the justices left the door open for future challenges to partisan gerrymanders. But Judge James A. Wynn Jr. of the U.S. Court of Appeals for the 4th Circuit, writing Monday for a special three-judge district court panel, said plaintiffs did have standing under the decision in Wisconsin's Gill v. Whitford, which he said reinforced the judges' earlier views that the congressional districts were drawn with improper partisan goals.
May 3 2019 A unanimous panel of federal judges on Friday declared Ohio's Republican-drawn congressional map unconstitutional, a ruling similar to those in a number of states where partisan gerrymandering has been outlawed. "We join the other federal courts that have held partisan gerrymandering unconstitutional and developed substantially similar standards for adjudicating such claims. We are convinced by the evidence that this partisan gerrymander was intentional and effective and that no legitimate justification accounts for its extremity." Last month, a similar panel of federal judges in Michigan found that some of that state's legislative and congressional maps were unconstitutional gerrymanders, and it appeared the lower courts were attempting to send a message to the high court. The Supreme Court in March heard arguments in similar cases from North Carolina - where judges found Republicans manipulated the maps to their advantage - and Maryland, where Democratic lawmakers drew a district that resulted in a loss for a longtime Republican congressman. At the oral arguments, the conservative justices who make up the Supreme Court's majority seemed skeptical that the court could find a manageable test for deciding when politics plays an unconstitutional role in map drawing.
May 31 2019 Thomas B. Hofeller achieved near-mythic status in the Republican Party as the Michelangelo of gerrymandering, the architect of partisan political maps that cemented the party's dominance across the country. But after he died last summer, his estranged daughter discovered hard drives in her father's home that revealed that Mr. Hofeller had played a crucial role in the Trump administration's decision to add a citizenship question to the 2020 census. Files on those drives showed that he wrote a study in 2015 concluding that adding a citizenship question to the census would allow Republicans to draft even more extreme gerrymandered maps to stymie Democrats. And months after urging President Trump's transition team to tack the question onto the census, he wrote the key portion of a draft Justice Department letter claiming the question was needed to enforce the 1965 Voting Rights Act - the rationale the administration later used to justify its decision. Those documents, cited in a federal court filing by opponents seeking to block the citizenship question, have emerged only weeks before the Supreme Court is expected to rule on the legality of the citizenship question. Critics say adding the question would deter many immigrants from being counted and shift political power to Republican areas (www.nytimes.com/2019/05/30/us/census-citizenship-question-hofeller.html).
28 June 2019 The US Supreme Court has blocked the Trump administration from adding a question on citizenship to the 2020 census for the time being. Chief Justice John Roberts joined the court's liberal justices, saying the administration did not provide adequate justification for the question. The reason provided by the White House seem "contrived", the justices wrote in a 5-4 ruling. US President Donald responded by saying he would try to delay the census.
28 June 2019 Armed with a five to four conservative majority thanks to President Donald Trump's appointment of Neil Gorsuch and Brett Kavanaugh over the past two years, the Supreme Court said it had no role to play in partisan gerrymandering - a decision that amounts to a massive political victory for Republicans, not just in the moment, but also likely for the next decade-plus. While the court didn't give Republicans everything they wanted on Thursday - rejecting the addition of a citizenship question to the census that the Trump administration had pushed for -- the ruling on line-drawing with political concerns as a primary motivation is an absolute game-changer for a party that has already reaped the considerable rewards of its ongoing domination at the state legislative level. What SCOTUS said Thursday was, essentially, if state legislators want to draw the lines of their own districts and those of their members of Congress using political calculations, it's not the court's job to stop them. That state legislatures are given that power and can exert it as they see fit.
According to the National Conference of State Legislatures, Republicans currently have full control over 30 of the 49 partisan legislatures in the country. (Nebraska has a unicameral legislature where members are elected on a nonpartisan basis.) In 22 states, Republicans not only control both chambers of the state legislature but also hold the governorship -- giving them total control over state government. (Democrats have total control in 14 states while control is divided between the parties in 13 states.)
We now turn to the relationship between libertarian capitalism and its in-principle denial of regulation and the events of the 2007-8 financial crisis.
After the great stock market crash of 1929, it was recognized that there was a jeopardy in firms acting both as risk taking investment bankers also engaging depositors as commercial banks. For example the Goldman Sachs Trading Corp. a closed-end fund created by Goldman failed as a result of the Stock Market Crash, hurting the firm's reputation for years afterward. Of this case and others like it John Kenneth Galbraith wrote: "The Autumn of 1929 was, perhaps, the first occasion when men succeeded on a large scale in swindling themselves."
The 1933 Banking Act often, referred to as the Glass–Steagall Act after its sponsors, enacted in response to the great stock market crash, refers predominantly to four provisions that separated commercial banking from investment banking. Deposit insurance was also separated from these two. The Act was long-criticized for limiting competition and thereby encouraging an inefficient banking industry. Supporters cite it as a central cause for an unprecedented period of stability in the US banking system during the ensuing four to five decades following 1933.
By the time of Bill Clinton’s presidency, many features of the Act had become watered down from a series of cases in which banks were given increasing powers to act as investment firms by lenient regulatory interpretations and use of loopholes. Most of it was repealed in 1999 by the Gramm–Leach–Bliley Act (GLBA). Clinton’s signing statement for the GLBA summarized the established argument for repeal, stating that this change would “enhance the stability of our financial services system” by permitting financial firms to “diversify their product offerings and thus their sources of revenue” and make financial firms “better equipped to compete in global financial markets.”
However commentators pointed to the Enron, WorldCom, and other corporate scandals of the early 2000s as exposing the dangers of uniting commercial and investment banking. Later, as financial crises and other issues played out in the United States and worldwide, arguments have broken out about whether Glass–Steagall, as originally intended, would have prevented these problems. After the GLBA became law, many commentators expected investment banking firms would need to convert into bank holding companies (and qualify as financial holding companies) to compete with commercial bank affiliated securities firms. It was only after the financial crisis of 2007-8 that this happened.
Fig 8: Graphical portrait of the sub-prime and financial crisis of 2007-8 showing (right) the declines in US stock markets, key investment bankers and connected trading banks in the 2007-8 financial crisis. (Left and centre from top) are shown increasing leverages on borrowed capital of the five investment bankers, the final decline in 2008 of Merrill Lynch and Lehman Bros. (the former ending at $10 when taken over by the Bank of America and the latter entering bankruptcy), the 2007 decline in mortgage-backed bonds, a pie chart breakdown of the mortgage bond market, the exploding spread of the TED value - the difference between the interest rates on interbank loans (measured by the Libor) and short-term U.S. government debt, four maps of the effect of the crisis in Cleveland, the rise of the sub-prime market along with home ownership, trends in house prices and repossessions during the crisis. (Centre) The then heads of Merrill Lynch, Goldman Sachs and Morgan Stanley depicted by Vanity Fair during an outcry about corporate executive bonuses after the companies were bailed out by the federal government (Shnayerson).
The 2008 stock market crash and financial crisis was not caused by an electronic trading instability, but fundamentals of deceit and greed in the entire financial industry, from investment brokers to major trading banks. Subprime mortgages were mortgages given to borrowers at higher risk of being unable to pay the money back. These high-risk loans, which became increasingly popular in the US in the years before the financial crisis (fig 8), were repackaged by banks into more complex mortgage investments and sold on to other banks, causing chaos in the banking system when borrowers began to default.
The Global Financial Crisis of 2007–2008 is considered to be the worst financial crisis since the Great Depression of the 1930s. It resulted in the threat of total collapse of large financial institutions, the bailout of banks by national governments, and downturns in stock markets around the world. In many areas, the housing market also suffered, resulting in evictions, foreclosures and prolonged unemployment. The crisis played a significant role in the failure of key businesses, declines in consumer wealth estimated in trillions of US dollars, and a downturn in economic activity leading to the 2008–2012 global recession and contributing to the European sovereign-debt crisis.
It was triggered by a complex interplay of policies that encouraged home ownership, providing easier access to loans for subprime borrowers, overvaluation of bundled sub-prime mortgages based on the theory that housing prices would continue to escalate, questionable trading practices on behalf of both buyers and sellers, compensation structures that prioritize short-term deal flow over long-term value creation, and a lack of adequate capital holdings from banks and insurance companies to back the financial commitments they were making.
The immediate cause of the crisis was the bursting of the United States housing bubble, which peaked in 2005–2006. Already-rising default rates on "subprime" and adjustable-rate mortgages (ARMs) began to increase quickly thereafter.
Easy availability of credit in the US, fuelled by large inflows of foreign funds after the Russian debt crisis and Asian financial crisis of the 1997–1998 period, led to a housing construction boom and facilitated debt-financed consumer spending fueled by lax lending standards and rising real estate prices.
While the housing and credit bubbles were building, a series of factors caused the financial system to both expand and become increasingly fragile. U.S. Government policy from the 1970s onward has emphasized deregulation to encourage business, which resulted in less oversight of activities and less disclosure of information about new activities in the increasingly important role played by financial institutions such as investment banks and hedge funds, also known as the shadow banking system, which had become of comparable importance to commercial (depository) banks in providing credit to the U.S. economy, but were not subject to the same regulations.
A new process of mortgage financing began to play an increasing role. In traditional mortgage financing a bank offers a first mortgage secured against the property using deposit money from other customers. With banking deregulation mortgaged properties were bundled up into tranches and on-sold to other financial organizations as mortgage bonds. The number of financial agreements called mortgage-backed securities (MBS) and collateralized debt obligations (CDO), which derived their value from mortgage payments and housing prices, greatly increased. These contained a tier of undertakings of varying security based on Standard and Poor AAA rating down to B-.
The aim in doing this was to spread risk by offering an instrument with a graduated proportion of prime mortgage stock with lower interest rates and higher security and additional lower rated stock with greater risk and returns. However the lack of any real oversight led to both self-deception and greed on the part of operators, who were able to disguise increasing risk of default, by draining badly performing tiers of existing CDOs into successively rebundled CDOs2 and even CDOs3.
Some organizations also engaged predatory lending - the practice of unscrupulous lenders, enticing borrowers to enter into "unsafe" or "unsound" secured loans for inappropriate purposes. A classic bait-and-switch method was used by Countrywide Financial, purchased by the Bank of America in 2008, advertising low interest rates for home refinancing. Whereas the advertisement might state that 1.5% interest would be charged, the consumer would be put into an adjustable rate mortgage (ARM) in which the interest charged would be greater than the amount of interest paid. This created negative amortization, which the credit consumer might not notice until long after the loan transaction had been consummated. In 2006 Countrywide financed 20% of all mortgages in the United States, at a value of about 3.5% of United States GDP, a proportion greater than any other single mortgage lender.
Former employees from Ameriquest, which was United States' leading wholesale lender, described a system in which they were pushed to falsify mortgage documents and then sell the mortgages to Wall Street banks eager to make fast profits. Critics such as economist Paul Krugman and U.S. Treasury Secretary Timothy Geithner have argued that the regulatory framework did not keep pace with financial innovation, such as the increasing importance of the shadow banking system, derivatives and off-balance sheet financing.
From 2004 to 2007, the top five U.S. investment banks each significantly increased their financial leverage (fig 8), increasing their appetite for risky investments and reducing their resilience in case of losses, amplifying their vulnerability to a financial shock. Much of this leverage was achieved using complex financial instruments such as off-balance sheet securitization and derivatives, which made it difficult for creditors and regulators to monitor and try to reduce financial institution risk levels. These instruments also made it virtually impossible to reorganize financial institutions in bankruptcy, and contributed to the need for government bailouts. Changes in capital requirements, intended to keep U.S. banks competitive with their European counterparts, allowed lower risk weightings for AAA securities. The shift from first-loss tranches to AAA tranches was seen by regulators as a risk reduction that compensated the higher leverage.
These five institutions reported over $4.1 trillion in debt for the fiscal year 2007, about 30% of USA nominal GDP for the year. Bear Stearns was sold to JPMorgan Chase and then Merrill Lynch was sold to the Bank of America at fire-sale prices, Lehman Brothers was liquidated, and Goldman Sachs and Morgan Stanley became commercial banks, subjecting themselves to more stringent regulation in line with the GLBA Act. With the exception of Lehman, these companies required or received government support. Fannie Mae and Freddie Mac, two U.S. Government sponsored enterprises, owned or guaranteed nearly $5 trillion in mortgage obligations at the time they were placed into conservatorship by the U.S. government in September 2008, including 13 million substandard loans totalling over $2 trillion. These seven entities were highly leveraged and had $9 trillion in debt or guarantee obligations, yet they were not subject to the same regulation as depository banks.
Let us now look in a little more detail at the fates of these five central players.
In the years leading up to the crisis Bear Stearns issued large amounts of asset-backed securities. Lewis Ranieri of Bear Stearns was called "the father of mortgage securities". As investor losses mounted in those markets in 2006 and 2007, the company actually increased its exposure, especially to those that were central to the subprime mortgage crisis. In March 2008, the Federal Reserve Bank of New York provided an emergency loan to try to avert a sudden collapse of the company. However, the company could not be saved and was sold to JP Morgan Chase for $10 per share, far below its pre-crisis 52-week high of $133.20 per share, but not as low as the $2 per share originally agreed upon.
Before declaring bankruptcy in 2008, Lehman was the fourth-largest investment bank in the US. On June 9, 2008, Lehman Brothers announced US$2.8 billion second-quarter loss, its first since being spun off from American Express, as market volatility rendered many of its hedges ineffective. Investor confidence continued to erode as Lehman's stock lost roughly half its value and pushed the S&P 500 down 3.4% on September 9. The U.S. government did not announce any plans to assist with any possible financial crisis that emerged at Lehman. The next day, Lehman announced a loss of $3.9 billion and its intent to sell off a majority stake in its investment-management business, which includes Neuberger Berman. The stock slid 7% that day. Lehman, after earlier rejecting questions on the sale of the company, was reportedly searching for a buyer as its stock price dropped another 40% on September 11. Executives at Neuberger Berman sent e-mails suggesting that the Lehman Brothers' top people forgo multi-million dollar bonuses to "send a strong message to both employees and investors that management is not shirking accountability for recent performance." The Lehman Brothers Investment Management Director dismissed the proposal as ridiculous: "Sorry team. I am not sure what's in the water at Neuberger Berman. I'm embarrassed and I apologize." On Sept 15, the firm filed for Chapter 11 bankruptcy protection following the massive exodus of most of its clients, drastic losses in its stock, and devaluation of its assets by credit rating agencies. According to Wikipedia it is the largest bankruptcy in U.S. history, and is thought to have played a major role in the unfolding of the global financial crisis.
During hearings, former CEO Richard Fuld claimed a host of factors including a crisis of confidence and naked short-selling attacks (selling derivatives one hasn't actually bought yet with the aim of purchasing them at a lower price after the market marks them down) followed by false rumours contributed to both the collapse of Bear Stearns and Lehman Brothers. House committee Chairman Henry Waxman said documents received portray a company in which there was “no accountability for failure". A March 2010 report by the court-appointed examiner indicated that Lehman executives regularly used cosmetic accounting gimmicks at the end of each quarter to make its finances appear less shaky than they really were, in a repurchase agreement that temporarily removed securities from the company's balance sheet. However, unlike typical repurchase agreements, these were described by Lehman as the outright sale of securities and created "a materially misleading picture of the firm’s financial condition in late 2007 and 2008.
Merrill Lynch, like many other banks, became heavily involved in the mortgage-based collateralized debt obligation (CDO) market in the early 2000s. In 2005 Merrill claimed that its Global Markets and Investing Group was the "No 1 global underwriter of CDOs in 2004". To provide a ready supply of mortgages for the CDOs, Merrill purchased First Franklin Financial Corp., one of the largest subprime lenders in the country, in December 2006. BusinessWeek would later describe how between 2006 and 2007, Merrill was "lead underwriter" on 136 CDOs worth $93 billion. By the end of 2007, the value of these CDOs was collapsing, but Merrill had held onto portions of them, creating billions of dollars in losses for the company. In mid-2008, Merrill sold a group of CDOs that had once been valued at $30.6 billion to Lone Star Funds for $1.7 billion in cash and a $5.1 billion loan. The day before Lehman Bros. bankruptcy filing on Sept 14, 2008, Bank of America announced it was in talks to purchase Merrill Lynch. The Wall Street Journal reported later that day that Merrill Lynch was sold to Bank of America for 0.8595 shares of Bank of America common stock for each Merrill Lynch common share, or about US$50 billion or $29 per share. This price represented a 70.1% premium over the September 12 closing price or a 38% premium over Merrill's book value of $21 a share, but that also meant a discount of 61% from its September 2007 price.
Congressional testimony by Bank of America CEO Kenneth Lewis, as well as internal emails released by the House Oversight Committee, indicate that Bank of America was threatened with the firings of the management and board of Bank of America as well as damaging the relationship between the bank and federal regulators, if Bank of America did not go through with the acquisition of Merrill Lynch. In March 2009 it was reported that in 2008, Merrill Lynch received billions of dollars from its insurance arrangements with AIG, including $6.8 billion from funds provided by the United States taxpayers to bail out AIG.
Merrill Lynch arranged for payment of billions in bonuses in what appeared to be "special timing". These bonuses totalling $3.6 billion were 36.2% of the money they received from the FEDs' Troubled Asset Relief (TARP) bailout. The Merrill bonuses were determined by Merrill's Compensation Committee at its meeting of December 8, 2008, shortly after BOA shareholders approved the merger but before financial results for the fourth quarter had been determined. A performance bonus according to company policy, was supposed to reflect all four quarters of performance and was paid in January or later. Merrill employees had to have a salary of at least $300,000 and have attained the title of Vice President or higher to be eligible.
In April 2009, bond insurance company MBIA sued Merrill Lynch for fraud related to credit default swap "insurance" contracts Merrill had bought from MBIA. When the CDOs lost value, MBIA wound up owing Merrill a large amount of money. MBIA claimed that Merrill defrauded MBIA about the quality of these CDOs, and that it was using the complicated nature of these particular CDOs (squared and cubed) to hide the problems it knew about in the securities that the CDOs were based on. However all but one of the charges were disallowed: the claim by MBIA that Merrill had committed breach of contract by promising the CDOs were worthy of an AAA rating when, it alleges, in reality they weren't. In 2009 Rabobank sued Merrill over a CDO named Norma. Rabobank later claimed that its case against Merrill was very similar to the SEC's fraud charges against Goldman Sachs and its Abacaus CDOs (see below). Rabobank alleged that a hedge fund named Magnetar Capital had chosen assets to go into Norma, and allegedly bet against them, but that Merrill had not informed Rabobank of this fact. Instead, Rabobank alleges that Merrill told it that NIR Group was selecting the assets. When the CDO value tanked, Rabobank was left owing Merrill a large amount of money.
On September 21, 2008, a week after the demise of Lehman and the rushed sale of MerrillLynch, Goldman Sachs and Morgan Stanley, the last two major investment banks in the United States, both confirmed that they would become traditional bank holding companies, bringing an end to the era of investment banking on Wall Street. The Federal Reserve's approval of their bid to become banks ended the ascendancy of the securities firms, 75 years after Congress separated them from deposit-taking lenders.
Morgan Stanley has long had a dominant role in technology investment banking and was lead underwriter for many of the largest global tech IPOs, including: Apple, Facebook, Netscape, Cisco, Compaq, and Google, the largest Internet IPO in U.S. history. Morgan Stanley also achieved high and top league table rankings during the eight years Phil Purcell was CEO. The company found itself in the midst of a management crisis starting in March 2005 that resulted in a loss of a number of the firm's staff. Purcell resigned as CEO of Morgan Stanley in June 2005 when a highly public campaign against him by former Morgan Stanley partners (the Group of Eight) threatened to disrupt and damage the firm and challenged his refusal to aggressively increase leverage, increase risk, enter the sub-prime mortgage business and make expensive acquisitions, the same strategies that forced Morgan Stanley into massive write-downs, related to the subprime mortgage crisis, by 2007.
In order to cope up with the write-downs, Morgan Stanley announced on December 19, 2007 that it would receive a US$5 billion capital infusion from the China Investment Corporation. On September 17, 2008 Newsnight reported that Morgan Stanley was facing difficulties after a 42% slide in its share price. CEO John J. Mack wrote in a memo to staff "we're in the midst of a market controlled by fear and rumours and short-sellers are driving our stock down." Mitsubishi UFJ Financial Group, Japan's largest bank, invested $9 billion in Morgan Stanley on September 29, 2008. Morgan Stanley borrowed $107.3 billion from the Fed during the 2008 crises, the most of any bank, according to data compiled by Bloomberg.
Goldman Sachs fared a lot better, through shrewd assessment of the market volatility. During the 2007 subprime mortgage crisis, Goldman Sachs profited from the collapse in subprime mortgage bonds in the summer of 2007 by short-selling subprime mortgage-backed securities. Two Goldman traders, Michael Swenson and Josh Birnbaum made a profit of $4 billion by "betting" on a collapse in the sub-prime market, and shorting mortgage-related securities. By the summer of 2007, they persuaded colleagues to see their point of view and talked around skeptical risk management executives. The firm initially avoided large subprime writedowns, and achieved a net profit due to significant losses on non-prime securitized loans being offset by gains on short mortgage positions. Its sizable profits made during the initial subprime mortgage crisis led the New York Times to proclaim that Goldman Sachs is without peer in the world of finance. The firm's viability was later called into question as the crisis intensified in September 2008.
On September 23, 2008, Berkshire Hathaway agreed to purchase $5 billion in Goldman's preferred stock. Goldman also received a $10 billion preferred stock investment from the U.S. Treasury in October 2008, as part of the TARP Program. Andrew Cuomo, then Attorney General of New York, questioned Goldman's decision to pay 953 employees bonuses of at least $1 million each after it received TARP funds in 2008, however CEO Lloyd Blankfein and six other senior executives opted to forgo bonuses, stating they believed it was the right thing to do, in light of "the fact that we are part of an industry that's directly associated with the ongoing economic distress". In June 2009, Goldman Sachs repaid the U.S. Treasury’s TARP investment, with 23% interest. On March 18, 2011, Goldman Sachs acquired Federal Reserve approval to buy back Berkshire's preferred stock in Goldman.
During 2008 Goldman Sachs received criticism for an apparent revolving door relationship, in which its employees and consultants have moved in and out of high-level U.S. Government positions, creating the potential for conflicts of interest. Former Treasury Secretary Henry Paulson who officiated the hundreds of billions in federal bailout money was a former CEO of Goldman Sachs, who had fortuitously been forced to sell his Goldman Sachs stock, then worth almost $500 million, at pretty close to its market peak, when he took up the post in July 2006. Additional controversy attended the selection of former Goldman Sachs lobbyist Mark Patterson as chief of staff to Treasury Secretary Timothy Geithner, despite President Barack Obama's campaign promise that he would limit the influence of lobbyists in his administration. In February 2011, the Washington Examiner reported that Goldman Sachs was "the company from which Obama raised the most money in 2008" and that its CEO had visited the White House 10 times.
During the time of TARP bailouts the self-serving antics on the companies involved and their senior executives became a source of public and media outrage. Michael Shnayerson of Vanity Fair highlights the controversy in March 2009. After getting $125 billion in taxpayer bailouts, the top officers at Citigroup, Merrill Lynch, Goldman Sachs, and three other banks had agreed to forgo their 2008 bonuses. Only a year before, Wall Street’s financial firms had paid out $33.2 billion in bonuses - down a mere rounding error from the $33.9 billion bestowed in 2006 -even as the credit crisis spread and $74 billion in shareholders’ equity went poof. But this year the bonuses were on the table again and came in at a cool 18.4 billion, raising the spectre of a retaliatory clawback, particularly from the bosses who had brought the crisis about and received ultra generous severance packages from their collapsing firms. For many U.S. taxpayers, the bailout was infuriating enough on its own terms: $700 billion of public money in all ($5,073 for every taxpayer).
New York Attorney General Andrew Cuomo began strategizing about how to stop what he would soon call “unwarranted and outrageous expenditures” sending ultimatums to several of the CEOs on the back of a $440,000 junket to the St. Regis resort enjoyed by 70 high-performing employees of AIG less than a week after the company was promised its first $85 billion from a tarp fund separate from the one helping the banks after it was caught with billions in bad debts on mortgage deposit insurance. AIG had let the previous CEO Martin Sullivan go the previous June after hideous losses totalling $18 billion - but had not fired him, so Sullivan had been free to leave with a package that included a $4 million pro-rated bonus, $15 million in severance, and other benefits then valued at $28 million, for a total of $47 million. Joseph Cassano, the executive whose Financial Products division was responsible for all the credit default swaps left in February with a $1-million-a-month consulting contract (since discontinued) and $69 million in deferred compensation. Only six months before, Cassano had said of the $441 billion portfolio of CDS’s he’d bet would not default: “It is hard for us, without being flippant, to even see a scenario with any kind of realm of reason that would see us losing one dollar in any of those transactions.”
Of course the CEOs of Goldman and Morgan Stanley could afford to waive their bonuses for a year. In 2007, Goldman’s Blankfein had set an investment-bank record for CEO compensation by taking home $68.5 million in cash and equity. His two co-presidents had gotten $67.5 million each. Blankfein alone had earned $210,169,732 in total compensation from 2003 to 2007, according to Equilar. Since his return to Morgan Stanley in 2005 from a short stint at Credit Suisse, John Mack had been awarded $69,565,233.
The bonuses themselves have been cited as a cause of the lack of long-term thinking that provoked the crisis. “As bonuses grew and grew, thinking beyond the next quarter became irrelevant to the management of the firm,” said Richard Ferlauto, director of corporate governance and pension investment at the American Federation of State, County and Municipal Employees union. “Pay became so large that it was decoupled from providing incentives to drive long-term strategic planning.”
According to its website, the Securities and Exchange Commission (SEC) has charged 157 firms and individuals so far, including 66 senior executives, and has secured $2.7bn in fines and penalties. On April 16, 2010, the commission announced that it was suing Goldman Sachs and its employee Fabrice Tourre. The SEC alleged that Goldman materially misstated and omitted facts in disclosure documents about a synthetic CDO called Abacus 2007-AC1. Goldman was paid a fee of approximately $15 million for its work in the deal. The allegations are that Goldman misrepresented to investors that the independent selection agent, ACA, had reviewed the mortgage package underlying the credit default obligations, and that Goldman failed to disclose to ACA that a hedge fund that sought to short the package, Paulson & Co., had helped select underlying mortgages for the package against which it planned to bet. The complaint states that Paulson made a $1 billion profit from the short investments, while purchasers of the materials lost the same amount. IKB Deutsche Industriebank lost $150 million within months on the purchase. ABN Amro lost $841 million. On July 15, 2010, Goldman agreed to pay $550 million – $300 million to the U.S. government and $250 million to investors – in a settlement with the SEC. The company did not admit or deny wrongdoing. The company also agreed to change some of its business practices regarding mortgage investments, including the way it designs marketing materials. The SEC said it was the largest commission penalty for a Wall Street firm.
As of 1st August 2013 a New York jury has found former Goldman Sachs trader Fabrice Tourre liable for fraud in a complex mortgage deal that cost investors $1bn (£661m). He was accused by the Securities and Exchange Commission (SEC) of misleading investors about investments linked to subprime mortgages that he knew would fail. He was described by the regulator as the "face of Wall Street greed". In e-mail which came to light in 2011 he described himself as Fabulous Fab, saying of the financial markets that the "whole building is about to collapse anytime now. Only potential survivor, the Fabulous Fab... standing in the middle of all these complex, highly leveraged, exotic trades he created without necessarily understanding all of the implications of those monstrosities!!!" In March 2014 Tourre was fined $650,000 and and told to hand back a $175,000 bonus for defrauding investors. Goldman Sachs was prohibited from contributing to his fine.
As of 6th August 2013 The US government has filed two lawsuits against Bank of America relating to fraud on $850m of mortgage-backed securities. Attorney General Eric Holder said the government wanted "justice for those who have been victimized." In the Justice Department suit, the government alleged that Bank of America "knowingly and wilfully misled investors about the quality and safety of their investments" in a residential mortgage-backed security known as BOAMS 2008-A. The security, worth around $850m when it was issued in January 2008, eventually collapsed during the crisis as the quality of the loans contained in it soured. Bank of America has recently announced a series of settlements, including an $8.5bn settlement with investors dealing with similar mortgage-based securities and a $1.6bn deal with MBIA Inc, a bond insurer.
In addition there have been huge suits and settlements between banks and affected parties. In October 2012 , the New York Attorney General sued JP Morgan Chase for allegedly defrauding investors who lost more than $20bn on mortgage-backed securities sold by Bear Stearns. Ten of the biggest banks have agreed to pay $8.5bn to settle a review of home foreclosures by US regulators. Banks and mortgage lenders, including Bank of America, Citigroup and JP Morgan Chase will pay $3.3bn directly to eligible homeowners and $5.2bn to modify and forgive loans. In January 2013 Bank of America agreed to pay Fannie Mae $11.6bn to settle claims relating to residential home loans, $10.3bn to settle claims relating to the loans and $1.3bn in compensation to the agency. In September 2013 Citigroup agreed to pay $395m to Freddie Mac to settle claims of potential flaws in mortgages it sold to the firm, covering nearly 3.7 million loans sold to Freddie Mac between 2000 and 2012. A few days later, Wells Fargo, the largest US mortgage lender, agreed to pay Freddie Mac $869m to settle claims on loans sold before 1 January 2009. In Feb 2014 Morgan Stanley agreed to pay $1.25bn to Fannie Mae and Freddie Mac. Credit Suisse has agreed to pay $885m to Fannie Mae and Freddie Mac, $234m and $651m respectively, the ninth that the Federal Housing Finance Agency (FHFA) has reached over some $200bn in mortgage-backed securities. $10.1bn has been recovered from banks in similar actions. JPMorgan Chase entered into a huge settlement believed to be $13bn with the Justice Department, related to the mis-selling of mortgage related products during the US housing boom in the run-up to the financial crisis, said to include $9bn in fines and a further $4bn in relief for struggling homeowners. It paid $920m to settle charges related to a trading scandal. It now has a contingency fund of $23bn to cover legal expenses. It also said that it could face litigation-related expenses of another $6.8bn. Bank of America's first quarter 2014 results included $6bn in legal expenses to settle allegations that it misled Fannie Mae and Freddie Mac before the housing crisis in 2008, as well as for other mortgage-related matters. In March, the firm finally agreed to pay $9.5bn to settle four lawsuits filed in 2011 by US regulatory agency, the Federal Housing Finance Authority (FHFA) - $6.3bn in cash and buy back $3.2bn in mortgage securities from Fannie Mae and Freddie Mac. Also in mid-2014 Citigroup agreed to pay $7bn. In one 2007 deal a Citigroup trader told colleagues in an e-mail that he had reviewed a due diligence report on the poorest quality loans and that they xshould start prayingx. US Attorney General Eric Holder said xDespite the fact that Citigroup learned the serious and widespread defects among the increasingly risky loans they were securitizing, the banks and its employees concealed these defects.x ¬†In August 2014 Bank of America made a settlement of agreed to pay a record $16.7bn ($5bn civil penalty and $4.63bn in compensation payments in cash and provide consumer relief worth about $7bn, much of which will go towards homeowners struggling with their mortgages) to US authorities for misleading investors about the quality of loans sold by Countrywide Financial and Merrill Lynch before Bank of America bought them in 2008. At the same time the HSBC received a more cushy $500 million settlement with the Federal Housing Finance Agency over mortgage bonds to Fannie Mae and Freddie Mac, having cooperated extensively with the FED.
But many other major world banks and insurers were caught up in the huge raft of crippling debt obligations. As well as Bank of America and Citibank , Deutschebank, HSBC, Royal Bank of Scotland and others were caught with billions of dollars of bad debt through their high exposure to the sub-prime mortgage market. This led to a severe global liquidity crisis when banks worldwide, realizing that many of their sister banks were rotten to the core with bad debts, began refusing to lend money to one another. The LIBOR (London Interbank Offered Rate) - the average interest rate estimated by leading banks in London that they would be charged if borrowing from other banks, shot up in comparison with US short term government borrowing, leading to an unprecedented TED spread highlighting the degree of mutual mistrust between financial institutions (fig 8).
Yet a further scandal emerged when it was revealed that the Libor was being manipulated by Barclays, UBS and other trading banks, involving hundreds of trillions of dollars of financial products, roughly equivalent to 4.5 times the world global GDP. Because the LIBOR was estimates rather than actual trades it was easier to manipulate by buddy transactions. At the height of the financial crisis in late 2007, many banks stopped lending to each other over concerns about their financial health with some banks submitting much higher rates than others. Barclays was one of those submitting much higher rates, attracting some media attention. This prompted comment that Barclays was in trouble. Following much internal debate and a controversial conversation with a Bank of England official, Barclays began to submit much lower rates. The graph above compares the Libor rate with those submitted by Barclays. While those paying interest on loans would have benefited from lower Libor rates, savers and investors would have lost out.
A memo recorded by former Barclays CEO Bob Diamond, which was submitted to Treasury Committee, said Paul Tucker, the Bank of England's deputy governor, initiated the call, saying senior government officials were wondering why Barclays was reporting higher borrowing rates than other banks. "I asked if he could relay the reality, that not all banks were providing quotes at the levels that represented real transactions. His response was `Oh, that would be worse.'" Tucker said he didn't think Diamond was alerting him to lying by other banks. "Certainly a bell didn't go off - my goodness there's dishonesty here," Tucker said. "I understood him to say, `we're basing ours on real transactions, the other guys aren't doing that.'" That could be, he said, simply because other banks weren't borrowing money from other banks, in some cases because the government had just injected billions to shore them up. Diamond said he later discussed the conversation with Jerry del Missier, who was a senior manager of Barclays Capital. "Jerry del Missier concluded that an instruction had been passed down from the Bank of England not to keep LIBORs so high. He passed down an instruction to that effect to the submitters," Diamond said. Del Missier resigned the same day as Diamond (Baar 2012). UBS was at fined $1.5 billion over rate rigging the Libor and Barclays $453 million. The Bank of England deputy governor called the Libor market a "cesspit".
Tom Hayes, 35, a former star trader rigged the Libor rates daily for nearly four years while working in Tokyo for UBS, then Citigroup, from 2006 until 2010. In 2015 he faced 8 counts of conspiracy to defraud was imprisoned for 14 years in August 2015. He was accused of acting in "a thoroughly dishonest manner" in his alleged attempts to rig the benchmark rate. In electronic and audio conversations with fellow traders, Hayes said: "Three-month Libor is too high, 'cos I've kept it artificially high" by "being mates with the cash desks - [JP Morgan] Chase and I always help each other out". Jurors were told that Hayes promised to pay a broker up to $100,000 to keep the Libor rate "as low as possible". He also told a fellow trader: "Just give the cash desk a Mars bar and they'll set wherever you want." The prosecuting QC noted: "If you ever needed any evidence of deliberate rigging of rates, this is it. This is strategic, isn't it. It's nothing to do with the bank's borrowing rates. It's all to do with Mr Hayes' trading positions."
In 2017, a secret recording that implicates the Bank of England in Libor rigging has been uncovered by BBC Panorama (http://www.bbc.com/news/business-39548313). The 2008 recording adds to evidence the central bank repeatedly pressured commercial banks during the financial crisis to push their Libor rates down. In the recording, a senior Barclays manager, Mark Dearlove, instructs Libor submitter Peter Johnson, to lower his Libor rates: "The bottom line is you're going to absolutely hate this... but we've had some very serious pressure from the UK government and the Bank of England about pushing our Libors lower." Mr Johnson objects, saying that this would mean breaking the rules for setting Libor, which required him to put in rates based only on the cost of borrowing cash, saying: "So I'll push them below a realistic level of where I think I can get money?" His boss Mr Dearlove replies: "The fact of the matter is we've got the Bank of England, all sorts of people involved in the whole thing... I am as reluctant as you are... these guys have just turned around and said just do it." The phone call between Mr Dearlove and Mr Johnson took place on 29 October 2008, the same day that Mr Tucker, who was at that time an executive director of the Bank of England, phoned Barclays boss Mr Diamond. Barclays' Libor rate was discussed. Mr Diamond and Mr Tucker were called to give evidence before the Treasury select committee in 2012. Both said that they had only recently become aware of lowballing. Peter Johnson, the Barclays Libor submitter, was jailed last summer after pleading guilty to accepting trader requests to manipulate Libor. The Bank of England said: "Libor and other global benchmarks were not regulated in the UK or elsewhere during the period in question. "Nonetheless, the Bank of England has been assisting the SFO's criminal investigations into Libor manipulation by employees at commercial banks and brokers by the SFO."
In August 2015, nine banks including HSBC, Barclays, BNP Paribas, Bank of America, JP Morgan, Citibank, Goldman Sachs, RBS and UBS agreed to pay $2bn in settlements to US investors, including hedge funds and pension funds, over claims they rigged foreign exchange rates, according to lawyers. Traders assumed names such as "The Cartel," "The Bandits' Club," and "The Mafia" to communicate and place confidential orders.
As an example of just how feral the world's largest banks can become, in December 2012 the HSBC was fined $1.9 billion for violating several US Acts in blatant money laundering through branches in Mexico and the Cayman Islands.
Matt Taibbi (2013) notes in Rolling Stone in "Gangster Bankers: Too Big to Jail": For at least half a decade, the [HSBC] helped to wash hundreds of millions of dollars for drug mobs, including Mexico's Sinaloa drug cartel, suspected in tens of thousands of murders just in the past 10 years - people so totally evil, jokes former New York Attorney General Eliot Spitzer, that "they make the guys on Wall Street look good." The bank also moved money for organizations linked to Al Qaeda and Hezbollah, and for Russian gangsters; helped countries like Iran, the Sudan and North Korea evade sanctions; and, in between helping murderers and terrorists and rogue states, aided countless common tax cheats in hiding their cash.
"They violated every goddamn law in the book," says Jack Blum, an attorney and former Senate investigator who headed a major bribery investigation against Lockheed in the 1970s that led to the passage of the Foreign Corrupt Practices Act. "They took every imaginable form of illegal and illicit business." In one four-year period between 2006 and 2009, an astonishing $200 trillion in wire transfers (including from high-risk countries like Mexico) went through without any monitoring at all.
"Had the U.S. authorities decided to press criminal charges," said Assistant Attorney General Lanny Breuer at a press conference to announce the settlement, "HSBC would almost certainly have lost its banking license in the U.S., the future of the institution would have been under threat and the entire banking system would have been destabilized."
The HSBC then tried to give a cleaner face by insisting its clients declare any potential US earnings or state they didn't have any, and were publicized in several instances refusing to let genuine customers withdraw their own money in cash, ostensibly as a protection against money laundering. Yet in 2015 the HSBC was found to have helped rich clients to dodge tax using its private bank in Switzerland, after a whistle blower Herve Falciani in 2010 revealed details of secret accounts, leading to a raid by Swiss police on their Geneva offices and potential criminal charges. The man in charge of HSBC at the time, Stephen Green, was made a Conservative peer and appointed to the government. Lord Green was made a minister eight months after HMRC had been given the leaked documents from his bank. He served as a minister of trade and investment until 2013. The HSBC has confirmed that its current chief executive Stuart Gulliver uses a Swiss bank account to hold his bonuses. The bank was responding to a report in the Guardian that Mr Gulliver has ¬£5m in the account which he controls using a Panamanian company. HSBC faces 10 separate investigations around the world for allegations that it helped wealthy clients avoid paying millions in taxes to governments in the UK, the US, Argentina, France, and elsewhere. French prosecutors have expanded their investigations into the activities of HSBC’s Swiss subsidiary to include its global holding company, the bank has disclosed. Authorities completed an investigation into HSBC Suisse, which were revealed in the HSBC files project, for alleged tax-related offences earlier this year, turning over a file to prosecutors to make a formal decision. The bank said on Thursday that prosecutors were now investigating HSBC’s global holdings company and had set bail at Ř1bn. In the previous investigation, bail was set at Ř50m.
Patrick Radden Keefe (2016) notes in New Yorker in "The Bank Robber": When Falciani arrived in Geneva, he told me, he realized that H.S.B.C. was engaged in a "gigantic swindle." Clients were not only placing their fortunes in accounts that were "undeclared" to tax authorities; H.S.B.C. bankers were actively assisting clients in hiding their money, by setting up shell companies and sham trusts in the British Virgin Islands and Panama. In some instances, the bankers were handing customers hundred-thousand-dollar bricks of U.S. bills, allowing money to be smuggled back home. In a subsequent investigation by French prosecutors, an H.S.B.C. client said that the bank had instructed him to "make a company in Panama, which should open an account at H.S.B.C. in Lugano, into which I should transfer all my holdings, in order to not be hit by this tax." ... Another client questioned in the subsequent investigation recalled that when he wanted to make a deposit he would meet his account manager in a public place. "I would give him an envelope holding my money, in cash," he explained. "And a few days later he would tell me by phone that the funds had been credited to my account in Switzerland." H.S.B.C. has numerous offices in Paris, but, according to the French investigation, when the Swiss bankers visited clients there they preferred to meet in cafes; in a similar spirit of concealment, account holders used pay phones when making calls to Switzerland. One client pointed out that the furtive face-to-face meetings offered "a bit of reassurance about the money I had in Switzerland, since I had no documents or anything that attested to my having an account." ...Gulliver, the C.E.O., said that since taking over, in 2011, he had implemented "root and branch" reforms. But it was hard to see him as an agent of change. When committee members inquired how he chose to receive his personal compensation from the bank, Gulliver acknowledged that for many years he was paid through an anonymous shell company that he had set up in Panama - through Mossack Fonseca.
As of May 2015 JPMorgan, Barclays, Citigroup, RBS and UBS are to pay fines totalling $5.7bn for manipulating the foreign exchange market. The first four have agreed to plead guilty to US criminal charges. The fifth will plead guilty to rigging benchmark interest rates. Barclays was fined the most, $2.4bn, as it did not join other banks in November to settle investigations by UK, US and Swiss regulators and is also sacking eight employees involved in the scheme. Separately, the Federal Reserve fined Bank of America, $205m over foreign exchange-rigging. All the other banks were fined by both the Department of Justice and the Federal Reserve. The fines break a number of records. The criminal fines of more than $2.5bn are the largest set of anti-trust fines obtained by the Department of Justice. Between 2008 and 2012, several traders formed a cartel and used chat rooms to manipulate prices in their favour. A common scheme was to influence prices around the daily fixing of currency levels, held to help businesses and investors value their multi-currency assets and liabilities. This happened every day in the 30 seconds before and after 16:00 in London and the result is known as the 4pm fix. In a scheme known as "building ammo", a single trader would amass a large position in a currency and, just before or during the fix, would exit that position. Other members of the cartel would be aware of the plan and would be able to profit. US Attorney General Loretta Lynch said that "almost every day" for five years from 2007, currency traders used a private electronic chat room to manipulate exchange rates. Their actions harmed "countless consumers, investors and institutions around the world". "They engaged in a brazen 'heads I win, tails you lose' scheme to rip off their clients," said New York State superintendent of financial services Benjamin Lawsky. One Barclays trader who was invited to join the cartel was told: "Mess up and sleep with one eye open at night."
Proposed new rules created by the Financial Stability Board (FSB), a global regulator in 2014, require "global systemically important banks" hold a minimum amount of cash to ensure they will be able to survive big losses without "too big to fail" bailouts turning to governments for help. The capital set aside should be worth 15-20% of the bank's assets, the FSB said. According to the BBC's business editor Kamal Ahmed, analysts estimate the new capital requirements could cost 200bn euros for Europe's banks alone, with the cost for globally significant banks in the US, Japan and China likely to be much higher.
The U.S. Senate's Levin‚ÄďCoburn Report asserted that the financial crisis was the result of "high risk, complex financial products; undisclosed conflicts of interest; the failure of regulators, the credit rating agencies, and the market itself to rein in the excesses of Wall Street. In July 2010, the Dodd-Frank Wall Street Reform and Consumer Protection Act was enacted to lessen the chance of a recurrence. The U.S. Financial Crisis Inquiry Commission in 2011 concluded that "the crisis was avoidable and was caused by: widespread failures in financial regulation, including the Federal Reserve’s failure to stem the tide of toxic mortgages; dramatic breakdowns in corporate governance including too many financial firms acting recklessly and taking on too much risk; an explosive mix of excessive borrowing and risk by households and a Wall Street that put the financial system on a collision course with crisis; key policy makers ill prepared for the crisis, lacking a full understanding of the financial system they oversaw; and systemic breaches in accountability and ethics at all levels."
In 2014 the Federal Reserve (FED) released records of meetings in 2008 showing it struggled to grasp the scope of the problem and how to adequately respond. The documents show that it worried perhaps more than it should about inflation risks, and failed to grasp initially the full impact of the housing market crisis. "Frankly, I am decidedly confused and very muddled" said former chair Ben Bernanke during a September meeting. He worried about the "ad hoc" nature of deciding who to bail out, while noting "the real possibility in some cases that you might have very severe consequences for the financial system and, therefore, for the economy of not taking action." However, overall the September transcripts indicate that most members of the Fed thought the crisis was contained. At another point, Mr Bernanke said: "I think that our policy is looking actually pretty good," indicating he thought that a move by the central bank earlier in 2008 to trim interest rates had stemmed the tide of the crisis. After that meeting concluded with the bank keeping its benchmark interest rate at 2%. Just two days later, Bernanke and Treasury Secretary Henry Paulson were forced to speak to members of Congress, advising them to agree to a bailout of the banking system. Once the extent of the problems were known, the central bank was largely in agreement about moving decisively to prevent a full-scale depression from occurring. Janet Yellen, who has become the current head stated "It is becoming abundantly clear that we are in the midst of a serious global meltdown. 'I am very concerned", saying the Fed had received a "witch's brew of news. The downward trajectory of economic data has been hair-raising".
The crisis also had conflicting international political implications. Henry Paulson in a BBC interview (Peston 2014) described the Chinese and potential Russian involvement in the plight of Fannie Mae and Freddy Mac: When Fannie Mae and Freddie Mac started to become unglued, and you know there were $5.4tn of securities relating to Fannie and Freddie, $1.7tn outside of the US. The Chinese were the biggest external investor holding Fannie and Freddie securities, so the Chinese were very, very concerned. I was talking to them [Chinese ministers and officials] regularly because I didn't want them to dump the securities on the market and precipitate a bigger crisis. And so when I went to Congress and asked for these emergency powers [to stabilise Fannie and Freddie], and I was getting the living daylights beaten out of me by our Congress publicly, I needed to call the Chinese regularly to explain to the Central Bank, 'listen this is our political system, this is political theatre, we will get this done'. And I didn't have quite that much certainty myself but I sure did everything I could to reassure them. ... Here I'm not going to name the senior person, but I was meeting with someone. This person told me that the Chinese had received a message from the Russians which was, 'Hey let's join together and sell Fannie and Freddie securities on the market.' The Chinese weren't going to do that but again, it just, it just drove home to me how vulnerable I felt until we had put Fannie and Freddie into conservatorship [the rescue plan for them, that was eventually put in place].
Hyman Minsky had a theory, the "financial instability hypothesis", arguing that lending goes through three distinct stages - the Hedge, the Speculative and the Ponzi stages, named after fraudster Charles Ponzi (1882-1949). Instead of genuine profits, in Ponzi schemes, funds from new investors are used to pay high returns to current investors. They are destined to collapse as soon as new investment tails off or significant numbers of investors simultaneously wish to withdraw funds. In the first stage, soon after a crisis, banks and borrowers are cautious. Loans are made in modest amounts and the borrower can afford to repay both the initial principal and the interest As confidence rises banks begin to make loans in which the borrower can only afford to pay the interest. Usually this loan is against an asset which is rising in value. Finally, when the previous crisis is a distant memory, we reach the final stage - Ponzi finance. At this point banks make loans to firms and households that can afford to pay neither the interest nor the principal. Again this is underpinned by a belief that asset prices will rise. Hedge finance means a normal capital repayment loan, speculative finance is more akin to an interest-only loan and then Ponzi finance is something beyond even this. It is like getting a mortgage, making no payments at all for a few years and then hoping the value of the house has gone up enough that its sale can cover the initial loan and all the missed payments. You can see that the model is a pretty good description of the kind of lending that led to the financial crisis. The "Minsky moment", a term coined by later economists, is the moment when the whole house of cards falls down.
In “The blunders that led to the banking crisis”, Rod Jameson highlights the way in which financial modelling failed to anticipate a crisis that in retrospect was blindingly obvious. Part of the problem is that liquidity crises are rare extreme events, so models for normal trading times don’t work. Each liquidity crisis is inevitably different from its predecessors, not least because major crises provoke changes in the shape of markets, regulations and the behaviour of players. Banks are vulnerable to liquidity crises because they borrow money that may have to be repaid in the short term, and use it to back up more lucrative longer-term investments. Banks wrongly assumed that two areas of vulnerability could be treated in isolation, each with its own risk model. When the two areas began to affect each other and drive up banks' liquidity risk there was no unifying framework to predict what would happen.
Models covering a bank's day to day trading typically assume market prices can be reasonably predicted on the basis of how they have in the past using time series analysis. Unfortunately this doesn't apply to the complicated financial instruments that bundle up different kinds of assets such as high-risk mortgages, which also often have little or no reliable financial history to use as a basis. The models also assumed that the bank would be able to sell "problematic" assets, such as high-risk sub-prime mortgages, and this too turned out not to be true. Poor price risk modelling and being unable to sell out of the position is a toxic and lethal combination.
A second set of risk models was intended to estimate the longer term risk from borrowers failing to repay money they owe the bank and were regarded as the cutting edge of risk modelling, attempting to predict how different debtors might be affected by economic conditions. However these fail to take into account the toxic effect of a major crisis in which banks have got something on their hands that has dramatically lost value. This makes other institutions reluctant to lend, which in turn makes the value of their assets shrink further, sucking liquidity out of the market.
This is a difficult modelling problem because it has to take into account an unstable factor - that captures the relationship between a bank's cost of borrowing and its riskiness in the eyes of other lenders - how their business strategy and portfolio will be perceived by outsiders during a potentially major crisis. When this crisis happened markets "began picking off unstable institutions like predatory animals attacking a herd: the first to go were the 'outliers' in the banking herd that were perceived to be most vulnerable". The modelling also needs to take into account the risks to which the financial industry as a whole is exposed.
It would seem glaringly simple that a downturn in the overheated US property market could cause an overhanging sub-prime crisis, but risk modelling became myopically fixated on sophisticated ways of transferring risk by leveraged reinvesting into other instruments on the basis that individual and local market fluctuations could be offset by other income from better performing sectors at the time. This thinking in an atmosphere of complex opaque commodities seemed to blind the players to the obvious risk of precipitating a global crisis the models had no protection for. In effect this brought about a double or nothing gamble which simply transferred risk to an ever greater crisis. Banks used a measure called "value at risk" that predicted how much money it might lose from a given market position yet VaR measures largely ignored the degree to which the fate of a bank might be affected by other banks.
In fact, as simple a measure as super-exponential market growth gave an early prediction of the sub-prime mortgage crisis. Zhou & Sornette in “Is there a real-estate bubble in the US?” submitted in June 2005 evidence of super-exponential growth over the previous two years, especially in the north-east and the west of the country, and predicted a bursting point of mid-2006 that turned out to be fairly accurate. Psychological approaches have also been suggested to probe people’s sense of reality about investments. This could apply equally to consumers and to financial professionals and the reality testing of their group cultures.
A variety of researchers from fields spanning ecology to dynamical systems have suggested modelling the market in terms of fractal power laws and self-organised criticality as is responsible for earthquake and avalanche modelling (Stanley, Pleroua & Gabaix). May, Levin and Shugihara in “Ecology for bankers” point out that ecological network models would give much better robustness to deal with tipping points, avoiding irreversible collapse situations.
A groundbreaking study in 2011 (Vitali et al, Battiston et al, Coghlan & MacKenzie, Coghlan & Marshall) using newly released data about the financial crisis has shown that as of 2007 a ‘rich club’ network of around 147 key corporations centrally involved in the financial crisis effectively owned 40% of the world’s productive economy. Previous studies have found that a few transnational corporations (TNCs) own large chunks of the world's economy, but they included only a limited number of companies and omitted indirect ownerships. From Orbis 2007, a database listing 37 million companies and investors worldwide, the Zurich team pulled out all 43,060 TNCs and the share ownerships linking them. Then they constructed a model of which companies controlled others through shareholding networks, coupled with each company's operating revenues, to map the structure of economic power.
Each of the 1318 had ties to two or more other companies, and on average they were connected to 20. Although they represented a good 20% of global operating revenues, the 1318 in the strongly connected core also appeared to collectively own through their shares the majority of the world's large blue chip and manufacturing firms - the "real" economy - representing a further 60% of global revenues. In turn these linked to a super-core of 147 even more tightly knit companies - all of their ownership was held by other members of the super-entity - that less than 1% controlled 40% of the total wealth in the network. Most were financial institutions. The top 20 included Barclays Bank, JPMorgan Chase & Co, and The Goldman Sachs Group.
Fig 9: The effect of the financial crisis on the global network of corporations holding the key debts and assets in the global economy modelled through the concept of debt rank. (Top left) total exposure of the FED, weighted fragility the FED credit portfolio defined as debt-to-equity ratio, and concentration (Herfindhal) index of debt. The effective number of borrowers is given by the reciprocal of H, which varies between 10 and 30. (Top centre) sequential debt level of affected companies ordered by the peak time of their crisis overlayed by total debt, (Lower centre and right) the debt-rank network of the top borrowers. Outgoing links represent the estimated potential impact of an institution to another one. The closer a node is to the center the higher is its DebtRank. A node in the center is able to put under distress the entire economic value of the network. DebtRank decreases by moving outwards and leftwards along the spiral. (Top right) asset size vs debt rank for the latter phase of the crisis showing entities coloured by fragility and sized by outstanding debt - in the earlier phase these were much smaller and lighter in colour (Battiston et al). (Lower left) layout of the strongly connected core (SCC) of the world financial market - 1318 nodes and 12191 links out of a study of 43,000 transnational corporations. Mostly highly interconnected hub corporations are in red (Vitali et al).
Such connectivity ‘rich clubs’ are common to nature and have been cited in the same year as central to cognitive brain function (van den Heuvel & Sporns). The super-core of 147 is probably too many to conspire en masse to manipulate world political decision making, but, despite competition between them, they are likely to collude on issues of common interest.
However revolving-door conspiracies between high government appointees and the rich club of big business abound. In a blog article of Aug 2013, Greg Palast cites a critical 1997 memo from Timothy Geithner to his boss Deputy Treasury Secretary Larry Summers. US Treasury Secretary Robert Rubin was pushing hard to de-regulate banks, requiring repeal of the Glass-Steagall Act. The memo, outlining an end game for the WTO financial services negotiations, involved explicit collusion with five CEOs of the US investment and trading banks, including Goldman Sachs, Merrill Lynch, Bank of America, Citibank and Chase Manhattan (direct phone lines included): "As we enter the end-game of the WTO financial services negotiations, I believe it would be a good idea for you to touch base with the CEOs..." This initiative would result in the opening up of the banking sectors of all countries to derivatives trading, using the Financial Services Agreement (or FSA), an addendum to the international trade agreements policed by the World Trade Organisation to effectively unwind protective banking legislation internationally. Only Brazil held out, despite embargo threats from the European Union's Trade Commissioner, Peter Mandelson. Within weeks of leaving office, Rubin was named director, then Chairman of Citigroup - which went bankrupt, while managing to pay Rubin a total of $126 million. Days after his election as President, Obama, at Rubin's insistence, gave Summers the post of US "Economics Tsar" and made Geithner Secretary of Treasury. In 2010, Summers gave up his position to return to "consulting" for Citibank and others.
Summers has a long and very controversial history. He has served presidents from Regan to Obama. He only this week , in September 2013, withdrew from his candidacy for head of the FED. In a letter to President Barack Obama, quoted by the Wall Street Journal, Summers said any confirmation process for him was likely to be "acrimonious". In the early 1990s he became Chief Economist for the World Bank. By the end of the decade he had become pivotal in the banking deregulation that lead to the financial crisis. On July 30, 1998, as Deputy Secretary of the Treasury, Summers testified before the U.S. Congress that "the parties to these kinds of contract are largely sophisticated financial institutions that would appear to be eminently capable of protecting themselves from fraud and counterparty insolvencies." As Secretary of the Treasury from 1999 to 2001 under President Bill Clinton, he was a leading voice within the Clinton Administration arguing against American leadership in greenhouse gas reductions and against US participation in the Kyoto Protocol. He hailed the Gramm-Leach-Bliley Act in 1999, which lifted more than six decades of restrictions against banks offering commercial banking, insurance, and investment services (by repealing key provisions in the 1933 Glass‚ÄďSteagall Act): "Today Congress voted to update the rules that have governed financial services since the Great Depression and replace them with a system for the 21st century," Summers said. "This historic legislation will better enable American companies to compete in the new economy." During Summers's presidency at Harvard, the University entered into a series totaling US$3.52 billion of interest rate swaps, financial derivatives that can be used for either hedging or speculation. Summers approved the decision to enter into the swap contracts as president of the university. By late 2008, those positions had lost approximately $1 billion in value, forcing Harvard to borrow significant sums in distressed market conditions to meet margin calls on the swaps. In the end Harvard paid $497.6 million in termination fees to investment banks and has agreed to pay another $425 million over 30-40 years. The decision to enter into the swap positions has been attributed to Summers and has been termed a "massive interest-rate gamble" that ended badly. He has also been accused of many revolving door backroom deals as a government executive. On April 3, 2009 it was disclosed that he was paid millions of dollars the previous year by companies which he now has influence over as a public servant during the bailout. He also became notorious for a speech he gave suggesting there were fewer women in high scientific research positions because the variance of intellect in men was higher, resulting in a gender difference at the top end. While this is good science and possibly valid, it was received with furore and contributed to his eventual forced resignation as president.
The Zurich team then turned their attention to modelling the network effects of the liquidity crisis. DebtRank attributes a score of between 0 and 1 to individual companies and banks, reflecting how their financial fragility affects others. A company with a ranking of 0 would have no effect on others if it went bust. A company with a value of 1, however, would bring down the whole network if it failed.
They obtained previously confidential data covering 1000 days as the crisis developed. The data gave a detailed picture of the debt of 407 financial institutions that collectively borrowed $1.2 trillion from the US Federal Reserve Bank during 2008-2010 to keep themselves afloat. From this, 22 emerged that collectively borrowed three-quarters of this money. These form a strongly connected graph where each of the nodes becomes systemically important at the peak of the crisis. As the crisis evolved some agents folded and a subset of the remaining players incurred increasing debt ranks. At the peak of the crisis, several of the 22 tightly interconnected institutions on their own could have wiped out more than 70 per cent of the total value of that tight network, had they failed.
While debt rank depends on FED load data, the Multi-Agent Financial Network (MAFN) model focuses on a form of debt insurance called credit default swaps, in which one bank promises to cover another's debt if they cannot pay it, in exchange for an upfront fee. Markose & Giansante used MAFN to model the 2008 crisis. From 2007 onwards, a small group of major banks became densely interconnected through their credit default swaps. J. P. Morgan Chase was the most connected, making it a "superspreader" that would have caused havoc had it defaulted Several other banks were also too connected to fail.
The difficulty with monitoring any future crisis is that regulatory frameworks are national and local, while the transactions are global. Laura Kodres, chief of the IMF's global financial stability analysis division commented: "Someone needs to keep track of the big picture,"
Global capitalism is the driving force behind human impact on the biosphere, the loss of natural habitats, accelerating consumption of non-renewable resources, driving the global climate and our biological survival let alone economic prosperity into uncharted waters.
Major world superpowers such as the United States depend on global capitalism for their short-term economic wealth, so both parties, Republican and Democrat, are heavily lobbied and financially and politically influenced by vested big business interests. For example in 2008, Occidental Petroleum contributed $301,579 to Democratic candidates and $204,587 to Republican candidates and previously was among 53 entities that contributed the maximum $250000 to GW Bush’s second inauguration. During the 2008 US election cycle, BP employees contributed to various candidates, with Barack Obama receiving the largest amount of money, broadly in line with contributions from Shell and Chevron, but significantly less than those of Exxon Mobil. In 2009 BP spent nearly $16 million lobbying the US Congress. In 2011, BP spent a total of $8,430,000 on lobbying and hired 47 lobbyists.
About 70 per cent of the orders to buy or sell on Wall St are now placed by software programs. One of the first forays of mathematical modellers into financial dealing was Edward Thorpe who went on from a 1967 book “Beat the Market” based on a system of playing blackjack that had forced casinos to change their rules, to found the highly successful hedge fund Princeton/Newport Partners. The end of the cold war had created a climate in which an influx of Warsaw Pact scientists brought about a new quantitative mathematical approach to financial prediction. Physicists and mathematicians began to provide new ways of dealing with market forces. Jim Simons, one of the founders of quantitative modeling is a physicist who has made fundamental contributions to string theory. In 1982 he founded a hedge fund management company, Renaissance Technologies, whose signature fund, Medallion, went on to earn an incredible 2478.6% return in its first 10 years, way above every other hedge fund on the planet, including George Soros' Quantum Fund. Medallion's returns have averaged 40% a year, making Simons one of the richest men in the world, with a net worth in excess of US$10 billion. Of his 200 employees, a third have PhDs, not in finance, but in fields such as mathematics, physics and statistics.
Financial markets are among the most complex dynamical systems in nature and society combined. They are subject to all manner of feedbacks and are not just a product of fundamentals such as company balance sheet announcements, but technical trends of share price growth, human sentiment oscillating between greed and fear, informed and uninformed opinions of current trading conditions, emerging news including political crises and natural and industrial disasters, sudden changes in other shares or other markets, and finally but not least the increasingly unstable affects of electronic traders and algorithmic trading itself. Given this complexity, it is natural that mathematical dynamical systems modeling should enter the financial markets and that exponential amounts of money could be made from running a trading algorithm which can outsmart, or outspeed, the responses of other players even if only by pre-bidding them by milliseconds.
Fig 10: The Flash Crash: Left Wall Street Journal breakdown (Phillips), Top right order flow toxicity (Easley et al). Lower right market order depth (within 500 basis points) and net aggressive buy volume from the SEC report (http://www.sec.gov/news/studies/2010/marketevents-report.pdf)
Using neural nets, AI methods and vector processing super-computation, one can develop a learning algorithm that uses historical trends in prices of a commodity or a whole commodities market to develop a prediction of how it will trend in the immediate future. However this type of simulation is prone to over-fit error, when past data becomes inaccurate, due to new information qualitatively changing market expectations. Hence to keep pace, it becomes necessary to include in the algorithm emerging market information, at least in keyword form, and to make probabilistic estimates of the predicted stock and options pricing based on incoming data. In effect the modelers are themselves developing a complex system, modeling as many of the characteristics of the financial market complex system as they can and these satellite systems then become plugged into the main system when hedge fund traders issue massive buy and sell orders, leading to escalating short term instability.
In another sexually charged article entitled "Raging Bulls: How Wall Street Got Addicted to Light-Speed Trading" Jerry Adler highlights a new breed of traders the quants - physicists, engineers, and mathematicians-turned-financiers who now generate as much as 55% of all US stock trading. When for a time it was believed neutrinos could travel faster than light, this ignited a dream for quants - in the pursuit of market-beating returns, sending a signal at faster than light speed could provide the ultimate edge: a way to make trades in the past, the financial equivalent of betting on a horse race after it has been run.
High-frequency traders or HFTs are a subset of quants, investors who make money often a fraction of a cent at a time, multiplied by hundreds of shares, tens of thousands of times a day who in Adler’s words carry themselves with a distinctive mixture of diffidence and arrogance that sets them apart from the pure, unmixed arrogance of investment bankers. The faster the wheels of finance turn, increases the risk that they will spin out of control, that a perturbation somewhere in the system will scale up to a global crisis in a matter of seconds. “For the first time in financial history, machines can execute trades far faster than humans can intervene,” said Andrew Haldane, a regulatory official with the Bank of Englande. “That gap is set to widen.”
Plans are in the pipeline to synchronize all trading through satellite atomic clocks to eliminate differences between corporate timekeeping which can unfairly be exploited by HFTs. In 2013, Reuters admitted releasing manufacturing data 15 milliseconds before official publication due to a clock synchronization issue. Algorithms pounced instantly on the early information, trading an estimated $28 million in shares in that minuscule time. .
Not all commodities trading takes place in New York. By historical accident, derivatives such as futures and options are mostly traded in Chicago, 720 miles away. So a company called Spread Networks began buying up rights-of-way for a route that would lop about 140 miles off the shortest fiber-optic cable distance between. However because the speed of light is 50% faster in air than glass, the fastest communication between New York and Chicago would be line-of-sight through the air, which requires a chain of microwave relay towers. Tradeworx is building such a network, as is McKay Brothers, a California firm that hopes its system will be the fastest, with a round-trip latency of less than 9 milliseconds. The downside is that the microwaves can be interrupted by rainstorms, or certain atmospheric conditions that duct the signal away from the receiving dish.
But in the new era of rapid electronic trades a whole new class of wild instabilities has begun to emerge. Three examples show how the focus on rapid computer-driven electronic trading can lead to catastrophic instabilities. None of these are driven by fundamental factors such as the sub-prime mortgage deceits that precipitated the 2008 stock market crash, but simply high speed trading errors.
On May 6, 2010 the “Flash Crash” saw the Dow Jones plunge 9% only to recover those losses within minutes. It was the second largest point swing, 1,010.14 points, and the biggest one-day point decline, 998.5 points, on an intraday basis in Dow Jones Industrial Average history. So many shares were traded that day the online trading section of the New York Stock Exchange froze and between 2.30pm and 3pm the Dow Jones lost and then regained nearly US$1 trillion. Shares in management consultancy firm Accenture plunged to just above zero. Apple shares went up to US$100,000.
The U.S. Securities and Exchange Commission (SEC) and the Commodity Futures Trading Commission (CFTC) joint report "portrayed a market so fragmented and fragile that a single large trade could send stocks into a sudden spiral," and detailed how a large mutual fund firm selling an unusually large number of E-Mini S&P 500 contracts (valued at approximately $4.1 billion) as a hedge to an existing equity position first exhausted available buyers, and then how high-frequency traders started aggressively selling, accelerating the effect. The report claimed that this was an unusually large position and that the computer algorithm the trader used to trade the position was set to "target an execution rate set to 9% of the trading volume calculated over the previous minute, but without regard to price or time." As the seller's trades were executed in the futures market, buyers included high-frequency trading firms and within minutes these also started aggressively selling the long futures positions they had first accumulated mainly from the mutual fund. HFTs then began to quickly buy and then resell contracts to each other - generating a 'hot-potato' volume effect as the same positions were passed rapidly back and forth. A variety of other factors, from single large trades of the same stock by other players, to a movement in the U.S. Dollar to Japanese Yen exchange rate were also cited. Technical glitches in reporting may have also contributed.
The Chicago Merchantile Exchange rejected the findings, claiming the cited transactions were legitimate and reflected little more than 1% of the same contracts on that day and less than 9% of the volume during the time period in which the orders were executed. The director of the Center for Innovative Financial Technology noted that "The heads of the SEC and CFTC often point out that they are running an IT museum." - citing photographic evidence of their inadequate technology, calling their five month delay in reporting "unacceptable".
Others suggest the Flash Crash was partly caused by the HFT strategy of "spoofing"; making bogus offers to buy or sell shares to flush out the intentions of rivals. On the day, an astonishing 19.4 billion shares were traded, more than were traded in the entirety of the 1960s, but hundreds of millions of them were never actually sold; they were merely held for a few thousandths of a second as traders tested the waters.
In April 2015 financial trader Navinder Singh Sarao, from Hounslow, west London, now dubbed the "Hound of Hounslow" in a reference to "The Wolf of Wall Street" was charged with wire fraud, commodities fraud, and market manipulation. The US Department of Justice alleges he contributed to the crash using an automated trading programme - "spoofing" financial markets from his parent's semi-detached home using commercially available trading software to place $200m of false trades. The US Commodity Futures Trading Commission has also filed civil charges against Mr Sarao, calling him a "very significant player in the market". The US DoJ complaint states: "Around the time of the flash crash, Sarao took significant steps to protect his assets". In late April 2010, Sarao established a new entity, Nav Sarao Milking Markets Limited, which was incorporated in Nevis. The Justice Department said in a statement that "Sarao's alleged manipulation earned him significant profits and contributed to a major drop in the US stock market on May 6, 2010". "By allegedly placing multiple, simultaneous, large-volume sell orders at different price points - a technique known as "layering" - Sarao created the appearance of substantial supply in the market." He was then able to buy and sell futures contracts tied to the value of the share indexes.
The Guardian has established that Sarao invested ¬£2m in a startup company called Iconic Worldwide Gaming, and that he remains a minority shareholder in the business. Iconic holds patents for peer-to-peer gambling licensed for use by Malta registered iconicbet.com, an online casino whose operating company is advised by a string of well-known names, including Sir Robin Jacob, a former court of appeal judge; Sir David Michels, a former chief executive of Hilton hotels and erstwhile deputy chairman of Marks & Spencer; and Damien O'Brien, an Irish telecoms entrepreneur, who describes himself as the chief executive of Iconic Corporation. Michels is a big name in the betting industry and a keen gambler on the north London poker scene, who ran casinos in the 1990s, and oversaw high-street bookmaker Ladbrokes during a seven-year tenure as chief executive of Hilton Hotels, which owned the bookmaker at the time. Also listed are City financiers Miles Mackinnon and John Dupont, founders of Mayfair-based boutique private equity firm MD Capital Partners, who say they raised the "pre-launch capital" for Iconic Worldwide Gaming and its gambling website. Dupont previously worked at Manx Trust Management Group, where he was sales and marketing director of Montpelier Tax Consultants. Mackinnon sits on the executive board of the Special Olympics. He is also a member of the Company of International Bankers and the International Tax Planning Association. Mackinnon is listed as briefly being a director of Sarao's company, Nav Sarao Futures Ltd, which held shares in Iconic Worldwide Gaming. The co-founder of Nav Sarao Futures was his childhood friend Gurmatpal Dosanjh, a financial analyst and debt specialist whose career has included spells at credit-rating giant Moody's and Bank of America Merrill Lynch. In December 2014, Nav Sarao Futures transferred the shares to International Guarantee Corporation, an Anguilla-based company that US authorities say the trader created as part of a "tax avoidance strategy". At his bail hearing at Westminster magistrates' court, Sarao's lawyer said the trader had ¬£4.7m in an IGC trading account that could be used as a surety.
The one academic study instead blamed "order flow toxicity" - the probability that informed traders (e.g., hedge funds) adversely select uninformed traders (e.g., market makers that quote both a buy and a sell price in a financial instrument or commodity held in inventory, hoping to make a profit on the bid-offer spread). One hour before its collapse, the stock market registered the highest reading of "order flow toxicity" in recent history. If the order flow becomes too toxic, market makers lose and are forced out of the market. As they withdraw, liquidity disappears, which increases even more the concentration of toxic flow in the overall volume, which triggers a feedback mechanism that forces even more market makers out. The authors claim this cascading effect has caused hundreds of liquidity-induced crashes in the past, the flash crash being one (major) example of it.
Measures to correct such instabilities are varied. The SEC suggested confidential reporting of each algorithmic strategy in a trade confidentially to them. Others have suggested designing a circuit-breaker algorithm. However these would need to be designed as very high reliability software engineered to work under stressed conditions and with multiple backups, or they might themselves cause further complications, unless they can genuinely distinguish toxic market instability from all the other players making good and bad trades.
Two years later NYSE’s Duncan Niederauer explained the exchanges 44% profit decline was due largely to a 25% decline in revenues from transactions from a year earlier. The culprit: An unfriendly environment for high frequency trading firms. From his point of view, regulators and folks in the media hyped the HFT bogey man too much, creating uncertainty, causing an HFT migration into other asset classes and geographies. However Saluzzi and Arnuk state instead that HFT volumes are down because investor volumes are down, in turn because traditional retail and institutional buyers and sellers of stock have been steadily waking up to the dangers of drinking at the increasingly dangerous ”stock market watering hole” despite rises in the market index as a whole. They cite stock exchanges catering to hyper traders who game the system, dark pools (anonymous off-market transactions between large players which are not reflected in market indices) which together trade more than a third of all shares, conflicts of interest as exchanges own stakes in dark pools, and HFT firms own stakes in exchanges, and several other salient factors. As of mid-2014 New York State prosecutors have sued Barclays for fraud. "Barclays grew its dark pool by telling investors they were diving into safe waters. According to the lawsuit, Barclays' dark pool was full of predators - there at Barclays' invitation".
Fig 11: Above: Declining US high frequency trading. Below: Overall trading volumes are down although the market is raging.
In "High-Speed Trading No Longer Hurtling Forward" Nathaniel Popper in New York Times’ Business Day notes that profits from high-speed trading in American stocks are on track to be, at most, $1.25 billion this year, down 35 percent from last year and 74 percent lower than the peak of about $4.9 billion in 2009, according to estimates from the brokerage firm Rosenblatt Securities. By comparison, Wells Fargo and JPMorgan Chase each earned more in the last quarter than the high-speed trading industry will earn this year. The firms also represent a declining percentage of a shrinking pool of stock trading, from 61 percent three years ago to 51 percent now, according to the Tabb Group, a data firm. The diminishing presence of these traders in the markets has not hurt the overall performance of stock prices. Leading indexes have been on a steady climb for the last few years. For high-speed traders, rising prices are actually a part of the problem: climbing stock markets tend to be calmer stock markets, providing fewer trading opportunities for high-speed firms. Several studies have found that the primary impact of high-speed firms has been a steady decline in the cost of placing trades for ordinary investors. Now that the high-speed firms are shrinking from the market, there are some indications that trading costs may again be rising. The slow down has caused HFTs to expand in other markets. High-speed firms accounted for about 12% of all currency trading in 2010; in 2012 it is set to be up to 28%, according to the consulting firm Celent. Many market experts have argued that the technical glitches that have recently hit the market have been a result of a broader trend of the market splintering into dozens of automated trading services and lack of human oversight.
In another fluctuation, in March 2011, BATS a startup exchange, organized an initial public offering of its own stock. Within a few seconds after trading opened a software bug froze trades in BATS stock on the BATS exchange and in the process took down a server that handled all the ticker symbols at the top of the alphabet, affecting among others Apple, erroneously reporting a price drop of almost 10 percent on BATS and causing a brief suspension in all trading of the world’s most closely watched stock. Meanwhile in 900 milliseconds, too fast for anyone to react, BATTS own listing on the Nasdaq plummeted from its $15.25 opening price to $0.28, reaching a fraction of a cent before trading was halted. BATS executives apologized, took responsibility, withdrew the IPO, and cancelled the trades. But maybe there was more going on than a glitch. Analyzing the transactions, a Nanex engineer named Jeffrey Donovan saw the fingerprints of an algorithm designed to feed stock into the market at successively lower prices. “You can see it waiting a few milliseconds after each trade for the bid side to lower its price, and then the cycle repeats until the stock goes to zero,” he says. Whoever may have done this presumably wasn’t in it to make money; Nanex CEO Hunsader thinks it was an attempt to obliterate BATS, which in just a few years has captured some 10% of US trading volume from older competitors.
On Wednesday 1st August 2012 computers at Knight Capital Group, a leading matchmaker for buyers and sellers of stocks, handling 11% of all trading in the first half of this year, went haywire. Unintended orders spewed forth and some stocks gyrated wildly. In a bid to keep a grip on its customers, Knight had pushed to introduce a new system that would position it competitively amid market changes that took effect that day. Unlike rivals that hesitated, Knight Capital’s presence on Day 1 would ensure bragging rights and extra profits. But in the rollout of the not fully tested system that morning, Knight created a blizzard of erroneous orders to buy shares of major stocks. The orders caused wild swings that affected the shares of more than 100 companies, including Ford Motor, RadioShack and American Airlines. It took the firm the better part of an hour to turn off its computers, and while the affected shares quickly recovered, the firm itself was left reeling. It effectively lost $440 million in selling all the stocks that it accidentally bought on Wednesday - more than its entire revenue in the second quarter, when it brought in $289 million. Knight lost three-quarters of its market value in the succeeding two days, and was scrambling to find financing or a new owner after rattled customers like Citigroup, Fidelity Investments and Vanguard took their business elsewhere.
A few decades ago, the orders would have flooded into specialists at the New York Stock Exchange, or to the market makers in Nasdaq stocks who had a similar responsibility. Some of the orders might have been executed, but trading in the affected stocks would have come to a halt within minutes while people tried to figure out what was going on. There would have been red faces at the firm responsible, but much less red ink. Those market makers are largely gone now. Their sources of profit - the spreads between what they sold stocks for and what they would pay for them - have vanished with competition and rule changes that allow share prices to move by one cent or less, rather than the one-eighth of a dollar, or 12.5 cents, that used to be the minimum change. Traditional market makers have been largely replaced by high-frequency traders who use computers that can react to orders in nanoseconds, sending in orders - and cancelling them - far faster than any human could hope to do. Exchanges, knowing that they need market makers who will take the other side of customer orders, offer rebates to high-frequency traders who manage to fill a lot of orders. In normal times, the result is markets that are highly liquid and very fast.
In what could be the most expensive tweet in history, at 13:07 on 23 April 2013, a vagrant tweet from the AP news agency in Washington hacked by The Syrian Electronic Army read "Breaking: Two Explosions in the White House and Barack Obama is injured." Within milliseconds it had been flagged by trading computers on Wall St programmed to scan the net for words or phrases that might affect stock markets and unleashed a torrent of trades. In seconds the Dow Jones had plunged 140 points and more than US$200 billion of capital had been wiped out. A few minutes later the report was exposed as a hoax and the markets quickly returned to their pre-tweet levels.
High-frequency trading raises an existential question for capitalism, one that most traders try to avoid confronting: Why do we have stock markets? To promote business investment, is the textbook answer, by assuring investors that they can always sell their shares at a published price - the guarantee of liquidity. From 1792 until 2006, the New York Stock Exchange was a non-profit quasi utility owned by its members, the brokers who traded there. Today it is an arm of NYSE Euronext, whose own profits and stock price depend on getting high-frequency traders in the door.
In “Triumph of the Geeks”, Sarfraz Manzoor asks “Isn't there something wrong with a system that promotes so much volatility to the benefit of no one except a handful of hedge funds? Can it be a meaningful investment of time and technology?” Buffett's business partner, Charlie Munger, has described HFT as "basically evil". "I think it is very stupid to allow a system to evolve where half of the trading is a bunch of short-term people trying to get information one millionth of a nanosecond ahead of somebody else."
Back in 1998 John Cassidy in New Yorker in "The Force of an Idea" examined the Department of Justice case against Microsoft and the part Brian Arthur and his non-linear economics might have played in the case that tying Internet Explorer to Windows violated a 1994 decree banning the company from tying any other product to its operating system in the face of allegations of Microsoft's anti-competitive practices.
One of the central justifications of the capitalist free market is the notion espoused by several Nobel prizewinning economists of the "Chicago" school that the concerted action of the players in the free market arrives at an optimal outcome. Brian Arthur's non-linear economics made the claim that these underlying assumptions do not happen for large sectors of the economy, particularly those in high tech and communications industries.
The economic test case example in court was the Windows operating system itself. This was clearly not the best or most efficient operating system, having proved to be a little like a clumsy B52, overblown and full of security flaws, making it prone to viral attacks, but it had some 80% share of the personal computer market - an unassailable advantage over competitors such as Apple with a mere 10% share.
Microsoft's predominance had come essentially from a singe transaction when it was given the license to develop the operating system for the IBM PC, IBM having failed to recognize that the future income was going to be made from software rather than the hardware it was used to purveying on large mainframes.
Clearly the technology market has stratifications which make nonsense of the utopian free market, simply because once an operating system has a dominant market share, people and businesses will have to continue to use it to maintain it for technological and informational fluency even if it is a hotchpotch of mediocre design.
Thus the case the Microsoft was acting anti-competitively by tying the browser to every operating system sold gave it a precisely similar stratified dominance of the browser market - a fact which remains true despite the survival of other browsers such as Firefox and the rise of Google to similar prominence in the search engine advertising market, in turn purveying their Chrome browser with similarly massive financial backing.
This shows us that economic dynamics, like natural ecologies, have many non-linear tipping points and feedbacks which we need to understand and model adequately to develop an economic model of our ecological future, as well as the chaotic instabilities of market trading volatility, which themselves are a product of emotional sentiment driven by greed, fear and hormonal highs as well as financial fundamentals of operating corporations.
Life has existed on earth for a full third of the universe’s lifetime. The reason for this stability is twofold. Firstly, unlike larger short-lived stars which gave us our rich array of atoms in earlier supernova explosions, the sun is a small very long lived star with only very slowly evolving brightness. Secondly and more pertinently, life is genetic and genetic inheritance and evolution is cumulative over generations. It is also extremely conservative, striving to preserve genetic encoding through error correcting enzymes with evolution occurring only differentially through occasional adventitious mutation.
This means a lion cannot turn overnight into a lamb, nor can a tiger eat an antelope and metamorphose into a zebra. However companies possess no such genetic stability and can cash up their assets in the face of crisis and turn to entirely different occupations, and consume one another in takeovers to become entirely new corporate giants. This lack of dynamic stability renders human society vulnerable to a complete lack of long-term economic and biological stability.
Moreover the cumulative dynamics of genetic organisms generates long-term ecological relationships, which provide non-linear feedbacks that tend to stabilize complex relationships and increasing species and genetic diversity. We may consider predators and disease-bearing parasites as evidence for the brutality of nature, but the ecosystemic relationships tell another story. Without predators such as lions, the population of gazelles will enter boom and bust as they eat out all the available fodder leading to their wipeout in escalating episodes of famine. This was intriguingly illustrated in the return of wolves to Yellowstone spurring the recovery of the bears. The removal of the wolves had caused the elk population to explode, consuming all the berries the bears depended on for their vitamins leading to a loss of fitness.
Climax diversity is the cosmological apex of complex system generation, in which human society stands at the pinnacle. We can continue to coexist in this complex system only if we fashion our economic and developmental impacts in a way that maintains the long term stability of the living systems on which our continued existence depends.
By contrast with the cumulative stability of genotype, and incremental evolution by mutation and natural selection, the capitalist economy is based on a purely social model of competing fragmented democracies. Company law stipulates a democratic basis for a group of shareholders to incorporate and sets out a legal and financial basis for them to pursue business based on the two nested democracies of the general meeting and the board of directors who are accountable to the shareholders, at least in principle. In larger companies, there is also a line-managed hierarchy of employees, forming a pyramid from the CEO at the apex down through the executive branch to salaried workers. Outside this framework government regulation t provides varying degrees of feedback intended to guarantee a modicum of corporate accountability and responsibility lacking in company law itself, for example in fair trading acts, clean air acts, and environmental protection acts.
Corporate competitiveness, by contrast with genetic mutation and natural selection, is a much more primitive form of selectivity, which does have a degree of survival of the fittest optimization of efficiency, but in the complete absence of any cumulative genetic mechanism to ensure long term stability. The end results are thus much more prone to the breakdown of ecological complexity into huge conglomerate enterprises, with a high degree of collateral fallout due to short term impacts lacking any long-term foresight, or even any natural feedback mechanisms to ensure at least medium term stability.
Because they are vulnerable to manipulative share trading, companies are prone to mergers and acquisitions by friendly, or often hostile, takeover. These can be by competitors seeking to expand their niche in the market by eliminating competition, or providing collective efficiencies by laying off redundant staff, or they may simply be forays by hedge funds to gain strategic control over large profitable operations, or at another extreme can be asset stripping companies taking an undervalued company to pieces for its assets in plant, property and goodwill.
An Economist editorial of 1998 shows that the ease with which companies are born and fail is clearly one reason why Taiwan's total factor productivity had improved faster than that of all other Asian countries since 1960.
In 1991, 40% of Taiwan's chemical output came from firms that did not exist in 1986. One-third of the value of Taiwan's plastics production and half its output of fabricated metal products were also attributable to firms less than five years old. The newcomers established their place in the market by forcing older firms out of business. Firms that had accounted for 58% of Taiwan's chemical production in 1981 had left the business by 1991. In other sectors - including ones which were expanding rapidly overall - the carnage was even worse. Four out of five firms that manufactured clothing, metal products, textiles and plastics in 1981 either closed or changed lines of business over the next decade.
As the successful entrants tend to be more efficient than the firms that die, they boost productivity across the economy. Between 1986 and 1991, total factor productivity - the increase in output due to more efficient use of inputs such as labour and capital - in Taiwan's electrical-machinery industry rose 23.6%. Over a third of that, the researchers estimate, came from new firms pushing out less efficient ones. In the chemicals industry, where productivity growth was slower, a whopping three-fifths of the gain was due to the entry of highly efficient firms and the exit of stodgier ones.
But at the same time, this concrete jungle form of survival of the fittest shows no signs of providing any sort of long-term stability for the people, and the environment in which these companies operate or even for the market conditions on which these industries depend long term. Companies are simply incorporated agents founded by a memorandum of understanding by their founding shareholders for their capital gain. They have no cumulative stability beyond the boardroom decision-making horizon and as they stand they have no covenant of responsibility to their workers, to the consumers of their products, to the general public and least of all to the natural environment in which they operate. Like a malignant cancer, the only principle on which they operate is relentless growth of capital for the investors.
Given this one-sided covenant of corporations only with their internal investors, it naturally falls to governmental regulation, to labour laws, the Clean Air Act, the Consumer Protection Act and other legislation to safeguard society from the deleterious impacts of corporate activity. When the new right call for an unregulated economy because this will increase production and profitability, they are being deceptively ingenuous about the actual purpose of much of such regulation, which is designed to protect society the natural environment and our long-term future from potentially irreversible misadventure intrinsic to the corporate model, not simply to waste ‘our’ money on inefficient government interference.
In a 1996 Newsweek article entitled “The Hit Men”, Seth Resnick highlights the wide variation among corporations in the attitudes of CEOs towards their employees during staff downsizing, regardless of how profitable the company is at the time, citing the paradox of CEOs receiving massive bonuses while firing a sizeable fraction of their company’s workforce. Some of these changes are a product of company mergers and the inexorable rise of automation in business, but the differences between companies are glaring enough to show there is no consistent corporate sense of employment ethics in company law and that many fail miserably in terms of the ethics of good employment.
When AT&T announced that it would fire 40,000 people as part of its breakup into three companies, AT&T shares roared upward. Bob Allen, the CEO said he felt bad about firing people but saw no point in giving up some of his pay or perks as a shared sacrifice with the workers. And he saw no reason to apologize: "I wouldn't see any value of going on TV and crying." Allen, who had been ridiculed on Wall Street for AT&T's disastrous $7.5 billion hostile takeover of computer maker NCR in 1991, made more than $5 million when the value of his stock and options soared after the layoffs were announced.
Chase Manhattan was pressured by Michael Price of the Mutual Series funds to get its stock price up and sold out to Chemical. The attraction to Chemical was the Chase name, which it kept and its ability to cut 12,000 jobs from the combined banks. Those jobs would have probably vanished even without a takeover, but in a slower, more controlled way. Thousands of little people were fired to save money, while the new Chase kept all 36 outside directors, who received fat fees and dandy retirement packages.
Apple at the time had installed Gilbert Amelio as CEO. The company was firing workers and eliminating its dividend to conserve cash but at the same time was paying Amelio $2.5 million a year in salary and bonus. Amelio was already on Apple's board, so was partially responsible for its plight. Amelio's answer was simply: "It's a market-determined figure." Contrast this to Steve Jobs’ later $1 arrangement upon returning to and rescuing Apple, to become for a time the world's most valuable company. Of course Steve did have massive stock options in Apple, but the high CEO salary when Apple was in trouble makes a stark contrast to the vanishing one in its time of burgeoning success. According to Apple’s proxy statement fifteen years later as of February 2011, Jobs owned about 5.5-million shares of Apple stock, valued at the time at just above $2-billion. Apple however did at this time pay for the private plane Jobs used for business meetings, valued at $40-million, which was a gift from the board of directors, to the tune of $1.1-million over the previous three years.
By comparison, Resnick cited three ‘noble’ ones. United Technologies, a conglomerate that had cut 33,000 jobs since 1990. UT unveiled a costly plan to help workers get re-educated. President George David said at the time the United States can't stop production jobs from migrating overseas, so companies should help people upgrade their education before it's too late. UT gave employees time off to attend classes, pays for tuition and books, and offered to give employees who completed their studies 50 shares of UT stock, currently worth about $5,200.
John Grundhofer, chairman of First Bank System of Minneapolis collected a $200 million fee for abandoning its proposed takeover of First Interstate Bank. Grundhofer - who fired 2,000 employees when he joined the bank six years before - gave each employee a $750 bonus, about $11 million in all. He wanted to show his appreciation to employees for having created a bank strong enough to bid for a company bigger than itself.
In Scherer Brothers Lumber Co., a building-supply company in suburban Minneapolis officers refrained from drawing bonuses until the company had given a 15 percent profit-sharing contribution to every eligible employee. Rather than firing workers to save a few bucks, the company eliminated fresh flowers for receptionists' desks, cut the top officers' pay temporarily by 25% and stopped buying professional sports tickets.
Clearly there is no consistency whatever in how CEOs of companies treat their employees regardless of trading conditions, for the very simple reason that in its foundation in company law, the only legally defining players are the shareholders and their investment in the company.
Fifty years ago, children in Newfoundland could catch fish by dipping a basket into the ocean. By 1992 Canadian research vessels were sweeping the seas in vain, finding not a single school of cod in what was once the world's richest fishery. The destruction of the Grand Banks cod is one of the biggest fisheries disasters of all time. Newfoundland and Labrador's historic cod fisheries attracted local and international fishing fleets for almost five centuries before the Canadian government shut the industry down indefinitely in July 1992.
Although the cod fishery supported workers for hundreds of years, a variety of changes occurred during the 20th century that made the industry much less sustainable than ever before. Foremost among these were advances in fishing technologies that dramatically increased the ability of fishers to find and harvest large quantities cod. These included changes to vessel and net design, as well as the introduction of electronic navigational aids and fish-finding instruments.
The inshore fishery was a local industry in coastal waters, while the offshore Grand Banks fishery attracted fleets from around the world. International vessels were able to fish anywhere they liked on the banks. Collapse started in the 1950s, when huge factory ships from Japan, Russia, and other countries descended on the Banks with giant nets. This trend accelerated in the 1960s, and the catch rose from its historical level of around 200,000 tons a year to 810,000 tonnes in 1968. Then it started to decline, falling 60% by 1975 and to 150,000 tonnes by 1977.
At this point Canada banned foreign fisherman from within 200 nautical miles (370 km) of its shores, and the cod began to recover. Canada blamed foreign disregard for quota. Although blame was cast only at non-NAFO (North Atlantic Fishing Organization) countries, history has shown that both the NAFO and non-NAFO members were overfishing at rates sufficient to deplete the stocks. Scientists set catch limits calculated to allow stocks to recover, predicting catches of 400,000 tonnes by 1990. In anticipation the government helped people in Canada's Atlantic provinces to buy new boats and fish plants. The bonanza never happened.
Fig 12: Cod was said to be so abundant then that you could almost walk across the ocean on their backs. For centuries, cod represented the most highly concentrated source of protein available in quantity anywhere on the planet. Inshore small boats were followed by large long-liners and massive factory boats whose sonar quickly hoovered up the last of the adult cod in huge accurate catches. When Canada extended the 200 mile nautical limit it set out upon the first collapse a highly false expectation that Canada could soon double their catch. The number of fishermen went from 15,000 to 35, 000 and the number of processing plants from 115 to 175. The small inshore fishermen said the stocks were severely declining but the factory ships were super-efficient and still caught the last of the fish. The federal government blocked the recommendations of the scientists to significantly reduce the quota.
Every year scientists of the Canadian Department of Fisheries and Oceans (DFO) estimated the size of the fish stocks, and set the "total allowable catch", or TAC, at 16% of the fish, which theory said should allow stocks to increase. But stocks never rose enough to allow TACs greater than 260,000 tonnes, falling well short of predictions. That wasn't necessarily a disaster, the scientists reasoned. The size of fish populations was held to be dominated by the survival rate of young fish, which varies widely and unpredictably. The slow recovery might simply mean a few bad years. But there were other worrying signs. The fish were smaller, a sign that each stood less and less chance of surviving the year. And the fleet was fishing a smaller and smaller area of ocean.
By 1980 the Newfoundland fishery was dominated by three large complexes, each propped up by provincial government funds and bank loans: Fishery Products, Nickerson-National Sea Products and Lake Group. Together they controlled over 70% of fish production. The balance was largely in the hands of traditional and small family-run businesses. By 1981 all of the large fish companies were in crisis. The fish stocks, although depleted were growing, but weak US markets and rising interest rates affecting the large companies, such as National Sea Products, which had expanded on borrowed money, drove it to near bankruptcy. The federal government provided $13 million to save Lake Group and keep its plants open. By 1982 National Sea Products was going insolvent with the Bank of Nova Scotia owed $40 million and the company requested federal support. Fishery Products received such support to prevent insolvency.
The fear of having to allow foreign fleets into Canada's exclusive economic zone if there was any surplus fish, as stipulated under the law of the sea, ensured the rationale would be that there would be no surplus fish. This is a classic tragedy of the commons enacted by the Canadian federal government for capitalistic purposes. The Kirby Report at the time remained more concerned about the failure of processing facilities than depletion of the fish stock. Fisheries expert Scott Parsons put it bluntly: "More simply the problem was unbridled greed which led to debt-financed over-expansion.”
The province was against the federal notion of "vertical integration" of the fishing industry, in which processing companies operated their own trawlers and marketing arms. However with the passing of the Atlantic Fisheries Restructuring Act arrangements were made to form two large and vertically integrated companies consolidated into National Sea Products, which remained private, and Fishery Products International, which absorbed Lake Group and others and was largely owned by the government. Both received large amounts of public funding. Up until the early 1990s National Sea Products had been Canada's largest fish-processing company and one of the world's largest fishing enterprises.
National Sea Products claimed in 1990 that scientists only thought fish stocks were low because they surveyed large areas of ocean randomly, and didn't "go where the fish are" where they would find that "fishing has never been better". Fishing had never been better, because during the 1980s, aided by subsidies, fishermen bought more powerful boats and new, accurate fish-finding sonars. This was intended precisely to increase the catch per unit effort. Yet scientists took no account of better technology in calculating stock size.
In 1989, the DFO lacked confidence in their own data, were reluctant to abandon received wisdom, and the region's main employer insisted that fishing was fine. They compromised and decided the stock was midway between the research and commercial data. Retrospective calculation of the fishing that would have produced such a stock showed boats had been taking not 16% of the fish each year as planned, but at least 60%. The scientists advised a TAC of 125,000 tonnes, well below the 266,000 of 1988.
Then expedient politics intervened. The fisheries minister refused to anger fishermen by slashing catches that much. Politicians used the uncertainty to set catches as high as possible at 235,000 tonnes. Newfoundland cabinet minister John Crosby who preferred a go slow approach stated on Canadian TV: “These questions are not decided by scientists. Because if they were we would have wiped out the whole offshore fishery” but in 1992 the fishery was wiped out because the scientists were ignored (CBC).
In January 1992, the DFO recommended a TAC of 185,000 tonnes. Then it did another research cruise - and cut that to 120,000. Then in June, it recommended banning fishing altogether. Suddenly, the scientists realised there were no cod old enough to spawn left. By now the fishermen were worried too, and agreed to a fishing moratorium on the Bank and adjacent fisheries. In 1993, it was extended indefinitely and still remains closed as of 2013.
In November of 2006, Fisheries and Oceans Canada released an article suggesting that the unexpectedly slow recovery of the cod stock is due to inadequate food supplies, cooling of the North Atlantic, and a poor genetic stock due to the overfishing of larger cod. With the Northern Cod, significant amounts of capelin – an important prey species for the cod – were caught as bycatch, further undermining the survival of the remaining cod stock.
During the summer of 2011, a study suggested that recovery of East Coast cod stocks around Nova Scotia showed promises of recovery, despite earlier thoughts of complete collapse. It was found that initial stages of recovery began around 2005, though more time and studies were needed to study the long-term stability of the stock increase. In addition in 2010 a study by the Northwest Atlantic Fisheries Organization found that stocks in Grand Banks near Newfoundland & Labrador had recovered by 69% since 2007, though that number only equates to 10% of the original stock.
So what do we find happened to the two large companies doing the lion's share of the final overfishing when faced with the collapse of one of their principal sources of income?
Canadian offshore fishing quotas plummeted during the early 1990s and effectively squelched any opportunities available in National Sea's traditional harvesting business. The entire national quota plunged from 316,000 metric tons in 1990 to 34,700 in 1995; National Sea's quota crashed from 123,000 tons to just 14,400 tons.
As profits from harvesting businesses declined, Henry Demone the then CEO sought to increase profits from other segments, while shoring up the company's bleeding balance sheet. He sold off several of the company's ships and processing facilities and nearly halved National Sea's workforce to just 4,200 by 1992. He began dismantling the globalization program that he had overseen during the mid- and late 1980s to refocus all of the company's resources on North America in the consumer frozen foods market. Only 40% of National Sea's fish products were harvested on company vessels, compared to about 80% in the mid-1980s.
As National Sea jettisoned assets during the early 1990s, its sales fell, from $607 million in 1990 to just $266 million in 1993. The company posted net losses every year from 1988 to 1993, when write-offs related to discontinued operations contributed to a whopping $42.5 million deficit for the year. Demone continued to sell off assets related to traditional harvesting businesses and to reduce the company's workforce. In 1994, the company sold 13 ships (including two deep sea freezer trawlers), its French subsidiary, and its shrimp processing plant in Florida. National Sea posted its first positive net income in seven years in 1994. Going into 1995, the company was reduced to about 1,600 full-time employees, five processing plants, and 19 fishing vessels, only 13 of which were active. Its core business had changed from harvesting fish to marketing prepared, frozen, fresh, and packaged seafood.
Fishery Products International likewise switched its emphasis to marketing, becoming a leading provider of seafood in the foodservice market, offering cold water and warm water shrimps, crab, lobsters, shellfish and fish and fish products including seafood starters, sea cuisines, nuggets, oven ready products, and others to America’s largest restaurant chains and national distributors. FP Resources (formerly FPI Limited) the parent company, sold off all of its seafood assets in 2007 in order to become an investment and holding company. In order to facilitate the change, the company sold its Ocean Cuisine International to Ocean Choice International for $175 million in cash and its Fishery Products International North American marketing and manufacturing businesses to High Liner Foods for $87 million in cash plus a stake in High Liner. The company changed its name to FP Resources and began using funds received from the sales (including part of its High Liner stake) to build up its business. Among its first investments was a stake in Caribbean Data Centers (Barbados).
This disaster was due to a combination of many factors, a general lack of understanding about surveying fish stocks, international competition to fish a lucrative resource, the development of new technologies which made fish harvesting unsustainable even short term, the incapacity of regulatory scientists to make decisions independent of commercial and political influence, the growth of larger processing companies exploiting the resource, corrupt political thinking and unbridled greed. The blame for the process lies not simply with the two large companies who fished the last life out of the Grand Banks cod but the climate of selfish short-term decision making the whole capitalist decision-making process engages. Democracy is caught with its pants down because it is elected official who have put the principal pressures on which have precipitated the catastrophe although National Sea was also guilty of misrepresenting the facts in such a way as to exacerbate the final demise.
The saga of Love Canal is one which tests the assumptions of democracy and capitalism and their complex relationship and divides liberals and conservatives over how to apportion the blame and which agency has been the subject of injustice between the victims, Hooker Chemical and its apotheosis Occidental Petroleum, the Niagara Falls School Board, the City, government, the federal government and the courts. From one point of view the companies are pollution villains harming innocent people’s futures and from the other they are victims of big government bullying and the thoughtless expedience of local elected officials. It thus serves as a test case, over which a great deal of care has to be mounted, to come to a reasonable and balanced understanding.
In 1892 William T. Love proposed to connect the upper and lower Niagara Rivers with a 7-mile canal to generate DC electricity. His scheme included a park, a hydroelectric plant and housing for 1 million people. Yet he only got 4,600 feet into the canal when an economic downturn and the rise of AC power ended his plans. Love went bankrupt and the project was abandoned. The excavation filled with water and was used by locals for swimming and ice-skating. Then, in the 1920’s, the city of Niagara Falls began using it as a municipal dump site. In the 1940s, the U.S. Army began using the site to dump wastes from the war effort during World War II, including some nuclear waste from the Manhattan Project, the rest of which was dumped in nearby Lewiston, New York at the Niagara Falls Storage Site.
Hooker Electrochemical Company, founded by Elon Hooker, was granted permission by the Niagara Power and Development Company in 1942 to dump wastes in the canal. Hooker purchased the site from Niagara in 1947, along with the 70-foot-wide (21 m) banks on either side of the canal, drained and lined the canal with clay and disposed of industrial and chemical wastes there and other nearby locations in 55-US-gallon (210 L) metal or fibre barrels. Approximately 21,000 tons of chemicals, including caustics, alkalines, fatty acids and chlorinated hydrocarbons (reportedly containing 200 tons of dioxin) from the manufacture of dyes, perfumes, solvents for rubber and synthetic resins were buried to a depth of 20-25 feet in and around the Canal. The City of Niagara Falls and the army continued the dumping of refuse. In 1948, after World War II had ended and the City of Niagara Falls had ended self-sufficient disposal of refuse, Hooker became the sole user and owner of the site, which was in operation until 1953.
By 1954 the City of Niagara Falls was experiencing a boom, due partly to production using cheap hydro-electricity, the population was exploding, and the Niagara Falls City School District was in need of land to build new schools, and set their sites on the land owned by Hooker Chemical. Hooker told the school board that the site was a chemical waste dump and that building a school there was a bad idea. Nevertheless, the school board persisted. Hooker continued to resist the school board’s overtures until the board began to hint at the possibility of eminent domain proceedings to seize the land.
Fig 13: Clockwise from top left: Love canal ferments, school children protesting, the clean up.
On October 6th, 1952 Hooker agreed to donate the land to the School District Board for a price of $1. In the quit-claim deed Hooker Electrochemical warns that the site contains chemical wastes and stipulates that all responsibility for loss of property, injury or death is transferred to the Board and that no claim shall be made against the company by the Board or their subsequent assignees upon the agreement being signed:
This raises some intriguing questions about what corporate responsibility entails. Clearly Hooker initially made a responsible attempt to make a sound landfill in an area, which at the time was not a residential development. However Hooker did know pretty much how toxic these chemicals were. They contained both highly toxic polychlorinated phenyls and reactive caustics, and alkalines, so there was a real risk of active processes in the event of corrosion or disruption of the site. It remains questionable whether simply signing a quit-deed to another party who may be misguided enough for narrow minded reasons to do with local government needs to agree to take it on, is a fully responsible action, given the vagaries of redevelopment in what was always an urban area less than five miles from Niagara Falls State Park.
Critical here is whether Hooker informed the school board of the actual nature of the chemicals and the real potential risks they were taking on. The district court judgment of 1997 notes that: “Hooker did not provide specific data about the chemicals in the landfill. This court concluded that given the technology available at the time, such an analysis would have been unreasonably costly and difficult. Nevertheless, the court concluded that Hooker did know with some precision the composition and dangers of certain chemicals buried in the Canal, and that it could have made a more diligent effort to communicate its knowledge to the Board.”
Notwithstanding the warning, in 1954 the 99th Street School was completed at site. In January 1954, the architect of the school wrote to the education committee, informing them that during excavation, workers discovered two dump sites filled with 55-gallon drums containing chemical wastes. The architect also noted that it would be "poor policy" to build in that area since it was not known what wastes were present in the ground, and the concrete foundation might be subsequently damaged. The school board then moved the school site eighty-five feet further north. The kindergarten playground also had to be relocated because a chemical dump lay directly beneath. By 1955, 400 children were in attendance. During its first year, a 25 foot area crumbled, exposing toxic chemical drums, which then filled with rainwater and overflowed. This created large puddles that children enjoyed playing in. They returned from recess in sludge-covered clothing with blackened burns on their hands. They scoured the grounds for "hot rocks" that, when thrown against a hard playground, sparked and once reportedly caught a child's pants on fire.
In 1957 the City of Niagara Falls had constructed sewers at the site for a mixture of low-income and single-family homes adjacent to the site and punctured the protective clay walls. At this point, the 99th Street School had been up and running for two years.
The court notes: “At some point in 1958, the Board offered a portion of the property to the City in order to develop recreational facilities. At first, the City refused to take the land "because of the tremendous holes being created due to the chemical reaction in the land." However, when the Board offered the land again in May 1959, the Council recommended that the City Manager look into the possibility of a transfer. The court notes that: “A number of City officials, along with a representative of Hooker, inspected the property to determine whether it was suitable as a recreation area.” Malcolm Logan claims that Hooker’s attorney attended a Board of Education meeting, to try to dissuade them from selling the land to developers for subdivision, for fear that it would puncture the clay cover containing the buried chemicals. In 1960, the Board conveyed title to the northern section of the property to the City and sold the southern portion at public auction. The conveyance was expressly made "subject to the terms [and] conditions" of the Hooker deed to the Board, in turn absolving the Board of its liability.
In 1962 the LaSalle Expressway was built near the site, limiting natural drainage into the Niagara River, during which additional holes were punctured in the protective cap, allowing rainwater to flow through, carrying toxic chemicals into the adjacent neighborhood. Effectively now the site had become a pond with no natural outlet, causing the water table and with it the toxic waste, to rise to the surface, following the exceptionally wet winter and spring of 1962. Residents began to report pools of oil and colored liquids in their yards and basements.
In 1968 Hooker was acquired by Occidental Petroleum Corporation and was thereafter known as Occidental Chemical Corporation (OCC).
In 1976 a pair of reporters for the Niagara Falls Gazette, David Pollak and David Russell, tested several sump-pumps near Love Canal and found toxic chemicals in them. In 1978 crusading liberal journalist Micheal Brown investigated further and discovered an alarming incidence of birth defects among residents living near the site, and many anomalies such as enlarged feet, heads, hands, and legs. He advised the local residents to create a protest group, which was led by resident Karen Schroeder, whose daughter had many (about a dozen) birth defects.
The New York State Health Department mounted its own investigation and found an abnormal incidence of miscarriages. Ten years after the incident, New York State Health Department Commissioner David Axelrod (not to be confused with presidential advisor David Axelrod) stated that Love Canal would long be remembered as a "national symbol of a failure to exercise a sense of concern for future generations."
The dumpsite was declared an unprecedented state emergency on August 2, 1978. Brown, who wrote more than a hundred articles on the dump, also further tested groundwater and later found that the dump was three times the size officials knew, with possible ramifications beyond the original evacuation zone. He was also to discover that highly toxic dioxin infamous from agent orange, the toxic defoliant used in Vietnam was there. Dioxin pollution is usually measured in parts per trillion; at Love Canal, water samples showed dioxin levels of 53 parts per billion.
In August 1978 local resident Lois Gibbs, whose son suffered from epilepsy, asthma and a low white blood cell count, rallied the homeowners of Love Canal to appeal to the city to do something. She collected first hand accounts of strange odors and substances that surfaced in backyards, including corroding drums and even a swimming pool, basement walls that seeped a thick, black ooze, and withering vegetation. The city refused to act.
Gibbs, in her attempt to get someone to do something, turned next to Hooker Chemical (now a subsidiary of Occidental Petroleum). Hooker dismissed her claims that chemicals at the dump site had anything to do with the cancers, birth defects, mental retardation and illnesses afflicting the neighborhood. When she persisted, they told her in affect that she would have to prove it in court. If need be, they could marshal a battery of attorneys to oppose her.
A survey conducted by the Love Canal Homeowners Association found that 56% of the children born from 1974–1978 had at least one birth defect. In one case, two out of four children in a single Love Canal family had birth defects; one girl was born deaf with a cleft palate, an extra row of teeth, and slight retardation, and a boy was born with an eye defect. Michelle Brown Skiba, played in the "black muck" at Love Canal as a child. She developed rheumatoid arthritis, had a growth removed from her knee and never developed her second teeth. "All my friends had the same thing," she told Donaldson-James of ABCNews. Her mother had six miscarriages before the family left. Now 42 and married, Skiba said she had decided never to have children. "In the back of my mind, I didn't know what the future would bring."
Still, circumstantial evidence does not prove guilt, and Hooker Chemical dug in, refusing to admit any responsibility whatsoever. For that matter, the city, adamantly refused even to acknowledge the problem. The mayor of Niagara Falls told angry residents, “There’s nothing wrong here [in Love Canal]!” By then, the New York State Health Commissioner had visited the neighbourhood and stated that the dump site constituted “a public nuisance and an extremely serious threat and danger to [public] health”. He urged pregnant women and young children to leave the neighbourhood. People began to abandon their homes at a loss.
In August 1978 President Jimmy Carter directed the Environmental Protection Agency, (which had been established by Richard Nixon, when Republicans at least had a shred of social responsibility), along with the Federal Disaster Assistance Agency to move swiftly to address the crisis at Love Canal, the first time federal emergency funds were used for something other than a natural disaster. Carter ordered home sump pumps sealed off and built channels to transport standing waste water to sewers. The EPA began assiduous testing of the ground water. More than 800 families were relocated and the federal government compensated them for their homes. The school was closed and demolished, but both the school board and the chemical company refused to accept liability. The 93rd Street School was closed some two years later due to concerns about seeping toxic waste.
And then in 1980 the US Congress, working in a bi-partisan manner, passed the Superfund, or Comprehensive Environmental Response Compensation and Liability Act (CERCLA), which gave the federal government broad authority to clean up sites contaminated with hazardous waste. The act authorized the EPA to identify the parties responsible for contamination and compel them to clean it up. If the parties were slow to act, the agency was authorized to clean it up itself, using a special trust fund, and then seek redress. It was specifically designed to be able to hold those companies taking over another accountable to take into account the metamorphosing nature of company identities, in this case the takeover of Hooker by Occidental.
In 1988 United States District Judge John Curtin, in response to a motion for summary judgment, found Occidental jointly and severally liable for clean-up costs under CERCLA. In many ways, this was a legislatively designed case made by the Justice Department against Hooker Chemical and Occidental Petroleum, Because the Superfund Act contained a "retroactive liability" provision, Occidental as the owner of Hooker, was held liable for cleanup of the waste even though Hooker had followed all applicable U.S. laws when disposing of it and Occidental only took Hooker over many years later and had not direct part in the dumping.
In 1994, Judge Curtin ruled that Hooker/Occidental had been negligent, but not reckless, in its handling of the waste and sale of the land to the Niagara Falls School Board. He also held that Niagara Falls City was jointly and severally liable for the public nuisance.
In 1995 Occidental Petroleum agreed to pay $129 million in restitution. The real cost of the cleanup is estimated at $250 million. The Department of Defense, which was alleged to have disposed of wastes from nearby military facilities, contributed another $8 million towards the settlement. Out of that federal lawsuit came money for a small health fund and $3.5 million for the state health study. Residents' lawsuits were also settled in the years following the Love Canal disaster (Blum). And what did the City and School Board have to pay? According to Malcolm Logan nothing. The reason – Niagara Falls City was veering toward bankruptcy. Half the population has left and outside the State Park and its remnant tourism, it is an industrial wasteland.
Fig 14: Map of Superfund sites in the contiguous United States, as of March 2010. Red indicates currently on final National Priority List, yellow is proposed, green is deleted (usually meaning having been cleaned up).
Love Canal is by no means alone. As of 29 November 2010, there were 1,280 sites listed on the National Priority List; an additional 347 have been delisted, and 62 new sites have been proposed. Due to a lack of support from Republicans for any form of funding for government cleanups under Reganomics only a fraction of the declared sites have had action taken.
Currently, houses in the residential areas on the east and west sides of the canal have been demolished. All that remains on the west side are abandoned residential streets. Some older east side residents, whose houses stand alone in the demolished neighborhood, chose to stay. It was estimated that fewer than 90 of the original 900 families opted to remain. Today a menacing chain-link fence surrounds an eerily peaceful meadow. No signage or visitor center points to its poisonous past. Only the chemical monitors that poke out of the wildflowers and two humming treatment plants suggest what lurks below. In 2004, the EPA removed Love Canal from its Superfund list, a list that was created in response to the Love Canal disaster. Lois Gibbs says: "There is a misconception that it's been cleaned up, but there's still 20,000 tons of chemicals and no one has taken a single barrel out. The waste that leaked into the soil is still there." The dump has been capped and two treatment plants were built to catch contaminants, but critics say another big storm could unearth the chemicals anew (Donaldson-James).
Logan notes the conservative reaction has tended to blame the media and the victims rather than the school board, local or federal government: “In 1998 Dr. Elizabeth Whelan, founder of the American Council on Science and Health, a right leaning advocacy group, wrote an editorial asserting that the media triggered hysteria among the residents when it called Love Canal a “public health time bomb”, and that illness among the residents was caused less by chemical waste than by stress - this in spite of an EPA study showing that 33% of the residents had undergone chromosomal damage.”
Logan wryly comments: “Whether you are the type who fears the government more than the corporations, or the type that fears corporations more than the government, the story of Love Canal should be an object lesson. No one is clean, everyone carries some of the blame.” But actually this is again an example of the breakdown arising from the complex interaction of democracy and capitalism, in which a chain of expedient decisions by several contributing parties leads to disaster. And it shows that company metamorphosis again would have left no corporate party responsible had not the legislation been crafted to retrospectively take mergers and takeovers into account.
The Deepwater Horizon oil spill of 2010 shows us how lethal misadventure and devastating environmental damage can occur when corporate responsibility becomes fragmented into self-serving cost-cutting conflicts of interest when large transnational corporations hire other large transnational companies as contractors in highly sensitive engineering projects, given a lack of effective monitoring from the federal government agencies that commissioned the projects in the first place.
As the Trump administration is preparing to finalize a sweeping proposal that would allow the oil and gas industry to buy leases in every part of the Atlantic, Pacific and Arctic oceans, in addition to a leasing expansion in the Gulf, a study by University of Miami researchers shows that the toxic reach of Deepwater Horizon's oil spill was much larger -- and deadlier -- than previous estimates (Fears 2020).
The Deepwater Horizon oil spill in the Gulf of Mexico on the BP-operated Macondo Prospect, is considered the largest accidental marine oil spill in the history of the petroleum industry. A 2017 report in Science puts the damage at $17.2 bn (Bishop et al. 2017). At approximately 9:45 pm CDT, on 20 April 2010, high-pressure methane gas from the well expanded into the drilling riser and rose into the drilling rig, where it ignited and exploded, engulfing the platform. At the time, 126 crew members were on board: seven BP employees, 79 of Transocean and employees of various other companies. Eleven workers were never found despite a three-day Coast Guard (USCG) search operation and are believed to have died in the explosion. Ninety-four crew were rescued by lifeboat or helicopter, 17 of whom were treated for injuries. Following the explosion and sinking of the Deepwater Horizon oil rig, a sea-floor oil gusher flowed for 87 days, until it was capped on 15 July 2010. The total discharge is estimated at 4.9 million barrels (780,000 m3).
During the explosion, the blowout preventer should have been activated automatically, sealing the well, but it failed to fully engage. Underwater robots later were used to manually trigger the blind shear ram preventer, to no avail. Chief surveyor John David Forsyth of the American Bureau of Shipping testified that his agency last inspected the rig's blowout preventer in 2005. The permit for the Macondo Prospect by the Minerals Management Service in 2009 did not require redundant acoustic control means.
First BP unsuccessfully attempted to close the blowout preventer valves on the wellhead with remotely operated underwater vehicles. Next it placed a 125-tonne (280,000 lb) containment dome over the largest leak and piped the oil to a storage vessel. While this technique had worked in shallower water, it failed here when gas combined with cold water to form methane hydrate crystals that blocked the opening at the top of the dome. Pumping heavy drilling fluids into the blowout preventer to restrict the flow of oil before sealing it permanently with cement ("top kill") also failed. After several failed efforts to contain the flow, the well was declared sealed on 19 September 2010, although some reports indicate the well site continues to leak.
During Congressional testimony, Transocean and BP blamed each other for the disaster. It emerged that a "heated argument" broke out on the platform 11 hours before the accident, in which Transocean and BP personnel disagreed on an engineering decision related to the closing of the well. On May 14, 2010, U.S. President Barack Obama commented, “I did not appreciate what I considered to be a ridiculous spectacle… executives of BP and Transocean and Halliburton [the firm responsible for cementing the well] falling over each other to point the finger of blame at somebody else. The American people could not have been impressed with that display, and I certainly wasn't."
Fig 15: The Deepwater Horizon oil spill, technical details of the recovery effort and the effect on stock prices of the three companies centrally involved.
A massive response ensued to protect beaches, wetlands and estuaries from the spreading oil utilizing skimmer ships, floating booms, controlled burns and 1.84 million US gallons (7,000 m3) of Corexit oil dispersant. Due to the months-long spill, along with adverse effects from the response and cleanup activities, extensive damage to marine and wildlife habitats, fishing and tourism industries, and human health problems have continued through 2013. Three years after the spill, tar balls could still be found on the Mississippi coast. In July 2013, the discovery of a 40,000 pound tar mat near East Grand Terre, Louisiana prompted the closure of waters to commercial fishing.
As an indication of corruption at the helm of the cleanup, Keith Seilhan who oversaw the cleanup of the oil spill, will, as of 2014, pay $224,118 to settle an insider trading charge. The US Securities and Exchange Commission (SEC), which investigated Mr Seilhan, says the Texan was then a crisis manager in BP's incident command centre in Houma, Louisiana, and coordinated the initial cleanup operations. He was accused of selling $1m of shares in BP after receiving information about the severity of the spill, which was not publicly available at the time. Mr Seilhan then sold his family's entire portfolio of BP shares - totalling $1m - in just two days, before the information hit the public domain. BP shares fell almost 50% once the magnitude of the disaster became known. The 47-year-old has neither admitted nor denied the allegations.
Numerous investigations explored the causes of the explosion and record-setting spill. Notably, the U.S. government's September 2011 report pointed to defective cement on the well, faulting mostly BP, but also rig operator Transocean and contractor Halliburton. Earlier in 2011, a White House commission likewise blamed BP and its partners for a series of cost-cutting decisions and an insufficient safety system, but also concluded that the spill resulted from "systemic" root causes and "absent [of] significant reform in both industry practices and government policies, might well recur".
The spill has had a strong economic impact on the Gulf Coast's economy sectors such as offshore drilling, fishing and tourism. Local officials in Louisiana expressed concern that the offshore drilling moratorium imposed in response to the spill would further harm the economies of coastal communities as the oil industry employs about 58,000 Louisiana residents and has created another 260,000 oil-related jobs, accounting for about 17% of all Louisiana jobs. NOAA had closed 86,985 square miles (225,290 km2), or approximately 36% of Federal waters in the Gulf of Mexico, for commercial fishing causing $2.5 billion cost for the fishing industry. The U.S. Travel Association estimated that the economic impact of the oil spill on tourism across the Gulf Coast over a three-year period could exceed approximately $23 billion, in a region that supports over 400,000 travel industry jobs generating $34 billion in revenue annually.
The spill has also had a strong economic impact on BP. Its expenditures on the spill included the cost of the spill response, containment, relief well drilling, grants to the Gulf states, claims paid, and federal costs, including fines and penalties. As of March 2012, BP estimated the company's total spill-related expenses do not exceed $37.2 billion. However, by some estimates, penalties that BP may be required to pay have reached as high as $90 billion. Due to the loss of its share market value, BP has dropped from the second to the fourth largest of the four major oil companies by 2013 after selling off assets to cover Deepwater Horizon oil spill-related payouts. During the crisis, BP gas stations in the United States also reported sales off between 10 and 40% due to backlash against the company.
In 1997 BP had become the first multinational outside the reinsurance industry to publicly support the scientific consensus on climate change, which Eileen Caussen, President of the Pew Center on Global Climate Change described as a transformative moment on the issue. In 2011–2013, in the wake of Deepwater horizon costs, BP has cut down its alternative energy business. The company announced its departure from the solar energy market in December 2011 by closing its solar power business, BP Solar. In 2012, BP shut down the BP Biofuels Highlands project which was developed since 2008 to make cellulosic ethanol from emerging energy crops like switchgrass and from biomass.
In November 2012, BP and the United States Department of Justice settled federal criminal charges, with BP pleading guilty to 11 counts of manslaughter, two misdemeanors, and a felony count of lying to Congress. BP also agreed to four years of government monitoring of its safety practices and ethics, and the Environmental Protection Agency announced that BP would be temporarily banned from new contracts with the US government. BP and the Department of Justice agreed to a record-setting $4.525 billion in fines and other payments in settlement of all federal criminal charges related to the explosion and spill, the largest settlement of its kind in US history. As part of the announcement, BP said it was increasing its reserve for a trust fund to pay costs and claims related to the spill. On the same day, the US government filed criminal charges against three BP employees; two site managers were charged with manslaughter and negligence, and one former vice president with obstruction. Near the end of November 2012, the U.S. Government EPA temporarily banned BP from bidding any new federal contracts, citing the company’s “lack of business integrity.” BP also paid $525 million to settle civil charges by the Securities and Exchange Commission that it misled investors about the flow rate of oil from the well. It has also paid out $7.8 billion in a settlement with people and businesses affected. Further legal proceedings not expected to conclude until 2014 are ongoing to determine payouts and fines under the Clean Water Act and the Natural Resources Damage Assessment. As of February 2013, criminal and civil settlements and payments to a trust fund had cost the company $42.2 billion. In Feb 2015 BP lost its bid to reduce the maximum civil fine of $13.7bn it could face for its role in the 2010 Gulf of Mexico oil spill. A US judge rejected BP's appeal to pay a cap of $3,000 per barrel under the country's Clean Water Act. Government prosecutors claim the firm is liable to pay $4,300 per barrel spilled to account for inflation. The court has yet to decide the amount of responsibility and final penalty the firm will pay for the disaster.
On December 15, 2010, The US Department of Justice filed a civil and criminal suit against BP and other defendants for violations under the Clean Water Act in the U.S. District Court for the Eastern District of Louisiana. The case was consolidated with about 200 others, including those brought by state governments, individuals, and companies, before U.S. District Judge Carl Barbier, who is trying the case without a jury, as is normal in United States admiralty law. The Justice Department contends that BP committed gross negligence and willful misconduct, which BP contests, and is seeking the stiffest penalties possible. A ruling of gross negligence would result in a four-fold increase in Clean Water Act penalties, which would cause the penalties to reach approximately $17.6 billion, and would increase damages in the other suits as well. Any fines from gross negligence would hit BP's bottom line very hard, because they would not be tax-deductible. The company paid no federal income tax to the U.S. government in 2010 because of deductions related to the spill. A "gross negligence" finding, under US law, would mean BP would be responsible for paying $4,400 per barrel of oil spilled instead of the standard penalty of $1,100 for a spill.
The consolidated trial's first phase began on February 25, 2013, to determine the liability of BP, Transocean, Halliburton, and other companies, and to determine whether the companies acted with gross negligence and willful misconduct. The second phase, scheduled in September 2013, will focus on the amount of oil spilled into the gulf and who was responsible for stopping it. The third phase will focus on all other liability that occurred in the process of oil spill cleanup and containment issues, including the use of dispersants. Test jury trials will follow to determine actual damage amounts.
It has to be pointed out that this disaster came on top of two previous disasters involving BP in the US. In March 2005, the Texas City Refinery, one of the largest refineries owned then by BP, exploded causing 15 deaths, injuring 180 people and forcing thousands of nearby residents to remain sheltered in their homes. The US Occupational Safety and Health Administration (OSHA) found "organizational and safety deficiencies at all levels of the BP Corporation" and said management failures could be traced from Texas to London. The company pleaded guilty to a felony violation of the Clean Air Act, was fined $50 million, the largest ever assessed under the Clean Air Act, and sentenced to three years probation. On 30 October 2009, the US Occupational Safety and Health Administration (OSHA) fined BP an additional $87 million, the largest fine in OSHA history, for failing to correct safety hazards documented in the 2005 explosion. Inspectors found 270 safety violations that had been previously cited but not fixed and 439 new violations. BP appealed the fine. In July 2012, the company agreed to pay $13 million to settle the new violations. In March 2006, corrosion of a BP Exploration Alaska (BPXA) oil transit pipeline in Prudhoe Bay transporting oil to the Trans-Alaska Pipeline led to a five-day leak and the largest oil spill on Alaska's North Slope.
In November 2010, US regulators began an investigation of BP for allegedly manipulating the gas market. BP's London offices, along with those of Royal Dutch Shell and Statoil, were raided in May 2013 by regulators from the European Commission, beginning an investigation into allegations the companies reported distorted prices to the price reporting agency Platts, in order to "manipulate the published prices" for several oil and biofuel products.
But what of Halliburton and Transocean, in this convoluted and acrimonious affair?
Halliburton is one of the world's largest oilfield services companies with operations in more than 80 countries. It owns hundreds of subsidiaries, affiliates, branches, brands, and divisions worldwide and employs over 100,000 people.
According to Tim Probert, executive vice president : “Halliburton, as a service provider to the well owner, is contractually bound to comply with the well owner's instructions”. However on Sept 8, 2010, an internal report released by BP into the Deepwater Horizon explosion claimed that poor practices of Halliburton staff had contributed to the disaster. Investigations carried out by the National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling found that Halliburton was jointly at fault along with BP and Transocean for the spill. The cement that Halliburton used was an unstable mixture, and eventually caused hydrocarbons to leak into the well, causing the explosion that started the crisis.
As of July 2013 BP had appeared to gain an edge in the battle over liability, after Halliburton abandoned one of its arguments that tried to paint BP as unconcerned about well safety. BP had employed Halliburton to oversee the process by which cement is used to seal pipes in oil and gas wells, thereby preventing leaks. Halliburton on Thursday pleaded guilty to destroying evidence of internal tests it conducted showing there was no difference between the effectiveness of putting 6 or 21 casing centralizers on the well. Centralizers help stabilize the well bore during cementing. According to the government, Halliburton recommended to BP that the Macondo well contain 21 centralizers, metal collars that can improve cementing, but BP chose to use 6. Prior to the settlement with the U.S. Department of Justice, Halliburton had sought in court proceedings to pin blame on BP for the blowout because of its decision to save "time and money" by using only 6 centralizers. However these simulations contradicted Halliburton's claim. Government investigators had ordered companies involved in drilling the well to preserve all relevant evidence.
In October 2013 a former Halliburton manager, Anthony Badalament, pleaded guilty to accusations that he destroyed evidence related to the spill. Badalamenti, who was given one year of probation, said it was discovered after the spill that the use of fewer centralizers on the well made little difference, and he ordered the program manager to destroy the results of the simulation.
A payment of just $200,000 to the Department of Justice ends the DOJ's case against Halliburton, which also agreed to three years of probation and to continue cooperating with the criminal probe. Halliburton also made a separate, voluntary $55 million payment to the National Fish and Wildlife Foundation, the Justice Department said. At the same time, the plea seems to hurt Halliburton as it seeks to settle its share of private claims over the disaster, currently estimated at $1.3 billion.
As of September 2014 Halliburton has reached a $1.1 billion settlement over its role in the 2010 spill. Under the settlement, Halliburton is protected from further punitive damages, if the court rules in the future that the company had been negligent for its role in the blowout. The U.S. District Court for the Eastern District of Louisiana must still approve the settlement. If a federal judge found Halliburton's gross negligence to be a major factor in the blowout, plaintiffs could choose to hold out for a larger settlement. In that case, Halliburton would most likely appeal the decision, which would further delay payments. BP has so far paid about $28 billion for its part in the blow out.
As noted, Halliburton, BP and Transocean Ltd are all defendants in a federal civil trial that began in February 2013 to apportion blame and set damages for Macondo. The trial is scheduled to resume on Sept 30 2013. BP is struggling through a separate legal battle over the payment of claims to people and businesses for spill-related losses. A federal appeals court is considering the case, and Halliburton said this process was impeding its push to settle its own liability through talks. Halliburton also disclosed that its legal fees and other expenses related to Macondo totaled $223 million, of which $190 million is covered by insurance. The prospects of a global settlement of the civil litigation by all parties seems remote, since Judge Barbier has not yet ruled on whether BP or its co-defendants were guilty of gross negligence and may not until the trial's next stage ends.
Also as of September 2014 New Orleans judge Carl Barbier has ruled BP was "grossly negligent" in the lead-up to the 2010 Deepwater Horizon oil spill. He ruled that BP will be "subject to enhanced civil penalties" due to its "gross negligence" and "wilful misconduct". The ruling could quadruple the civil penalties that BP must pay as a result of the spill to an estimated $18bn.BP said in a statement that it "strongly disagrees" with the ruling and that it would appeal to a higher court. Judge Barbier said BP should shoulder 67% of the blame for the 2010 spill, with drilling rig owner Transocean responsible for 30% and cement firm Halliburton responsible for 3%.
Transocean and its associated companies is one of the world's largest offshore drilling contractors. Transocean's day rates extend as high as US$650,000 for its deep-water drillships, which house dual activity derricks and can drill in ultra-deep ocean depths of 10,000 ft (3,000 m). Transocean was rated as a leader in its industry for many years. However, since the company's merger with GlobalSantaFe in 2007, Transocean's reputation has suffered considerably. The Deepwater Horizon explosion has further hurt its reputation. “Transocean is dominant, but the accident has definitely tarnished its reputation for worker safety and for being able to manage and deliver on extraordinarily complex deepwater projects,” said Christopher Ruppel, an energy expert and managing director of capital markets at Execution Noble.
While BP owned the Macondo well and was in charge of onsite operations, it leased the Deepwater Horizon rig and its crew from Transocean Ltd. Yesterday, the Justice Department announced that Transocean Deepwater Inc. has agreed, subject to the court’s approval, to pay $400 million in criminal fines and penalties and to continue its on-going cooperation in the government’s criminal investigation. Transocean and its associated companies have also agreed to pay an additional $1 billion to resolve federal Clean Water Act civil penalty claims. Under the civil settlement, the Transocean defendants also must implement court-enforceable measures to improve the operational safety and emergency response capabilities at all their drilling rigs working in waters of the United States. “This agreement holds Transocean criminally accountable for its conduct and provides nearly a billion dollars in criminal and civil penalties for the benefit of the Gulf states.” said Attorney General Eric Holder. Transocean has not faced any felony charges at the personnel or corporate level of the kind that BP has.
In 2011, in the face of withering criticisms of huge “safety” bonuses granted to executives, Transocean international said it’s top five executives will donate their windfall to families of the workers who died in the explosion aboard the vessel. Transocean announced earlier this week that it was distributing the bonuses to its top executives for making 2010 the “best year” in safety the company had experienced, notwithstanding the death of 11 workers aboard the Deepwater Horizon rig, nine of which were Transocean employees. The five said they would donate more than $250,000 to the Deepwater Horizon Memorial Fund, which Transocean established, CNN reported, though it was unclear how much of each executive’s bonus the donation represents or if the donation was collective or an individual sum of $250,000 from each executive. The fund had distributed more than $1.6 million to the 11 families of the perished workers.
In February 2013 Expert witness Alan Huffman, chief technology officer for Fusion Petroleum Technologies Inc and a former geophysicist for Conoco and Exxon told the third day of the New Orleans trial that BP should have heeded a "kick" in the well – which results in everyone on the drilling floor be rained with mud. "It is truly egregious to drill that extra 100 feet, knowing you could lose the well in the process". Mud is used both for pressure suppression and lubrication within the drill hole column, but is never guaranteed as prevention to a blowout because the puncture pressure can always be higher that the weight of mud can withstand. Additionally, when gas is entrained in the mud, it lowers the mud’s density. A full shutdown to avoid disaster means closing the blowout preventer valve, causing the drilling string to be shut, but this is expensive.
Former BP CEO Tony Hayward’s recorded testimony insisted that cost-cutting at BP had no affect on drilling operations at BP. “This wasn’t our accident,” BP’s chief executive had declared at one point, laying blame on Transocean’s systems, equipment and people. In contrast, former BP drilling chief Kevin Lacy testified that $250 to $300 million had been cut from his budget in 2008-09, while production increased by over 50%. "I was never given a directive to cut corners or deliver something not safely, but there was tremendous pressure on costs," he said. Lacy resigned from BP a few months before the spill, because of his concerns about safety.
The Bureau of Ocean Energy Management, Regulation and Enforcement (BOEMRE), formerly known as the Minerals Management Service (MMS), was the agency of the United States Department of the Interior that managed the nation's natural gas, oil and other mineral resources on the outer continental shelf (OCS). In September 2008, reports by the Inspector General of the Interior Department implicated over a dozen officials of the MMS of unethical and criminal conduct in the performance of their duties. The investigation found MMS employees had used cocaine and marijuana, and had sex with energy company representatives. MMS staff had also accepted gifts and free holidays amid "a culture of ethical failure", according to the investigation. The New York Times's summary states the investigation revealed "a dysfunctional organization that has been riddled with conflicts of interest, unprofessional behavior and a free-for-all atmosphere for much of the Bush administration’s watch."
A May 2010 inspector general investigation revealed that MMS regulators in the Gulf region had allowed industry officials to fill in their own inspection reports in pencil and then turned them over to the regulators, who traced over them in pen before submitting the reports to the agency. MMS staff had routinely accepted meals, tickets to sporting events, and gifts from oil companies. Staffers also used government computers to view pornography. In 2009 the regional supervisor of the Gulf region for MMS pled guilty and was sentenced to a year's probation in federal court for lying about receiving gifts from an offshore drilling contractor. "This deeply disturbing report is further evidence of the cozy relationship between MMS and the oil and gas industry," Secretary of the Interior Ken Salazar said.
The Project On Government Oversight (POGO) alleges that MMS has suffered from a systemic revolving door problem between the Department of Interior and the oil and gas industries. For example, thirteen months after departing as MMS director, Bush appointee Randall Luthi became president of the National Oceans Industries Association (NOIA) whose mission is to "to secure reliable access and a favorable regulatory and economic environment for the companies that develop the nation's valuable offshore energy resources in an environmentally responsible manner."
On May 11, 2010, in response to the Deepwater Horizon oil spill, Salazar announced that MMS would be restructured so that the safety and environmental functions are carried out by a unit with full independence from MMS in order to ensure that federal inspectors will have more tools, resources, and greater authority to enforce laws and regulations that apply to oil and gas companies operating on the Outer Continental Shelf.
MMS's regulatory decisions contributing to the 2010 oil spill included, in negligence, the decision that an acoustically controlled shut-off valve (BOP) would not be required as a last resort against underwater spills at the site, MMS's failure to suggest other “fail-safe” mechanisms after a 2004 report raised questions about the reliability of the electrical remote-control devices, and the fact that MMS gave permission to BP and dozens of other oil companies to drill in the Gulf of Mexico without first getting required permits from the National Oceanic and Atmospheric Administration that assesses threats to endangered species and to assess the impact the drilling was likely to have on the gulf.
In June 2010 Robert Costanza et al in "The Perfect Spill: Solutions for Averting the Next Deepwater Horizon" note the unaccounted ecosystem costs the contradictions of the burden of proof in public damage caused by ‘business-as-usual’ and suggest solutions.
"The spill has directly and indirectly affected at least 20 categories of valuable ecosystem services in and around the Gulf of Mexico. The $2.5 billion per year Louisiana commercial fishery has been almost completely shut down. As the oil extends to popular Gulf Coast beaches, the loss of tourism revenue will also be enormous. In addition, the spill has damaged several important natural capital assets whose value in supporting human well-being is both huge and largely outside the market system. These nonmarketed ecosystem services include climate regulation via the sequestration of carbon by coastal marshes and open water systems, hurricane protection by coastal wetlands, and cultural, recreational, and aesthetic values. Since the time of the Exxon Valdez spill, we have developed better techniques to estimate the value of the damage to these public assets. A recently released study estimated the total value of these ecosystem services for the Mississippi River Delta to be in the range of $12-47 billion per year. Based on the flow of these services into the future, the value of the Delta as a natural asset was estimated to be in the range of $330 billion to $1.3 trillion, far more than the total market value of BP ($189 billion) before the spill. Unlike BP, ecosystem service values are outside the market. They continue to produce benefits unless an action like the spill damages them."
Fig 15b: Left: More than 1,300 bottlenose dolphins have stranded in the northern Gulf of Mexico since early 2010.
New research links this unusual mortality event to the massive Deepwater Horizon oil spill (doi:10.1038/nature.2015.17609 Photo: Julie Dermansky/Corbis).
"The Deepwater Horizon incident, like the banking crisis, resulted from inadequate attention to the risks that the public was left to bear. Precautionary measures were known but not taken. Investments in safety devices (like the acoustic blowout preventer) were not made. Corners were cut. Short-term private profits motivated taking high risks with public assets. The fundamental problem is that while private interests are ultimately liable for damages to public assets, they are only held accountable long after the fact and only partially. This gives private interests strong incentives to take large risks with public assets - far larger than they should from society’s point of view."
They suggest a series of measures to correct this for the future:
1. Assessment and incorporation of the full value of public natural capital assets into both corporate and public accounting and decision-making.
2. Assessment of the risks and worst-case damages that could result from accidents, based on this more broadly assessed value.
3. Application of the best science available about the complex linkages between human systems and the rest of nature.
4. Reversal of the burden of proof and requirement of corporations and other private interests to internalize and monetize their risks to public goods.
5. Realignment of investment incentives for both public and private investment away from greater oil dependency and toward renewable domestic energy sources.
Ecological economics is not an alternative notion – it is scientific reality. We live and survive in and because of the natural environment. Nature and survival are the ultimate feedback mechanism. Neither ecologies nor economies are steady state. Nature is forever in a state of fluctuation. The impacts of earthquakes, tsunamis, storms, massive eruptions, plagues, natural disasters such as fire and flood, the impact of asteroids, or nearby supernovas all affect human survival and even more sensitively the economy. The real question is how to apply ecological economics to political and economic society in such a way as to achieve sustainability. This is a question of informed decision-making.
Ecological economics has a variety of manifestations. For example green and ethical investment applies to selectively choosing stocks and investments that are environment positive or founded on a wider ethical principle such as not allowing unethical treatment of workers or consumers. By definition, much of green investing comes down to renewable energy and sustainable industries and technologies.
Ecological economic theory also comes with an ethical foundation based in preserving the integrity of the biosphere. The natural ecosystems of the planet tend to be ignored in traditional economic thinking, even though their long-term genetic value for medicines, and future technological and scientific advances, let along for their intrinsic value as living species and habitats could be enormous and dwarf current social and industrial economies, just as a major world disaster like the two recent world tsunamis did. To turn this thinking around, Costanza and coauthors set out in Nature to value the planetary ecology in financial terms. Dollar figures were averaged to a per hectare value for different types of ecosystem e.g. wetlands, forests and oceans. A total was then produced which came out at 33 trillion US dollars (1997 values), more than twice the total GDP of the world at the time of the study.
Fig 16: (Top left) Ecological economics is nested within human society and the natural environment so it is the natural environment on which we depend which provides the ultimate feedbacks in a sustainable economy. Only when all three are operative together do we attain a sustainable culture and society. (Top right) A region –by-region costing of the Earth’s natural ecosystemic environment by area (Costanza et al). (Lower left and right) world GNP and projected GNP growth for 2013. (Centre left) Game theory matrix for the mutual prisoner’s dilemma game of sexual strategies under patriarchy and a partnership society. in which female reproductive choice is respected.
The study was criticized by pre-ecological and even some environmental economists - for being inconsistent with assumptions of financial capital valuation - and ecological economists - for being inconsistent with an ecological economics focus on biological and physical indicators. The whole idea of treating ecosystems as goods and services to be valued in monetary terms remains controversial to some. A common objection is that life is precious or priceless, but this demonstrably degrades to it being worthless under the assumptions of any branch of economics. Nevertheless, this study plays a central role because it defines a quantitative relationship between ecology and economy. The valuation may be a vast understatement, but it represents a transition in which the biosphere is quantified in a way that can become a feedback process in economic planning.
The ethical case of ecological economics has a perfectly good scientific foundation, but in the complex strategic society we live in, nothing is ever so simple, because defection and cooperation are always in strategic conflict.
Reproductive choice provides a central case example. Under any social paradigm, an equilibrium of reproductive choice strategies between the sexes involves both cooperaters with the strategy of the opposite sex, and defectors against it. Generally the game remains a mix of cooperaters and defectors because the rarer entity becomes the most highly prized. In a society of loose women, a faithful wife can command a fortune. In a society of obedient wives a scarlet woman can command every man’s attention.
These complex mixtures hold true in diverse social regimes, from patriarchy to sexual partnership societies, where female reproductive choice is respected. In a partnership society most women exert reproductive choice, so the male strategies are a function of what women choose and remain dynamic in a Red Queen evolutionary race. In a patriarchal society, men can exert reproductive control over women so we have an apex of potentates much like financial millionaires, who have many female partners, followed by a graduated rank and file of lesser males either aspiring to be faithful husbands to hold onto the one they have, or to cheat the others by being philanderers. Faithful husbands ensure their position by asserting social or physical penalties for unsupervised female reproductive choice.
Machiavellian intelligence, along with mutual sexual selection has been a principal formative process in the emergence of super-intelligence and the complexification of culture. It is key to the way strategic bluffing becomes the hallmark of a complex society. The most intelligent primates have the most sophisticated strategic bluffing which gives the society its richness of niches and complexity. Strategic bluffing is central to how to defect as much as possible towards one’s own interests while at the same time acting as an ethical agent through hype and spin. These processes are natural strategies of defection and are necessary to the richness and complexity of a society and how many people can coexist within it in viable social niches, but they do so only in a state of strategic contest with the forces of order as represented in government, particularly of the left, and in the natural context in the overall stability of the entire natural ecosystem.
Given the formative role of Machiavellian intelligence in human social evolution, these populations each become complex systems in which there are many feedback avenues through the process of culture. The end result is that society becomes a complex system in which criminality and legitimate protest fall on the side of chaotic defection and good government and social harmony fall on the side of order.
Ecological economics makes an ethical case for the need to adopt a globally responsible view because it is necessary for our long-term future. However Machiavellian intelligence makes it difficult for any overarching rationale because libertarian forces of defection mount against the order, testing the dynamical system for any justification to intervene in the spirit of freedom of choice. We can see this in the form of the climate change sceptics and their strategic attempts to unhinge the scientific case for climate change, so that business-as-usual can reign supreme on the basis we all charge on unless you can prove the damage is real, supported by a number of clandestine funding sources in industry and finance. The spin is to apply the reckless proof principle instead of the precautionary one – burn till the climate proves it is out of control.
An indicative confessional comes from Grover Norquist (American Economist and Political activist, president and founder of Americans for Tax Reform, which he says was done at the request of then-President Ronald Reagan. It has become his personal signature: “I don't want to abolish government. I simply want to reduce it to the size where I can drag it into the bathroom and drown it in the bathtub.” Norquist is the primary promoter of the "Taxpayer Protection Pledge", a pledge by lawmakers to oppose increases in marginal income tax rates for individuals and businesses, as well as net reductions or eliminations of deductions and credits without a matching reduced tax rate. Prior to the November 2012 election, the pledge was signed by 95% of all Republican Members of Congress and all but one of the candidates running for the 2012 Republican presidential nomination.
Ultimately ecological economy has to become a fully informed knowledge economy where the precautionary onus has to be on testing attempts at defection for their likely deleterious effects. This is what anticipating circumstances is all about and that is how an informed culture avoids lethal tipping points. Effectively written knowledge, science and history are culture’s equivalent of genetics - the cumulative principle by which we can come to understand how we got to where we are and what the real decisions are that we need to make so life continues to flourish on Earth. These are the cumulative memes of cultural evolution that fulfil the cumulative informational role that genes do in biological evolution.
Libertarianism is fundamentally a strategic defection against social order in the guise of freedom of choice. The aim is actually for the libertarian themself to take strategic advantage of what they are advocating. The spin is that you are persuaded to agree based on a misrepresented ethic of freedom for all. Hence it is a strategic deceit or bluff. This is why Richard Dawkins author of “The Selfish Gene” and spinner of the social meme, warned that giving out any kind of information to others is a sucker’s game which gives others strategic power over you, unless you are disseminating a deceptive perspective for your own strategic advantage. On the other hand government can itself become totalitarian by the same strategic bluff – “we are doing the best for you – even when acting covertly to spy on the population and act to their own advantage. Of course sometimes one can actually disseminate truthful information, so we can all come to a win-win outcome as the prisoners’ dilemma also demonstrates. Science and transparency tend to bring about just such a win-win escape from double jeopardy.
For a society to exist in evolutionary time, it needs to apply the precautionary principle to instabilities manifesting as strategic bluffs for one party or another to gain strategic advantage. Genetics applies a very strong precautionary principle, because most of the genetic descendents of an organism or a sexual zygote are faithful copies of the original complement, albeit possibly sexually recombined to make a fully viable parental genetic combination. Only a very small proportion are mutated versions, all but a few of which are less fit and die out, with a very few displaying a new functionality advantageous to survival. The entire genetic resource is never placed at risk except in species extinction, or in a lethal epidemic, such as the pneumonic Black Plague, when natural selection again tends to diminish lethal virulence to sustainable levels, or the parasite will die along with its host.
“Ring-a-ring o' roses, A pocket full of posies, A-tishoo! A-tishoo! We all fall down.”
Society can survive long term only by applying the precautionary principle to chaotic change, which by its nature is unpredictable. This doesn’t mean a static state of the oppressive rule of order because the natural ecology is a complex dynamical system. The precautionary principle is – prove your oil well is going to be done safely – before you drill. Prove global and warming and climate change are not effects of burning fossil fuels before you consume them all at exponentiating rates. In the unregulated economy the oil driller pollutes because of unsafe cost-cutting and society and society has to bear the burden until they can prove the perpetrator was legally liable. All of the three disasters we have examined, The Grand Banks cod, Love Canal and Deepwater Horizon the burden of proof has taken years to be litigated.
Central to the prisoners dilemma complex system of defection and cooperation is the role of altruistic punishment. One strategically defects against a perceived defector even though one will not benefit and indeed might suffer, to maintain the integrity of the social order. This is the basis of the rule of law, which by its very nature is the feedback process of altruistic punishment by society of perceived forms of transgression. It is also an unholy truth of morally prescriptive religions. All regulatory legislation is fundamentally totalitarian and has to have teeth to be effective, so it is in a sense both oppressive and coercive. Nevertheless altruistic punishment is essential to social survival. So we have to ensure that processes of altruistic punishment are actually designed to preserve freedom of choice and social diversity, or suffer the threat of the police state. This is where the knowledge economy and transparency of information becomes key.
An informed society depends on freedom of information. We need to know what really goes on in the world what has actually happened and the mistakes made and the crises avoided. This way an informed society is capable, through its knowledge, of avoiding disaster. Governments, politicians and corporations often try to do exactly the opposite. They act to varying degrees as Machiavellian agents for their own purposes as instruments of maintaining power. Information is distorted, fabricated, controlled and suppressed in a variety of ways, histories are rewritten, and discussion is forbidden. The capacity of democracy to function ecologically depends on the freedom of information so as to apply the precautionary principle before irreversible damage is done. The informational society of the internet and the cell-phone means we are all being spied on all of the time. We need to make sure we are the arbiters of freedom of information. Like any social revolution, engendering ecological economy is a transition from chaos to order in which social struggle is both the medium and the agent.
We now turn to a small land owning company, which at face value, might appear to violate many of the principles associating for the mutual profitability of the shareholders, company law implies, until ecological economic accounting is taken into account.
In 1970 a group of ten to sixteen families formed a small private company to own a block of 300 acres (124 hectares) of land we had found, with a view to forming a residential conservation community with the aim of perpetually conserving and regenerating the land. The land is a superbly beautiful, wild, windswept peninsula with sweeping views of Mount Moehau and the islands of the Hauraki Gulf, which was in former times in native title.
Fig 17: (Above) The Land looking toward Moehau and the islands of the Hauraki Gulf. (Lower left) One clause of the Memorandum of Association allowing us, among other things, to be aircraft manufacturers, giving a very broad business phenotype, barring only insurance. (Lower right) Trends in land valuations (net asset backing) exceed share values, which are tied to the CPI and the 1991 land valuation, as of 2007 by a factor of 10.
Ownership of the land was set up in a private limited liability company under the then 1955 Companies Act. The one exception to the usual articles was a pair of clauses which (a) gave the directors power to levy the shareholders to pay for the costs of rates and land care and (b) gave the directors the power to take power of attorney over a shareholding for transfer to another party to recover costs if these levies were not forthcoming within two weeks of giving notice. In every year since incorporation, the company has made a loss to pay the rates and ongoing costs funded by the levies. This is at face value a complete reversal of a company being formed for the mutual profit of its shareholders but it was deemed essential to ensure the rates would get paid on what was going to become a residential conservation estate with no plans of a farming or any other immediate profitable business venture.
Under the 1955 Companies Act a company was formed via a memorandum of association as well as a set of articles setting out the rights and voting procedures of the general meeting and board. The memorandum was the compact of agreement and could, but did not have to, set out the purpose for which the company was founded. In retrospect, this provision, which was later abolished in the 1995 Companies Act, set out a kind of phenotype, which could prevent the company making a radical change of business – the tiger becoming a shark.
In our case there was a wide-ranging portfolio of potential business activities from land ownership and farming, from lime cement plasterers, through aircraft manufacturers and radio engineers to hardware or clothing merchants. Just one clause among many is shown in fig 17. However there was just one caveat: “Nothing herein shall be deemed to authorize the company to carry on the business of an insurance company” - a pertinent reminder of the double jeopardy the Glass-Stegall Act between banks and deposit insurers sought to avoid. The phenotype idea is certainly a constitutional protection against unbridled corporate metamorphosis and could act as a brake to takeover, but by the time of our incorporation it had obviously become so wide-ranging as to be meaningless.
As we began convening meetings, we ran into a problem of quorums. The quorum for a general meeting had been set at ¾ of the shareholders, but we were spread out across the country and could not reliably get such a majority together. Section 362 of the then act set out a procedure of ¾ of the shareholders co-signing a resolution in the minute book in lieu of a general meeting. We adopted this procedure because it gave greater security of a fully authorized decision and continued to adopt signed consensus in all our subsequent general meetings, having become used to the procedure, because it gave each of us confidence we wouldn’t be subjected to tyranny of the majority in a divided vote.
Consensus is a very difficult process to negotiate with and involves a lot of argument and often ill feeling when people don’t really like a decision, but feel obliged to go along with it to keep the peace and preserve the consensual integrity. Because any one party could effectively veto any decision, consensus decision making’s chance of a successful outcome tends to decline super-exponentially with the factorial of the number of people involved, rather like the travelling salesman problem of finding the shortest route around a set of cities. History, from the amphictyony of Greece and the twelve tribes of Israel to twelve angry men on a jury attest to a dozen being close to the functional limit. Our general meetings have generally had eight to twelve people and have succeeded over forty years in making decisions without a formal breakdown of the consensus process. Consensus decision-making also applies the precautionary principle to abrupt change because everyone has to be satisfied they can live with a decision before it can be resolved.
Consistent with the standard provisions of both Companies Acts, the directors were appointed de jure until such a time as a notified meeting of shareholders sought to remove them. The business of the company was actually decided by annual or special general meetings and the directors functioned essentially to maintain the legal process of calling and convening meetings in which decisions were made by shareholder signed consensus.
In the early life of the company there was a relatively high turnover of shares. The articles provided for a standard process of offering available share to existing shareholders and then on the open market, but we tried to make a general meeting decision agreeable to all parties to find the best incoming party to fit with our own collective situation. As land values continued to escalate, stoked on by deregulation allowing New Zealand land to be sold to foreigners, some shareholders sought to get a maximum payout of the asset value of a shareholding based on land valuations. This gave rise to concerns that the escalating value of the shares would cause us to lose control of who the shareholders were and be taken over by financial pressures to become a residential subdivision. We also became concerned that, in the event we didn’t clear existing sellers promptly, they might form a coalition and try to sell to an asset stripping company for a hostile takeover of our assets, as was a notable feature of corporate activity of companies such as Brierly Investments at the time.
At the same time, we became concerned that large houses built on the land could drive both the share and house values beyond reach of the current members and the direction of the company away from the long-term conservation reserve approach we had set out to safeguard, instead towards an exclusive coastal resort. Technically we had made no provision for private ownership of dwellings, with the company owning all of them by default as owner of the land and all fixtures thereon. However one shareholder who had built a very substantial house in a remote part of the land sought legal damages when they sought to sell, on the basis that we had let them build and as a consequence would unfairly benefit from their misfortune, unless we compensated them ourselves.
We resisted this and an under the table deal was done for a more moderate price with an existing shareholder, but this caused us to begin to research several legal pitfalls - how to keep house prices within bounds, how to avoid shares sales reaching unmanageably high prices and whether a limited company was the best vehicle to own land for long-tenure conservation. We researched many types of alternative ownership structure, including incorporated societies, charitable trusts, and private trusts. After legal consultation with several specialist lawyers in the field of collective land ownership we arrived at a proposal where a shareholder wishing to dispose of their property could offer it to existing shareholders but if none was forthcoming, they could remove the materials at their own cost, effectively costing houses as just a heap of materials. This was opposed by one shareholder and we could not afford to impose it because it could create a chain of legal claims if we made an exception for them even though others had agreed.
At this point the New Zealand government decided to re-enact Company Law and passed the 1995 Companies Act, requiring every company in New Zealand to re-constitute, drawing up from scratch a completely new company Constitution, which the shareholders had to agree to. This provided us an opportunity to relitigate ourselves with the hindsight of twenty years of ongoing experience of our own operating conditions.
We thus pored over the new legislation from beginning to end and designed a Constitution with the most exhaustive precautionary safeguards we could incorporate, in consultation with two independent sources of legal advice. Here are some of the leading features:
1. The company’s overriding purpose was defined to be the perpetual conservation of the land of Opuhi Reserve. It has an effective phenotype.
2. Share price was constrained to the consumer price index (CPI). In any share transfer, the transferor had to give a written notice to the company, which unilaterally appointed the directors as agents to transfer the shareholding to a party approved by a general meeting for a price having an upper limit of the CPI adjusted 1991 unimproved land valuation. In any dispute, conservation organizations such as Greenpeace were to arbitrate, rather than the Society of Chartered Accountants.
3. Improvements resulting from any dwellings or other fixtures on the land occupied by shareholders for their own use were not to be included in estimation of share value as the materials could subsequently be removed by the shareholders concerned.
4. No smaller parcel of shares could be transferred than the smallest existing parcel to keep strict bounds on the number of shareholders.
5. There was no transmission of shares, but a child or subsequent offspring of a shareholder, or a trust for their benefit approved by the company could receive a shareholding as a family transfer unless 75% of the shareholders disagreed. Although shares could technically be held jointly none so far have, partly out of a concern that endless subdivision of shareholdings was an instrument to indirectly alienate Maori land, by dividing into ever diminishing shareholdings, by forcing transmission to all offspring, rather than those actually making their life on the land.
6. Hundred percent agreement for major change. A special resolution required to change the constitution, sell, sublease or subdivide the land, execute a mortgage or undertake debt, make a guarantee, agree to buy further land, make a major transaction under the Act or liquidate the company required 100% of the shareholders voting and entitled to vote at a properly notified general meeting with provision for postal voting. The ultimate precautionary principle.
7. No asset splitting. In the event of liquidation, any remaining assets were to be held together in a trust for the equitable benefit of the shareholders for a period of at least 20 years to inhibit any temptation to wind the company up for asset stripping.
8. Shareholders control management. The general meeting was empowered to instruct the directors and all shareholders so doing became directors in the execution of such a resolution, formalizing the general meeting being the equitable managers of the company with the directors acting only as day-to-day managers between meetings.
9. Consensus decision-making was formalized in the sense that a resolution by signed consensus at a general meeting became legally binding on the shareholders, although the formal general meeting process nominally followed standard majority voting. All meeting resolutions have continued to be by signed consensus.
10. An incoming shareholder had to agree to honour the existing constitution and abide by the consensus resolutions of the general meeting, binding incoming parties to the same terms signed by all the shareholders and formally witnessed as a deed of agreement at the time the Constitution was established.
11. Any shareholder could be appointed as a director, but the term of office remained de jure in line with both company act defaults. The original company had to have between two and four directors not necessarily shareholders.
We retained the right to strike levies to cover the running expenses, under penalty of having one’s shares taken power of attorney by the directors, extended to a serious offence against the constitution, consensus policy or shareholders if ¾ of the general meeting so resolve. Notably a shareholder’s right of residency is not formally specified, although it is understood that we each have equitable ability to share in the life of the land.
In effect the company has become a very exclusive conservation estate club, with very strict rules of membership and a transfer fee an order of magnitude less than the value of the asset. Many people have since questioned whether this is legally possible, since the central aim of company law is to enable a group of people to incorporate for profit and to claim a share of the proceed in any liquidation, however all sources of legal advice at the time agreed this was constitutionally an acceptable application of the new Act’s provisions, if the shareholders agreed, and had previous legal precedents, such as cooperative dairy companies. To make the process as legally irreversible as possible, each shareholder was required to sign in front of an independent witness in the manner of a deed of arrangement.
Two shareholders elected to sell out just before the signing of the Constitution and received their share of asset value less outstanding levies, clearing the share register of the two key opponents of change.
Almost as soon as the Constitution was signed and sealed the excrement of factional division hit the fan of collective harmony. A female shareholder, whose house had recently burned down and had been rebuilt using a relatively large insurance settlement, moved to sell, but refused to issue the required notice empowering the company to find a buyer by the constitutional process. She attempted to gazump her informal asking price, which was well above previous trends, by offering it informally to two competing shareholding couples and threatening to rent it in perpetuity if she did not get what she wanted.
We had appointed two female directors with the new constitution in addition to the two long-standing male directors. Almost immediately fundamental disagreement emerged about how to do business and how to finalize the constitutional arrangements.
Early in the history of the land we had built a meeting house as a common living space for the shareholders and their families to meet, hold meetings and other get togethers, as well as a place to receive visitors. A group of local women had begun to hold regular sessions of Moon Goddess self-empowerment in the full moons there at the invitation of a female shareholder. Public invitations to other women were posted in the local store without consultation with shareholders outside the group. Male offspring of some shareholders were told to go away when they arrived at the meeting house when the women were meeting. The full moon was a special time for everyone and other people enjoyed gathering to socialize or having occasional mushroom veladas also on the full moon nights. This evoked a fundamental crisis which brought to the surface ancient sexual tensions on sacred ground between women and men and set off a deep round of misunderstandings, which took years to resolve, but were a rich source of intrigue and social discovery..
A special general meeting was called to consider the house offer that imposed clear conditions upholding the company position in formal resolutions, but two sets of written minutes were produced, one of which undermined the formal resolutions passed. One of the female directors who had chaired the meeting then endeavoured to retrospectively rewrite the minutes to indicate the company might agree to the house transfer despite the formal resolution. The two female directors, in support of their outgoing female co-shareholder, then accused the two male directors of being male chauvinist bullies and began a campaign for them to stand down. One of the female directors resigned in protest. An acrimonious general meeting was called in which no agreed resolutions could be passed, a formal counter claim was made against the female directors by one of the other shareholders and the male directors refused to stand down to avoid the company going into procedural default, on the basis of unfounded allegations and failure to give proper notification.
This put immense strain on the consensus process, because of the difficulty of passing a motion to remove an existing director with the agreement of all shareholders. It resulted in an ongoing factional split in the shareholders, in which four to five female shareholders formed a dissident faction and set up their own ‘home account’ boycotting general meetings and endeavouring to pay their rates to the local Council bypassing the company, as owner, as if it no longer had legitimate legal authority.
This raised a fundamental question of allegiance. The male directors and general meeting claimed allegiance to the founding principles and to the democratic process of the shareholders and the company. The two female directors and their feminist faction also claimed allegiance to the land but also to one another as sisters with a common purpose, which had both formed a faction within the community and a wider movement outside it. They rejected the company as a patriarchal manifestation of the established order which was superfluous to our needs and an encumbrance to our emancipation.
The general meeting, which still had enough members to form a quorum of a majority present, sometimes with proxies, promptly passed two consensus resolutions: the first formally affirming resolution by signed consensus as the meeting process and the second making clear the signed resolutions were the only formal business, with informal minutes consigned to a minute book simply as a partial record of the subjects discussed.
This standoff continued for six years until we eventually engaged a formal facilitator with experience in company facilitation and specific expertise in conflict resolution in conservation communities. The compromise position we arrived at was that the ‘home account’ members would agree to pay their levies to the company and the rest of us would agree to rotate the directors around the shareholders.
At the next general meeting we agreed to two new consensus policies: (a) All shareholders are encouraged to take a turn as director. Directorship to rotate with a new director appointed at each AGM for a three year term. (b) Every shareholder is welcome to attend any directors meeting. The directors will keep a contact network to let interested shareholders know of any board meeting and its agenda. The first one provides for a positive process to rotate directors as consensus policy, without having to make a change to the constitution, or straining consensus by having to agree to remove directors. The second is a check and balance by ensuring transparency of directors’ actions and the ability of any shareholder to enter into the discussion process at all levels.
The other standing male director began the process of voluntary resignation and I continued for the first year to facilitate the democratic meeting process of the board with the new appointees. This process has continued successfully for the last ten years and represents a net improvement in democracy and maturity of decision-making although it has pitfalls of political instability and a degree of factionalism. Directors have so far been appointed by mutual agreement without any majority voting, based on who is prepared to take a turn, who is known to have good working qualities and experience, and who will together form a functional board which is reasonably representative of the diverse positions of the shareholders. This is admittedly a more highly politicized process in which one arrives at general meetings sometimes unsure who will be running the company afterwards, but politics is intrinsic to democracy.
That is not to say there haven’t been nail biting crises, attempted boardroom coups, disputes over financial management, and attempts to sideline the formal process with its transparency and accountability for informal in-house deals between factional members to try to run things in a different manner to their own liking while in power, including taking control of the financial accounts without formal consultation and trying to withhold critical company documents from the Board, including a notice of share transfer in which a dwelling was involved, to aid the prospects of an outgoing shareholder associate, against the best interests of the company as a whole.
For many years there was an effective ban on immediate reappointment, (new director may or may not mean a new person – an intentional ambiguity of the original agreement) but as the political frenzy of taking public office has given way to the realization that the directors are working hard unpaid to keep the general meeting process legitimate, we have finally arrived at an acceptance of agreeing on the best team we can set up even if some are outgoing directors reappointed for a new term.
So far the Constitution has stood the tests of a number of share and dwelling sales without serious challenge and has some hope of keeping the land intact as a conservation reserve even as successive generations of the founding shareholders become more remote from the founders commitment. Forty three years is already a long life for an intentional community.
To close the circle let us now come back to ecological economics and the pricing of the biosphere. At face value this company looks like a sucker’s game. The shareholders have no formal rights and their houses are owned by the company. They are obliged to pay levies under ultimate pain of dismissal. They cannot redeem their investment liquidity in a share sale but only receive a pittance of asset value. The company appears to be making a perpetual loss and has no hope of becoming a viral saleable asset like Apple or Google, nor has it any real hope of a reasonable cash flow like industrial utilities. This is not to say it is doomed to a loss. Indeed one could set it up as a very chic ecological retreat.
However, when ecological costing is taken into account, a completely different picture emerges of a highly economically positive irreplaceable investment. The raison d’etre of the company is to conserve and regenerate an environmental estate. This estate has immense and escalating financial value, both in its land value as an asset, its ecological value as a relatively intact coastal wilderness, and in its continuing survival value for the shareholders and their descendents. Moreover, without the restrictions on share value, company borrowing, and a liquidation, the shareholders couldn’t have confidence in preserving the asset for its maturity long-term, so the strategy is necessary for this form of investment.
If we take the unimproved value of the land, it has appreciated from $14,000 at the outset in 1970 to $1.45 million in 2007, representing a 13.36% annual return on annual income, or around $193,473 in 2007. If we discount the levies of $800 in the same year, we arrive at nearly the same figure 13.31% per annum. But this fails to include the intrinsic ecological value, which by Constanza’s estimates is a cool $461,250 implying at least a 31.8% annual return on the ecological value alone and a 45.1% return on land and ecological value less levies. Over 40 years, this has exceeded the performance of the some of the best hedge funds in a context of absolutely vigilant risk reduction, which no hedge fund can replicate, by utilizing every means of precautionary principle available.
Of course one can pass a number of objections to this position. We are not currently utilizing the land, as most of the shareholders are living in the city. Our descendents may not be in a position to live on the land, although the current ones hold the land in high regard. Our involvement in the land is fiscally negative, with levies having to be paid, and a sometimes demanding additional commitment in time and labour maintaining the access and fencing and assisting ecological management to avoid predators to the kiwi and other wildlife and noxious animals and plants that threaten the native vegetation.
But this again has to be offset against what it provides in familial survival. We have no idea how the future of world economics fares in the face of climate change, resource and habitat depletion and other potentially irreversible tipping points. The land provides a resource in fundamental survival productivity that ‚Äď in the face of social crisis which affects food supplies, which might wipe out many members of the urban population, we have an option on a paradisiacal wilderness in which survival is possible by planting subsistence gardens and catching fish and shellfish from the ocean as human societies have done for the last 150,000 years. In the age of silicon technology, the internet, and renewable energy from the wind and sun, a perpetually sustainable life on the land is an achievable reality, which we and our children have already experienced. It is this sense of irreplacability, which we know from our long experience of living with the land as a reality, that has maintained the strong commitment of the shareholders to it over periods of social change, whole lifetimes and three generations.
Adler J. (2012) Raging Bulls: How Wall Street Got Addicted to Light-Speed Trading http://www.wired.com/business/2012/08/ff_wallstreet_trading/all/
Arnold C (2017) The mathematicians who want to save democracy Nature 546 200–202 doi:10.1038/546200a.
Arthur B. (1992) Positive Feedbacks in the Economy Scientific American Feb 92-99.
Baar R. (2012) Bank of England Official Denies Pressuring Barclays Into Libor Manipulation Huffington Post http://www.huffingtonpost.com/2012/07/09/bank-of-england-barclays_n_1659909.html
S. (2013) Women: You May Be Better at Managing Investments
Bangia S et al. (2017) Redistricting: Drawing the Line arXiv:1704.03360.
Battiston S,Puliga M, Kaushik R, Tasca P, Caldarelli G (2012) DebtRank: Too Central to Fail? Financial Networks, the FED and Systemic Risk Nature Scientific Reports doi:10.1038/srep00541 http://www.nature.com/srep/2012/120802/srep00541/full/srep00541.html
Behar et al. (2008) The Dawn of Human Matrilineal Diversity, American Journal of Human Genetics doi:10.1016/j.ajhg.2008.04.002.
Bishop R et al. (2017) Putting a value on injuries to natural assets: The BP oil spill doi:10.1126/science.aam8124.
Blum, Elizabeth D. (2008). Love Canal Revisited : Race, Class, and Gender in Environmental Activism. Kansas:
Braverman E. (2013) Go Get Somebody Pregnant? http://www.wallstreetoasis.com/blog/go-get-somebody-pregnant
J. (2008) Testosterone Fuels Stock Market Success
Cassidy J. (1998) The Force of an Idea New Yorker Jan 12 32-37.
Chen H, Rodden J (2013) Quarterly Journal of Political Science 8 239–269 http://www-personal.umich.edu/~jowei/florida.pdf
Cho W, Liu Y (2016) Toward a Talismanic Redistricting Tool: A Computational Method for Identifying Extreme Redistricting Plans Election LawJournal 15/4 doi:10.1089/elj.2016.0384 http://cho.pol.illinois.edu/wendy/papers/talismanic.pdf
Coghlan A & MacKenzie D (2011) Revealed – the capitalist network that runs the world New Scientist 24 Oct http://www.newscientist.com/articleimages/mg21228354.500/1-revealed--the-capitalist-network-that-runs-the-world.html
Coghlan A & Marshall M (2012) The financial meltdown forecasters New Scientist 11 Dec http://www.newscientist.com/article/mg21528773.400-the-financial-meltdown-forecasters.htmlde
Costanza, R. 2008. Stewardship for a “full” world. Current History 107 30-35 http://www.uvm.edu/giee/pubpdfs/Costanza_2008_Current_History.pdf
Costanza R, Batker D, Day J, Feagin R, Martinez M, Roman J (2010) The Perfect Spill: Solutions for Averting the Next Deepwater Horizon The Solutions Journal http://www.thesolutionsjournal.com/node/629
Czech B. (2013) Supply Shock: Economic Growth at the Crossroads and the Steady State Solution
New Society Publishers ISBN 10: 0865717443
Daly H. (2005) Economics in a Full World Scientific American, 293/3 Sep 2005. http://sef.umd.edu/files/ScientificAmerican_Daly_05.pdf
Darwin, Charles (1871), The Descent of Man, and Selection in Relation to Sex (1st ed.), London: John Murray, ISBN 0-8014-2085-7
Donaldson-James S. (2008) Love Canal's Lethal Legacy Persists ABC News http://abcnews.go.com/Health/story?id=5553393
Dunn A. (2012) Average America vs the One Percent Forbes http://www.forbes.com/sites/moneywisewomen/2012/03/21/average-america-vs-the-one-percent/
Easley, D., M. López de Prado, M. O'Hara (2011) The Microstructure of the ‘Flash Crash’: Flow Toxicity, Liquidity Crashes and the Probability of Informed Trading Journal of Portfolio Management 37/2, 118–128.
Emmott B. (ed) (1998) The Flexible Tiger The Economist http://www.economist.com/node/109704
Emmot B (2003) Capitalism and Democracy The Economist http://www.billemmott.com/article.php?id=24
Faludi S. (2013) Death of a Revolutionary New Yorker Apr 15 52-61.
Fears D 2020 The toxic reach of Deepwater Horizon's oil spill was much larger — and deadlier — than previous estimates, a new study says Washington Post https://www.washingtonpost.com/climate-environment/2020/02/12/toxic-reach-deepwater-horizons-oil-spill-was-much-larger-deadlier-than-previous-estimates-new-study-says/.
Hamers L (2018) Sunshine is making Deepwater Horizon oil stick around Science News
Hardin Gareth (1968) The Tragedy of the Commons Science, 162 1243-1248.
Heinberg R. (2011) The End of Growth: Adapting to Our New Economic Reality New Society Publishers
ISBN 10: 0865716951.
Herschlag G, Ravier R, Mattingly J (2017) Evaluating Partisan Gerrymandering in Wisconsin arXiv:1709.01596.
Jameson R (2008) The blunders that led to the banking crisis New Scientist 25 Sep http://www.newscientist.com/article/mg19926754.200-the-blunders-that-led-to-the-banking-crisis.html
Johnston D (2007) Income Gap Is Widening, Data Shows NY Times http://www.nytimes.com/2007/03/29/business/29tax.html
Karmin M et al. (2015) A recent bottleneck of Y chromosome diversity coincides with a global change in culture Genome Research 25 459-466 doi:10.1101/gr.186684.114.
Keuls E. (1993) The Reign of the Phallus: Sexual Politics in Ancient Athens University of California Press.
King C & Fielder C (2004) Sexual Paradox: Complementarity, Reproductive Conflict http://www.sexualparadox.org
Klarreich E (2017) How to Quantify (and Fight) Gerrymandering Quanta Apr 4 https://www.quantamagazine.org/the-mathematics-behind-gerrymandering-20170404/
Klofstad C,, Anderson R, Nowicki S (2015) Perceptions of Competence, Strength, and Age Influence Voters to Select Leaders with Lower-Pitched Voices doi:10.1371/journal.pone.0133779.
Kohler T et al. 2017 Greater post-Neolithic wealth disparities in Eurasia than in North America and Mesoamerica Nature doi:10.1038/nature24646.
Liptak A (2010) Court Under Roberts Is Most Conservative in Decades NY Times http://www.nytimes.com/2010/07/25/us/25roberts.html
Lerner G (1986) The Creation of Patriarchy, Oxford University Press, New York.
Logan M. (2012) Dumped On: The Messy Truth About Love Canal, NY http://myamericanodyssey.com/dumped-on-the-messy-truth-about-love-canal-ny/
MacKenzie D (2013) The maths that saw the US shutdown coming New Scientist 10 Oct.
Manzoor S (2013) Triumph of the Geeks NZ Herald 2 Aug http://www.nzherald.co.nz/business/news/article.cfm?c_id=3&objectid=10906760
Mattingly J, Vaughn C (2014) Redistricting and the Will of the People arXiv:1410.8796.
May R, Levin S & Sugihara G (2008) Ecology for bankers Nature 451 893-5.
Meadows D, Randers J Meadows D (2004) The limits to growth : the 30-year update Chelsea Green Pub. Co., White River Junction, Vt.
Milbank D (2018) An explosion is coming https://www.washingtonpost.com/opinions/an-explosion-is-coming/2018/06/29/c3301b66-7ba2-11e8-93cc-6d3beccdd7a3_story.html
Mill, John Stuart (1869). The Subjection of Women (1869 first ed.). London: Longmans, Green, Reader & Dyer. http://archive.org/download/subjectionofwome00millrich/subjectionofwome00millrich.pdf
G (2013) The Confidential Memo at the Heart of the Global Financial Crisis Aug http://www.vice.com/en_uk/read/larry-summers-and-the-secret-end-game-memo
EU reaction to Brazil: www.gregpalast.com//vulturespicnic/pages/filecabinet/chapter12/5_protocol.pdf
Payne Keith (2017) The Broken Ladder: How Inequality Affects the Way We Think, Live and Die Random House.
Pearce F. (1996) The Grand Banks: Where Have All the Cod Gone? New Scientist 16 Sept 24. http://www.nps.gov/olym/forteachers/upload/The-Grand-Banks-Collapse.pdf
Peston R. (2014) Russia 'planned Wall Street bear raid' http://www.bbc.com/news/business-26609548
Phillips M (2011) Flash Crash Anniversary: Relive the Thrills and Spills in Charts! Wall Street Journal http://blogs.wsj.com/marketbeat/2011/05/06/flash-crash-anniversary-relive-the-thrills-and-spills-in-charts/
Popper N (2012) High-Speed Trading No Longer Hurtling Forward NY Times Oct 14 http://www.nytimes.com/2012/10/15/business/with-profits-dropping-high-speed-trading-cools-down.html
Price R. (2003) In Depth Analysis of American Income and Taxation http://www.rationalrevolution.net/articles/american_income_taxation.htm
Radden Keefe P (2016) The Bank Robber New Yorker 30 May.
S. (2013) Yellowstone wolves spur recovery of bears' berries
Resnick S. (1996) The Hit Men Newsweek Feb 26 http://www.thedailybeast.com/newsweek.html
Rose G. (2007) Cod: The Ecological History of the North Atlantic Fisheries Breakwater Books St. Johns, NL.
Saluzzi J & Arnuk S (2012) Happy 2nd Anniversary, Flash Crash of 2010! http://www.ritholtz.com/blog/2012/05/the-flash-crash-of-2010-happy-2nd-anniversary/
Shnayerson M (2009) Wall Street’s $18.4 Billion Bonus Vanity Fair http://www.vanityfair.com/politics/features/2009/03/wall-street-bonuses200903
Shostak, M (1981) Nisa The Life and Words of a !Kung Woman, Penguin Books.
Silver-Greenberg J. and Protess B. (2012) Trying to Be Nimble, Knight Capital Stumbles http://dealbook.nytimes.com/2012/08/02/trying-to-be-nimble-knight-capital-stumbles/
Stanley H, Pleroua V, & Gabaix X (2008) A statistical physics view of financial fluctuations: Evidence for scaling and universality Physica A 387 3967–3981
Stephanopoulos N, McGhee E (2015) Partisan Gerrymandering and the Efficiency Gap U. Chi. L. Rev. 831 http://chicagounbound.uchicago.edu/cgi/viewcontent.cgi?article=5857&context=uclrev.
Stewart I (2010) Electoral dysfunction: Why democracy is always unfair New Scientist 28 April http://www.newscientist.com/article/mg20627581.400-electoral-dysfunction-why-democracy-is-always-unfair.html
Taibbi M (2013) Gangster Bankers: Too Big to Jail: How HSBC hooked up with drug traffickers and terrorists. And got away with it Rolling Stone http://www.rollingstone.com/politics/news/gangster-bankers-too-big-to-jail-20130214
van den Heuvel M & Sporns O (2011) Rich-Club Organization of the Human Connectome Journal of Neuroscience 31/44 15775–86.
Vitali V, Glattfelder J & Battiston S (2011) The network of global corporate control http://arxiv.org/pdf/1107.5728
Wade L (2017) How taming cows and horses sparked inequality across the ancient world Science doi:10.1126/science.aar4973.
Zhou W & Sornette D (2006) Is there a real-estate bubble in the US? Physica A 361 297-308.