Irrational behaviors of individuals include taking offense or becoming angry about a situation that has not yet occurred, expressing emotions exaggeratedly (such as crying hysterically), maintaining unrealistic expectations, engaging in irresponsible conduct such as problem intoxication, disorganization, and falling victim to confidence tricks. People with a mental illness like schizophrenia may exhibit irrational paranoia.
These more contemporary normative conceptions of what constitutes a manifestation of irrationality are difficult to demonstrate empirically because it is not clear by whose standards we are to judge the behavior rational or irrational.
The study of irrational behavior is of interest in fields such as psychology, cognitive science, economics, game theory, and evolutionary psychology, as well as of practical interest to the practitioners of advertising and propaganda.
Irrationality is not always viewed as a negative. Dada Surrealist art movements embraced irrationality as a means to "reject reason and logic". André Breton, for example, argued for a rejection of pure logic and reason which are seen as responsible for many contemporary social problems.
Ancient Greek philosophy established a fundamental differentiation between logical "true" assumptions of the universe and irrational "false" statements or mere opinions based on emotion or sensorial experience. The German cultural historian Silvio Vietta has shown that Greek philosophy thus founded a dual cultural system based on rationality as the domain of philosophy and science versus "irrational" emotion and sensuality as domains of literature and art. Since the irrational emotions as stirred up in literature threaten the rationality of human beings, in The Republic Plato expelled poets from the state.
In the later history of philosophy this opposition of rationality and the irrational was renewed as a methodological differentiation by Descartes, but reversed by Pascal in his statement: "Le coeur a ses raisons, que la raison ne connait point" ("The heart has its reasons which reason does not know"). Pascal thus asserted a specific rationality of the "irrational" emotions. The philosophy of sensualism (John Locke, among others) underlined the importance of the senses as the source of human perception and cognition.
The 19th-century German philosopher Julius Bahnsen asserted that all thought processes, desires and actions ultimately led to irresolvable contradictions which stem from the inherent irrationality of being. Years earlier, Friedrich Wilhelm Joseph Schelling had theorized that despite some traces of rationality in the world, the "dark ground" of being itself rested in an irrational will that could not be explained, only described in an apophatic manner. Arthur Schopenhauer picked up on this idea and completely fleshed out the concept of an irrational will as a cause of existence, by founding his entire metaphysics and explaining the variety of physical phenomena precisely with this underlying, unconscious and dynamic notion of will.
Søren Kierkegaard gave some remit to irrationality in his Concluding Scientific Postscript to the Philosophical Fragments, where he claimed that 'Subjectivity is Truth'. Rather than allowing reason to do our choosing for us, Kierkegaard argued that irrational leaps of faith could be more useful, as they were more authentic (although, he never used the word 'authentic'), and thus gave more meaning to life. Objectivity, like reason, was opposed to subjectivity, and thus could not be said to give any meaning to anyone's life. Although he never dismissed rationality in its entirety, Kierkegaard argued that we could not allow rationality to make our decisions for us. In this, and to some degree, he offers a vindication of irrationality.
Much subject matter in literature can be seen as an expression of human longing for the irrational. The Romantics valued irrationality over what they perceived as the sterile, calculating and emotionless philosophy which they thought to have been brought about by the Age of Enlightenment and the Industrial Revolution.The Dadaists and Surrealists later used irrationality as a basis for their art. The disregard of reason and preference for dream states in Surrealism was an exaltation of the irrational and the rejection of logic.
Mythology nearly always incorporates elements of fantasy and the supernatural; however myths are largely accepted by the societies that create them, and only come to be seen as irrational through the spyglass of time and by other cultures. But though mythology serves as a way to rationalize the universe in symbolic and often anthropomorphic ways, a pre-rational and irrational way of thinking can be seen as tacitly valued in mythology's supremacy of the imagination, where rationality as a philosophical method has not been developed.
On the other side the irrational is often depicted from a rational point of view in all types of literature, provoking amusement, contempt, disgust, hatred, awe, and many other reactions.
The term irrational is often used in psychotherapy and the concept of irrationality is especially known in rational emotive behavior therapy originated and developed by American psychologist Albert Ellis. In this approach, the term irrational is used in a slightly different way than in general. Here irrationality is defined as the tendency and leaning that humans have to act, emote and think in ways that are inflexible, unrealistic, absolutist and most importantly self-defeating and socially defeating and destructive.
The Challenge of Central Banking in a Democratic SocietyGood evening ladies and gentlemen. I am especially pleased to accept AEI's Francis Boyer Award for 1996 and be listed with so many of my friends and former associates. In my lecture this evening I want to give some personal perspectives on central banking and, consequently, I shall be speaking only for myself.William Jennings Bryan reportedly mesmerized the Democratic Convention of 1896 with his memorable ". . . you shall not crucify mankind upon a cross of gold." His utterances underscored the profoundly divisive role of money in his time--a divisiveness that remains apparent today. Bryan was arguing for monetizing silver at an above-market price in order to expand the money supply. The presumed consequences would have been an increase in product prices and an accompanying shift in the value of net claims on future wealth from the "monied interests" of the East to the indebted farmers of the West who would arguably be able to pay off their obligations with cheaper money.The debates, before and since, over the issue of our money standard have mirrored the deliberations on the manner in which we have chosen to govern ourselves, and, perhaps more fundamentally, debates on the basic values that should govern our society.For, at root, money--serving as a store of value and medium of exchange--is the lubricant that enables a society to organize itself to achieve economic progress. The ability to store the fruits of one's labor for future consumption is necessary for the accumulation of capital, the spread of technological advances and, as a consequence, rising standards of living.Clearly in this context, the general price level, that is, the average exchange rate for money against all goods and services, and how it changes over time, plays a profoundly important role in any society, because it influences the nature and scope of our economic and social relationships over time.It is, thus, no wonder that we at the Federal Reserve, the nation's central bank, and ultimate guardian of the purchasing power of our money, are subject to unending scrutiny. Indeed, it would be folly were it otherwise.A central bank in a democratic society is a magnet for many of the tensions that such a society confronts. Any institution that can affect the purchasing power of the currency is perceived as potentially affecting the level and distribution of wealth among the participants of that society, hardly an inconsequential issue.Not surprisingly, the evolution of central banking in this nation has been driven by such concerns. The experiences with paper money during the Revolutionary War were decidedly inauspicious. "Not worth a Continental" was scarcely the epithet one would wish on a medium of exchange. This moved Alexander Hamilton, with some controversy, to press for legislation that established the soundness of the credit of the United States by assuming, and ultimately repaying, the war debts not only of the fledgling federal government, but of the states as well. Equally controversial was the chartering of the First Bank of the United States, which, although it had few functions of a modern central bank, was nonetheless believed to be a significant threat to states rights and the Constitution itself.Although majority controlled by private interests, the Bank engaged in actions perceived to shift power to the federal government. Such a shift was thought of by many as a fundamental threat to the new democracy, and an essential element of what was feared to be a Hamilton plan to re-establish a powerful aristocracy. The First Bank--and especially its successor Second Bank of the United States--endeavored to restrict state bank credit expansion when it appeared inordinate, by gathering bank notes and tendering them for specie. This reduced the reserve base and the ability of the fledgling American banking system to expand credit. The issue of states' rights and concern about the power of the central government reflected the free wheeling individualism of that time. The Second Bank was a major issue of the election of 1832. Earlier in that year, President Andrew Jackson had vetoed the bill to extend its charter, and the election became a referendum on his veto. The outcome was a resounding victory for Jackson and the death knell for the Bank.It has not been easy, however, to separate often seemingly conflicting threads in the debate between advocates of state powers over money and those seeking a national role. When Andrew Jackson vetoed the charter renewal of the Second Bank of the United States, for example, he argued for the severing of the grip on the economy of easterners and especially foreigners, who owned a significant stock interest in the bank. Ironically, by helping to create what was perceived to be an unstable currency, he set the stage for the later development of a full-fledged gold standard, the institution that Bryan railed against in 1896 from much the same populist philosophical base as Jackson.After the Civil War, redemption of the paper greenbacks issued during the war brought an era of agold-standard-induced deflation, which, while it may not have thwarted the impressive advance of industrialization, was seen by many as suppressing credit availability for the rural interests of the nation, which were still a majority. The general price level declined for more than two decades, which meant borrowers were paying off their loans in more expensive dollars than those they borrowed.Not surprisingly, mounting pressures developed for reform, with Bryan bearing the standard for subsidized silver coinage, that is, free silver. Though Bryan lost to McKinley in 1896 (and again in 1900), the rural-based pressures for a more elastic currency did not diminish and ultimately were reflected, in part, in the creation of the Federal Reserve.Nonetheless, many of the proponents of banking reform in the 1890s, and in the aftermath of the Panic of 1907, were suspicious of creating a central bank. In very large measure, those concerns underlay the various threads of reform that were joined together in the design and creation of the Federal Reserve System in 1913. Its founding followed a prolonged debate on the balance of power between the interests of the New York money center banks and the rest of the nation, still largely rural. The compromise that resulted from that debate created twelve regional Reserve Banks with a Washington presence vested with a Federal Reserve Board. Its purpose was to "furnish an elastic currency, . . . to establish a more effective supervision of banking in the United States, and for other purposes." Monetary policy as we know it today, was not among the "other purposes." That evolved largely by accident in the 1920s.Return to topEven with a central bank, the gold standard was still the dominant constraint on the issuance of paper currency and the expansion of bank deposits. Accordingly, the Federal Reserve was to play a minor role in affecting the purchasing power of the currency for many years to come.The world changed markedly with the advent of the Great Depression of the 1930s, and the evisceration of the gold standard. The upheaval, and still festering fear of New York "monied interests," engendered the Banking Acts of 1933 and more importantly of 1935, which vested more of the Federal Reserve's authority with the Board of Governors in Washington. During World War II, and through 1951, however, monetary policy was effectively subservient to the interests of the Treasury, which sought access to low-cost credit. With the so-called Federal Reserve-Treasury Accord of 1951, the Federal Reserve began to develop its current degree of independence.Although in the 1950s and early 1960s there were short-lived bouts of inflation that caused momentary concern about sustained increases in the price level, these events did little to shake the conviction of most that America's economic and financial structure would indefinitely and effectively contain any inflationary forces. This prescription certainly seems to have been reflected in the low inflation premium then embedded in long-term bonds.That this view was profoundly wrong soon became apparent. The 1970s saw inflation and unemployment simultaneously at relatively elevated levels for some time. The notion that this could occur was nowhere to be found in the conventional wisdom of the economic policy philosophy that developed out of the Keynesian revolution of the 1930s and its subsequent empirical applications. Moreover, these models embodied the view that aggregate demand expansion, from almost any level, would permanently create new jobs. When that expansion carried the economy beyond "full employment" there would be a cost in terms of higher inflation--but only a one-time increase in inflation, so that there existed a permanent trade off between sustainable levels of inflation and employment.The stagflation of the 1970s required a thorough conceptual overhaul of economic thinking and policymaking. Monetarism, and new insights into the effects of anticipatory expectations on economic activity and price setting, competed strongly against the traditional Keynesianism. Gradually the power of state intervention to achieve particular economic outcomes came to be seen as much more limited. A consensus gradually emerged in the late 1970s that inflation destroyed jobs, or at least could not create them.This view has become particularly evident in the communiques that have emanated from the high-level international gatherings of the past quarter century. That inflation could reduce employment was a highly controversial subject in the mid-1970s when introduced into communique language drafts. At the meetings I attended as Chairman of the Council of Economic Advisers, the notion invariably induced extended debates. Today in similar communiques such language is accepted boiler plate and rarely the focus of discussion. This shift in attitudes and understanding provided political support in 1980 and thereafter for the type of monetary policy required to rebalance the economy.Despite waxing and waning over the decades, a deep-seated tension still exists over government's role as an economic policymaker. This tension is evident in Congressional debates, campaign rhetoric, and our ubiquitous talk shows.Return to topIt should not be a surprise that the very same ambiguities and conflicts that characterize the rest of our political life have their reflection in the nation's current view of its central bank, the Federal Reserve. With regard to monetary policy, the view--or at least the suspicion--still persists in some quarters that an activist, expansionary policy could yield dividends in terms of permanently higher output and employment.Nonetheless, there is a grudging acceptance of the degree of independence afforded our institution, and an awareness that unless we are free of the appropriations process that our independence could be compromised. It is generally recognized and appreciated that if the Federal Reserve's monetary policy decisions were subject to Congressional or Presidential override, short-term political forces would soon dominate. The clear political preference for lower interest rates would unleash inflationary forces, inflicting severe damage on our economy.Notwithstanding, the central bank has not been immune from the suspicion and lack of respect that has come to afflict virtually all institutions in our society since the traumas of Vietnam, Watergate, and the destabilizing inflation in the 1970s.The Federal Reserve's most important mission, of course, is monetary policy. I wish I could say that there is a bound volume of immutable instructions on my desk on how effectively to implement policy to achieve our goals of maximum employment, sustainable economic growth, and price stability. Instead, we have to deal with a dynamic, continuously evolving economy whose structure appears to change from business cycle to business cycle, an issue I shall return to shortly.Because monetary policy works with a lag, we need to be forward looking, taking actions to forestall imbalances that may not be visible for many months. There is no alternative to basing actions on forecasts, at least implicitly. It means that often we need to tighten or ease before the need for action is evident to the public at large, and that policy may have to reverse course from time to time as the underlying forces acting on the economy shift. This process is not easy to get right at all times, and it is often difficult to convey to the American people, whose support is essential to our mission.Because the Fed is perceived as being capable of significantly affecting the lives of all Americans, that we should be subject to constant scrutiny should not come as any surprise. Indeed, speaking as a citizen, and not Fed Chairman, I would be concerned were it otherwise. Our monetary policy independence is conditional on pursuing policies that are broadly acceptable to the American people and their representatives in the Congress.Augmenting concerns about the Federal Reserve is the perception that we are a secretive organization, operating behind closed doors, not always in the interests of the nation as a whole. This is regrettable, and we continuously strive to alter this misperception.If we are to maintain the confidence of the American people, it is vitally important that, excepting the certain areas where the premature release of information could frustrate our legislated mission, the Fed must be as transparent as any agency of government. It cannot be acceptable in a democratic society that a group of unelected individuals are vested with important responsibilities, without being open to full public scrutiny and accountability.To be sure, if we are to carry out effectively the monetary policy mission the Congress has delegated to us, there are certain Federal Reserve deliberations that have to remain confidential for a period of time. To open up our debates on monetary policy fully to immediate disclosure would unsettle financial markets and constrai