Former Fed Vice Chair Donald Kohn on monetary policy strategies, tools, and communication

A person walks past the Federal Reserve building in Washington, U.S., July 16, 2018. REUTERS/Leah Millis - RC15C50746E0
Editor's note:

Kohn delivered the following keynote address at the “The Federal Reserve and Prospects for Monetary Policy Reform” seminar co-sponsored by the Institute for Humane Studies and the Mercatus Center at George Mason University on January 3, 2019.

We have had a stimulating day of discussion about the prospects for monetary policy reform and I’m pleased to have the opportunity to address this dinner at the end of the day.  The topic was well chosen—a period of stress is where you learn about the strengths and weaknesses of policy structures, and the past decade or so certainly qualifies as a 10 on the stress scale.  With the economy back to full employment and sustainable price stability it seems a good time to ask whether changes can be identified that would improve the conduct of policy and bolster the public’s understanding and support for that policy and the organization that makes it.

The Federal Reserve has already announced one important step: an open process “to review the strategy, tools, and communication practices it uses to pursue its congressionally assigned mandate of maximum employment and stable prices”; this process is to include “outreach to a broad range of interested stakeholders” encompassing a public research conference in early June.

This open process is a major extension of the trend toward increased transparency around monetary policy that has been underway for several decades.  In the past, greater transparency has been mostly focused on communicating what the Federal Open Market Committee (FOMC) did and might do in the future and why.  The new initiative is focused on how those decisions should be made—the strategy; how they should be implemented—the tools; and how they should be explained—the communication.  Those topics already have been subject to considerable research and discussion within and outside the Fed of course, but this brings them together in an organized and transparent way under the sponsorship of the entire policymaking organization—the Board and the Reserve Banks.

The promised openness of the process is key.  It is the mark of a confident organization willing to engage with outside views and make changes when it is presented with ideas whose benefits can be shown to exceed their costs.  Such openness should enhance public understanding of monetary policy and bolster support for policy independence at a critical economic and political juncture.  Wisely, the Fed has chosen to limit the scope of the process by defining it around its current legislative mandate.  By not asking for suggestions for new legislation, it has ruled out appearing to intrude into the province of elected representatives.

I will shape my comments tonight around those three topics: strategy, tools, and communication, which overlap to a considerable extent with the topics already discussed today.  In a sense, I see this as a memo to Rich Clarida about what the Fed should be looking to cover at that early June conference.

The strategic question most in need of addressing is whether and how the inflation targeting framework centered on 2 percent might need to adapt to a world of persistent low interest rates.


The strategic question most in need of addressing is whether and how the inflation targeting framework centered on 2 percent might need to adapt to a world of persistent low interest rates.  FOMC participants see short-term nominal interest rates averaging around only 3 percent over the long run, held down by inflation expectations anchored at 2 percent and demographic and productivity trends reducing equilibrium real rates. In the past, however, the Fed has often felt it necessary to cuts rates by 4-5 percentage points in recessions to restore full employment and keep inflation from falling persistently below target.

To be sure, policy easing in the past often started from rate levels well into restrictive territory; cuts in policy rates can be effectively supplemented by unconventional policies like those implemented in the recent financial crisis—large-scale asset purchases and heightened forward guidance on interest rates; and a stronger financial system—if the forces of deregulation are held in check—will reduce the odds of adverse shocks being amplified by sharp reductions in credit supply.  Still, reduced scope for policy easing implies greater probability of conventional monetary policy becoming constrained at the zero lower bound and forced to rely on unconventional polices of uncertain power.  Limits on the scope for monetary policy easing have to be of heightened concern when fiscal policy could well be hamstrung in the next recession by the continuing rapid run up in debt relative to income even at high levels of employment and by political paralysis in Washington.

A number of proposals to alter the policy framework have been made to make hitting the zero lower bound less likely—for example a higher inflation target, or to make policy at the zero lower bound more potent by building in systematic techniques to lock in lower-for-longer interest rates in such circumstances. These latter types of proposals work by causing expectations of extended periods of zero interest rates to get built into interest rates and asset prices and perhaps also by raising inflation expectations and lowering real rates when those commitments imply higher inflation later.   Price level targeting fits into that category as do the suggestions of Williams for targeting two percent inflation on average over the business cycle, Bernanke for price level targeting only at the zero lower bound and Yellen for tying lower for longer policies to the amount of accommodation foregone because of the zero bound on rates. Nominal GDP targeting shares many of the same characteristics—a commitment to keep rates low enough for long enough to raise the level of nominal GDP back to its previous trend path.

In my view, all of these proposals have important drawbacks:  A higher inflation target probably is not consistent with the Fed’s legislative price stability mandate—and it wouldn’t meet the Volcker/Greenspan definition of price stability as inflation low enough that people don’t have to take account of it in their daily decisions.  Price level targeting and lower-for-longer make-up plans have the advantages of providing some protection against the Japanese disease of inflation expectations locked in well below the target, and they generally imply that inflation would average around 2 percent over the business cycle.  But they raise complex messaging challenges and a period of high make-up inflation risks un-anchoring expectations on the high side.  Nominal GDP targets, in addition to the messaging and inflation overshoot issues, are subject to data revisions and uncertainty about sustainable real growth.

My choice is to stick with flexible inflation targeting around a 2 percent target, with the emphasis on flexible when the policy rate is near or at the zero lower bound, supplemented by other efforts to limit the incidence or the damage of lower bound constraints on policy.   As policy rates fall toward zero, risk management considerations—the greater cost to society of being stuck at zero for a time relative to the cost of an inflation overshoot—would dictate taking unusually large chances on exceeding the target at some point in the future, but not deliberately aiming at that outcome.  That would entail easing rapidly when the zero lower bound threatens; and jumping to unconventional policies quickly and forcefully at, or even before, the limit of conventional easing, including a ramping up of forward guidance as the limit is approached.  Forward guidance about rates at or near the zero lower bound—the economic and financial conditions that influence how long rates are likely to remain at zero—should be informed, but not dictated, by make-up strategies built around price levels or shadow rates, just as policies above the zero lower bound are informed, but not dictated, by reference to rules.

Whether such an aggressive risk-management lean for policy would be enough to produce a satisfactory recovery from a recession when rates were pinned near zero is a matter for further research.  I suspect it very much depends on the particular circumstances of the recession.  In effect, it is the strategy the Fed followed after the great recession, and the recovery was often seen as disappointing.  But that recession was especially deep and involved a nearly complete breakdown in credit intermediation.  In addition, lessons have been learned from that experience about how to implement unconventional policies to maximize their effectiveness, and there may be ways to supplement unconventional policies to reduce the deleterious effect of the zero lower bound constraint.

For example, the Fed should re-examine two possible enhancements of unconventional monetary policies that it has previously rejected but have been used elsewhere.  One would be a negative interest rate on bank reserves; other central banks have been successful in easing financial conditions with negative rates, especially when applying them on a marginal basis to limit their adverse effect on bank resilience. A second avenue to explore would be incentive schemes such as those employed by the Bank of England, using the cost and availability of loans from the central bank to get banks to increase lending and pass lower policy rates through to borrowers.

Moreover, to reduce the probability of financial distress causing prolonged episodes of being stuck at the zero lower bound, the Fed also should pursue countercyclical macroprudential policy much more vigorously than it has. One pillar would be more active use of the countercyclical capital buffer. Higher capital buffers built up in good times will help banks to retain access to funding when adverse shocks hit; allowing the banks to utilize those buffers after a shock will encourage them to keep the supply of credit flowing to households and businesses. And, with mortgage finance at the heart of so many financial problems in the past, the Fed should work with the various housing finance agencies to assess whether they can incorporate countercyclical elements into their policies, tightening mortgage standards in good times to lower them after the housing cycle has turned;  maybe less ambitiously the Fed should just try to convince these agencies to make their policies less procyclical, to reduce or eliminate their tendency to lower standards when house prices get high relative to incomes.

Alternative frameworks for achieving congressional mandates in a low interest rate world are not the only monetary policy strategy issue that should be addressed in the open process.  Chairman Powell and other FOMC members have tended to cite financial stability concerns—elevated asset prices and easy credit terms—along with inflation risks as they have discussed the rationale for raising interest rates of late.  In this regard, Chairman Powell seems more open to factoring in such risks than the previous two chairs, who emphasized monetary policy as the “last line of defense” for protecting financial stability.

I’ve been on record over many years as skeptical about the cost-benefit calculus for using interest rate policy to address financial stability risks—and I still am under most circumstances.  Perhaps the void where countercyclical macroprudential policy should be is weighing on current considerations, especially as risks might be lodged outside the banking system.  In any event, the role of financial stability considerations in monetary policy strategy would seem ripe for further discussion given recent rhetoric.

Another hardy perennial in the policy strategy category that has been given new life by developments over recent years centers around the effects of uncertainty on policy choices. A number of parameters in standard thinking about policy seem to be in the honestly “unusually uncertain” category these days.  The slope of the Phillips curve and determinants of inflation head the list, but Chairman Powell has also raised questions about the “stars”—the natural rates of unemployment, interest rates, and sustainable rate of economic growth.

Much work has been done over the years to address how uncertainty should factor into policy decisions—beginning with William Brainard on robust control and risk management.  The Fed has reacted to the rise in uncertainty about underlying parameters by emphasizing “data dependency”.  Surely, that has to be right on some level, but too much reaction to incoming data will result in shifting messages and policy volatility.  The experience of recent years might be a good basis for re-examining various strategies for making policy when underlying relationships appear to be evolving.

Finally, the FOMC should take this opportunity to clarify the strategy document that it re-issues every January, and to begin to utilize it to explain changes in policy.  What exactly does it mean by a “symmetrical” inflation target—equal reaction to overshoots and undershoots along a given reaction function, or averaging inflation over time as President Evans has advocated?  Does the “balanced approach” to conflicting misses of employment and inflation targets the document discusses really imply equal weight to unemployment rates below the NAIRU and to undershoots of the inflation target when they occur together?  Recent FOMC actions in these circumstances would seem to imply not, understandably when low unemployment is desirable in the absence of inflation pressure and the NAIRU is so hard to pin down.  Notably, the balanced approach rule used by Chair Yellen in some speeches and included in the rules box in the monetary policy report does give equal weight to inflation and unemployment misses and it now implies a much higher federal funds rate.  Finally, understanding of monetary policy strategy would be helped if the FOMC actually referred to its basic strategy document when it explained policy decisions; instead it appears to be given a minor sprucing up every January and then put away until the next January.


I will be much briefer on the tools agenda for the Fed’s open process.  I see two broad topics—tools when policy is at or near the zero lower bound, and operating procedures when rates are materially above zero.

For tools at the zero lower bound, I’ve already suggested explorations of exactly how low rates could be moved in such a circumstance, and whether Fed credit to banks could be structured to incentivize lower rates and greater availability of credit to households and businesses.  What remains is for the Fed to draw lessons from its past implementation of asset purchases and forward guidance as they evolved over the recovery.  Which techniques were most effective?  How does the FOMC expect to deploy these tools when policy rates next flirt with whatever the effective lower bound turns out to be?  An important task is to re-examine the experience with large-scale asset purchases.  QE, as it was commonly known, seemed to arouse many fears and much opposition when it was introduced and expanded.  The concerns expressed generally have not been borne out by experience, and it will be important to build public understanding that large-scale asset purchases have a legitimate role to play in boosting recovery from recession when the ordinary instrument of monetary policy is stuck at the zero lower bound.

It is important for the Fed to talk about all this now—to tell us what their plans might be when the next recession threatens.

It is important for the Fed to talk about all this now—to tell us what their plans might be when the next recession threatens. For sure, circumstances will differ from expectations and plans will need to be adapted, but the public and political spheres need to understand that the odds are pretty good that unconventional policies will need to be utilized again and how the Fed is planning to approach that contingency.

The FOMC has already had extensive discussions of its policy tools in peacetime, when the policy rate is expected to be sustained above its effective lower bound.  Here the issue concerns the size of the portfolio—whether reserves will continue to be in abundance with rates pinned to a floor set by interest on reserves and reverse RP rates, or whether a much smaller portfolio and a return to a corridor system is desirable, in which the policy rate is centered between an IOER floor and a discount rate ceiling.

The minutes of the FOMC meetings indicate that the Fed is leaning toward a floor system, which may be the right decision, but should be subject to public input and debate.  Moreover, a permanently larger portfolio raises a couple of subsidiary questions.  What will be the composition of the portfolio and will that portfolio composition be used to further financial stability goals, for example, to flatten the near-term yield curve to reduce maturity transformation incentives?

A secondary issue involves the rate of interest on reserves.  In concept this rate was to be either at comparable market rates in a floor system or below market rates in a corridor system.  In practice, the floor has been a bit spongy or soggy, with the interest rate on reserves a little above the rate at which federal funds were trading in the market.  This has been the result of a number of factors, including the large presence as sellers in the funds market of the GSEs, which do not have the option of holding deposits at the Fed, coupled with the demand damping effects of capital and FDIC charges for banks that has added to the expense of arbitrage between the funds market and Fed deposits.  The gap between the rate on reserves and the federal funds rate has given the appearance—though not the reality—of a subsidy to large and foreign banks having deposits at the Fed.  That seems to be correcting now as the funds rate rises toward the rate on reserves, but the Fed needs to address what it expects to happen to these relationships in the future and how it will address any tendency for the interest rate on reserve deposits to exceed market rates on reserves.


Examining communication in the open process is just as important in the current environment as research on the technical aspects of strategy and tools.  Since the financial crisis, we have seen a number of attempts to involve the political process more closely in the implementation of monetary policy.  The current degree of policy independence has, in my view, served our economy well, and shortening the arms-length relationship of policy from political pressures would, over time, have adverse consequences.  Enhancing public understanding of the institution and how it is pursuing its legislative mandates should both build support for sustaining an appropriate degree of independence within our democratic structures and increase the ability of elected representatives to hold the institution accountable.

The Fed has taken a number of constructive steps in the past few years to further these objectives.  One has been Chairman Powell’s efforts to explain Fed policy in plainer English—you can’t build enduring support for something the public doesn’t understand.  Another is Chairman Bernanke’s inauguration of press conferences, which will now be held after each FOMC meeting instead of quarterly.  That can be tricky, as we’ve seen of late, and Chairman Powell needs to be careful that plain English doesn’t become the enemy of necessary nuance.  But press conferences are an opportunity to tell a coherent story emphasizing the consensus in the FOMC, rather than the small differences that too-often dominate press coverage of the speeches of individuals.  And the ability of Congress to hold the Fed accountable in testimony has been enhanced by early release of the monetary policy report and the inclusion in that report of material on monetary policy rules.

Although I have noted several opportunities for more and clearer communication regarding strategies and tools, much more can be done.

One potential improvement is around conveying uncertainty.  I’ve already remarked on the increased emphasis on uncertainty in the Chairman’s speeches and FOMC communication.  Yet that uncertainty is not well reflected in the quarterly projection material.  Putting probability distributions around an array of up to 19 different forecasts is no easy task—this is not a consensus or best collective judgment forecast of the sort published by the Bank of England with its useful fan charts.  Confidence intervals around the medians derived from historic forecast errors are published with the minutes of FOMC meetings, but they are omitted from the material published right after the meetings that inform the press conference and media coverage.  Surely this could be corrected.

At the same time, I believe the current emphasis on the medians of these disparate projections in Fed publications and explanations also works to undermine the emphasis on uncertainty.  The fed should stop publishing the medians and stop highlighting them in its policy explanations.  Yes, I know that others will publish medians if the Fed doesn’t, but it matters who has ownership and even more how they are used.  For small numbers, like the number of FOMC participants, the medians may not even be good representations of the central tendencies, since they are sensitive to shifts in one or two projections.  What is now labeled as the central tendencies of the projections—the middle two-thirds—should be adequate for sketching the broad outlines of the consensus, and anything more is false precision—especially without prominent confidence bands around the medians.

The need for indicating uncertainty is especially pressing with regard to the so-called “dots”—the expectations of individual FOMC participants for the federal funds rate at the next few years and in the long-term.  The dots are a useful device for indicating whether the FOMC expects rates to rise or fall, and maybe even whether by a lot or a little.  But market participants and commentators seem to put much more weight on small differences in the medians of these policy projections—say, one or two or three one-quarter percentage point increases next year–than is consistent with our imprecise knowledge of economic relationships.  And this focus detracts from the more appropriate and helpful discussion of the underlying forces at play and the risks around the forecast.

[M]y recommendation for…economic projections…is to align them better with the rhetoric of uncertainty and data dependency and the true state of economic knowledge by dropping the medians and including confidence bands.

In sum, my recommendation for the presentation of economic projections and their use in policy explanations is to align them better with the rhetoric of uncertainty and data dependency and the true state of economic knowledge by dropping the medians and including confidence bands.

Second, there are a number of steps the Fed could consider to further its efforts to enhance understanding of its policy beyond academia and the financial sector and its associated media outlets.

For example, it might consider the efforts of the Bank of England to reach a broader audience.  The Bank has gone to “layering” its monetary policy and financial stability policy reports.  The top layer is a simple statement of the main messages for the general public; the second layer has some charts and some explication for the financial press; the third layer has the full explanation and back up material of the monetary policy or financial stability reports.  Among other things, this technique forces the policymakers to consider very carefully what main messages they wish the public to understand.

Reaching a broader audience also implies choosing a wide variety of venues and audiences for communication and then using those venues to get across the main messages.  The reserve banks already speak to many different groups in their districts.  In my view, however, the reserve bank presidents could do a better job of explaining why the FOMC has made the decisions it has, with much less emphasis in their speeches on the small differences that divide the policymakers. To be sure, knowing that FOMC decisions are informed by diverse perspectives is important and should build confidence in the resulting policies.  But public understanding of the policy process would be helped by concentration on the story behind the FOMC’s policy choices and expectations rather than on for example the precise number of rate increases an individual policymaker expects next year.

The speeches of members of the Board, by contrast, seem mainly to explicate the consensus, but too rarely reach out to large and diverse audiences.   They should consider doing more of this.  Chairman Bernanke’s appearance on 60 Minutes reached a broader set of people than 100 appearances on CNBC would have, and he made his explanations in much more understandable language than is ordinarily used in these other venues.

In closing, I recognize that this is a formidable agenda for that open process and early June conference—and I’m sure Chairman Powell and Vice Chairman Clarida have some things on their agenda that I’ve omitted.  The important point is the innovation of the open and inclusive process, which will help the Federal Reserve and the public “weigh the prospects for monetary policy reform” in the spirit of today’s seminar.