Transcripts of the monetary policymaking body of the Federal Reserve from 2002–2008.

  • Good afternoon, everybody. I have a welcome for Sandy Pianalto. Since she will not officially be a Reserve Bank President for a number of days, we will put your welcome in escrow, and you may withdraw it at the appropriate time.

  • As you all know, this is the organizational meeting for the year. Accordingly, I will turn the proceedings over to Governor Ferguson.

  • Thank you very much. As all of you know, maybe with the exception of Sandy, the reason I have the floor now is to organize the election of a Chairman and a Vice Chairman of this Committee to serve until our first meeting of 2004. So, let me open the floor for nominations for the positions of Chairman and Vice Chairman of the FOMC. Don Kohn.

  • Governor Ferguson, as the Speaker of the House will say tonight, I have the great privilege and high honor of nominating Alan Greenspan to be Chairman and William J. McDonough to be Vice Chairman of this Committee.

  • All right, those sound like two credible nominees. [Laughter] Are there any other nominations? Assuming not, could I have a vote? All in favor say “aye.”

    ALL. Aye.

  • Sounds as if it’s unanimous. Congratulations yet again to you both.

  • The next item on the agenda, if I can find it, should be the vote for our staff officers. Do you have the list to read?

  • Yes, Mr. Chairman. Secretary and Economist Vincent Reinhart Deputy Secretary Normand Bernard Assistant Secretaries Michelle Smith Gary Gillum General Counsel Virgil Mattingly Deputy General Counsel Thomas Baxter Economists Karen Johnson David Stockton Associate Economists From the Board Thomas Connors David Howard David Lindsey Charles Struckmeyer David Wilcox From the Reserve Banks Proposed by President McDonough Christine Cumming Proposed by President Guynn Robert Eisenbeis Proposed by President Broaddus Marvin Goodfriend Proposed by President Moskow William Hunter Proposed by President Parry John Judd

  • Without objection, so be it. Would somebody like to move the selection of the Federal Reserve Bank of New York to execute transactions for the System Open Market Account?

  • I’ll move the selection of the Federal Reserve Bank of New York.

  • Conflict of interest!

  • Maybe for the record somebody else should second the motion.

    SEVERAL. I’ll second.

  • I appreciate the revision. Approved without objection. I would assume and hope that there is no objection to the incumbent Manager of the System Open Market Account, Dino Kos, continuing in that position. Without objection. Thank you very much.

    Mr. Kos, would you quickly review your memorandum on the Authorization for Domestic Open Market Operations, the Authorization for Foreign Currency Operations, the Foreign Currency Directive, and the Procedural Instructions with Respect to Foreign Currency Operations?

  • Thank you, Mr. Chairman. As this was the organizational meeting I thought I would propose a couple of housekeeping items for the domestic authorization and the “guidelines” for operations in federal agency issues, which go along with the authorization. I would just say a few words on each.

    I am proposing that the Committee approve the Authorization for Domestic Open Market Operations with one amendment. The amendment has to do with the minimum lending fee for our securities lending program. When the Committee first approved this minimum lending fee, it was set at 1 percent. That was a reasonable number given that interest rates were then far higher. With interest rates where they are today, there is the risk that if the general collateral rate goes below 1 percent or if the Committee at some point should choose to lower interest rates to 1 percent or less, that would then effectively shut down the securities lending facility. If there were some kind of disruption or otherwise a need for us to lend securities, it would not make economic sense for anybody to borrow at a negative interest rate. So the proposal that I am making is for the Committee to provide that the Manager would have discretion to set the rate depending on the circumstances. Having said that, I must tell you that I have no intention of actually changing the rate anytime soon if the Committee should approve the amendment. But, again, this amendment would give the Manager discretion to act if need be. The objectives would stay the same—that is, the program would remain a secondary source for borrowing securities and a rather temporary source. I’d be happy to answer any questions on this proposal.

  • Questions? If not, would somebody like to move it?

  • Move approval, Mr. Chairman.

  • This is the domestic authorization, right?

  • It’s really a necessary technical correction.

  • Any objection? Given no objections, it is approved.

  • Thank you. The second item has to do with the Guidelines for the Conduct of System Operations in Federal Agency Issues. In August 1999, the Committee suspended four paragraphs of these guidelines. Prior to that action, the guidelines had not been amended since 1977. In that year, agency issues were a large and actually growing portion of the System Open Market Account. By 1981, agencies were 7 percent of SOMA holdings. Since then, that number has been coming down, first because we stopped buying agencies and then because since the mid-1990s we have allowed them to roll off at maturity. But as we expanded the collateral that we take at the Desk, a couple of the provisions in the guidelines got in the way—inadvertently, I think— of our accepting Ginnie Mae securities as collateral. So we had the odd case in which we were able to take mortgage-backed securities guaranteed by government-sponsored enterprises but not by Ginnie Mae.

    In 1999, the Committee took several steps to deal with potential liquidity strains in money and financing markets in the period around the century date change. One such step was to broaden the range of acceptable collateral for repos. The way the Committee did that was to suspend temporarily—until April 30, 2000—paragraphs 3 to 6 of the guidelines. Every year since then the Committee has extended the suspension of those paragraphs. Since mortgage- backed securities have become an almost routine part of System open market operations, rather than my coming to you every year and asking for another annual suspension I am proposing that the Committee repeal those four paragraphs. Given that agencies in the outright portion of the portfolio are de minimis now and actually will be at zero by the end of the year, this should have no material effect. But it is a cleanup of the existing guidelines for the Desk.

  • This may sound a little silly, but if we eliminate paragraphs 3 through 6, what are paragraphs 1 and 2 doing for you?

  • Well, you ask a good question. In fact, the proposal brought to the Committee in 1999 was to repeal the entire set of guidelines. As I understand it, there was a view in the Committee that the first two paragraphs provided a sense of how the Committee felt about federal agency issues. Retaining those two paragraphs didn’t constrain the Desk at all. So, certainly if the entire set of guidelines were repealed, that would be fine from my perspective. But it’s up to the Committee whether it wants to retain some sense of how it views agencies in this context.

  • Well, just as a matter of housekeeping, I would suggest that when you review the authorization next time—not this time but next year—you might see if you could add a sentence or two to the authorization that conveys the sense of those two paragraphs. If that could be put in the authorization, then we wouldn’t need the guidelines any more. That’s just a suggestion.

  • Further questions on this issue? If not, would somebody like to move approval?

  • Move approval of eliminating paragraphs 3 though 6 of the guidelines.

  • Discussion? All in favor?

    SEVERAL. Aye.

  • Opposed? Approved without objection.

  • Finally, the third item is a proposal to renew without amendment the Foreign Currency Authorization, the Foreign Currency Directive, and the Procedural Instructions with Respect to Foreign Currency Operations.

  • I’m not going to make another speech about intervention, Mr. Chairman, but I guess I will respectfully do my tri-annual “no” vote on the foreign currency instruments. If anything, I think we have obtained increased credibility in this Committee, and I am even more strongly against our taking intervention actions now than in the past. So, I thought I’d mention that.

  • I’d merely like to state that we’ve had virtually no intervention—in fact I don’t recall any—in the recent past. You may remember that the issue had to do with the Secretary of the Treasury and his willingness to accept our general view as to how intervention works and its impact on markets. That view has generally been accepted. Apparently for political reasons it cannot be accepted fully as an unequivocal directive because the State Department and others would like to hold it slightly in abeyance. But as a practical matter I think the Treasury is doing rather well in this area, and I trust that the incoming Secretary of the Treasury will hold very much the same views. My suspicion is that he does. Would somebody like to move approval?

  • Is there a second?

    SEVERAL. Second.

  • All in favor say “aye.”

    SEVERAL. Aye.

  • You all received a memorandum on the Program for Security of FOMC Information. Does anybody have any questions relevant to that? If not, would somebody like to move the revisions?

  • Without objection. Thank you very much. We have a staff memorandum proposing FOMC rule changes that redefine a meeting quorum, rescind the outdated Emergency Interim Committee, and authorize the appointment of an interim Manager in an emergency. These do not seem particularly controversial. I wondered if anybody had any questions or issues they’d like to raise or if we can go immediately to a motion. Would somebody like to make a motion then?

  • Without objection. Now we are back to the usual formal structure of our meetings. I ask your approval of the minutes for the meeting of December 10, 2002.

  • Approved without objection. We now go to an interesting issue on which we’ve gotten a significant amount of briefing material. We will hear from Messrs. Sack, Tetlow, Croushore, and Rudebusch. Mr. Sack, would you start us off?

  • Yes. I am going to refer to the charts that were distributed. They don’t have a cover on them, but the words “Smoothness of the Federal Funds Rate” are at the top of the first chart. For some time, economists have noted that monetary policy rates in major industrialized countries tend to change only gradually. A case in point is the intended federal funds rate, plotted in the top panel of exhibit 1. As can be seen, the stance of policy is typically adjusted in sequences of relatively small steps in the same direction. Over the sample shown, nine out of ten of the policy actions moved in the same direction as the previous change. Moreover, nearly all of the policy moves in the sample were 50 basis points or less.

    The smoothness of the federal funds rate can be captured more formally by estimating simple monetary policy rules that measure the systematic response of the policy instrument to key macroeconomic variables. An example is shown in the bottom left panel. In this equation, as in John Taylor’s original specification, the quarterly average level of the federal funds rate is assumed to respond to the current output gap and the inflation rate over the previous year. However, in this version we also allow the previous quarter’s federal funds rate to enter the equation. Estimating this rule using real-time data from 1987 through 2000, we find that the coefficient on the lagged policy rate is strongly significant and not far below unity—a finding that is common in the research literature. This coefficient causes predicted movements in the federal funds rate to be more inertial than otherwise.

  • Do you have an R² on that? This is one case in which the lagged dependent variable has policy implications.

  • Right, so the R² is very high—it’s 0.96—reflecting that the lagged dependent variable soaks up a lot of the explanatory power.

  • That means that historically our decisions have been almost right but not quite. [Laughter]

  • One can also express the R² in terms of the model’s ability to predict interest rate changes instead of the level, and there it falls off a good degree—down to 0.42. So part of this, as you noted, is that the level of the funds rate is very predictable.

  • But we don’t like to be predictable!

  • That means our value added is 0.04. [Laughter]

  • Of course, simple rules such as this one provide only a rough description of actual policy decisions, and it would not be surprising if the parameters of these rules would change over time. One possible incidence of this is hinted at by the easing of 2001, highlighted in the bottom right panel, which was more rapid than would have been expected under the policy rule estimated through 2000. This episode brings into sharper focus the question of the appropriate speed of adjustment for monetary policy, which is the topic of this briefing. More specifically, we investigate whether gradual movements in the federal funds rate are desirable in terms of achieving the macroeconomic objectives of the Federal Reserve—maximum sustainable employment and price stability.

    As summarized in the top panel of exhibit 2, we consider the monetary policy rule that would be optimal assuming that the FOMC desires to limit squared deviations of inflation from a target level and of the unemployment rate from its equilibrium level over all future periods. Inflation and unemployment deviations are penalized equally, and the target for core PCE inflation is taken to be 1½ percent. Note that policymakers do not have a direct preference for smoothness in the federal funds rate. The exercise also assumes that FRB/US is the correct characterization of the economy. We can compute the policy that maximizes the assumed objectives of the FOMC given that model, assuming that the policy is required to have the same form as the estimated rule from the prior exhibit.

    As shown in the middle left panel, the coefficients of the calculated policy rule, which we will refer to as the “optimal” rule, differ considerably from those of the estimated rule. In particular, the optimal rule is much more responsive to current economic conditions, as indicated by the larger coefficients on inflation and the output gap, and is much less inertial, as indicated by the smaller coefficient on the lagged federal funds rate. We will focus on both of these characteristics, since each of them would increase the volatility of the federal funds rate relative to the estimated policy rule.

    The more aggressive nature of the optimal policy rule can be seen by looking at its prescription for the path of the federal funds rate going forward, assuming that the economy will be subject to the sequence of shocks implied by the Greenbook projection. As shown by the dotted line in the middle right panel, the optimal rule calls for an immediate easing of policy to zero percent in order to boost output toward potential more quickly and then a substantial tightening over the second half of this year to unwind the accommodative stance of policy. By contrast, the path of the federal funds rate under the estimated policy rule, the solid line, is much more stable. The panel also shows the policymaker perfect foresight path that appears in the Bluebook (the dashed line labeled PPF). This path is computed from an exercise that is similar to the one just described, but with two important differences. First, policymakers are endowed with more information, including knowledge of the sequence of shocks that will be hitting the economy in the future. Second, policymakers are assumed to have a preference for a smooth path of the federal funds rate. This latter assumption importantly influences the shape of the PPF path: If the smoothing preference were removed, the PPF policy would push the federal funds rate immediately to zero. Thus, the PPF exercise also suggests that monetary policy should be much more aggressive in the absence of a preference for smoothing.

    At first glance, it might seem surprising that the optimal policy calls for movements in the funds rate that are so much more aggressive and volatile than those observed. As described in the lower panel, these characteristics of the optimal rule hinge on three key assumptions underlying this exercise. First, we assumed that the private sector continues to form its expectations as if the FOMC were following its historical policy rule, even when the true policy differs considerably. This assumption implies that large policy moves have sizable effects on the yield curve and on real activity, because private agents do not expect those actions to be quickly reversed. Second, we assumed that the FOMC knows the structure of the economy with certainty and in particular faces no uncertainty about the impact of its policy actions, no matter how drastic they are. Third, we assumed that the FOMC is able to observe the current state of the economy perfectly. That is, there is no measurement error in the macroeconomic data or in policymakers’ estimates of potential output and other unobserved variables. Under these circumstances, there is nothing to prevent the central bank from pushing very hard on the economy to achieve its objectives; hence the optimal monetary policy responds aggressively to macroeconomic variables and is quick to reverse course. But clearly these three assumptions do not hold in practice. In the next three exhibits, we investigate the implications of relaxing each assumption in turn.

  • As just noted, the finding that the federal funds rate should be very aggressive under the optimal policy depends crucially on the manner in which private agents form their expectations, the subject of exhibit 3. The previous exercise assumed that those expectations are formed under the belief that the FOMC is following its historical policy rule, even if that rule is changed going forward. In practice, private agents are likely to take into account changes in the way policy is implemented.

    Consider what happens when the private sector understands the policy rule that the central bank is actually using. As summarized in the top panel, if policymakers are following an inertial policy rule, private agents will expect the initial response of the federal funds rate to a macroeconomic shock to be followed by additional policy changes in the same direction. Moreover, those actions will be expected to unwind only gradually as the shock dissipates. Expectations of this persistent response of policy will be incorporated into current asset prices and economic decisions, thus bringing forward the effects of those future policy actions. As a result, an inertial policy response can have an immediate and sizable impact on economic variables, even with relatively small movements in the federal funds rate in each period. By contrast, the large but transitory movements in the federal funds rate that were found to be optimal in the previous exhibit will be less effective, because private agents will recognize the change in the policy rule and look through the near-term swings in the interest rate.

    To illustrate the importance of this consideration, we conduct an experiment in which the public forms its expectations as a weighted average of forward- and backward-looking terms. As the middle panel notes, the degree of forward-looking behavior is governed by a single parameter, φ. When φ equals zero, the model corresponds to the version of FRB/US used in exhibit 2, in which expectations are formed using a “backward-looking” vector autoregression (VAR) model. When φ equals unity, expectations are rational—meaning that households and firms fully understand the structure of the model and the policy rule in forming their expectations.

    With this set-up, we can calculate the optimal policy rule for different degrees of forward-looking behavior. As shown in the bottom left panel, the optimal coefficient on the lagged federal funds rate moves higher as the parameter φ increases. That is, the optimal monetary policy rule becomes more inertial when the public is forward- looking, since expectations of future policy actions lead to larger and more-persistent movements in bond rates and other asset prices that effectively counterbalance the persistent effects of macroeconomic disturbances. Note, however, that the historical estimate of the coefficient on the lagged federal funds rate (0.76) is reached only when expectations are formed in an almost completely forward-looking manner. The bottom right panel shows the complete optimal policy rule for three choices of the degree of forward-looking behavior. As just noted, in the case in which φ equals unity, the coefficient on the lagged policy rate is quite high—even higher than that from the estimated rule. Nevertheless, the coefficients on the output gap and inflation are about three times larger than their estimated values. Thus, the optimal rule still calls for much more volatile movements in the federal funds rate than are observed.

    Another possible source of gradualism in policy setting is parameter uncertainty, the topic of exhibit 4. In the analysis so far, policymakers have been assumed to know the exact structure of the economy; uncertainty has entered only through additive error terms. To see the effects of this assumption, consider the situation in the top left panel, which shows the relationship between the output gap (plotted on the horizontal axis), and the real interest rate, r, relative to its equilibrium level, r* (plotted on the vertical axis). The solid line represents policymakers’ best estimate of the level of the output gap that would be realized at any given setting of the real interest rate. But because of an additive error term, the realized outcome could lie above or below the central estimate, as indicated by the shaded region. As noted in the top right panel, the amount of uncertainty is not affected by the policy decision. For this reason, it turns out that additive uncertainty has no effect on the optimal policy setting. That is, policymakers should ignore the uncertainty and set policy based on their best estimate of the likely outcome for macroeconomic variables.

    However, this framework neglects the obvious fact that policymakers face considerable uncertainty about the values of key parameters in their models of the economy. To see the implications of this type of uncertainty, suppose policymakers are unsure of the value of the policy multiplier, as shown in the middle left panel. The policymakers’ best estimate of the relationship again is represented by the solid line, but the actual slope of the relationship could be higher or lower. In those circumstances, policymakers face more uncertainty about the outcome for the output gap the further the real interest rate deviates from its equilibrium level, as indicated by the shaded region. As we note in the middle right panel, this example highlights the key implication of parameter uncertainty—that uncertainty about future economic conditions will be importantly affected by current monetary policy decisions—a factor that policymakers should take into account in formulating those decisions. Under the objectives assumed, policymakers will tend to shade their policy actions toward choices that reduce uncertainty about future levels of unemployment and inflation. Of course, the middle left panel presents just one specific example of parameter uncertainty. Many other parameters are also unknown, in which case the variance-minimizing policy will not be to hold the real interest rate at its equilibrium level. In general, the effects of parameter uncertainty depend crucially on which parameters are unknown and on the variances and covariances of those parameters.

    Unfortunately, the complexity of FRB/US makes it difficult to incorporate parameter uncertainty directly into simulations of that model. But we can quantify the effects of parameter uncertainty using the simpler VAR model utilized to characterize expectations in FRB/US. As summarized in the bottom left panel, the VAR captures the dynamics of key macroeconomic variables, including inflation and the output gap. In addition, the VAR provides a convenient measure of parameter uncertainty—namely, the variance-covariance matrix of the estimated coefficients. We can use this measure to assess the effect of parameter uncertainty on the optimal monetary policy rule.

    The results are shown in the bottom right panel in two steps. We first compute the optimal policy rule that ignores parameter uncertainty by assuming that the estimated coefficients from the VAR are known with certainty. As shown in the first line of the table, the optimal policy under those assumptions is much more aggressive than the estimated policy rule, as reflected in the larger coefficients on the output gap and inflation and the smaller coefficient on the lagged policy rate. These findings are qualitatively similar to those found earlier using the FRB/US model. However, when we allow policymakers to recognize that the coefficients of the VAR are uncertain, they choose a policy with smaller coefficients on inflation and the output gap and a larger coefficient on the lagged interest rate. Parameter uncertainty therefore moves the optimal rule in the direction of the estimated rule, but it seems to fall well short of explaining the observed degree of smoothness in the federal funds rate.

  • Exhibit 5 addresses the issue that the macroeconomic variables used to formulate real-time monetary policy decisions are sometimes poorly measured. There are two potentially important sources of measurement error. The first is that initial releases of key macroeconomic data are imprecise and subject to revision. To quantify the magnitudes of such revisions, we use a data set maintained by the Federal Reserve Bank of Philadelphia that records the values of major macroeconomic time series as they were available to policymakers at specific dates in the past.

    The top left panel illustrates this type of uncertainty by showing the distribution of revisions to the quarterly growth rate of real output over the quarter following the initial data release for the period from 1965 to 2002. Clearly, revisions can be substantial. As shown in the first line in the top right panel, the average absolute size of this revision is about ⅔ percentage point of output. Moreover, as indicated in the remaining lines, subsequent revisions can also be large, in part reflecting rebasing and other methodological changes. The revisions to other NIPA variables show broadly similar patterns. As noted in the middle left panel, the second source of measurement error arises because many of the variables that have prominent roles in our economic models are not directly observed and must instead be estimated. These variables include potential output, expected inflation, and the equilibrium real interest rate. Inevitably, estimates of these variables are subject to significant error that can be highly persistent, given the substantial lags involved in detecting important structural changes in the economy. These two sources of measurement error generate considerable uncertainty about many of the variables that are integral to monetary policy decisions. Some of the strongest policy implications likely come from errors in the measurement of the output gap. The dotted line in the middle right panel shows the staff’s real-time assessment of the output gap since 1980—that is, the estimate of the contemporaneous output gap made at the time shown. Those estimates differ considerably from the staff’s current assessment of the output gap for that period, the solid line. As indicated in the inset, the real-time errors implied by the difference between these two estimates have a standard deviation of 1¾ percentage points of output and a high degree of serial correlation. As summarized in the bottom left panel, mismeasurement of the output gap could, in principle, have no effect on policy. This would be the case if the real-time estimate of the output gap were uncorrelated with subsequent revisions to that estimate, which might occur if the revisions were based on information not available at the time of the original estimate. In practice, however, that condition has not been met, and large initial estimates have often been revised to be smaller. Under such circumstances, the optimal policy should attenuate its response to the real-time output gap measure, in order to reduce movements in the interest rate in response to the noise in that measure.

    To quantify the effect of measurement error on monetary policy, we return to the FRB/US exercise from exhibit 2, only now assuming that the real-time estimate of the output gap available to the FOMC contains a random error that has the same properties as those observed from 1980 to 1998. As the bottom right panel indicates, measurement error leads to some attenuation of the policy response to the measured output gap (the middle column). However, that coefficient remains well above its estimated value. Moreover, the other coefficients of the policy rule, including that on the lagged policy rate, are largely unaffected. Of course, mismeasurement of variables other than the output gap might also have important policy implications that are not captured by this exercise. Indeed, some recent research suggests that incorporating uncertainty about the equilibrium real interest rate might result in further attenuation and some additional inertia for the optimal policy rule.

    The top panel of exhibit 6 summarizes our findings. A simple analysis of optimal policy—one that uses the backward-looking version of FRB/US and assumes that uncertainty enters only through additive shocks—indicates that monetary policy should move more forcefully in response to changes in macroeconomic variables and be less inertial than observed on average since the mid-1980s. We have investigated the sensitivity of that conclusion to three factors—forward-looking behavior, parameter uncertainty, and measurement error. These factors move the optimal policy in the direction of the estimated policy rule, but none of the factors alone seems to fully explain the observed smoothness of the federal funds rate. An important caveat to this finding is that we consider each of the factors separately, owing to the analytical difficulty involved in combining them. These factors likely interact in ways that could affect the desirable degree of interest rate smoothing.

    Even with the extensions considered, our analysis surely fails to capture important aspects of the policymaking environment. For example, as noted in the bottom left panel, policymakers face considerable uncertainty about the structure of the model itself. All models are approximations and therefore ignore specific variables that could at times become relevant for policy decisions. Also, the models we have used for this briefing are essentially linear, whereas the economy may under some conditions demonstrate large, discrete responses to monetary policy or other events. Policymakers’ concerns about uncertainty may be exacerbated by the fact that some of the optimal policy rules considered called for large swings in the federal funds rate that are well outside of historical norms. The problems involved with significant nonlinearities, for example, might be reflected in a concern by the FOMC about financial fragility. Such a concern could generate a smoother path for the federal funds rate if large policy changes were viewed as having adverse effects on the functioning of financial markets. Transcripts of FOMC meetings show that members of the Committee have at times argued for smaller interest rate changes based on concerns about financial fragility.

    The smoothness of the federal funds rate could also result from various institutional aspects of the policymaking process, as summarized in the bottom right panel. For example, the fact that policy decisions are made by a committee, and thus require building a consensus, could generate some inertia in realized policy actions. Alternatively, the FOMC might seek to avoid reversals in the direction of the policy instrument. Such an approach would presumably involve sacrificing some macroeconomic performance, but it might be perceived to have other benefits, such as allowing Committee members to more easily explain their policy choices to the public. Judging from simulations of the FRB/US model, reversals would occur much more often under the optimal policy rule than under the estimated policy rule.

    Overall, while no model can encompass all the issues that pertain to interest rate smoothing, the analysis above at least provides some benchmarks against which to gauge the appropriate pace of monetary policy adjustment. Because the observed policy is much smoother than would be prescribed by these benchmarks, it is of interest to determine what aspects of our models or of our assumed preferences may be misspecified.

  • Were you planning to move to questions and comments now?

  • You can ask a question whenever you want to, Mr. Chairman. But our intention was to take questions at the end of the presentations.

  • I don’t know if Glenn’s presentation is a formal part of the preceding ones or whether there’s a discontinuity.

  • They are two separate presentations, but they are on the same topic.

  • Well, the reason I ask is that some of the issues that may come up may be addressed in your remarks.

  • Yes, that is certainly the case. That is why we thought we’d run them all together, Mr. Chairman.

  • My presentation is entitled “Monetary Policy Inertia,” and I will be referring to a handout that was distributed. Based on monetary policy rules estimated from quarterly data, many economists hold the view that the Fed adjusts monetary policy at a very sluggish pace, specifically, that it distributes desired changes in the funds rate over several quarters—a behavior that is often termed “monetary policy inertia.” I will argue, instead, that there is essentially no policy inertia at a quarterly frequency and that, in fact, the funds rate typically is adjusted fairly promptly to economic developments—within a single quarter anyway. In large part, my argument is based on evidence of a very limited ability of financial markets (for example, Eurodollar futures or fed funds futures) to forecast the next few quarterly changes in the funds rate. Such evidence is informative about policy inertia because any partial policy adjustment obviously means that there is some remaining portion of the policy action that is postponed to the future and is thus predictable. Therefore, the absence of funds rate predictability implies the absence of significant partial adjustment by the Fed.

    Before I flesh out this argument, however, let me start on page 1 of the handout and delineate two types of monetary policy inertia. These two types of inertia are often confused in the literature, with discussion of one type often mistakenly applied to the other; however, the two types of policy inertia operate at very different horizons. First, there is very short-term policy inertia or interest rate smoothing that I think does exist, but I won’t be discussing it in detail today. This short-term or week- to-week partial adjustment of the funds rate involves, for example, cutting the funds rate by two 25 basis point moves in fairly quick succession, rather than reducing the funds rate just once by 50 basis points. This type of gradualism or interest rate smoothing was more prevalent in the past, when intermeeting moves were more frequent, though it still goes on to some extent—perhaps induced by concerns of financial market fragility. In any case, this short-term policy inertia is essentially unrelated to the quarterly policy inertia that is relevant for most macroeconomic discussions—including my own.

    Quarterly policy inertia is defined as the slow partial adjustment of the federal funds rate on a quarter-to-quarter basis. For example, if the Fed knew it wanted to increase the funds rate by 1 percentage point, it actually would raise the rate only about 20 or 25 basis points per quarter for the next few quarters. It is this type of inertia that I find suspect. The apparent evidence supporting quarterly policy inertia is summarized on the second page of my handout. This evidence is based on estimates of monetary policy rules or reaction functions—that is, estimated equations that attempt to model Fed behavior. These estimated equations usually take a standard partial-adjustment form, where the current funds rate is set as a weighted average of last quarter’s actual funds rate and the current quarter’s desired rate. This partial adjustment form is displayed as the first equation on page 2. The parameter ρ is the weight on last quarter’s funds rate, and 1-ρ is the weight on the current desired level. A high ρ means that the funds rate will be adjusted very slowly to its desired level. Based on quarterly data, estimates of ρ are typically around .75, which puts a ¾ weight on the lagged rate and a ¼ weight on the desired rate. Thus, these empirical rules imply a very slow speed of adjustment of the policy rate—specifically, the Fed would change the funds rate only 25 percent each quarter toward its desired level. This sluggish adjustment of the funds rate over several quarters is widely interpreted as evidence of “interest rate smoothing” or “monetary policy inertia.”

    For example, before each FOMC meeting, the financial indicators packet that is distributed contains two estimated monetary policy rules: one with and one without quarterly policy inertia. Both rules set the desired funds rate according to the Taylor rule, which is displayed as the second equation. In the Taylor rule, the desired level of the funds rate is based on current readings for the output gap and inflation rate. The α and β parameters calibrate the policy response to these determinants. The funds rate recommendations from these two rules are shown in the chart at the bottom of page 2. The solid line shows the actual path of the funds rate as a quarterly average. The dashed line shows the estimated rule with no inertia—so ρ = 0. The dotted line shows the rule with inertia—where ρ is estimated by the Board staff to be .75. The Taylor rule without inertia (the dashed line) fits the actual funds rate fairly well, but there are some large persistent deviations. Notable deviations include 1992 and 1993, when the actual rate was held below the rule; 1996, when the rate was pushed above the rule; and 1999 and 2002, when again the funds rate fell below the rule. The nature of these deviations is a key element in understanding the evidence for policy inertia. The Taylor rule with inertia (the dotted line) largely eliminates these deviations and matches the historical path of the funds rate much more closely. That is, the lagged funds rate in this estimated equation is statistically significant. Although this type of econometric evidence appears convincing, it is valid only if the equation is specified correctly. If the desired funds rate also depends on persistent factors other than the current output and inflation in the Taylor rule, then such a misspecification could result in a spurious finding of partial adjustment. Accordingly, based only on these types of policy rule estimates, it is very difficult to distinguish between whether the Fed’s adjustment is sluggish or whether the Fed generally follows the Taylor rule with no policy inertia but sometimes deviates from the rule for several quarters at a time in response to other factors.

    I will discuss the nature of these persistent deviations in a minute, but first—on page 3—I want to provide some evidence against policy inertia from a different source. This evidence is based on a key implication of policy inertia—namely, that the presence of inertia should imply predictive information in financial markets about the future path of the funds rate. Intuitively, if the funds rate typically is adjusted 25 percent toward its desired target in a given quarter, there’s a remaining 75 percent of the adjustment that should be expected to occur in future quarters. In a wide variety of settings, such delayed policy adjustment ensures forecastable future variation in the funds rate. Assuming that financial markets understand the inertial nature of policy, they should anticipate the future partial adjustment of the funds rate. In this case, a regression of actual changes in the funds rate on predicted changes should yield a good explanatory fit and a fairly high R². In fact, researchers have found the opposite. They have estimated interest rate forecasting regressions and, using financial market expectations, have found little predictive information beyond a few months. For example, Eurodollar futures have essentially no ability to predict the quarterly change in the funds rate three quarters ahead. The R² of such a regression is zero.

    This lack of predictive ability is well illustrated by the most recent episode of easing. The chart at the bottom of the page gives the actual funds rate target and various expected funds rate paths as of the middle of each quarter based on fed funds futures. Under quarterly policy inertia, the long sequence of target changes in the same direction in 2001 would be viewed as a set of gradual partial adjustments to a low desired rate. However, although the funds rate gradually fell in 2001, market participants actually anticipated few of these declines at a six-to-nine-month horizon, as they would have if policy inertia were in place. Instead, markets assumed at each point in time that the Fed had adjusted the funds rate down to just about where it wanted the funds rate to remain based on current information available. Under this interpretation, the long sequence of declines is the result of a series of fairly prompt responses to new information that turned progressively more pessimistic. That is, the presence of quarterly partial adjustment or policy inertia is contradicted by the lack of the forecastability of changes in the funds rate.

    Turning to page 4, I will reconcile the evidence for and against quarterly policy inertia. As I said, the persistent deviations of the actual rate from the Taylor rule without inertia are key to understanding what is going on. Under policy inertia, these persistent deviations are explained as sluggish responses to output and inflation, but that interpretation is inconsistent with the lack of funds rate predictability. An alternative explanation is that the Taylor rule is an incomplete description of Fed policymaking and that the Fed responds to other persistent variables besides current output and inflation. Under this interpretation, the Fed does not exhibit quarterly policy inertia, but it responds promptly to a variety of developments that unfold over time.

    What would cause such persistent deviations from the Taylor rule? Well, in John Taylor’s original analysis, he noted that occasional deviations from his rule were appropriate responses to special circumstances. Two such special circumstances are noted at the bottom of page 4. The deviations in 1992 and 1993 can be interpreted as the Fed’s response to a disruption in the flow of credit, in which the funds rate was kept lower than might be expected given the macroeconomic context because of severe financial headwinds. The 1992-93 episode is better described as a persistent “credit crunch” deviation from the Taylor rule than as a sluggish partial adjustment to a known desired rate. In terms of the Taylor rule, the disruption of credit supply can be treated as a temporary fall in the equilibrium real rate, to which the Fed responds by lowering the funds rate (relative to readings on output and inflation). Similarly, in 1998 and 1999, a worldwide financial crisis following the Russian default and devaluation appeared to play a large role in lowering rates—again relative to what the Taylor rule would have recommended. Expectations also can play an important role in tempering the policy response to current readings on output and inflation. Indeed, some have suggested that expectations of future inflation—and, in particular, inflation scares in the bond market—are an important consideration for policy.

    Finally, on page 5, I highlight two questions that are in some sense two sides of the same coin. The first question is, How should we think about analyzing or modeling recent Fed behavior? The second question is, To what extent should actual Fed behavior conform to our models of optimal monetary policy? Let me start with the first question. In modeling the Fed’s decisionmaking process, I argue that the Taylor rule is an incomplete description of Fed behavior. However, adding partial adjustment to the policy rule is not a solution; instead, in my opinion, partial adjustment adds another layer of misspecification that substitutes for a clearer understanding of the policy process. Of course, more research is required to characterize the full set of influences and determinants of policy beyond those contained in the Taylor rule.

    A closely related question focuses on modeling the underlying motives of policy—more specifically, what kind of loss function should represent Fed behavior? Currently, the policymaker “perfect foresight” path in the Bluebook uses a loss function that assumes the Fed would be equally displeased with any of the following: (1) an unemployment rate 1 percentage point above the natural rate; (2) an inflation rate 1 percentage point above target; or (3) a 100 basis point decrease in the quarterly average funds rate. These equal weights place an implausibly high penalty on funds rate variability. However, without a substantial funds rate volatility penalty, the constructed optimal policy path does not match the recent historical path of the funds rate, and this is true across a variety of models. In my opinion, the high funds rate volatility penalty may be another misspecification that is compensating for some unknown flaw in our calculations of optimal policy. In essence, if policy over the past two decades has been close to optimal, then an important element is missing from the current specifications used by economists to construct optimal monetary policy.

    An alternative possibility is that our economic models—without interest rate smoothing in the loss function—are basically correct in finding that under an optimal policy the Fed should be more aggressive in reacting to economic news than it has been. This suggests a second question: Should the Fed deviate from its historical behavior and become more aggressive in changing the funds rate? The analysis above suggests that the Fed has not been sluggish in reacting to economic developments: It has likely set the funds rate equal to its desired rate in each quarter. However, questions remain about whether the desired rate should react more forcefully to economic news. Indeed, researchers typically find that the parameters of an optimal Taylor rule—that is, the α and β shown on page 2—are much larger than the estimated parameters of a historical Taylor rule. Thus our models, even after trying to take into account various types of uncertainty, recommend much more vigorous policy responses. This raises the question, Has monetary policy been too timid in its responses to economic developments? I personally remain less than convinced that our models capture all the important factors influencing policy.

  • It’s a very interesting set of papers, gentlemen. As a practitioner of sorts, since what is supposedly being focused on are how the equations fit, let me see if I can come at it from the other direction. There is a stipulation that we should be looking at the data sets as they existed at the time decisions were made. We should also be looking at the models that existed at the time of the decisions. But let’s take the structure as it is today. If we fit a bunch of independent variables into that very complex, very sophisticatedly estimated model and we do not use appropriate add factors, we will get nonsensical results.

    What that implies is that the internal structure of the model—its simulating capability—is subject to the same problems that are associated with the fact that we need to use add factors to get the model to a point where it can actually function. So what we do—consciously or otherwise—is to raise continuously the question of variance of the coefficients in the model and uncertainties with respect to the nature of the model. As a consequence, we tack to the wind, if I may put it that way, toward where we think the model’s overall structure and our add factors to the model—which are essentially a set of sub-models—are taking us. If we do that, the result— as you point out, Glenn—is a tendency not to engage in sharp changes in the funds rate. Instead of doing, say, one move of 100 basis points, we’ll do 50 and then another 50. The reason is that we do not know, and cannot know, the degree of financial fragility in the system. And the reason we can’t know is that the presumption that financial fragility is a stable function, capable of being evaluated through history, is clearly not sustainable.

    The underlying structure of the economy with which we are dealing and to which we are endeavoring to fit out models is in a continuous state of change. The technologies, the demographics, and the computations that are involved generate a system in which ideally we would have not fixed coefficients in our model but instead estimated coefficients, which would then become variables in a set of other dynamic changes. But we don’t know what that is. All we do know is that, to abstract from reality, we need a structure that can essentially simulate the forces that change that structure, which is exceptionally difficult to construct. I’ve been building models for fifty years; they tell you something, and you learn something from them. But every time you think you’ve got it, the structure changes. Obviously it must change unless we have a stable system that would enable us to fit all of our coefficients on the basis of the presumption that the structure is unchanged through the estimation period. Unless we can do that, I’m not sure what we have.

    So as a practical matter, every time we sit down and look at the forecast there is always the question of whether the model has changed significantly since the last Greenbook, and that is something we will not know for two years. Remember our problems with projecting M2? All of a sudden, after really quite good fits, the equations no longer worked. In fact, I spent a good deal of time modeling P*, which seemed to be a very nice fix on the structure of M2 and its projections just before it collapsed and ceased to be of any use whatever. Those are considerations that induce the actual decisionmaker to hedge all the time.

    You raised the loss function issue. The loss function is not a simple function that involves balancing 1 percent of inflation and 1 percent of unemployment. In fact, it’s a whole series of dynamic functions. In a sense we are always asking the question in terms of two alternative policies: What are the consequences if we choose A rather than B and A is wrong? In many instances when A and B are very close, there is no loss of any great moment if we make a mistake. But with the presumption of a very sharp change in policy, the probability that we would be making a horrendous error is much higher because some unknown change in the financial structure may have created a degree of fragility that we have not yet been able to infer. Therefore, the decisionmaker is quite reluctant to make a sharp change in policy without testing the real world. So what this Committee would likely do if we thought a 100 basis point move were the necessary change is that we would move the funds rate 25, 50, or maybe 75 basis points and watch what happens. If the system shook a little and then stabilized, we’d tack on the other 50 or 25 basis points. We’d take that next step on the grounds that all our simulations indicate that the time frame needed to judge a fragility shock is much shorter than the time frame in which the monetary policy takes effect. All of that basically says that we endeavor to construct our models, presumably, in a manner that optimizes our knowledge, recognizing that our models continuously get out of date and have to be revised. We do that largely not by re-estimating the models but basically by add-factoring as we go along.

    There is, however, a very significant bias problem in policymaking in this context, which is that, in all of these models, policy should really be based on risk-neutral evaluations. But in fact, we’re all human beings, and we are risk averse. Psychologically that’s how we behave, and to try to reverse that behavior is exceptionally difficult. I suspect that, if we could distinguish between risk neutrality and risk aversion, we probably would find that we are much closer to risk aversion. I can give you examples in my own personal experience. I used to trade copper, gold, silver, and other commodities on the commodity exchange ring. I would sit there with full confidence that the price of a particular commodity was going to go up and that, therefore, as the price went down I should be doubling up on my position. What did I do? I sold at the end of the day, largely because I wanted to sleep that night. [Laughter] That may seem like an idiosyncratic event. It isn’t. That’s the way we behave. It strikes me that if we find ways to move away from risk aversion and try to be strictly risk neutral, we will get a far more optimal policy outcome. I’m not sure we can do that, but we ought to try.

    Let me then ask a question. What happens to all of this analysis if you use the pre-1987 period? What do you find, and what do you conclude on the basis of what you find?

  • In terms of policy rule estimates, the view in the literature very often is that there was a structural break in the 1980s. So policy rules in the 1970s are considered—

  • But there is an economic structural break in the 1980s in that regard.

  • The statistics generally indicate a break some time in the 1980s in terms of the policy rule estimates. There appears to be a different behavior in the 1970s than in the 1990s if you regress the funds rate on the output gap and inflation. For example, the coefficients on inflation—that is, the response to inflation—don’t seem as large.

  • That raises an interesting question. Let’s say we go back to the 1960s and 1970s and fit our fixed-coefficient models into that period. The implication is that we’re saying that the economic forces moving today are the same ones in principle that were moving back then. Why should policy be any different? I don’t know why it would be unless there’s a learning curve in there or something.

  • I think the assumption is that there is a learning curve. Although very often the other equations in the model appear stable, at least to a first approximation, the interest rate equation—the equation that summarizes monetary policy or Fed behavior—appears less stable over time. I believe the conventional view is that there was a learning curve and that policy behavior did change.

  • There’s a very large learning curve at the Fed on the construction of various models if we go back historically. It’s interesting because the Fed may be a unique institution in that it has a sufficient history and constancy to enable one to see how the models actually have changed. In contrast, private-sector models do not have a sufficient history because the firms sponsoring them are not that stable. My impression is that there has been an extraordinary, very major learning curve in model construction going back into the 1950s and moving forward. Obviously, in the 1970s and 1980s the secular stagflation that the economy exhibited was not replicable by any of the macro models that we set up in the early years.

  • Mr. Chairman, you may recall the paper that Christine and David Romer presented at the most recent Jackson Hole conference, in which they characterized the process as one of learning, forgetting, and then relearning. The thing that is most unstable about the Taylor rules, as Glenn mentioned, is the inflation target. It’s very hard to explain the setting of interest rates in the 1990s and today if we use estimates from the 1970s or the first part of the 1980s because the background inflation rate was just so much higher in those earlier years. A distinction that’s also important to make is whether one is estimating the rule on the data as we know it now or the real-time data. That can make a difference. It’s a factor in how stark the differences are among decade-long estimates of the Taylor rule, for instance.

  • In terms of the economic modeling, two things about the 1970s come to mind. One is the emergence of the natural rate hypothesis—that there is not a long-run tradeoff between output and inflation. Also, the sacrifice ratio was considered at the time to be very much higher than we imagine it is now.

  • I’ve monopolized the floor for long enough. Let me see the list of others who wish to comment or ask questions. Governor Gramlich.

  • Well, this is a puzzling issue because both sides had very good papers and, of course, you’re arguing positions that are the exact opposites. I’d like to focus on the episode of year 2001 because I think the question is really joined there. The Board staff says there is a lot of inertia, and Glenn says there is not. I believe that the Board staff would argue that the sequence of small steps proves inertia and Glenn would argue that the unpredictability, as measured by market expectations, proves there is not inertia. I think the Chairman in what he just said was confessing to some inertia. I, myself, feel that in the first years I was here, up through 2000, we probably did have a lot of inertia; it was hard to get rates changed by very much. But in 2001 we changed them quite dramatically in a pretty short period of time even though we did it in relatively small steps. So I find the overall question of whether or not there is inertia quite puzzling. But, Glenn, I might ask you a question. To prove that there wasn’t inertia, you’ve chosen a time when there might not have been. If you had used times when the funds rate was more stable, then I think your test would have indicated much more predictability and, therefore, possible inertia. Isn’t that so? Could you be accused of choosing an episode to maximize your case?

  • In terms of this picture, I actually didn’t look around that much at other periods. This one seemed to work, so I stuck with it. But in terms of the underlying analysis, even though this episode demonstrates the point, it does not provide the strongest test. A lot of people have looked for this type of interest rate predictability by running regressions of actual changes in short-term interest rates on forecasted changes over longer periods. This finding of limited predictability is true, really, over much of the postwar period. The particular regressions that I used were from 1988, the start of the Eurodollar market in readily accessible form, to 2000. So that’s the period on which my analysis focused.

    At any rate, other people have run these kinds of regressions a lot, and it’s remarkable how financial markets often know what the Fed is going to do a month or two in advance. In some episodes—1994, for example—there was even a bit more predictability. Nevertheless, I think this recent episode is actually fairly typical in that, in the middle of this easing cycle, the markets thought they might get the next 25 basis point move. But in fact, moving out beyond six months, they actually have little predictive ability. That’s because, if the funds rate is being set in terms of future movements in output and inflation and other factors, those factors are hard to forecast, and the markets find them hard to forecast as well. But there’s no pent-up policy inertia on a quarter-to-quarter basis that gives them a lot of predictive ability.

  • May I also try to clarify that? The estimated policy rule from our analysis would actually be consistent with your impression of the 2001 episode. That episode did look as if the policy moves were much more rapid than would have been predicted by the estimated policy rule. It’s hard to assess because there are only a few data points in a quarterly policy rule. But if you let the rule choose a different coefficient on the lagged funds rate for that episode, it will put it pretty close to zero. So that episode, in the context of this rule, actually was a very rapid policy easing relative to the average pace of policy adjustment seen from 1987 to 2000.

  • You can see that in my handout in the chart at the bottom of page 2. There is a Taylor rule without inertia—that is, there is no lagged funds rate in it—and a Taylor rule with inertia. During 2001, those two rules both did fairly well.

  • It doesn’t matter that much. The difference in the rules comes in 2002, when for the rule without inertia a persistent deviation emerges. I would argue that there’s something other than strict Taylor rule determinants acting during this episode, perhaps some reaction to a collapse of the tech bubble in stock prices or to geopolitical risks—something that’s holding the actual funds rate below what a simple Taylor rule would say.

  • Governor Gramlich, before you leave the topic with the sense that the distinction between the people sitting on the two sides of the table is that stark, Brian has done some work estimating policy rules in which he allows a serially correlated error in addition to the lagged dependent variable. I would ask him to comment on what happens then.

  • In a recent research paper that I wrote with two colleagues, we show that it is possible to estimate policy rules directly allowing for both factors. So we don’t necessarily have to turn to the term structure evidence to separate them. What our paper and several other papers have found is that clearly Glenn’s point is right: There are variables omitted from the rule that have serial correlation, which gives some impression of gradualism. But that’s not the whole story. In fact, when we apply these methods to our policy rule and allow for serially correlated errors, that decreases the coefficient on the lagged federal funds rate from 0.76 to 0.56. So it reduces the degree of inertia in these rules, but there’s still a significant amount of gradualism.

  • You can reduce the degree of inertia by attributing some of that explanatory power to some other error out there that happens to have serial correlation, right?

  • So part of the problem is that what appears to be inertia actually is not. We’ve just responded consistently to something that’s the same for a period of time.

  • Well, perhaps. I’d point to the part of exhibits 2 and 3 where we use the FRB/US model. Now, that is a large model with lots of persistence in it; a large number of lagged states and lagged errors are at work there. Let’s say you ask what the optimal parameterization of this simple policy rule would be. You ask it to pick up the very thing that I believe Glenn is describing. What it tells you is that, unless there is a high degree forward- looking expectations, the optimal policy doesn’t want that lag. It doesn’t want that lag to proxy for the same kind of things that Glenn argues empirically it is proxying for. Does that mean Glenn is wrong? Not necessarily. It could mean that the model is wrong. But it isn’t clear that what Glenn is referring to is not being adequately captured in what we’re describing in exhibits 2 and 3.

    The same thing is true, I might add, about the term structure evidence. For this term structure evidence to explain what Glenn wants it to explain, he has to assume rational expectations. That’s a presumption in his analysis. But as you saw in exhibit 3, if there are rational expectations, the optimal strategy calls for a lot of persistence in the fed funds rate. So one can interpret Glenn’s evidence as suggesting that rational expectations are not there, in which case they shouldn’t have produced inertia in the first place. There again, it doesn’t mean that Glenn is wrong, but it does mean that there is friction between these two views, and it can’t be settled at this table.

  • Rational expectations in terms of financial markets, not necessarily the rest of the economy?

  • I think we would agree on the general point of our briefing. We’re bickering about what the coefficient is on the lagged federal funds rate in the estimated policy rule, but these issues don’t affect our calculations of the optimized rules and don’t affect the main conclusion of our briefing. What we find is that these optimized rules result in very aggressive, very volatile movements in the federal funds rate—more volatile than we see in the data regardless of what the exact rule is. Recall that we said there are two aspects in which policy is too smooth. The coefficients on output and inflation appear to be smaller than under the optimal rules, and the coefficient on the lagged funds rate tends to be bigger. Even leaving aside that second part, as Glenn alluded to in his last exhibit, these types of exercises that we perform would still recommend larger response coefficients and more-aggressive movements in the federal funds rate.

  • This is also true, if I might add just one last point, in the “policymaker perfect foresight” simulation. Without the usual penalty on interest rate volatility—remember a policymaker in that case is taking everything into account, including the kinds of things that Glenn was referring to such as the tech bubble and current and future geopolitical risks—it, too, predicts very volatile, very aggressive monetary policy as the optimal thing to do. So in short, this outcome has nothing to do with the parsimony of a simple rule and has everything to do with the rest of the structure.

  • You referred to term structure evidence. This is different from the fed funds evidence that Glenn was referring to or the same?

  • When I said term structure, I meant the evidence that Glenn presented.

  • I ask because I’m very interested in this basic point that the Fed should be more predictable in order to use the short-term rate to influence long-term rates and whether that is an important issue. In particular, your evidence is very interesting, Glenn. I was wondering if there had been evidence on whether or not the responsiveness of long-term interest rates to movements in the fed funds rate was consistent with the predictability of the type that you propose or the type that uses more partial adjustment. That would seem to be a separate test. I was wondering if you had done anything more directly on this hypothesis.

  • Well, the expectations that are most informative are just a few quarters ahead. In ten-year expectations, say, policy inertia doesn’t play as big a role. Those expectations aren’t able to clarify whether there is inertia or not. So it is term structure evidence, but I’m just looking at the very short end—one year or less of the term structure maturities.

    But it is true that there is a link. In what is often called optimal monetary policy inertia, it has been demonstrated in other presentations that, as you have more forward-looking expectations, the monetary authority is able to stabilize the economy, not necessarily by taking actions today but by promising to take actions in the future. Financial markets build that in, and those expected future actions are able to affect the economy today. Those actions reflect inertia. Again, those inertial movements are assumed in optimal monetary policy. I don’t consider that something we should be taking too seriously. I don’t think we have the empirical evidence of monetary policy inertia, and I don’t think optimal monetary policy inertia is very convincing.

  • The other point, Governor Bernanke, is that in the cottage industry of estimating event-study regressions—which might be another place we could look to see how far a given monetary policy surprise gets transmitted through the term structure of interest rates—we get the same sort of results we get in the time series test of term structure expectations. That is, the predictable effects of policy on interest rates die off pretty quickly. In other words, it’s the front end of the forward curve that is usually influenced by a policy surprise; and at the longer end, the forward rates go down.

  • One might argue that Glenn’s interpretation of the Taylor rule is correct—that there is no inertia in the policy rule and that there should be more in order to get more effect on long-term rates. I think that’s an open question.

  • It might be useful to point out as well that, while policy actions three, four, or five quarters ahead are hard to predict, it’s not that they don’t get built into the term structure. The Board staff produces an expected path of the federal funds rate based on our readings of the futures markets, and it commonly has policy actions built in at those horizons. In fact, the current forecast is a good example. Policy is expected to be on hold until the fourth quarter of 2003 and then, according to our path, about 150 basis points of tightening is built in for subsequent years. So it’s not that future actions don’t get priced in at all but that they’re not very predictable.

  • That’s different. That’s the interest rate responding endogenously to the expected evolution of the economy.

  • Well, that’s true; that could be the case as well.

  • Let me carry on from this conversation and turn it just a little. If we compare the optimal rule and the estimated rule, the optimal rule is much more aggressive. It probably involves a lot more reversals, and the funds rate would look a lot less smooth if we drew a chart of it. Let’s say we use the output from the models that have all the private-sector expectations rolled into them—expectations reflecting knowledge of the rule that the central bank is following. If one were to plot a one-year, two-year, or five-year rate, I suspect that there wouldn’t be much daylight between the rates under the estimated rule and the optimal rule. That’s my guess because even the amount of persistence we’re talking about with the estimated rule indicates readily enough that there’s not much difference in terms of a longer-term rate. So if we think about policy being transmitted through interest rates, it’s not going to make much difference—in the way the economy works in terms of growth, employment, and other real variables—which rule is used. Let me just leave that expressed as an assertion for the moment.

    Now, if that’s the case, it seems to me that a critical dimension of designing the rule precisely in the environment of uncertainty that we face is to promote a good rational expectations equilibrium where the private sector truly understands what we’re doing. That’s a dimension that you didn’t really explore or discuss, but that’s a critical part of this whole subject because we’re trying to produce this rational expectations equilibrium where the market comes to expect the right things from the central bank. One of the problems with a very aggressive rule, which tends to result in a lot of reversals, is that many people are going to find our policy rather confusing; they will have difficulty trying to figure out what the central bank is doing. From the point of view of promoting an understanding in the private sector regarding what the policy really is, I think there’s a big benefit to having a certain amount of smoothness in the funds rate. That’s because—this is my guess now—it will be much easier for the private sector to understand what our policy strategy is. That was not addressed in this analysis, but it seems to me to be a critical part of the problem of trying to design a policy that we should follow.

  • I think of reversals as being perhaps a bit shorter run. Again, an important point regarding optimal monetary policy the way economists usually construct it is that it’s at a quarterly frequency; we use quarterly average rates. So, it’s not as if at one meeting the rate is moved up 50 basis points and at the next meeting it goes down 50 basis points. The reversals are not reflected in an up-and-down pattern meeting by meeting. We’re talking about quarterly average rates. Therefore, from the point of view of economic analysis, as long as the Fed follows a systematic rule quarter by quarter—however it hits that quarterly average—at the broad level of the way macroeconomic economic models are constructed, that is sufficient. In terms of reversals, this perhaps goes back to the Chairman’s point about implementing policy changes in small steps, given that the Fed has to worry about financial fragility and testing the system.

  • Let me answer your question in a slightly different way than Glenn did. I interpret your question as saying suppose private-sector agents don’t understand the rule or, even more generally, the model. In that case you might ask if there is a role that the Fed can play in the conduct of monetary policy that will assist private-sector agents in learning that rule over a shorter period of time or getting to the right equilibrium. As you know, Jim Bullard at the Federal Reserve Bank of St. Louis works on this. There is a small and growing literature on the subject of learnability, and that literature does say that some inertia in policy is beneficial in helping private-sector agents learn the rule and come to the rational expectations equilibrium. I’m sympathetic to that view myself, but I don’t know how much weight to put on it because I don’t think it has been tested in a broad enough set of models to know whether that view is universally correct. At this point, all the experiments that have been conducted have been done on extremely simple models that have no inherent persistence other than in the policy rule; they are models that jump instantaneously in response to shocks at all points in time. I have a suspicion that in a model that has intrinsic lags—so that shocks, regardless of the policy, take some time to play out—it might not be as important to have that persistence in the rule. But I can’t really say that I know the answer.

  • I think there’s inevitably a great deal of learning and evolution involved here in terms of furthering the completeness or sophistication, whatever you want to call it, of the markets’ understanding of what we’re doing. The environment is very different in that respect from what it was thirty-five years ago, let’s say. So part of the problem in trying to model this is that the world does change and evolve over time, and time and learning go only in one direction—at least we hope they go only in one direction. I don’t know where to go with that! [Laughter]

  • I want to ask about this financial fragility point a bit more, too. You mentioned that in transcripts of FOMC meetings some Committee members have cited concerns about financial fragility as a reason for smaller interest rate changes than they might otherwise have wanted. I just wondered whether there is any evidence on this. Has any research been done? Or do you have any suspicions about what the relationship is of these up-and-down rate movements to financial fragility?

  • Actually, I think there’s surprisingly little research relating to that explanation compared with the huge amount of literature on parameter uncertainty and model uncertainty. As far as I know, financial fragility has more or less slipped by.

  • All right, Brian, all you have to do is to take a poll of the FOMC members. That’s your database.

  • Right, and I’m getting the impression that financial fragility is important! [Laughter]

  • Not important, it’s determinant! If we think the market is fragile and it’s not, that doesn’t matter.

  • It gets back to the risk-aversion point.

  • We do know a few things, though. First, markets actually have become better at anticipating our policy actions since the early 1990s and especially since 1994. Second, presumably markets have become better at trading and allocating risk as well. These developments in the economy bear on this issue. But in terms of research indicating what financial fragility does to the optimal rule, there hasn’t been very much.

  • May I make a very brief comment on this point, although I’m jumping in out of turn? The way in which dealers manage their positions is going to depend importantly on what kind of policy changes they think are conceivable. If they live in a world where the rate changes by 25 basis points most of the time, with moves of 50 basis points on occasion, that’s a very different world from one where the change could be much larger, perhaps 200 or 300 basis points at a shot. The transition from one world to the other is going to be an awful problem and may involve some catastrophes along the way. But the dealers will learn to live with a different environment. It’s just that the transition could be extremely difficult.

  • Another point is that, while the more aggressive rules would create volatility in the short-term interest rate, it’s not really clear what they would do to the volatility of prices of longer-lived assets such as long-term bonds or stocks. To the extent that these rules are better at stabilizing the macroeconomy, they could actually reduce the volatility of those asset prices. In other words, such rules would clearly make the volatility of the term structure more negatively sloped. So in terms of the fragility aspects, there are a lot of subtle points like that to think about.

  • Remember, if we’re moving the interest rate quickly, the loss function becomes very crucial, whereas if we’re moving incrementally, it falls out so we don’t really care about it. That can be a very crucial determinant on how we move. In other words, we start with a degree of uncertainty that is very high; it is much higher than it is for those who take the data and put them into a model and do projections. Most modelers are dealing with a controlled environment in which the number of variables is well short of a thousand. In the real world there are a million, and we don’t know which ones are important. So it really matters. Lots of technical things that we do would seem to be wrong in a sort of optimum sense. Yet we do those things because we don’t trust the models to be capturing what is going on in the real world. Therefore the base of information on which we act falls away, and risk aversion becomes a very predominant factor in the Committee’s judgment of which way to move. I don’t know whether we need a psychiatric examination of the Committee members, but I bet it would produce an interesting database! [Laughter]

  • See if there are any volunteers!

  • I just want to stress again the difference between short-term policy inertia and quarterly policy inertia. A lot of the concerns about financial market fragility would refer just to short-term policy inertia and wouldn’t necessarily show up in our quarterly average loss function. Very often in the past if you were going to change the direction of policy, you would start off with a 25 basis point move. Even then—in 1994, for example—we saw large repercussions with the Orange County situation. A lot of people had built up positions, apparently on the assumption that monetary policy would never change, and those positions were hard to unwind. So just a small move in the funds rate seemed to produce large repercussions. A 50 basis point move for that initial move would have been even more disastrous, I think.

  • In fact, if you read the transcript of that meeting you will find that there was a very substantial debate within the Committee on exactly that theme.

  • Right. But again, if you wanted to get to a total change of 100 basis points from the quarterly average of the first quarter to the quarterly average of the second quarter, that could be done over a sequence of four meetings—with limited concern about financial fragility. Financial fragility is something that operates on a week-to-week basis. Markets react appropriately given enough time. So I think that distinction has to be made.

  • Mr. Chairman, I’d like to ask two questions about the first paper. You looked at these three factors in isolation, and then you indicated in a caveat that the factors might interact. It seems to me that, indeed, they might and that could actually change one’s conclusion. I can understand, if you’re working with the FRB/US model, that with a model that size it would be very difficult to incorporate the three effects simultaneously. Isn’t it possible to work with a smaller model that would enable you to look at those effects and how they interact and determine whether or not that leads to somewhat different conclusions?

  • There has been some research that did that, and I think it shows that when you add several of these factors together you get a lot closer to the estimated rule than the optimal rule. That’s pretty convincing.

  • Okay. There’s another point I’d like to ask about. First of all, we had an interesting meeting in June in which papers on inflation modeling were presented. The discussion revealed some differences around the table in views about which model is relevant. In one case we had a paper based on a Phillips curve, and another was based on a random walk. If the policy participants have different models of the economy, would that perhaps lead one to the conclusion that we should be less aggressive in terms of our approach to policy?

  • There is a literature on robust policy design in which there are a number of different ways to model this. The one that you seem to be pointing to is a rival model methodology where we put different models up and see what we get if we take the optimal rule from one model and put it into some other model. Most of the literature suggests that policy should protect against the worst possible case in the set of models that we’re willing to consider. The usual prescription that comes out is that policy is more active than it would be under the policy for your base case model. But it’s going to depend on what models are in your set. It’s certainly possible to have cases in which it’s optimal to stay in a narrower range. It’s difficult to come up with a definitive answer, but that’s where the bulk of the literature seems to be pointing at the moment.

  • It seems a little counterintuitive.

  • Well, let me give you one particular example. Suppose you had two models that differed in the degree of persistence in inflation—your Bank’s random walk model is one of those examples. If you get a shock to the output gap in that kind of model, it produces a cycle of inflation that lasts for an extended period of time. Suppose in the other model, if you shock it, inflation dies out quickly. If you design optimal policy for the model where inflation dies out quickly, that model is going to say don’t worry about inflation—just respond to output, and everything will be fine. The other model is going to say, no, inflation is something that you have to attack on a quarter-to-quarter basis. Every time there’s a shock to that Phillips curve model, inflation is going to take off on you unless you react. So if you’re trying to protect against the worst case in those two models, you’re going to respond aggressively.

  • That’s interesting. By the way, that’s President Stern’s random walk!

  • It depends, I think, on the structure of uncertainty. This is the argument that Milton Friedman made. His point was that not knowing the model—not being sure of the results—led him to argue against fine-tuning or, in this case, the aggressive policy responses.

  • Thank you, Mr. Chairman. This discussion has gone on awhile, so let me make just a few points. One is that I’d remind Brian Sack that people sat around this table in October 1979 and made the same argument he just made—that a policy that induced volatility in short-term rates would not get passed through to long-term rates because it would stabilize the economy. And what we found was that the short-term volatility did feed through to long-term rates over the next three years in part because expectations weren’t very well anchored and seeing the rate move around in an unpredictable way actually did feed through to long-term rates. There was a paper on this in the staff studies of the new operating procedures.

    I guess I come out between the two positions in your presentations to a certain extent. As I’ve observed the Committee over time, I think there are elements of gradualism in its policymaking but not as much as seems to show up in the data. There are a number of occasions—like 1994, as you’ve pointed out—when the Committee thought it wanted to go some distance but didn’t want to get there really fast because it was worried about financial fragility or other things. Actually, despite the chart, I think there was a bit of that in early 2001 when, once the Committee started to ease, a number of members thought the Committee probably needed to move the funds rate down a couple of percentage points but it took three or four months to get there. On the other hand, my observation is that most of the time when we leave this room we think the rate is pretty close to where we believe it needs to be—not very far away—and we’re not looking ahead to long runs of further easing or tightening. So in that case I agree with Glenn. I think an important issue is the serial correlation of forecasting errors that Vincent talked about. We learn slowly over time about the shocks and the way the economy is responding to them. Those answers reveal themselves to us after the economy is shocked. We learn over time how big the shock is, and then we react as we get that information.

    With respect to the strength of our responses to output gaps and inflation gaps, I think the Committee hasn’t been as gradual or as damped in its responses as the equations say it has. In my view there are a couple of points indicative of biases there. One is that the Committee has been forward-looking, so we’re really looking at forecasts and not at existing output gaps. We can often bring information to bear that says that a particular shock will likely go away and we don’t need to react so strongly to it. So I think the wrong stuff is on the right-hand side of these Taylor rules; the Committee is doing much more than looking at the current levels of those two gaps. The second point is that these estimates are made on the assumption of a constant inflation target, in this case from 1987 through the present. I don’t want to get into a discussion of whether it should or should not have been constant. But I do believe that, from 1987 at least into the second part of the 1990s, the Committee surely did not have a constant inflation target. A number of the former members of this Committee talked about an opportunistic approach to reducing inflation. Inflation was higher than it needed to be over the long run, but there wasn’t any extraordinary effort to reduce it. The models wanted us to be stronger in reducing inflation because they had a lower inflation target than the Committee and the Committee didn’t react to the model’s target but to its own. I think that biases the results to finding that the Committee didn’t act as aggressively as the models thought it should, when in fact it acted fairly aggressively—and aggressively enough to get some pretty darn good outcomes for the economy over the past twenty years.

    Having said that, I think there is a valuable lesson embedded here, and it goes to the discussion you were having about policy mistakes. It’s better generally for policy to act too strongly than too weakly to developing situations. Serious policy errors have been made when policy doesn’t react aggressively enough to a developing situation. Examples are the Federal Reserve in the 1970s or the Bank of Japan in the 1990s. That is the sort of policy error that allows expectations to get out in front. It allows a spiral to develop that becomes very, very hard to reverse. If we react too aggressively, that also can be a policy mistake. But tightening too much because we’re afraid of inflation or easing too much because we’re concerned about deflation or recession is much more easily reversed without cumulating expectational problems getting built in. So to me the lesson for the Committee from these optimal rules is that we are probably better off being a little too aggressive than being not aggressive enough in terms of the possible consequences for the economy over time.

  • You know, that’s the issue of risk aversion right there. What prevents us from actually doing what you’re suggesting is a fear that is asymmetric. It’s very tough to get around that, but we’re trying.

  • We can do the psychiatric examinations, but I hope they’re not subject to FOIA! [Laughter]

  • Not only have we tried, we’ve succeeded very well.

  • With that thoughtful thought, let’s break for ten minutes and come back. We have a number of people who want to speak on these subjects.

  • [Coffee break]

  • Shall we continue? Governor Ferguson.

  • Thank you very much, Mr. Chairman. I’d like to pick up about where Bill Poole was and focus on the bottom of exhibit 3 in the paper that the Board staff put out. What that suggests to me is that, if we think the markets have become much more complete and therefore somewhat more forward-looking, then we’d do very well by working with them or holding onto a more inertial approach here. I would argue as a hypothesis that we are perhaps on the curve. Indeed, while it’s hard to say that markets are somewhere between 0.9 and 1 in a forward-looking rational expectations sense, I think it is certainly clear that they have gotten a lot better in that regard over the past twenty years. In that world, the general approach of transparency and gradualism or inertia may well be the best policy in some sense because that way we’re working with markets and not, if you will, surprising markets by making large jumps in interest rates. That is exactly what Bill Poole was saying. But I would like to ask the staff whether that is a reasonable interpretation of exhibit 3. Can one say that, because markets have changed sufficiently, we may actually be closer to optimal than some of the earlier specifications might have suggested?

  • I think we’re creeping up that curve on the bottom left.

  • Well, give us another red point. [Laughter]

  • It’s hard to say.

  • All right. So at least it is a credible theory or a theory that one could put forward that this approach is indeed the right one because the markets have become a lot better, we’ve become more transparent, and they understand what we’re doing. So, in some sense we should validate their expectations.

  • Governor Ferguson, the coefficients inferred from our behavior are much lower than the ones that were found to be optimal in the earlier exercise.

  • Well, on inflation and output.

  • Right, and that’s where I was going to go with Don’s point—or maybe it was the Chairman’s point. Well, let me say a couple of things. One is that I think we may have been taking advantage of changing perspectives about what level of inflation is acceptable. Second, I think perhaps we have been taking some risks because of financial fragility or external circumstances. We just released the 1997 transcripts, which clearly indicated that the Committee—both before Ned and I joined and afterwards—was taking into consideration some of the risks with respect to the Asian crises and so forth, even though the economy appeared to be growing faster than one would have liked. So I think it’s possible—not to pat ourselves on the back exactly but to square this circle—that we take into consideration both the earlier concerns about financial fragility and the fact that most of the markets are functioning better.

    That leads to a second point, a comment I have for Glenn. I think you’re obviously right to talk about the markets’ ability to anticipate but not on a six-to-nine-month horizon. This may be just a different way of looking at it, but I would have assumed that another way to view not necessarily inertia but predictability is whether or not on the day of—or the week before—the FOMC meeting the markets pretty much have it right. My recollection is that by and large they do. We’ve rarely left this room with the knowledge that we’ve surprised the markets dramatically. Once in a while we’ve had to do that. But I suspect that, if in your chart you used the expected funds rate path a week or two before the meeting instead of the expected funds rate path in the middle of each quarter, you’d probably find that the predictability would be higher. Would that be the case?

  • Yes, although if you looked at those numbers, I think they would actually strengthen my case. That’s because with quarterly policy inertia after you made a move, to the extent that it was not completely anticipated and there was some policy surprise, it would have implications about changes in future rates at a six-to-nine-month or six-to-twelve-month horizon. The yield curve would not be moving parallel to the existing curve, but its slope would be changing to some extent. In fact, that doesn’t appear to be the case.

  • That doesn’t appear to occur. Okay.

  • Again, those inertial movements don’t appear to be there because they’re not showing up after the FOMC changes the target.

  • Okay, good enough. Thank you. Those are my two points.

  • Thank you, Mr. Chairman. I had just a couple of comments. Starting with a point that I think Brian made, for me there’s a difference here between inertia and timidity. As far as inertia is concerned, I find Glenn’s story pretty convincing. So that’s not what interests me particularly. On the other hand, both papers seem to at least allow for the possibility that, relative to some sort of optimal rule, the Committee has been too timid. To the best of my knowledge, by the way, most of the reaction function literature going back to the 1970s and maybe even before seems to come to the conclusion that the Fed has been overly timid. My intuition is to be rather suspicious of that conclusion for reasons that other people have already mentioned. Perhaps the most important one to my mind is that, while the Taylor rule is a useful approximation, I think it’s a misspecification in that it omits a lot of important variables. After all, our actions are conditioned not only on things such as how we think the financial markets might react but also on forecasts and all sorts of other variables that are not going to be picked up with a Taylor rule. Having said that, I would admit that I would feel better if we had some empirical work that brought those other variables into play and demonstrated that we got reasonable results. As far as I know we don’t have that. So, that’s where I am on these issues.

  • First let me congratulate Vincent and David for bringing to the table what has been an interesting issue and a worthwhile discussion. I take two things away from this. The first is that the English language is a funny thing. These titles are value laden, and we accept them at times too easily, I think. I’m taken by the phrase “optimal rule.” That is, it’s optimal assuming that everyone is backward-looking and not optimal as one might think about in a perfect modeling sense. “Optimized rule” I like better. It is a slight spin on the other phrase because it actually says we optimized the rule based upon a model that isn’t necessarily optimal in the formation of expectations. The same is true for “policy perfect foresight,” which we usually shorten into the “perfect foresight” equation. It isn’t perfect foresight. It’s a policymaker’s perfect foresight when the other economic agents are not using perfect foresight, and therefore it is by construction not perfect. It’s not perfectly rational and not even mostly rational.

    Now the reason I say that is not to play games with either FRB/US or the English language itself but to recognize that we tend to get ourselves into a game here. We’re trying to defend ourselves against something called an optimal rule. It seems that we should be close to it and therefore we should shed the inertia until we’re “there.” In fact, “there” may be something that is an artifact of the title we gave it, and therefore we have to be a little cautious about getting “there.” Are we subject to inertia and gradualism or to timidity? I like that because it really comes to the point of another value phrase in that we wouldn’t want to be timid. We might want to be gradual, and we might even want to subject ourselves to some inertia, but we wouldn’t want to be timid. [Laughter] The reason I raise it that way is that the reality is that we’re dealing in a world in which we seem to observe that we act slowly. But the rationale for acting slowly was actually put on the table. We’re not sure about where we are because of data adjustments. We’re not sure exactly of the potency of our actions because of the coefficient uncertainty. Indeed, we’re a little concerned about the markets’ reaction because of some of the fragility issues. In such a world, it’s not surprising that we would act slowly. If in fact the world is as we think we see it, then we may hold steady or make a further move; if it is not, we will reverse an action. As we move closer, if the world backs off and the rationale for an action is an artifact of the data, we will not have to go as far backwards.

    So I think in fact our policy behavior was more symptomatic of an environment of uncertainty than we give ourselves credit for. In my view, our actual behavior looks more like a rational response to the uncertain world in the dimensions I just laid out. So rather than try to chase the optimal rule, I suppose my reaction is that we’re probably doing a better job than the optimal rule suggests. The data from the markets seem to support that notion, as Glenn reported. Those would be my comments.

  • Thank you, Mr. Chairman. I thought this was a very interesting topic, and I thank all the authors for their papers. I enjoyed reading them. I really don’t see the two papers as in opposition so much as different approaches to essentially the same issue, and I think the results are both interesting and important. But to me it’s important to try to draw insights from the results that the FOMC can actually use in conducting policy and maybe even try to relate them to current policy. Let me take just a quick shot at that, and I want to focus on a particular type of interest rate policy inertia. First, I should say that I agree very much with Tony and others who pointed out that there’s a semantic issue here. Timid is bad; cautious is good. It all depends on how you look at it.

    I want to talk about one particular kind of inertia that I think we see to a fairly considerable extent in actual policymaking. I would describe it with a phrase that a couple of my Richmond colleagues, including Marvin Goodfriend here, came up with several years ago. They characterized actual funds rate target changes as “highly persistent and seldom quickly reversed.” I think that phrase is used by a lot of people, and it seems consistent with the charts we’ve looked at today.

    As we see it, this kind of inertial behavior has both advantages and disadvantages. There has been some allusion to that already. We can see the advantage by recognizing that changes in the funds rate target basically affect the economy through their ability to affect longer-term interest rate movements—to carry those along—as has already been observed here. I think everyone knows that, but just to review it: Longer-term interest rates are linked to expected future funds rates according to the expectations theory of the term structure. With this in mind, having a reputation for not readily reversing changes in the funds rate target implies that a given target change is more apt to carry expected future funds rates with it and therefore carry longer- term interest rates with it as well. For example, if we were easing policy to stimulate the economy, this reputation for inertia would tend to make a reduction in the funds rate more stimulative than it otherwise would be. It would be more likely to move long-term interest rates down, too, and to have a significant impact, which I think is important to consider—especially in the current situation, when we’re near the zero bound. We may need that extra thrust in a number of different situations. In any case, that’s the advantage of having a reputation for this kind of inertia.

    But there’s also a cost—or at least it’s not a freebie—because this enhanced effect of changes in the funds rate target works only as long as we maintain our reputation for not reversing course quickly, even if developments in the economy on occasion make it highly desirable to do so. There isn’t a lot of attention given to that in the discussion, but I can certainly think of some situations where that is the case. So in short, garnering the benefits of inertial behavior can conflict with the need at times to act preemptively. Now, on one level that is not exactly late-breaking news, but in my view it’s appropriate to think about it along the lines I’ve described.

    Let me quickly try to illustrate this with one particular example, and that’s our reaction to the Russian default in the fall of 1998. One way to interpret that event and subsequent events is as follows. We reduced the funds rate, as we all recall, by 75 basis points in three steps in 1998. Because the markets believed that our target changes were highly persistent and seldom quickly reversed—certainly that was the attitude then—the 75 basis point change carried longer-term rates with it. We then reversed those three easings in 1999, but we did so fairly gradually. The first change was in July; there was another one in August and another in November. Now, if you go back and look at the history, the economy was really rather strong in early 1999, and a case could have been made to reverse course much sooner. We didn’t do that, and that left our reputation for inertial policy intact. I think one could argue that having that reputation was helpful when we cut rates sharply in 2001 to fight the recession because then, as earlier, the cuts were seen as lasting rather than temporary. But in retrospect one can ask whether or not 1999 might have been one of those times when we should have reversed course more promptly to be more preemptive against the boom, even if it undermined our reputation for inertial policy. I’m not taking a position. I don’t know the answer as to which choice would have been best. I’m using that case as an illustration of the kinds of tradeoffs we can confront relative to this issue of inertia.

    I find this sort of question very relevant to the situation today, so let me close with a brief comment on that. We currently face the possibility of a potentially sizable negative, maybe geopolitically induced, shock at a time when the nominal funds rate is even nearer to zero than it was in 2001. Of course, the real funds rate is already negative. I can envision a number of situations not too far down the road that would require us to think about these tradeoffs that I just tried to illustrate with that 1998-99 example. Suppose we get a negative shock soon, and we reduce the nominal funds rate even closer to zero, but the economy then improves unexpectedly and quickly. We then would have to weigh the tradeoff between the need to preempt that upside risk, on the one hand, and the harm it might do to our reputation for inertial policy, on the other—a reputation that could well be helpful in other situations when the economy may weaken unexpectedly.

    In any event, I’m simply throwing that out as an example of the types of issues that this research and the presentation suggest to me are important for current policy discussions. I would just ask Glenn, Bob, or Brian—or anyone else on the staff—if they might want to comment on that.

  • My opinion would be that in the current situation, or even in 1998-99, what is of interest is that inertia is not operating at these quarterly frequencies. These are situations—and one of them would be the repercussions from the Russian default—in which the Committee sees some serially correlated influences that may persist for several quarters and the Committee is responding to that promptly. Similarly, in the current situation there are influences not well summarized by the Taylor rule determinants that the Committee is responding to on a quarter-to-quarter basis. There is no feeling that we have to get somewhere but we’re going to take our time to get there. Again, that’s a synopsis of my general impression of the Committee’s behavior.

  • I’ll take the other side. If policy is inertial, what we highlight in exhibit 3 are the benefits of inertia when markets and agents are forward-looking. You provided some very nice examples of exactly what happens in these models. In these models, policymakers by assumption are committed to the optimal rule, which has inertia. That gives them the benefit of bringing forward effects of anticipated, drawn-out policy responses. As you said, for example, in the fall of 1998, expectations were that those easings would not be quickly reversed. Now, that does imply that there will be other situations in which policymakers might want to do something different, but they’ve committed themselves to this gradual rule. Mike Woodford has done a lot of the research on this theory, and he talks quite a bit about the fact that policymakers often find themselves confirming expectations of policy actions that were built in in advance. It’s important to confirm those expectations in order to get this channel operating and to get markets bringing forward the effects of policy actions. I think you gave some good examples of what is going on in exhibit 3.

  • If we break this pattern, though, and move boldly in some situation, what is the cost associated with the loss of credibility for inertia, if I can use that phrase? It seems to me that is the kind of thing we have to be informed about if we ever have to make these choices going forward.

  • That’s a difficult question because it depends on how people learn, a mechanism that is not formally in the model itself. If people have completed their learning and know your rule and you do something quite different from your rule, does that cause them to reinterpret what you’re doing and conclude that you must not be using the rule they thought they knew? That’s when confusion results, and they can’t predict the funds rate ninety days ahead, for example. There is not very much literature on this, mostly because there’s no agreed-upon theory about how people learn. What I think is true, and this is just a feeling on my part, is that if the events the Fed is responding to are extraordinary—outside the normal random shocks that occur from period to period—the private sector is quite happy to give you the benefit of the doubt.

  • People will cut us a little slack?

  • Yes. But if you’re responding idiosyncratically to normal shocks so that it doesn’t seem as if you’re acting in a systematic way, that will undermine people’s beliefs that you’re following a committed rule, or rule-like behavior, that anchors their expectations. That is the key here. One thing we didn’t show you in the material we presented is that, if you get to the point Governor Ferguson referred to—where you’ve moved up the curve in exhibit 3 and are near the top of it so that people are more rational—the performance of the economy is much better than if they’re not rational. That’s the first point. The second is that people are also very forgiving of mistakes. If you do behave differently than the policy rule would call for—and people don’t bail on you and say they don’t know what you’re doing now—the policy loss from doing something other than the optimal is very small. If people are rational, they are willing to give you that benefit.

  • Let me make just one point, President Broaddus, which goes back to how this conversation started a while ago. The Chairman asked Brian what the R² was in this exhibit on changes, and the answer was 0.4. There is plenty of unexplained variation. It’s going to take a while before market participants decide that the rule has changed just because there are a couple of runs in which a policy action is unexpected. In part it also depends on how policymakers explain what they do in the manner that Governor Ferguson and President Poole talked about. If it can be described as a change in r* credibly linked to events, then it’s not obvious that a change in the direction of policy is a policy reversal in the underlying sense.

  • I feel more comfortable.

  • Thank you, Mr. Chairman. Most of the people that I know in financial markets understand as much about these models as I do, which isn’t very much. Having the enormous benefit, unlike all previous speakers, of not having a PhD in economics—or perhaps in English literature, in the case of President Santomero—I think the rule that we need to apply is the rule of what I would call prudent central banking. If we look back on the conduct of the Committee, going back at least to 1987 or to 1993, when I got here, I believe that we’ve passed that test quite well. If we reexamine what we’ve done together—with the possible exception of the case President Broaddus mentioned, although I would not agree with his conclusion on the timing of our tightening in 2001—even with the brilliance of hindsight, I believe that we have been prudent central bankers. What do I mean by that? What does that tell us about how the Committee behaves?

    I’ve bored you many times by saying that in my view the most important thing in public life is to distinguish between what we do know and what we do not know. When we reached a point as we did in February 1994 or as we did on January 3, 2001, when it was time for a significant shift in monetary policy—tightening in the case of 1994 and easing in the latter case—what did we know on those occasions? Actually, I should note that I think the seeds of the January 2001 action go back to the December 2000 meeting of this Committee at a time when the markets were essentially closed for the year. There was an underlying feeling going through the room at the December meeting that we were going to keep monetary policy on hold but that there was a strong likelihood that we’d be chatting soon in a telephone conference call. We did that the very first day the market was open in 2001. I remember that we knew on both of those occasions that a significant change in monetary policy was required—tightening in the first case, easing in the second case. But what didn’t we know? We didn’t know what the total size of the needed monetary policy corrective move would turn out to be. I certainly didn’t know that, and I think even our distinguished Chairman didn’t know that at the time. It was very clear with the 25 basis point move in February 1994 or the 50 basis point move on January 3, 2001, that our work wasn’t finished. There was more to do.

    In such situations, we get into the combination of not knowing how far we’re going to go or necessarily how quickly. But we start dealing with the financial markets through which our policy operates, and we observe the reaction in those financial markets. Then we try to judge the likely effect on the real economy of that reaction. If the financial markets are working with us, then the pace of our monetary policy move can be accelerated. We can be bolder, but we are being bolder while being wise and prudent at the same time, which in my view is highly necessary. So I think a prudent central banker is not a timid central banker or a risk-averse central banker but rather one who says the following: This is where I think I want to go; how fast and how far I’m not quite sure, so I’m going to deal with the real world in which we live and do it as effectively as possible.

    At the end of most of those lengthy cycles, which usually last about a year, we realize — as we’ve said frequently in this Committee—that either the last two moves or the last move is going to turn out to have been unnecessary. That was the reasoning behind our explaining the 50 basis point move that we made most recently as an action taken against a soft spot in the economy. We made it very clear that it was an insurance policy against downside risk. Why did we do that? Because when we reverse that move—when we cancel the insurance policy—it will be very easy for financial market participants and the public more generally to understand.

    I think the model building is extremely useful, and this kind of discussion is very important and very enlightening for us. But in my view the likelihood—as you suggested, Mr. Chairman—of our ever getting the model so exactly right that we can base anything on it other than a sense of policy direction is very low. If we decide, well, let’s be bold, then somehow the markets will figure it out over time; that is a risk that I don’t think a prudent central banker will take. Eventually the markets may figure it out, but in the meantime, if we scare the hell out of everybody and tank the economy—losing the confidence of the American people, never mind market participants—that would be imprudent. A central bank should not engage in imprudent behavior. It’s wonderful to feel bold, but I think it’s even more important to be prudent. Thank you.

  • How does it feel to be bold and wrong? [Laughter]

  • That’s why you’re prudent—also known as risk averse—but prudent sounds so much better.

  • I have one brief comment on that, which is how I would come out on this issue. If the choice were whether the Fed should change its behavior to match the models or whether we should change the models to have optimal monetary policy match historical Fed behavior, I’d probably go for the latter. But it’s not clear to a lot of researchers exactly how to incorporate this model uncertainty and what needs to—

  • Well, as the Chairman said, there’s a good deal of psychology involved. Therefore, we’re dealing with all the social sciences, however mathematical we try to make them, at the same time. What is the reaction of financial markets? That’s largely psychological. What is the reaction of the real people? It’s almost purely psychological. I don’t know of any model that perfectly links psychology and economics. Maybe some day we’ll have such models. I don’t know that they exist yet.

  • For example, the policymaker perfect foresight path in the Bluebook is perhaps interesting, but the psychology has to be taken as missing from that.

  • After pondering this, I’m going to talk more about policymaking than the papers or models, and I’m going to say what Bill just said in a somewhat different way. We, of course, want to be prudent, responsible, stable, and all those things. It has been assumed that we get prudence, stability, credibility, and so forth on the basis of the changes we make to the target funds rate. But the more I think about it, the more I think that probably doesn’t capture it. To go back to something Vincent said, what is important is to clarify to markets what we care about. We care about stable prices and maximum employment. There will be times when we have to move fairly quickly, such as in 2001, to establish our commitment to those objectives. I don’t think we should be captive to a feeling that, if we move in a series of small steps but a large distance over a fairly short period, it will somehow make us less credible, understandable, or transparent. As long as we are clear on what we care about, then I think that will give us a little freedom to make big changes if we have to make them or not to make big changes if we don’t have to. So I tend to think that the stability and credibility depend on a consistency of objective, not on our behavior in terms of how much we change.

  • There’s a lot more communication with the markets than just the intended fed funds rate itself. One of the things that I find striking is that this Committee and, of course, particularly the Chairman have really been remarkably clear and coherent about a whole variety of things on which it’s possible to be clear and coherent. He enjoys talking about obfuscation, but that’s not in fact what the markets most of the time take away from what he says. I believe that’s an important point. It’s one thing that separates the success here, I think, from the problems that some other central banks have had because the predictability that we’re talking about is not there for some of those other central banks. There is some literature on that.

    Let me mention something that I find interesting. Roger talked about the fact that the fed funds futures market the day before a meeting most of the time has it right. One of the things that Bob Rasche and I have done is to go back and try to figure out how that happens. If one looks at any particular fed funds contract at the beginning of its trading—let’s say when it starts to become active three or four months in advance—that forecast that far in advance is often not very accurate. What happens if you follow this day by day is that you see that the information coming into the market moves that rate in a very sensible way most of the time. So let’s say the fed funds prediction starts out too high and then it converges to where we actually go; it’s a consequence of new data such as a weak report on industrial production or employment and a whole raft of other things. It turns out that about 80 percent of the large fed funds futures changes are a consequence of information of that type and about 20 percent are a consequence of statements or testimony—primarily from the Chairman or occasionally from other members of the Committee. So a coherent picture of how the policy process works is obtained not just from the response function that economists build into their models and work with in their equations. It’s also a consequence of the way the Committee—and again I emphasize particularly the Chairman—has articulated what it is that we are doing.

  • Mr. Chairman, it’s late and some of my intended comments would repeat what others have said, so I’ll be fairly brief. One point I would note is along the lines of Vice Chairman McDonough’s comment on modeling human behavior. We have two models here, and most of us probably can agree with some dimension of each model depending on the circumstances. As I look at the models and at the circumstances, I find that my thoughts go to the Chairman’s earlier point and Don Kohn’s. When we are confident enough about what we should do, we act more boldly. When we are less confident, we act either more cautiously because we don’t know or we take that one step too many because we don’t know. What these models are trying to do is to pick up that behavior. Since that involves the human behavior of a small group, we’re going to get some rather contradictory and conflicting outcomes. That’s exactly what we’ve got here, I think, and it has been interesting. I’m not interested in signing up for the psychological test, though! [Laughter]

  • Let me make a couple of different comments. Also because of the time, I am not going to repeat some of the comments others have made. One of the things I found interesting in reading these papers was Glenn’s chart on page 2, which looked at the different rules and the actual fed funds rate. What strikes me is that the largest variances among the different fed funds rates really occur mostly when either there is a turn or there is noise in the data. I was in the private sector when many of these events were occurring, and I will tell you how I read the situation—and this picks up from what Bill Poole just said. When the FOMC has a great track record of being persistent—when we know that we’re trying to get out of a recessionary period or we’re really fighting inflation, or whatever our direction is—the market expects us to persevere. So we keep along those paths through both communication of the rate changes and other public comments that we make. But just as our views sometimes differ around the fringes in getting to a consensus on when turns happen, the views on the Street differ as well. To the extent that that’s noise, I think that’s a different issue than the persistence of a policy course through the cycle.

    The other comment that I have relates to reacting and overreacting. The market tends to overact. I was chairman of an early asset liability management committee (ALCO), and in that group we called the fed funds rate the administered rate because we saw it as always lagging behind whatever else was going in the markets. With the advent of derivatives and other hedging instruments, the one thing that most bank ALCO managers did not do was to hedge anything to the fed funds rate. That’s because its correlation was so low compared with that of free-moving rates like Libor or Treasury yields or other market indexes that moved more quickly. In the periods we’re talking about—for example, in 1998—some of us can remember that traditional hedges actually reversed on us; and instead of being negatively correlated, they were positively correlated. The fact that the Fed was there signaling that it knew something unusual was happening that caused the movements helped in my view to damp the overreaction of the marketplace. So another element of our not wanting to overreact and to reverse ourselves is that I fear it would build on the market’s own tendency to overreact and reverse on itself. So I think it’s important for us to be persistent, to signal when unusual events are happening, and to communicate that well. To me those are the lessons that we can take away from these papers.

  • Thank you very much. Any further comments to close this discussion? If not, we have a report from Dino.

  • Thank you, Mr. Chairman. I’ll be referring to the package of charts that is being circulated. The first graph shows three-month cash deposit rates in the United States and the euro-area and three-month deposit rates three, six, and nine months forward for both the dollar and the euro. The cash rates were stable in the intermeeting period, but forward rates declined, particularly for the euro. The three- month-forward dollar rates traded through cash rates, partly in response to data that became available as the intermeeting period progressed. Disappointing outlooks from several major corporations and risk aversion driven by geopolitical risks were cited by market participants as reasons that short-term rates were likely to stay low longer than had previously been expected. Some were even building in further reductions to come.

    In the euro area, reduced expectations for growth, changing sentiment about the prospects for structural reform in some major economies, and a stronger euro that might curb exports were cited as reasons that interest rates might decline there. For nearly two years now euro-area short-term rates have been higher than dollar rates of comparable maturity. That divergence was not really remarked upon for much of that period. Long-term capital flows, such as foreign direct investment and equity portfolio flows, more than covered the U.S. current account deficit. But with those sources drying up more recently, the expected returns on short-term investments have become much more important to the financing of the deficit. Hence, short-term interest rate differentials have become more of a topic of conversation in markets.

    At the longer end, as shown in the bottom panel of page 1, ten-year yields on German and U.S. government bonds recently have shown a rate advantage for the German bonds. Although that spread has nearly converged as of today, German yields have had an advantage over U.S. yields since May 2002. In general, an underlying theme in currency markets recently has been the reduction of expected returns on dollar assets.

    Turning to page 2, in the past two months the dollar has shown a steady decline against most major currencies, with the decline of the dollar against the euro most pronounced. The euro has gained about 10 cents versus the dollar in the last two months and is now trading at its strongest level since October 1999. But the dollar has also been falling against the Swiss franc, the Australian dollar, the New Zealand dollar, the pound sterling, the Norwegian kroner, the Canadian dollar, the yen, and almost every other currency except the Mexican peso. Geopolitical concerns in one form or another have been cited as the reason. But if investors have concerns about Iraq, it’s not clear why European currencies would be immediately attractive. And if the concern is North Korea, it’s not clear why the yen would have a spark. Falling stock prices in the United States are sometimes mentioned, but European stocks have fallen even more.

    One variation on the geopolitical factor that may have a bit more attraction in terms of concerns is the extent of generalized uncertainty. That explanation suggests that uncertainty among businessmen and investors has led both groups to stay on the sidelines until the air clears with respect to these geopolitical risks. Given the losses that investors in general have had in riskier assets in the last few years, that yearning to “stay home” is understandable even without the uncertainty generated by event risks. But if Japanese, European, and other investors stay home and the U.S. current account deficit still needs financing, the result is that pressure builds on the exchange rate—and perhaps even on asset prices. A second aspect of developments in the last few weeks involves uncertainty about today’s testimony by the incoming Treasury Secretary. Some market participants had speculated that he might take a different approach toward the dollar policy. Finally, another underlying theme in the markets was that some influential central banks were reallocating reserves from dollars to euros.

    The yen also appreciated versus the dollar despite the continuing stream of bad news out of Tokyo, although the rise in the yen was less pronounced than that of other major currencies. In part, the cause of the yen’s appreciation may also be related to asset reallocation, this time by Japanese institutions from yen and dollar assets into euro assets. But another factor may have been the intervention by the Bank of Japan. Since mid-January, the Japanese monetary authority has intervened very, very quietly in markets. In total, they have acquired $5.6 billion over that period. Markets have not caught on to this activity yet, though they will by early next week when the Bank of Japan’s month-end financial statements are released.

    As for the dollar’s value more broadly, as shown in the bottom panel, on an effective basis the dollar is roughly back to the middle of its trading range since 1995 and is also at the average of its value since 1985. While some commentators have taken that as a sign that the currency is perhaps getting close to some sort of equilibrium, others have pointed out that the dollar moves in long multiyear cycles and is prone to overshoot on the upside and also the downside.

    Turning to page 3 and moving to domestic markets, there have been some signs of a revival in risk appetites and of easier financing conditions. But these signs are tentative and perhaps not all that convincing in the end. The top left panel shows that the investment-grade corporate bond spread has narrowed sharply in the past three months. Issuance has also shown some signs of picking up. The top right panel shows a similar pattern of narrower spreads and a pickup of issuance for high-yield bonds. Although the trends are favorable, the absolute levels of both spreads and issuance still point to markets being not quite back to normal. The middle panel graphs for the period since January 2000 the monthly corporate bond spread to Treasuries—for investment-grade bonds (in green on the right scale) and for high- yield bonds (in red on the left scale). Spreads are down from their highs for both, but essentially they have retraced the spike after last summer’s corporate scandals and in absolute terms are still on the high side. Similarly, over a longer perspective, issuance also has stopped growing. The bottom panel shows aggregate corporate bond issuance—both investment grade and high yield—by quarter for 2001 in red and for 2002 in blue. In each quarter of 2002 issuance was lower than in the comparable period in 2001. So far this month, through January 24, issuance has totaled $47 billion compared with nearly $85 billion for the full month last year and more than $100 billion in 2001.

    Market participants also point to absolute levels of Treasury yields as another indication of risk aversion. As shown in the top left panel page 4, the two-year yield remains near its record low at about 1.65 percent. Longer-term yields have declined a bit less, but they, too, remain at low absolute levels. The result is that the curve has steepened again. The middle panel graphs three views of the curve’s steepness since January 1991. The red line depicts the thirty-year yield minus the three-month rate, the blue line shows the ten-year yield minus the three-month rate, and the green line graphs the ten-year yield minus the two-year rate. Each is at a very steep level and is comparable to the peaks recorded in 1992. The positive spin on this steep yield curve is that it is forecasting an economic recovery as it did in the early 1990s. The negative interpretation that some are taking is that longer-term rates, although low in absolute terms, have remained unusually high despite the fall of short-term rates. In the latter view, that’s either (1) because the return of budget deficits is forecasting a substantially higher supply, which is being priced into the curve, or (2) because foreign investors in particular are demanding a higher rate of return to keep accumulating U.S. assets. Given the short-end policy rates, that higher rate needed to attract foreign investors is showing up at the long end. The picture in equity markets has been mixed, with signs of risk appetites recovering early this year. But with the declines of stock prices in the past two weeks and the spike in volatility, as shown in the bottom two panels, risk aversion seems to have the upper hand there as well.

    Turning to the next page, I just wanted to say a few words about some intra- European developments. The top panel graphs from last November the S&P 500, the DAX, and the Dow Jones Euro Stoxx index. In general, despite the rally in the euro, the situation in euro-area markets has not been very rosy—especially in Germany. The DAX has depreciated more than the S&P and for that matter more than other European indexes. Germany has been criticized in the markets for moving too slowly on structural reform and for letting its finances breach the Stability Pact threshold. Somewhat less noted among analysts has been the compression of spreads relative to Germany among European sovereign bonds. French government bonds have traded close to German bonds for some time. But Italian and Spanish bonds, which were traditionally grouped in the so-called periphery—and typically yielded 2 to 3 percent more than German paper before the launch of the euro—have seen their spreads compressed. As shown in the bottom panel, despite Italy’s budget problems, its sovereign debt is now trading less than 20 basis points over Germany’s. Spanish government bonds are trading on par with France’s, at only a handful of points over Germany’s. A number of market participants have noted that Spain has actually moved among the fastest in the EU on the structural reform side.

    Turning to page 6, I have just a few quick words on reserves. The graph there shows the composition of the System Open Market Account in terms of its three main components—the permanent SOMA, long-term RPs, and short-term RPs acquired since January 26, 2002. You’ll recall that after currency unexpectedly drained reserves in the fall, the Desk compressed the System’s balance sheet, mainly by reducing the size of the long-term repo book to $6 billion. As we moved toward year-end, a more normal seasonal pattern of currency growth emerged, and the Desk accommodated that primarily by expanding the long-term repo book back up to $26 billion. As the seasonal pattern reversed, the long-term repo book has been taken down to $14 billion, a level we feel comfortable maintaining for the time being.

    Mr. Chairman, there were no foreign operations in this period. I will need a vote to ratify domestic operations, and I would be happy to take any questions.

  • Just one, Dino, on your dollar chart on page 2. You said that some people think the dollar is in equilibrium now because it is back at its historical level. Has anybody looked at the current account deficit to inform those judgments?

  • Well, the people citing that view are perhaps not taking the analysis as far as you’re suggesting. I think it’s probably more mechanical—looking at the dollar’s highs and lows and figuring that, since it’s somewhere in between, it might be approaching something of an equilibrium. But, again, many people tend to look at these exchange rate movements as having very long cycles; they tend to think that we may have more to go on the dollar.

  • Further questions? Cathy.

  • I was interested in your comments on risk aversion and the various ways that you see it in the market. In the last couple of statements after our meetings, we’ve referred to geopolitical uncertainty. But with the weakness apparent in the incoming economic data, there also has to be a degree of uncertainty that simply reflects people wondering what is going on in the economy. Do you have any sense of the breakdown between those two types of uncertainty with regard to how market people are seeing them?

  • That’s a very good point, President Minehan. I think that is reflected in the concerns about some of the corporate reports that have been coming out and about the weakness in the production and employment data. It’s very hard to put percentages on that, but that type of uncertainty about the underlying strength of the economy is certainly part of it. Much of the market commentary has been that there are a lot of uncertainties involving big events that could go one way or the other way in the next few weeks and months, and market participants are having a hard time pricing for those risks because there are quite a number of them.

  • Yes. It seems to me that if all the uncertainties center on a discrete geopolitical event—a go/no-go decision such as we go to war or we don’t go to war—that has one implication for how to look at the second half of the year. As in the Greenbook, one could look at various scenarios that make some big assumptions about the shortness of a war or whatever. If the uncertainties hedge around underlying fundamentals—growing out of the view that this is a very different kind of recession than that around which our models and people’s memories are built because this “recovery” compared with previous ones is so slow—then that says something different about the role of uncertainty. That is a different story in terms of when the uncertainty might get resolved and the implications for the second half of the year.

  • I don’t want to comment on the forecast or what the market might or might not be expecting with precision. I think many have an assumption that the second half will be strong. So that’s one of the factors for which there might be—

  • They are thinking a little more of a discrete event or that kind of thing—something that is resolved faster rather than slower.

  • Well, the way that I would put it is that the forecasts for the third and fourth quarters are relying upon a lot of positive things happening in the next few weeks on the geopolitical front and also on the assumption that the data will get better. Now, if things do not turn out that way, then quite a bit of risk might lie ahead in several markets. That’s the way I would think of it.

  • President Minehan, one thing that makes the kind of decomposition you’re suggesting especially difficult to do is that, if geopolitical risks actually have affected business decisions on hiring and production, then some of what appears to be a response to weak employment and production data is indirectly a response to the fallout from the geopolitical risks.

  • May I slide a comment in here? In talking with people in the New York, London, and Paris financial centers, it’s very, very hard to determine to what extent they are saying that the uncertainty is geopolitical when it’s really a cover story for uncertainty about economic issues. It’s absolutely impossible to distinguish between the two. My own view is that people think it’s rather fashionable to say you’re worried about geopolitical risks—it’s such an eloquent word—when the real fact is that they’re worried very much about economic developments. The point is that, if all the concerns are geopolitical and we do something in Iraq and it’s over very successfully, one might think, wow, the economy could take off. If on the other hand it’s a cover story for concern about the economy, the reaction to a successful war effort might not be anywhere near as positive. I don’t know how to distinguish this mix of uncertainty and risk aversion.

  • That’s the real dilemma we’re looking at for the next month and a half.

  • I want to ask a question about your page 3. When I see a chart like this that shows spreads of corporate yields versus Treasuries for a short period, the question that comes to my mind is, What should one consider the norm? If one had a much longer historical chart, I don’t know what it would show. Is a spread somewhere in the range of 150 to 200 basis points about right—I’m looking at the U.S. domestic rates—or is that way off? Can you give us a little sense of how this fits in the broad historical scheme of things? That would pick up a little on the earlier discussion about risk aversion. That spread has come down some, but has it come down a great deal by historical standards or is it still elevated by historical standards?

  • Well, if we go back to the 1997-98 period, spreads were extremely tight. If anything, they were probably too tight and were not factoring in as much risk as people were undertaking. So in that sense I’ve intentionally left that out. But if we exclude that period, at least based on the sense I get from people in the market, “normal” would be something south of these numbers but not as low as the 50 basis points seen in that earlier period on some bonds that were bordering on junk. So the area we would be thinking of is something higher than that but lower than what they are today. Certainly for the high-yield bond the spreads are about 800 basis points now, and I believe they bottomed out at about the mid-200s, which was extremely low. So a spread in the 400s or the 500s is intuitively what people in the markets feel might be a bit more “normal,” if you will.

  • As you know, Governor Ferguson, and as Dino noted, these spreads move around a good deal. We’re struck by the fact that they’ve come down of late, but they’ve come down from an unusually high peak. The high-yield spreads bottomed out, as Dino said, in the neighborhood of 240 basis points at a time when we think risk aversion was unusually low and investors were too willing to take on risk—at the end of 1997. Subsequently they rose about 800 basis points and in recent months have come off a couple hundred basis points, as you see here.

  • Another way of looking at this is that these spreads now are roughly where they were in January 2001, when the Committee first started easing and when there was a lot of risk aversion, especially in the high-yield market. They are also close to the levels of October 2001 right after the terrorist attacks. So they’re back to levels that most people in the market would not view as necessarily reflecting a healthy state of affairs.

  • In fact, if you want an even more evocative comparison than that, the last time the spreads were there before then was in 1990.

  • I have a question on page 3 also, on the bottom chart showing a decline in U.S. corporate debt issuance. Do those figures also include obligations involving funds raised by corporations through securitization, or is that just straight corporate debt?

  • I believe it is straight corporate debt. I would have to check on that, but I don’t believe it would include Ford ABS, for example.

  • Well, that’s what I was wondering. Back to your risk aversion—to the extent that corporate debt has gotten very expensive to issue, some companies are substituting commercial mortgage-backed securities, collateralized debt obligations, and those types of things in lieu of straight corporate debt.

  • Yes, some clearly have. That was especially true in the fall, when a lot of corporations could not issue straight debt, and they went that route because it was still available to them.

  • We ought to amplify the charts to show that.

  • Yes, I think that’s right. If it included securitization, those data, which are included in the sources and uses of funds, would have to show a huge sale of securities. They don’t show that, which suggests that it is a different set of data.

  • The other form of substitution that firms used at that time, Governor Bies, was to run down liquid assets. In the flow of funds accounts we see a big decline in liquid assets.

  • Right. That was another question I had. To the extent that corporate cash flows have gotten a little better, I wondered whether that has had an effect too.

  • Certainly there was a run-down in commercial paper being “termed out.” There was also some paydown of commercial paper financed by running down cash balances. Some of that was going on as well.

  • Any further questions? If not, let’s adjourn until—

  • Mr. Chairman, I move approval of the domestic operations. Sorry, I almost forgot.

  • Oh yes. Without objection they are approved. Let’s adjourn then until 9:00 a.m. tomorrow.

  • [Meeting recessed]