EM Forum Presentation — August 24, 2011

IAEM-USA's Proposed Framework
for Measuring Return on Preparedness Investments

Randall C. Duncan, CEM®
Chair, IAEM-USA Government Affairs Committee
Director, Sedgwick County (KS) Department of Emergency Management

Amy Sebring
EIIP Moderator

This transcript contains references to slides which can be downloaded from http://www.emforum.org/vforum/IAEM/IAEM-ROI.pdf
A video recording of the live session is available at http://www.emforum.org/pub/eiip/lm110824.wmv
An audio podcast is available at http://www.emforum.org/pub/eiip/lm110824.mp3


[Welcome / Introduction]

Amy Sebring: Good morning/afternoon everyone and welcome once again to EMForum.org. I am Amy Sebring and will serve as your Moderator today. We are very glad you could join us.

As we approach National Preparedness Month, there are currently three preparedness related documents currently out for review and they are all linked from today’s background page. The one we are focusing on today is the International Association of Emergency Managers’ (IAEM) draft document, Preparedness: A Principled Approach to Return on Investment. IAEM requests your feedback by September 2nd.

IAEM, NEMA and others are also conducting a survey to measure attitudes towards preparedness. Again, there is a link to a version of the poll for Emergency Management and Public Health professionals, and responses to that are requested by tomorrow.

Finally, FEMA has issued a new draft for the National Preparedness Goal and comments on that document are requested by September 2nd.

Now, inspired by all this, we are going to conduct our own mini-poll in just a moment, but first, for the benefit of our newcomers, once we have looked at our poll we will proceed with an overview presentation, and then your questions and comments. Let me point out that if you would like to capture a PDF of the slides, you may do so by clicking on the printer icon at the bottom right once they are displayed.

Before we move to our poll, it is my pleasure to introduce today’s guest: Randy Duncan is a past President of IAEM and currently serves as Chair of IAEM’s Government Affairs committee. He is Director for Sedgwick County (Wichita) Kansas Department of Emergency Management and is in his second term as a member of the Governor's Commission on Emergency Planning and Response.

He has participated in numerous professional activities and I refer you to today’s Background Page for further details. He also wrote the Executive Summary for the document we are discussing today. Welcome back Randy. Thank you very much for taking the time to be with us today.

Randall Duncan: Amy, thank you for the invitation and I look forward to doing the program, hearing the questions and the comments.

Amy Sebring: Now let’s move to our poll, and we encourage all of you to participate whether you are an emergency manager or not. Just click on the button for the answer that most closely reflects your opinion.

First we would like to characterize the overall composition of today’s participants:

Poll Question 1. To which sector do you belong?

  • Academic = 2 (7%)
  • Business/Private Sector = 5 (17%)
  • Government = 14 (50%)
  • Volunteer/NGO = 3 (10%)
  • Other = 4 (14%)

As you are well aware, this September marks the 10th anniversary of 9-11, so our next two questions ask you about progress in national preparedness since then.

Poll Question 2. Over the last 10 years, what kind of progress have we made in institutional preparedness?

And by institutional we mean governments, businesses, schools, etc.

  • Negative progress = 0 (0%)
  • No progress = 0 (0%)
  • Slight progress = 11 (37%)
  • Significant progress = 17 (58%)
  • Major progress = 1 (3%)

Poll Question 3. Over the last 10 years, what kind of progress have we made in individual and family preparedness?

  • Negative progress = 0 (0%)
  • No progress = 1 (3%)
  • Slight progress = 24 (85%)
  • Significant progress = 3 (10%)
  • Major progress = 0 (0%)

Now, it’s apparent that there is an economic and political climate in which government services at all levels are being trimmed back. There has been generous grant funding for preparedness for the past several years, but can that continue? So our question is...

Poll Question 4. In the future, to what degree do you expect public funding for preparedness efforts to be reduced at all levels of government?

  • No reduction = 0 (0%)
  • Slight reduction = 7 (25%)
  • Significant reduction = 10 (37%)
  • Major reduction = 9 (33%)
  • Increased funding = 1 (3%)

Poll Question 5. If public funding were reduced in a significant or major way, what impact would you anticipate on preparedness efforts and outcomes?

  • No impact = 0 (0%)
  • Minor impact = 2 (6%)
  • Significant impact = 16 (53%)
  • Major impact = 12 (40%)

Thank you everyone. Now let’s turn to today’s discussion.

[Presentation]

Randy, I turn the floor over to you to start us off please.

[Slide 1]

Randall Duncan: Thank you for the introduction, Amy. For those of you who are participating and listening, thank you all for the input through the poll questions. I found that very interesting.

I would like to talk a little bit about a new document that has been released by the International Association of Emergency Managers. It is called "Preparedness: A Principled Approach to Return on Investment" as you can see on the title slide. One thing I want to make clear up front—some of you know that I am the Chair of the U.S. Government Affairs Committee for the International Association of Emergency Managers.

This particular document does not reflect the Government Affairs Committee. It is a work of the entire association. While it did not originate in Government Affairs—certainly we have had close interaction with it, but we want to make it clear that it originates with the entire organization.

[Slide 2]

What you see on this slide is a photograph or a capture of the front cover of the document. In particular what I want to point out to you about this particular slide is you’ll notice the document is labeled Version 1.0. There is a very specific reason for that. We understand there will be a dialog. In fact, it is our hope that a dialog will be engaged in on this topic as our introduction of this document, and we understand that the document itself may change.

That is why we are referring to this as Version 1.0—understanding that there will be a process of iteration and so forth. One of the best ways we might be able to give you a flavor for this is to look at some of the elements of the document. As we get into the latter part of the presentation, we will open it up for questions and comments.

[Slide 3]

What you see now is a list of the folks who were involved with the development. I want to make it perfectly clear at the beginning that the basic breakthrough of this idea and its novel application to measurement of return of investment came from Dr. Jessica Jensen at North Dakota State University who heads up their Center for Disaster Studies and Emergency Management.

We offer and have a great debt of gratitude to her for the intellectual gut work that went into all of this. The rest of us had an opportunity to review it, see if it makes sense with regards to our local programs and put it through the peer review process if you will, but based on practitioners as opposed to academics, for those of you who are familiar with the academic peer review process.

This gives you an idea of the folks who were involved in that. You can see there are local emergency managers, there is a tribal emergency manager, and it has been through a number of different folks. We know it is not done yet. Even though it has been through those folks, it hasn’t been through you all yet. As we’ll talk later in the presentation, there is an opportunity for you to provide feedback and input on this document as well.

[Slide 4]

The genesis of this project essentially comes from the comment and the desire on the part of the American taxpayer, the United States Congress—both the House and the Senate—and the folks at Headquarters at DHS at FEMA—the policymakers—they want to know what it is they are getting for their money.

One of the basic places we started was with Management Philosophy 101 with management guru Pete Drucker who said, "What gets measured gets managed." I’m sure many of you are familiar with that. It is one of the foundational concepts upon which modern management principles are built. I took the opportunity to extend that statement with regard to emergency management, "That’s okay, as long as what you’re measuring is relevant."

That is the emergency management twist on Peter Drucker’s original idea of management.

[Slide 5]

By way of background, the framework or foundational stones for this document came out of another document which was created in September of 2007 called "The Principles of Emergency Management". You see a capture of the cover of that document on this slide.

[Slide 6]

Here you see a copy of the foreword from "The Principles of Emergency Management" and you get an idea of the folks that were involved in the development of that. That development effort was led by Dr. Wayne Blanchard who is now retired from the Federal Emergency Management Agency’s Higher Education program at the Emergency Management Institute.

There were representatives of the International Association of Emergency Managers. There were representatives of NEMA, EMAP and others involved in the development of these principles of emergency management, which is a seminal, foundational document with regard to what emergency management is and how we conduct the business of emergency management—the principles that are the basis for that.

[Slide 7]

Here you see a list of the eight principles. One of the exercises that this group went through was developing a definition, a vision, and a mission as it applies to the emergency management enterprise. The key work done by this group involved these eight principles you see here—that emergency management is comprehensive, progressive, risk-driven, integrated, collaborative, coordinated, flexible, and professional.

You’ll notice after each of those descriptive words, there are brief lines that talk more about the characteristics of that word—that principle as it applies to our business of emergency management.

[Slide 8]

Here you see the essence of Jessica’s intellectual breakthrough. We take the eight principles of emergency management and we put them together as a hierarchy of outcome. If you’ll look at the blue triangle on the right of the screen, you’ll notice we start with the element of the program (professional) and we work in the others (collaborative, integrated, flexible, progressive, risk-driven, comprehensive, coordinated) which leads to preparedness, which leads to the ultimate goal of effective and efficient mitigation, response and recovery.

The only reason we don’t mention preparation or preparedness there is because we believe it is the next to the last or penultimate outcome next to effective and efficient. What this does is provide a context within which to put the numbers that we have all been familiar with collecting in terms of measuring our return on investment for the emergency management preparedness grant.

We have now provided a picture—an overall context so that those numbers now have a meaning in terms of leading toward that overall outcome of effective and efficient mitigation, response and recovery. I want to leave that slide up for a moment longer because this is a big idea to transform those eight principles that govern the way we conduct business into a series of outcomes organized hierarchically.

That is what provides us the picture. The analogy I like to use is so many times in the past we have simply prepared inventories of things. We have X number of people that work on emergency management. We have Y number of people that prepared plans. We’ve made Z number of plans—and we can continue onward.

Now we have a hierarchy of outcome that tells us the reason we have those people is we need to look at the issue of professionalism, which leads to the subsequent issues of collaboration, integration, flexibility, which then transitions into coordination, preparedness and leads to that effective and efficient mitigation, response and recovery that we are all looking for as the ultimate outcome. We have now a picture in which to plug those numbers.

[Slide 9]

We also thought it would be a good idea to take you through an example of how we deal with the way these things could be measured and how these relate to the overall picture. We’re going to start out with a top level outcome. You see the outcome on this slide—they (the jurisdiction) engage in preparedness actions guided by professional staff, that is, emergency managers, and professional programs.

There is our hierarchical outcome that relates to professionalism that you see as the base of the outcome hierarchies from the previous slide.

[Slide 10]

We see the objective related to that—local emergency managers are practicing professionals. We’ve taken one part of that outcome and now turned it into an objective.

[Slide 11]

How do we measure the professionalism of local emergency managers? We have suggested some sample measures. Our guidelines on these sample measures, or preference for our approach in terms of doing these sample measures, is to try to utilize things we already measure on a day-to-day basis in our program so that we are not creating new measurement mechanisms, and we’re taking the existing data and providing an overall context for it, which we think is a thing that has been missing from previous approaches to this topic.

Here are some sample measures we suggested that would contribute to measuring the professionalism of the folks in an emergency management program. One of the first ones we suggest is—out of the emergency management program, how many people are certified, whether as a IAEM Certified Emergency Manager credential, an Associate Emergency Manager credential, a Certified Emergency Manager credential from a state association, or graduate from FEMA’s Emergency Management Foundational Academy?

You see there are several alternatives. These are things we would track in our HR files or our training files on an ongoing basis. We might then measure if the jurisdiction’s emergency managers hold an emergency management degree.

Another measure might be if the jurisdiction’s emergency managers belong to one or more emergency management professional associations, on the thought that belonging to those professional associations is an outward sign or symbol of that person seeking greater professionalism within the field.

[Slide 12]

Some additional sample measures might include the number of general administration or management training courses completed each year by the district’s emergency manager. That keeps track of the training component. Essentially, our thought process here is that a trained emergency manager is an emergency manager travelling toward professionalism.

Also, the number of emergency management specific training courses completed each year—it’s the same thought process. An emergency manager not only has to run the business of the program, which includes the general administration and management training, the manager also has to run the emergency management program—hence the need for the emergency management specific training.

[Slide 13]

Now we take the other half of our outcome. Remember, in the first objective we talked about the first half of our outcome, which is a professional emergency management staff. The second half was a professional emergency management program. Our objective is—we take a look at the jurisdiction’s emergency management program to see if it is administered in keeping with standards of the profession.

[Slide 14]

Let’s see what the sample measures might include. The first one is the jurisdiction’s emergency management program and financial audits are satisfactory based on the jurisdiction’s audit standards. This is an issue that occurs every year, typically a programmatic audit—at the local level it is done by the state, at the state level it is done by the feds and so on.

The second is the jurisdiction program has received accreditation through the Emergency Management Accreditation Program or the jurisdiction is progressing toward achievement of the EMAP standard as measured through the EMAP self-assessment tool.

The third point on this page is essentially the same point with reference to the NFPA 1600. At this point in time, we understand there is not a self-assessment tool that has been developed with regard to that, but we think this would encourage the development of such a self-assessment tool.

[Slide 15]

To find out more about this, here is a URL on this slide. The slide deck is available to look at on the EM Forum website. Also the background slides have various websites available. If you go to the address highlighted in blue, you’ll find a press release from IAEM regarding the release of the document.

You’ll also find a link to the document itself and a secondary link to a Survey Monkey tool which is your opportunity to provide comments and input on the continuing development and evolution of this particular document. The deadline for submission for comments on this document is September 2 of this year. The deadline is looming closer.

We encourage you to download the document, read it, and let us know what you think about it, bearing in mind that it was developed originally by an academic, but that is through a pool of local practicing emergency managers in various sized programs and various geographical areas throughout the United States.

Overall, we are very anxious to hear what you think about this. Where we want to go, once we get the input and continue this process—it is our hope that this document will contribute to the dialog about the measurement of return on investment for the Emergency Management Performance grant and make a contribution.

Ultimately, we would love to see the measurement system used by FEMA to report progress on EMPG back to Congress take the shape of this tool. However that finally ends up—our examination of other efforts that are attempting to do this—they are worthy efforts and they are making a contribution.

We think the basic difference between this effort and the others is that we have provided an overall context—a hierarchy of outcome within which to array the numbers we are essentially already collecting. Hopefully, we have provided you a sample of one particular element of the eight principles of emergency management and a way that it could be incorporated into this measurement system.

We refer to them as sample measurements. We don’t refer to them as final measurements because we don’t view the process as completely finished at this point in time.

Amy, at this point I have gone through the presentation I intended to share with members of the EM Forum today. I think it would be appropriate to open this up for questions and comments.

Amy Sebring: In the comments, are you encouraging people to comment on the sample measures even though they are identified as samples?

Randall Duncan: Exactly. Think of them as a place to start the dialog. Look at them, and if you see a sample measure that is particularly relevant and captures that idea well, please tell us. Similarly, if you look at them and find a measurement that doesn’t get at what the outcome or objective is, let us know, and go further suggest a way to get to it.

Amy Sebring: I have one for the professionalism: is the emergency manager a full-time employee?

Randall Duncan: Well I understand that there is a range of different ways that programs are handled throughout the United States. I understand that some local jurisdictions have emergency managers that work at this job less than full-time. In my own work and background experience, I originally came from a small, rural county in Kansas of 37,000 folks in the entire county.

I basically had a one or two person shop, with me being full-time and most everyone else being part-time. Now I am in a county of about half a million where I have the luxury of having staff. I know there are folks who don’t have full-time allocation. Unfortunately, the responsibilities that keep getting placed on the emergency management program in terms of public expectation and guidelines from our federal and state partners keep increasing.

At some point in time, local jurisdictions are going to have to seriously examine the issue of whether they can get by without a full-time emergency manager in some aspect.

Amy Sebring: I did want to mention we did invite Dr. Jensen to join us today, but she had to send her regrets because she is currently in Washington, D.C. and meeting on this outreach effort with Congress.

Randall Duncan: We have a group of folks including Dr. Jensen, our IAEM President, Eddie Hicks, and our incoming President, Hui-Shan Walker, Martha Braddock, our policy advisor—they are conducting a series of meetings with FEMA officials and Congressional committee staff talking about this and conducting outreach regarding the document to make the committee staff folks aware.

I had the opportunity to visit with them last night to get initial feedback. The comment is that we have been receiving favorable response from the congressional folks who have been taking a look at the document. But like all things that are new, it is not necessarily jumping up and down and being ready to endorse it today.

We have to think about—read, ponder, and let sink in the consequences of the idea, understanding that we’re not really going to find a way to avoid measuring return on investments. The only question is will we get to have input into the design of the system, or will the system be created by somebody else and we have to react to it?

This document is IAEM’s effort to provide input to those out there who are looking at creating that system and have that system vetted through actual practitioners in the field so that we have an active part in creating our own system of measurement rather than responding to a system of measurement in which we did not have an opportunity to participate in the creation.

Amy Sebring: Thank you very much Randall. Now, to proceed to our Q&A.

[Audience Questions & Answers]

Question:
Lucien Canton: There are always concerns that measures become mandatory rather than being used as indicators. For example, ideally the EM should have an EM degree but that is not the current reality. How does the document distinguish between the ideal and the acceptable?

Randall Duncan: First off, let me respond by saying that is one of the typically great questions that Lucien always asks. One of the ways we attempt to deal with this issue is to offer a variety of different measures. The concept would be similar—for those of you who can remember the days before September 11, 2001, we used to have a document as locals and states that dealt with FEMA at that time—we called it the Comprehensive Cooperative Agreement.

Essentially it was an agreement where we would negotiate work that would be done. I think there are some parallels with the target capabilities list and so on, in terms of the fact that you can select particular capabilities to work on in any given year off the TCL and similarly that was the approach that was used in the CCA.

I think that is another view we have of this document. There would be a negotiation between the local jurisdictions and state jurisdictions about what areas would be concentrated on in what particular year and similarly between the state and the federal government. From the perspective you would have the opportunity not only to choose the outcomes that are particularly relevant to your program, you would also have a variety of different measurement mechanisms to try to achieve that.

That is also why the document is in draft or discussion stage—we do want to take those things into consideration. If someone has suggestions as to how we can do it better than what we already have, we are very much open to them.

Question:
Amy Sebring: One of the things I was interested in the slide about the NFPA 1600 self-assessment tool possibility. Has there been any discussion about that?

Randall Duncan: At this point, we have not done outreach to NFPA 1600 on that specific issue. We have members of the association that serve as liaisons to these various standard setting bodies—Emergency Management Accreditation Program, the National Fire Protection Association and so on—and at this point it has just been within the document itself that discussion has occurred.

It is possible that scenario might change in terms of input, but we do like the idea of self-assessment. There was a very specific reason why self-assessments were included. The specific reason for that is there are a number of smaller programs in lesser populated jurisdictions—I told you I came from one of those originally in the more rural area of Kansas—who don’t have the financial wherewithal, especially in the current financial climate of austerity, to pay the fees associated with that process.

There is potentially still a way to get the benefit of going through the accreditation process with the self-assessment guide. You as a program manager can run your own program through the self-assessment at essentially low or no cost. Although you don’t get the official accreditation document from the accrediting agency, you know that you’ve gone through the process and you are able to demonstrate that through meetings of self-assessment.

That was the reason why we included that. It seemed logical that NFPA 1600 might have some interest in developing that, and we’ll certainly be doing follow up outreach and contact with them to see if it is the case.

Amy Sebring: That is a great idea and I believe there is a model for that in terms of DHS’s Business Preparedness program if I’m not mistaken—where they can do some self-assessment. [This refers to the PS-Prep program. No further information is found on the self-certification process that was under consideration during the latter part of 2010. See http://www.fema.gov/privatesector/preparedness/]

Question:
Avagene Moore: Do you personally have a perspective on the intangible factors that make a good professional in the EM field that go beyond education, training, experience or things we can measure?

Randall Duncan: Once again Avagene comes through with a particularly good question. I think that is a question we have all struggled with at various times. We have seen and had experience with folks who have been involved in this field who have been particularly able to accomplish things. They may not have had a degree or particular training, but it was something about the overall person and personality. That is extremely difficult to capture.

It gets back to the issue that it is very difficult capture subjective data. If it is our opinion that a certain type of personality seems to do well, sometimes that is difficult to capture unless you are willing to go out and do a Myers-Briggs analysis of everybody who is in the field and say if you are an ‘ENFJ’, you qualify. I am just picking a type at random to talk about.

That is difficult and I’m not sure I have a good handle on how to answer that. I think what we’ve done through the use of these principles of emergency management is tried to identify and make tangible some of those intangibles. If you look at the eight principles they include things like integration and collaboration and we have suggested some specific samples of ways that those things might be measured.

From that perspective what I would urge all the participants to do is get a copy of the document, read it, and then take the time to respond to the Survey Monkey instrument and give us your feedback. Have we found a way to capture that? If not, give us a suggestion at how to do better at capturing it.

Question:
Amy Sebring: I mentioned that FEMA has just come out with the updated draft of the National Preparedness Goal under PPD-8. Have you had a chance to look at that?

Randall Duncan: I have it in my inbox and I haven’t had a chance to look at it directly.

Amy Sebring: The same principles are there with the target capabilities you mentioned in your presentation. I wondered if you had thought about cross-referencing that to this effort or to the principles of emergency management.

Randall Duncan: It should be possible to do. I think one of the benefits of this framework—I’m referring to the Return on Investment Document and also the Principles of Emergency Management document. Like our field of emergency management, those concepts are broad.

They are purposefully broad because that is the approach that emergency management takes. It is a broad professional approach. It is a broad integration, both horizontally within the jurisdiction and vertically with different levels of government. Toward that end, I think there is a great affinity between the two.

I am thinking about the concept of cross matching it with the target capabilities, but it seems like a possible thing to do. In fact, I am a getting a piece of paper and writing a note down on that. I intend to have a discussion with our group on that topic after we finish here. Perhaps someone would be kind enough to mention that in the survey instrument to help support me bringing that discussion point up in our group.

Question:
Steve Davis: Have you thought about building on the previous Pilot Capabilities Assessment of FEMA's? It seemed to be an effective way to measure capabilities (TCL) and year to year use would measure improvements (enhancements) made to these TCs.

Randall Duncan: Again that is certainly another possibility. Those are the kinds of suggestions and input we are looking for. I hope Steve will take the time and effort to go to the survey and make sure that input is there for the discussion point. We understand this is a living document.

At some point we have a vision that a version 2.0 will come out. The way that version is going to happen is with the dialog and input and suggestions that we hope are generated as a result as our outreach on Capitol Hill, our program with EM Forum today, and generally speaking, the thoughtful approach of thoughtful people in the emergency management field toward this whole thing.

Question:
Avagene Moore: Since we are in the "people" business, which is more important in a local or state program - people skills and capabilities or the equipment, systems, etc, that bean-counters look at when accounting for grant funds?

Randall Duncan: Another typically excellent Avagene question. I have a bias on this question and I want to make that clear up front. I have a simple thought experiment I would ask you to participate in. Let’s picture a typical workstation—a desk and a chair. Now let’s put a computer on that workstation. Now let’s put an interoperable radio on that workstation.

Now picture whatever else your favorite toy is—perhaps it is the key to an emergency management vehicle or perhaps the disk that operates the computer program that allows the sophisticated CBRNE network sensor to operate. Now look back at the chair. If the chair is empty, what is the outcome of all the information coming into that workspace? That should be the clue to my bias.

I think people are very important and the people skills have to come first. Then we have to develop the equipment and capability for the people.

Question:
Isabel McCurdy: Is the intent of this document to be the inclusion of lives, property, environment, education as the measurement of investment in dollars and sense?

Randall Duncan: Yes, the outcome is that blue pyramid we talked about on a previous slide. The ultimate outcome is effective and efficient mitigation, response and recovery to any disaster regardless of the cause. To that extent, we take a look at the process in terms of—what are the inputs that go into that?

How do you achieve professionalism? Obviously, you have to make an investment in the operation of the program so it can be professionally operated and meet the standards. You also have to make an investment in the people so they are professional in the way they conduct the program and have good relations and such.

To that extent, the inputs are important to that process. The inputs include not only the grant funds, whether they are emergency management performance grants or whatever; they also include the local general fund tax revenue that goes into the operation of that local program, or the state general fund tax revenue that goes into the state program.

I think the issue is if you get that effective and efficient mitigation, response and recovery, along with the preparedness that is the next to the last step for all those things, you are going to have a positive impact on saving lives. You are going to have a positive impact on preventing damage and securing the environment and stabilizing the situation and all of those other concerns we have with emergency responses and disasters.

Question:
Amy Sebring: When we did the poll questions we separated it out into institutional preparedness, which this seems to go directly to. Where does the family preparedness fit in all of this? Is it 50/50, 75/25 in terms of outcome?

Randall Duncan: I’m not sure I could give you an exact percentage breakdown. If we go back and look at this through the lens of our hierarchical outcomes, for a program to flexible—which is one of the principles of emergency management—it has to incorporate the ability of organization at the private enterprise level, the non-governmental organization level, the governmental organizational level, the individual level—it has to accommodate all of those things.

Similarly, a progressive program takes into account all of those issues. A comprehensive program makes sure—I think there is an element of that running through many of the principles. If you get the opportunity to download the document and read it carefully, I think you will see there are sample measures that relate to outreach to the public, outreach to NGOs, private enterprise, etc., recognizing that all of those have important roles to play in the ultimate hierarchical outcome of effective and efficient mitigation, response and recovery and the preparedness that precedes it.

Comment:
Tom Fahy: Randy, your comments have focused on the Return on Investment for human resources, training and development of emergency managers. This is an excellent goal.

Randall Duncan: Thank you very much Tom. I appreciate that comment. I hope you find the other principles are equally as well fleshed out in the document. I just went through the one as a sample. I encourage everybody to get the document, read it, and share your comments and input with us in the survey.

Comment:
Amy Sebring: Folks, once you get the document, if you want to dive right into it the samples that are appended, that really does help clarify, at least it did for me, what this is all about.

Randall Duncan: Amy that is a good point. If you don’t mind I want to make a quick point about how the document itself is organized. The document is organized in two parts. The intellectual underpinnings of the concept are outlined in the first part of the document. What you may want to do is go to the appendix which is where the outcomes, objectives, and sample measures are outlined for each of the principles. That hopefully will give you a very concrete idea of what we want the dialog to center on with respect to measurement of return on investment.

[Closing]

Amy Sebring: I guess we will wrap it up for today. Thank you very much Randy. When you are talking with Jessica please convey to her that we appreciate her valuable contribution here

Randall Duncan: Absolutely. Even though I am not trying to put words in Jessica’s mouth I am sure that she, and definitely that I, would say thank you to you, to Avagene, and each one of the participants out there for taking time out of your busy day to allow us to brief you about what we hope will be an important document and a game changer with respect to measuring return on investment.

Amy Sebring: We appreciate your taking the time to be with us today to share this information and we wish you success with this effort. Good luck!

Again, the video and audio recordings should be available later this afternoon. If you are not on our mailing list and would like to get notices of future sessions and availability of transcripts, just go to our home page to Subscribe.

Before you go, PLEASE take a moment to do the rating/review! Note: We are asking you to rate the relevance of the information, and any additional comments you have as this will assist us in our future programming.

Our next program will take place Wednesday, September 14th. Please watch for our announcement and plan to be with us then. We will be deep in National Preparedness Month.

Until next time, thanks to everyone for participating today and please take time to provide your input on this document and the others. Have a great afternoon. We are adjourned.