Superseded

This policy memo has been superseded by the 2012 SAMM Rewrite.

 
DoD Shield

DEFENSE SECURITY COOPERATION AGENCY
2800 DEFENSE PENTAGON
WASHINGTON, D.C. 20301-2800

12/20/2000

MEMORANDUM FOR :

Deputy Under Secretary of the Army (International Affairs) Department of the Army
Director, Navy International Programs Office
Department of the Navy
Deputy Under Secretary of the Air Force (International Affairs) Department of the Air Force
Director, Defense Logistics Agency
Director, National Imagery And Mapping Agency
Director, National Security Agency
Director, Defense Contract Management Agency
Director, Defense Finance and Accounting Service (Denver Center)

SUBJECT :

FMS Review Policy Guidance (DSCA 00-19)

Over the past few years, DSCA received substantial comments from the USG FMS community and the FMS customer countries regarding the FMS review process. (Approximately 400 reviews are held on at least an annual basis.) In order to provide excellent support to our FMS customers, we use reviews to convey accurate, timely and thorough status on the FMS programs. These reviews represent a significant investment of FMS resources, in terms of both time and funding. While some aspects of the current process received favorable endorsement, the majority of feedback focused on an FMS review process in need of improvement. Specifically, it was felt policy was needed to establish whether a given review adds value, define the proper scope of the different FMS review types, apply consistency in determining which USG components should attend reviews, identify how the FMS reviews should be funded, and assign standard preparation and follow-on requirements.

To respond to these issues, an Interagency Process Team (IPT) was formed in February 2000. The IPT's primary objective was to improve the FMS review process. Representatives from DSCA (Comptroller, MEAN, ERASA, DSADC), USASAC (Alexandria and New Cumberland), Navy (IPO and NAVICP), USAF (AFSAC and SAF-IA), and DFAS met on several occasions to explore this issue in considerable detail. In addition, DSCA briefed the Foreign Procurement Group, International Customers Users Group and numerous FMS customer countries during the past several months to solicit their input and to ensure that their desires were given utmost consideration. While there is a valid need to consistently apply FMS review policy as much as possible, this guidance gives due weight to accommodating uniqueness and flexibility necessary for the optimal execution of individual FMS country programs as discussed in those feedback sessions.

This memo and Attachments 1 through 7 provide the comprehensive policy guidance derived from the IPT. A brief synopsis of this policy will be incorporated into the forthcoming SAMM (DoD 5105.38-M) rewrite. Corresponding updates to MILDEP-level policy publications may be necessary. Additionally, this memo will be posted on the DSCA Web Site (www.dsca.osd.mil). Your assistance is requested in ensuring widest possible dissemination of this policy.

The FMS review policy guidance is found at Attachment 1. That guidance provides the general parameters within which FMS reviews are to be conducted. Main policy tenets follow:

  • Determine that each review has a defined objective and a desirable outcome before the review is scheduled.
  • Reduce the number of reviews to the extent possible.
  • Limit the number of USG attendees at FMS reviews to the extent possible, while ensuring the reviews themselves are conducted in an effective and efficient manner.
  • Ensure that each USG attendee at FMS reviews has a distinct and active role, is fully prepared, is knowledgeable and is empowered to make decisions.
  • Subscribe to the FMS review funding guidelines.
  • Standardize preparation and follow-on requirements.

As a means for monitoring this policy, DSCA seeks the establishment of FMS review advisors for the DSCA, MILDEPs/Implementing Agencies and DFAS Denver. These advisors should have either already served on the FMS review IPT or be otherwise familiar with the review process and policies. I ask that you notify my primary contacts, Mr. David Rude and Ms. Vanessa Glascoe, by 15 January 2001 as to whom will help promote this policy guidance.

In closing, I want to thank the following individuals outside DSCA for their outstanding contributions to this important endeavor:

  • USASAC -- Joan Buchanan, Rick Westhafer
  • Navy -- David Molyneaux, J.P. Hoefling, Susan Lyon
  • USAF -- Jeff Dierker, Bev Spires
  • DFAS -- Jan Rakickas, Steve Willauer

Please convey my personal appreciation for their dedication and professionalism, without which the IPT's objectives would not have been accomplished.

 

This group, which is an essential component of the Business Processes IPT, will resume on an ad-hoc basis to ensure DSAMS requirements accurately capture FMS review policy; standardize FMS reporting formats and/or identify minimum data requirements to the extent possible; address policies regarding facilities hosting FMS reviews; clarify proper usage of representational funds, conference fees, gifts, and socials; refine FMS review delivery reporting transactions; and (if needed) fine tune this policy as a result of implementation feedback. The Business Process IPT's charter will reflect these efforts.

Should your staff have any questions, the DSCA point of contact is Mr. David Rude, Financial Policy Team Chief/IPT Chair, (703) 604-6569, e-mail: david.rude@osd.pentagon.mil.

Tome H. Walters, Jr.
Lieutenant General, USAF
Director

ATTACHMENT :

  1. FMS Review Policy Guidance
  2. Stratification & Characteristics of FMS Reviews Categories
  3. FMS Review IPT Charter
  4. Dr. Hamre Memo, Foreign Military Sales (FMS) Financial Management, 13 December 1999
  5. FMS Review Funding Matrix
  6. Financial Management Review (FMR) Case Financial Status Reporting Format
  7. FMS Review Survey

Attachment 1
FMS Review Policy Guidance

While this policy guidance addresses the universe of FMS reviews, certain types of FMS meetings/visits are excluded from this policy. Training PMRs, IMET reviews, technical reviews, site surveys, releasability meetings, and INL-funded meetings are not covered by this policy. In addition, DSCA recognizes that the nature, scheduling and conduct of Policy-level reviews chaired by Assistant Secretary or higher level are not subject to this policy. However, Policy-level reviews represent one review category and, as such, are referred to in this document.

Review Types

Five broad types of reviews apply to FMS: Policy-level; Country-level; Service-level; Program-level; and Internal. The first four types (Policy -- through Program-level) constitute External reviews, i.e., those involving the FMS customer. Within the Internal review category are three subdivisions: External Review Planning Meetings; Internal Reconciliation Reviews; and Internal Process Reviews. Attachment 2 describes the characteristics and scope applicable to each review type. Please note that the "Associated Reviews" section within Attachment 2 attempts to correlate the review types with the various names/acronyms currently in use to represent that category. Every effort should be made to begin transitioning from those names/acronyms to simply identifying the review type. While some degree of flexibility should be retained to accommodate longstanding country/program-unique review acronyms, it is expected that all prospective reviews that commence for the first time after 1 January 2001 will adhere to the labeling format provided below. In doing so, and with increased familiarity over time with the corresponding characteristics and scope, any misunderstanding as to the purpose/intent/objective of any given review should be significantly reduced.

Example 1: All Program-level reviews should be labeled (Country Name) (Weapon System/Program) (Program Review) -- to illustrate: Bandaria F-16 Program Review.

Example 2: All Service-level reviews should be labeled (Country Name) (Service) (Review) -- to illustrate: Bandaria Army Review. Note: "Service" can denote either IA or In-Country Service (ICS), depending on the scope of that particular review. The foregoing illustration applies to ICS-driven reviews. If IA-driven reviews apply; the review name format would be: U.S. Navy Review for Bandaria

The following sections of this policy correspond to the sequence of IPT Charter Elements found at Attachment 3.

Review Value

It is important that, when considering whether to conduct any given FMS review, a determination is made that the individual review adds value. In doing so, the value assessment should be made not only in consideration of USG resources and other constraints, but also the desires of the FMS customer. At times, the political visibility/sensitivity that an FMS review will receive is reason enough to conduct it; this is particularly true for the Policy-level reviews. In addition, drastic changes evident in a region, country or program may necessitate the conduct of previously unscheduled reviews and deviate from usual reporting formats (one such example is reviews stemming from the 1997-1998 Asia Financial Crisis). For all other circumstances, however, additional determinants must be taken into account in the context of value added. Those criteria include:

Identifying Objectives and Deliverables. When considering whether to have an FMS review, it is imperative that the objectives (why are we conducting this FMS review?) and deliverables (what outcomes do we want to achieve?) are clearly identified. If either objectives or deliverables are absent in that analysis, the review should not be held at that time. Moreover, the objectives and deliverables should be articulated to all FMS review components (USG and customer) during the planning phase; this will help minimize confusion and reinforce the proper scope of issues to be discussed.

Customer Requirements. A customer's internal policy or even legislation may require periodic information on the status of country accounts, issues, cases and programs. Care must be taken to ensure that customer expectations or precedence complement the review value process; on the other hand, having a review every quarter for the past three years is not in and of itself sufficient. (An exception would be Program-level reviews that are following an established milestone plan.) In addition, while technologies such as VTC should be explored whenever feasible, recognize that personal, face-to-face dialogue is vital in some cultures to actually getting the work accomplished.

USG Requirements. We may have many of the same needs shown in the "Customer Requirements" section above. In addition, FMS reviews are a wonderful opportunity for apprising the customer on updated policies, laws and current events/issues. Reviews can also promote our proactiveness and advocacy, as well as timely resolution of issues and closures of actions. They show our commitment and desire to be effective/efficient stewards of the customer's FMS resources. Actions such as those announced in DEPSECDEF's 13 Dec 99 memo (Attachment 4) can be satisfied through the FMS reviews.

Activity/Dollar Value/Size. This refers to the degree in which the country, service or program being reviewed is active, the dollar amounts associated thereto, and/or the number of cases being reviewed. It is important to note that none of these factors are sufficient standalone indicators for determining the value of a given review. For example, while Country XXX may have only 15 cases, those cases may total several billion dollars in value and could be a lynchpin in our bilateral relations. Under that scenario, using the number of open cases alone would be misleading. Instead, each of these factors must be viewed in conjunction with others.

Long-Term Investment. The FMS review forum may be viewed as a valuable opportunity to promote USG interests and strengthen our sovereign relations with other countries. This is an intangible yet potentially important value determinant.

Customer Sophistication/Reliance on USG. This can be an important factor, especially when an FMS review involves a customer unfamiliar with the FMS "language", policies and procedures. Usually, these customers require closer USG involvement and more intensive management. These reviews would also be prime venues for educating customers on the FMS process. Conversely, highly sophisticated customers can benefit from reviews as they help maintain open communications, but they may also be comfortable using technologies as a substitute for reviews per se.

Customer Preference. The preferences and desires of the customer regarding the conduct of reviews should be accommodated to the extent possible. However, when those preferences are not practical and/or logical, the USG review component lead is responsible for offering sound and reasonable alternatives. The key is to find mutually agreeable solutions that make sense.

Uniqueness. A number of reviews have evolved over time to accommodate unique requirements on the part of the customer, applicable weapon system, etc. These unique arrangements already in existence should continue to be honored provided they continue to add value. However, review components are invited to introduce common data element usage, standardized definitions and reporting formats to the extent agreeable by the FMS customer.

Number of Reviews

As noted earlier, approximately 400 FMS reviews are held at least once per year. DSCA received considerable feedback reflecting that the review components' organizational structures generally require the same cadre of country/case/program managers to attend numerous reviews within a given year. Understandably, this strains resources and adversely affects the time allotted for managers to resolve FMS review actions and perform their day-to-day routine functions. In addition, many FMS customers who have an active FMS review roster have expressed a desire to reduce the quantity of reviews for these same reasons. Also, it became quite clear during the IPT's research that areas of duplication and overlap exist between different reviews for the same country/service/program. Therefore, efforts are to begin immediately to identify reasonable ways to consolidate (or, in some instances, eliminate altogether) reviews. Examples of consolidation already instituted thus far follow:

Example 1: Merge the Financial Management Review (FMR) and Case Reconciliation Review (CRR) for the same country into an FMR.

Example 2: Consolidate separate Program-level reviews that are mature in nature into a single joint Program-level review.

These consolidation efforts, however, cannot be taken unilaterally: the review consolidation/reduction proposals must be offered to and accepted by the FMS customer. USG flexibility in entertaining customer counter-proposals is expected. While the precedence of having a given review should be given merit, remember that precedence does not mandate permanence. For consolidation approach recommendations, or if problems with the proposals arise, please consult the respective FMS review advisor (see section below). The keys in being successful in endeavors to reduce/consolidate are that the value of such a reduction exceeds the status quo, and that the customer perceives fewer reviews improve the process. This latter point may involve educating on our part.

In addition, a primary objective of merging reviews should be to minimize (if not eliminate altogether) areas of redundancy and duplication. Resource constraint issues arise in the context of having to present the exact same type of information (albeit in slightly different formats) during several different FMS reviews. Similarly, identical issues can be raised at more than one review and/or review type. In those instances, the party raising that issue should be apprised as to the most suitable review for discussing that topic. One corrective measure is to ensure correlation between the level of the issue being proposed for discussion and the review type itself (refer to Attachment 2). We must also remain reasonably flexible to address all customer concerns at a review. If issues are known in advance which are clearly outside the scope/purview of that review, the customer should be notified as to alternative venues for those discussions.

Optimal Frequency of and Timing for Conducting Reviews

The usual frequency of and timing for reviews depend in large part on the review type being considered. For all external reviews deemed necessary by both the USG and the customer, the frequency and timing must be agreed by mutual consent with the FMS customer. The following reflects normal guidelines:

Review Type

Frequency

Timing

Policy-level

Ad hoc (although some reviews are held on a regular basis, usually annually).

Ad hoc, usually based on determination by policy-level officials.

Country-level

Annual

May be driven by customer funding and budgeting timelines. Care should be taken to schedule these reviews to optimize their value to customer's internal budgeting and planning cycles.

Service-level

Annual

Same as country-level

Program-level

Based on milestone plan established during case development as referenced in the LOA (and refined over time). Refer to the following note that must be contained in all LOA documents offered after 31 March 2001 for which program reviews apply.

Should be event-driven based on established milestones, not necessarily calendar-driven.

Internal

Ad hoc, although some internal reconciliation reviews may be held annually to comply with Attachment 4 and SAMM requirements.

Ad hoc

LOA note for program-level review frequency follows:

"Program Review Schedule. The initial review schedule has been projected as follows: (specify known review events here). Future changes and/or additions to this projected schedule will be based on further program definition and will be provided through official correspondence to the FMS customer for concurrence."

In scheduling reviews, consideration should be given to customer and USG holidays, customer weekends (which are oftentimes different from ours), and changes within SAO personnel and customer leadership.

Appropriate Levels of Representation

For protocol purposes, whenever possible the rank of the lead USG review official should be equivalent to that of the customer co-chair (counterpart). All USG representatives attending FMS reviews must be knowledgeable and empowered to make on-the-spot decisions, while recognizing that some issues may require the final approval of senior management who may not be present at the review itself (which may require an action item). Those who attend the FMS reviews must be able to adequately represent their components and, consequently, speak effectively and decisively.

This topic must also consider the type and scope of the review being held. While more senior officials may co-chair reviews of a highly visible and macro-level nature, detailed reviews such as PMRs may require the attendance of managers who are responsible for the day-to-day operation of that program/weapon system.

FMS Review Attendees

This factor addresses two aspects: (1) which components should attend each type of review, and (2) responsibilities of the attendees.

Component Attendance. Although exceptions are allowed if agenda topics dictate (and if those issues are not under the purview of the usual attendees), components are normally required for the review types as shown below:

Review Type

Attending USG Components

Policy-level

OSD/ISA/SOLIC (USG chair)
State Department
Joint Staff
DSCA (potential and may chair a subcommittee or working group)
MILDEPs/Implementing Agencies (IAs) (if requested)
AT&L, OUSD(C), (if requested)
Others as needed

Country-level

DSCA (USG chair)
MILDEPs/IAs (if required)
SAOs
DFAS (if required)
Other interagency (e.g., State, Commerce) (if required)

Service-level

MILDEPs/IAs (USG chair)
SAOs (if required/requested)
DSCA (if required)
DFAS (if required)
Contractors (if required)

Program-level

IAs and Program Mgmt/Executive Offices (USG chair)
DFAS (if required)
DSCA (if required)
SAOs (if required)
Contractors (if required)
Others as needed

Internal

Ad hoc, depending on nature of internal review

Attendee Responsibilities. All USG DOD officials attending FMS reviews must meet the following criteria:

  • Each attendee must have a distinct and active role in the FMS review. The applicable USG chair is responsible for ensuring that each attendee is performing separate roles.
  • Every effort should be made to minimize the number of attendees while ensuring full coverage of all agenda topics. The review's location may impact the number of attendees that can be present.
  • Attendees must be fully prepared to address all agenda topics submitted in advance, and those logically anticipated to arise during the course of discussions. However, "contingency" representatives are not authorized. The USG chair is responsible for ensuring that all invited activities have agenda topics being addressed.
  • Attendees represent their organization, not just the specific office or activity to whom that attendee reports. Understandably, actions may arise for issues not known in advance and which are outside the attendee's activity per se. In those instances, the attendee must take responsibility for ensuring follow-up with the appropriate organizational component. That said, the attendee must be knowledgeable about all issues known beforehand that pertain to the overall organizational component.
  • Attendees must be able to effectively represent their organization and speak to the issues at hand. This refers not only to the levels of representation (discussed in the preceding section), but also the ability to clearly articulate discussion topics.
  • Attendees should be selected to reflect the FMS review type that applies and the corresponding level of detail involved.

FMS Review Funding

During the IPT's research, it was found that there are inconsistent applications in terms of how each FMS review type is to be funded. It was also discovered that the funding source depends not only on what type of review is considered, but also what components attend and even what levels of component managers attend. Attachment 5 provides the FMS funding matrix. If the USG requests reviews exceeding the normal timeframe shown in the preceding table, the source of funding normally would not change. However, if the FMS customer requests reviews exceeding the norm, those additional reviews could be FMS case-funded -- in that situation, the USG and FMS customer should assign a mutually agreeable FMS case against which the review costs should be charged. DSCA will coordinate with OUSD (Comptroller) to ensure any rewrite to Table 718-1 of the DOD FMR, Volume 15, Chapter 7 reflects Attachment 5. We realize that extraordinary exceptions may be required to accommodate a given individual's circumstance for a specific FMS review; in those instances, the applicable FMS review advisor must be consulted for a policy exception determination.

FMS Review Reporting Format Standardization

The establishment of "boilerplate" reporting formats for each FMS review type is an important tool for eliminating inconsistencies and/or redundancies. In addition, using standard formats helps familiarize the FMS customer with our usage of data element terms, and avoids confusion that oftentimes results from presenting different formats in the same review. While standardized formats are preferred, flexibility should be retained to allow for supplemental changes and other deviations from the normal reporting structure. The standard format for use in DSCA Country-Level FMRs is provided at Attachment 6 to illustrate this point.

As essential as the format itself is the consistency associated with defining each reporting data element. It is a source of confusion and frustration to those receiving reports in an FMS review when various reporting components use the same term (e.g., "obligations") in different ways. The development of a lexicon would assist all components responsible for preparing similar reports, and as such DSCA highly encourages that lexicons are distributed at all reviews.

General Preparation and Follow-On Requirements

The FMS review is both a culmination of extensive preparations and planning preceding it, and sets the stage for important follow-on requirements. The following guidelines apply to all reviews, regardless of level or hosting organization:

Preparation. The first step in planning for a review is to identify the objectives and deliverables -- refer to the foregoing discussion under "Review Value". Subsequent preparation requirements are to involve the following:

  • Ascertain the review purpose (which review type applies?)
  • Conduct an internal FMS review planning meeting
  • Establish planning milestones to include data "cut-off" date
  • Formally announce the review (see "Communication" section below)
  • Establish an agenda
  • Determine attendees and the customer audience
  • Determine the review date and logistics (i.e., location, transportation arrangements, etc.)
  • Formulate (with FMS customer input) the agenda topics and distribute to all attendees in advance
  • Develop and publish briefing/info papers formats
  • Develop and publish reporting formats
  • Develop and publish quality control checklists applicable to briefings/info papers and reports
  • Develop Minutes preparation guidelines/format
  • Confirm how the review effort will be funded
  • Administrative: security/country clearances, threatcon briefings, disclosure, hotel/flight reservations, bios, protocol issues, social events, audio/visual requirements, cultural primers, etc
  • Role of SAOs: for reviews hosted by the FMS customer, SAOs are expected to coordinate all administrative arrangements, secure lodging and transportation, and accommodate the visiting CONUS team however practical.

Follow-on. It is expected that action items will be tasked, and other information will be required, as a result of an FMS review. The following applies:

  • Minutes preparation: the USG chair is responsible for ensuring the timely preparation of all Minutes associated with that review. This entails oversight (and, as necessary, direct involvement) of the Minutes preparation, coordination and distribution.
  • Minutes distribution: a copy must be sent to all USG components attending the review, other organizations to whom actions were assigned, the applicable DSCA Country Program Director and Country Finance Director, the SAO, and any other organizations deemed appropriate by the lead component activity. Electronic transmission of Minutes is encouraged. Minutes should be distributed within 30 days after signature.
  • Action item assignments should be distributed with the Minutes and contain the following information: who has the action (OPR); what is the action; when is the action due; and what is the reference number
  • Action item follow-on reports should be sent on a regular basis to update all OPRs on status of actions tasked during the review
  • Actions are to be completed in a timely manner; any delays must be notified by the OPR with a reason and revised estimated completion date
  • Trip reports and other internal summary reports may be required
  • Provide tentative dates/location for the next review, if appropriate, and forward that information to the FMS review advisor

Communication Channels

The degree to which the planning for, conduct of and follow-up to reviews succeeds is highly dependent on open and efficient lines of communication. For external reviews, the SAOs in particular are key players as they are the official liaison between the FMS customer and the USG review components. The lead USG review component (i.e., review co-chair) is responsible for ensuring these clear communication channels exist. With ever expanding technology, communication occurs in the form of "formal" and "informal". For the purpose of communicating on FMS reviews, formal encompasses frontchannel cables, letters/memoranda, and meetings with the customer. Informal includes e-mail.

Formal communication must be made on the following aspects of FMS reviews:

  • Customer's (or USG's) request to conduct a review
  • Review announcement
  • Review subject and scope
  • Restrictions/limitations (e.g., all discussions are to be held in an unclassified forum)
  • Agendas
  • Milestones
  • Administrative arrangements
  • Country/theater clearance requests and approvals
  • Funding
  • Action assignments and completions of actions

Informal communication can address the following:

  • Reporting, briefing, info paper formats
  • Checklists (including quality control)
  • Protocol issues
  • Administrative set-up
  • Taskings
  • Briefings
  • Suspense dates
  • Action item status reports

Surveys

The survey instrument is an excellent means for assessing customer satisfaction with the review just held, as well as a forum for "lessons learned" to improve future review endeavors. Surveys are required for all Country-through Program-level FMS reviews that commence after 31 March 2001. They are to be distributed prior to the review's closing session. Preferably, they will be returned before surveyed attendees depart; if that is not possible, a target date should be assigned by which respondents furnish the completed survey. The boilerplate survey to be used is found at Attachment 7. Modifications to the boilerplate survey may at times be warranted, to include adding survey elements addressing satisfaction with the FMS customer in a given review. DSCA encourages a central repository for survey results, possibly with the applicable FMS review advisor.

FMS Review Advisors

To address policy guidance implementation queries and help ensure consistent interpretation thereof, FMS review advisors should be established for the MILDEPs/Implementing Agencies and DFAS Denver. Mr. David Rude and Ms. Vanessa Glascoe will serve as the DSCA FMS review contacts. The selected advisors should be familiar with, and serve as overall focal points for the following:

  • Ensuring wide dissemination of this policy guidance.
  • Serving as advisor and, as necessary, assist in the review of country/theater clearance requests prior to transmission for their cognizant organization.
  • Serving as ex-officio members of the FMS Review IPT.
  • Publishing FMS review schedules for their respective organization. DSCA will be responsible for maintaining a worldwide FMS review roster.
  • Meeting with DSCA on an ad-hoc basis on FMS review policy guidance issues.

Attachment 2
Stratification & Characteristics of FMS Reviews Categories

Category

Associated

Reviews

Characteristics

Policy Level

BWG

CG

above DSCA-chaired

 

HLDG

JMC

SA/SC subcommittee to address DSCA issues

 

HLCC

MCC

little/any DSCA "control"

 

DEE

SCC

national security issue/foreign policy-driven

 

SCM

 

format/structure driven by senior policy mgmt

Country-Level

PMR

MCRIM

DSCA-chaired

 

FMR

TMR

Programmatic/financial and/or

 

SAMR

 

logistical orientation
higher level representation (to component country mgr)
customer: flag-officer or civ equiv co-chair
summary case-level visibility
- case closure
- standardized format
- delivery status
- excess funds
- discrepancy resolution
forum to address FMS policies/procedures and SA/SC issues

Service-Level

SAR

LMR

MILDEP lead component chairs

 

SAMR

SACR

can be oriented by customer ICS or IA

 

CMR

TSR

general status briefings: major weapon systems

 

CRR

PMR

driven by magnitude of customer and/or MILDEP issues
forum to address FMS policies/procedures
customer and MILDEP representation driven by agenda topics
often involves contractor personnel
line/contract-level detailed review

Note: Refer to handout next under for glossary of associated review acronyms.

Category

Associated

Reviews

Characteristics

Program-Level

PMR

PCG

MILDEP/PMO-chaired

 

IPR

PARM

covers all aspects of a specific weapon system/

 

CMR

MCRIM

program/case/"family" of cases

 

CRR
GTC
FWG
TSG
WSR

NJEP
System
AUTEC
TCG/IEMP/
CIP

line/contract-level detailed review addressing:
- obligations/contract awards
- expenditures
- deliveries
- unused funds
- programming of current and future reqmnts
- discrepancy resolution
customer represented by head of its PMO
driven by key milestones in program life cycle
often involves contractor personnel

 

Internal

Pre-FMS Review Mtgs
Reconciliation/Scrub

USG-only
plan and prepare for external reviews
- review draft briefings
- identify agenda topics
- establish milestones
- discuss reporting formats and requirements
USG reconciliation
- possibly driven by external review actions
- to prepare for external reviews
- to correct known discrepancies/errors
- to expedite case closure
- normal case management function
coordinate life cycle milestones, contracting actions, delivery schedules, etc. at outset of a given LOA
definitize USG and contractor roles/responsibilities

Associated Reviews Glossary

ABR

Meaning

PMR

Program Management Review

SAR (tri-service)

Security Assistance Review

SAR (service-level)

Security Assistance Review

CMR (country)

Country Management Review

CMR (case)

Case Management Review

IPR

Internal Program/In Process Review

CRR

Case Reconciliation Review

FMR

Financial Management Review

BWG

Bilateral Working Group

HLDG

High-Level Defense Group

HLCC

High-Level Consultative Committee

GTC ***

Germany Training Conference

FWG

Functional Working Group

TCG

Technical Coordination Group

IEMP

Intl Engine Management Program

CIP

Component Improvement Program

TSC ***

Technical Steering Committee

PARM ***

Participating Management Review

DSCA Reviews (e.g., SAR)

 

SACR ***

Saudi Arabia Country Review

TSR

Technical Service Review

PCG ***

Program Coordination Group

SAMR

Security Assistance Management Review

LMR

Logistics Management Review

WSR

Weapon System Review

MCRIM ***

Major Cases Requiring Intensive Mgmt

NJEP

Netherlands Jet Engine Program

Ad-hoc reviews

 

Internal (USG-only) reviews

 

F-16 reviews

 

TMR

Tri-Service Management Review

*** Denotes reviews unique to a specific country

Attachment 3
FMS Review IPT Charter

  1. Establish baselines to ascertain scope of effort (number and types of reviews, levels of resources involved).
  2. Determine the criteria for determining value of/need for a given review.
  3. Reduce the number of reviews to the extent possible.
  4. Determine the optimal frequency of and timing for conducting each type of review.
  5. Determine the appropriate levels of representation required for effective outcomes at FMS reviews.
  6. Determine which DoD components are needed for each type of review.
  7. Define normal/routine levels of effort.
  8. Determine appropriate funding sources for each review type and quantity.
  9. Standardize FMS review reporting formats.
  10. Standardize delivery reporting of expenditures relating to FMS reviews.
  11. Identify common areas of duplication among review types and develop corrective proposals.
  12. Determine general preparation and follow-on requirements.
  13. Determine communication channels.
  14. Develop new and refine existing metrics that can be used for monitoring process improvement and, where feasible, metrics for which FMS reviews constitute a valid sample for the entire population.
  15. Establish policy guidance that reflects decisions made via this IPT.

MEMORANDUM FOR :

Secretaries of the Military Departments
Chairman of the Joint Chiefs of Staff
Under Secretaries of Defense
Director, Defense Research and Engineering
Assistant Secretaries of Defense
General Counsel of the Department of Defense
Inspector General of the Department of Defense
Director, Operational Test and Evaluation
Assistants to the Secretary of Defense
Director, Administration and Management
Directors of the Defense Agencies
Directors of the DoD Field Activities

SUBJECT :

Foreign Military Sales (FMS) Financial Management

Recent audit reports have identified a number of FMS management problems that manifest themselves in inaccurate or delayed financial management transactions. At my direction, a review of FMS processes impacting financial management was conducted. This effort was led by the Office of the Under Secretary of Defense (Comptroller) [OUSD(C)] and the Defense Security Cooperation Agency (DSCA), with participation by the Office of the Under Secretary of Defense (Acquisition, Technology and Logistics) [OUSD(AT&L)], the Military Departments, the Defense Logistics Agency, and the Defense Finance and Accounting Service (DFAS). The review produced a number of recommendations with the potential to improve financial management in the near-term. I have approved those recommendations and am directing their implementation through the actions contained in the attachment.

The attached actions are intended to reduce work load, eliminate erroneous payments, lower operating costs, permit FMS cases to be closed sooner, accelerate reimbursements to the Department and the U.S. Treasury, and ensure better customer satisfaction. Within 90 days from the date of this memorandum, the USD(AT&L.), Heads of the DoD Components, and Directors of DFAS and DSCA are directed to report their progress on the attached actions to the USD(C). Your cooperation in implementing these rules is appreciated.

John J. Hamre


ATTACHMENT :

Attachment
Foreign Military Sales (FMS) DEPSECDEF Directed Actions

The Under Secretary of Defense (Acquisition, Technology and Logistics) (USD(AT&L)) is directed to:

  • Require that FMS contract line items be closed out as soon as the closeout requirements for those line items are satisfied. The closeout of FMS contract line items should not be delayed while waiting for requirements to closeout other non-FMS contract line items to be satisfied.
  • Require one Contract Line Item Number (CLIN) per one Accounting Classification Reference Number (ACRN) for each FMS requirement on a contract.
  • Emphasize the requirement that Defense Federal Acquisition Regulation Supplement (DFARS) clause (252.232.7002) is to be included on all contracts involving FMS. (This clause requires contractors to bill separately for each FMS customer).

The Under Secretary of Defense (Comptroller) (USD(C)) is directed to:

  • Revise the "DoD Financial Management Regulation" ("DoDFMR") to allow the use of an "estimated" price code in reporting the deliveries of major end items if an actual price code is not available within 30 days of date of shipment and require the use of an "estimated" price code in reporting the deliveries of major end items if an actual price is not available within 90 days of date of shipment.
  • Revise the "DoDFMR" to require payment schedules to be updated annually on the anniversary of each major case and/or when the value of a case increases by 10 percent or more.
  • Revise the "DoDFMR" to require that cases be reconciled financially and logistically on at least an annual basis, preferably on the anniversary of each major case.

The Director, Defense Finance and Accounting Service (DFAS), is directed to:

  • Establish, with participation from the DoD Components, a tiger team to troubleshoot problems at locations that have major FMS delivery reporting and/or related reconciliation problems. The tiger team shall review reasons for significant reporting delays at such locations, identify and implement solutions, and augment training of personnel at such locations, as appropriate.
  • Promote the maximum use of the authority to eliminate minor unresolved transactions, up to the approved threshold of $200 per transaction using the FMS Administrative Account as the funding source.
  • Resolve on a one-time basis, in cooperation with the DoD Components, problem disbursements aged over 180 days valued up to $1,000 per transaction, using up to $2.2 million provided by DSCA from the FMS Administrative Account. The current number of those transactions is approximately 8,800. The $2.2 million funding and $ 1,000 threshold are available only for the remainder of FY 2000.
  • Provide a quarterly report to the Military Departments of FMS case payment schedule variances.

The Director, Defense Security Cooperation Agency (DSCA), is directed to:

  • Expand the ongoing FMS reinvention effort to include representatives of the USD(AT&L), USD(C), DFAS, and representatives of the security assistance, financial management, acquisition and logistics communities within the Military Departments and DLA, and report the results of applicable meetings within 30 days from the date of this memorandum.
  • Ensure the ongoing FMS reengineering effort addresses: (1) clarification of organizational responsibilities; (2) roles, responsibilities and authorities of case managers; (3) funds control, to include fiscal accountability responsibilities among DSCA, DFAS and various DoD Components; (4) recommendation of a permanent dollar threshold for minor unresolved transactions that can be charged to the FMS Administrative Account, whether additional types of transactions should be eligible to be charged to the FMS Administrative Account, and estimates of the annual financial impact of any proposed revisions to the current policy; (5) the feasibility of eliminating the Letter of Offer and Acceptance (LOA) for funded consumable item requisitions; and (6) the feasibility of FMS customers using commercial debit/purchase cards for consumable items.
  • Provide additional funding authority up to $2.2 million on a one-time basis from the FMS Administrative Account to the DFAS to resolve problem disbursements aged over 180 days with a value of up to $1,000 per transaction. The current number of those transactions is approximately 8,800. The $2.2 million funding and $1,000 threshold are available only for the remainder of FY 2000.
  • Revise the Security Assistance Management Manual (SAMM) to explicitly encourage consolidation of small dollar requirements under one LOA per country.
  • Widely disseminate metrics developed for FMS performance measurement in the areas of LOA Processing, Delivery Reporting, Disbursements and Case Closure.
  • Articulate priorities for case execution activities funded by the FMS administrative budget so that the DoD Components are able to prioritize their activities.
  • Direct the Defense Institute of Security Assistance Management (DISAM) to update and expand security assistance training curricula to reflect changes to policies and procedures directed in this memorandum.
  • Designate DISAM to be the repository for, and direct DISAM to facilitate the sharing of, security assistance best practices within the Department.

The heads of the DoD Components are directed to:

  • Report physical deliveries of items or performance of services to the DFAS Denver Center within the 30-day timeframe specified in the "DoDFMR."
  • Ensure compliance with the revision to the "DoDFMR" that allows the use of an "estimated" price code in reporting the deliveries of major end items if an actual price is not available within 30 days of date of shipment and requires the use of an "estimated" price code in reporting the deliveries of major end items if an actual price is not available within 90 days of date of shipment.
  • Require that FMS contract line items be closed out as soon as the closeout requirements for those line items are satisfied. DoD Components should not delay the closeout of FMS contract line items while waiting for requirements to closeout other non-FMS contract line items to be satisfied.
  • Establish, with the DFAS, a tiger team to troubleshoot problems at locations that have major delivery reporting problems. The tiger team is to review reasons for significant delivery reporting delays at those locations, identify and implement solutions, and augment training of personnel at the those locations with respect to delivery reporting and reconciliation.
  • Promote the maximum use of the authority to eliminate minor unresolved transactions, up to the approved threshold of $200 per transaction using the FMS Administrative Account as the funding source.
  • Resolve on a one-time basis, in cooperation with DFAS, problem disbursements aged over 180 days valued up to $1,000 per transaction, using up to $2.2 million provided by DSCA from the FMS Administrative Account. The current number of those transactions is approximately 8,800. The $2.2 million funding and $1,000 threshold are available only for the remainder of FY 2000.
  • Distribute to each level from senior security assistance officials to case managers and across the functional disciplines of security assistance, financial management, acquisition and logistics throughout each DoD Component, and require the use of, performance metrics provided by DSCA in the areas of LOA Processing, Delivery Reporting, Disbursements and Case Closure.
  • Participate in the DSCA reengineering effort to include providing representatives from the functional areas of security assistance, financial management, acquisition and logistics.
  • Revise security assistance training curricula to reflect changes in policies and procedures directed in this memorandum and expand training opportunities for all personnel involved in FMS processes.
  • Develop strategic training plans and submit those plans to the DSCA/DISAM Curriculum Committee for planning purposes.
  • Ensure compliance with all portions of the Security Assistance Management Manual and "Department of Defense Financial Management Regulation" applicable to security assistance.

Attachment 5
FMS Review Funding Matrix

Review Type/Category

Admin Funded

How Often?

Case Funded

How Often?

Remarks/Comments

Policy-level

X

Ad Hoc

 

 

 

Tri(All) Service/Country-level

X

Annual

 

 

 

- If DSCA attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If MILDEP sr mgmt attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If MILDEP country mgr attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If MILDEP case mgr attends

X

Annual

X

 

Case funded if the case manager is attending to represent a specific case, weapon system or group of cases.

If DFAS attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If SAO attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

Service-level

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If DSCA attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If MILDEP sr mgmt attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If MILDEP country mgr attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If MILDEP case mgr attends

X

Annual

 

 

Case funded if the case manager is attending to represent a specific case, weapon system or group of cases.

- If DFAS attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If SAO attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

Program-level

X

Ad Hoc

 

 

 

- If DSCA attends

X

Ad Hoc

 

 

 

- If MILDEP sr mgmt attends

X

Ad Hoc

 

 

 

- If MILDEP country mgr attends

 

Ad Hoc

X

 

 

- If MILDEP case mgr attends

 

Ad Hoc

X

 

 

- If DFAS attends

X

Ad Hoc

 

 

 

- If SAO attends

X

Ad Hoc

X

 

SAO travel and per diem cost funding source should consider DSCA policy memo 00-15 dtd 12 Oct 2000.

Internal reconciliation

X

Ad Hoc

 

 

 

- If DSCA attends

X

Ad Hoc

 

 

 

- If MILDEP sr mgmt attends

X

Ad Hoc

 

 

 

- If MILDEP country mgr attends

X

Ad Hoc

 

 

 

- If MILDEP case mgr attends

X

Ad Hoc

 

 

 

- If DFAS attends

X

Ad Hoc

 

 

 

- If SAO attends

X

Ad Hoc

 

 

Usually not applicable for SAOs to attend these. DSCA policy memo 00-15 applies.

Internal periodic review

X

Ad Hoc

 

 

 

- If DSCA attends

X

Ad Hoc

 

 

 

- If MILDEP sr mgmt attends

X

Ad Hoc

 

 

 

- If MILDEP country mgr attends

X

Ad Hoc

 

 

 

- If MILDEP case mgr attends

X

Ad Hoc

 

 

 

- If DFAS attends

X

Ad Hoc

 

 

 

- If SAO attends

X

Ad Hoc

X

 

Usually not applicable for SAOs to attend these. DSCA policy memo 00-15 applies.

Internal FMS review planning

 

Annual

 

 

 

- If DSCA attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If MILDEP sr mgmt attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If MILDEP country mgr attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If MILDEP case mgr attends

 

 

X

 

 

- If DFAS attends

X

Annual

 

 

If customer requests more than one review per year, those additional reviews could be case funded.

- If SAO attends

X

Annual

 

 

Usually not applicable for SAOs to attend these. DSCA policy memo 00-15 applies.

Other (specify below):

X

 

 

 

 

Payment schedules

X

 

 

 

 

Financial/logistical recon

X

 

 

 

 

Delivery reporting

X

 

 

 

 

Internal contractors

Funded based on how their salaries are paid; except for external program reviews

External contractors

Incorporated into the governing contracts and appropriate LOA lines

Attachment 6
Financial Management Review (FMR) Case Financial Status Reporting Format

U.S. -- (Country) 2000 Financial Management Review Case Financial Status Reporting Format
Data as of: 31 August 2000 (unless specified otherwise)

Item

Data/Value

Source(s)/Definition

Case Summary

 

 

Case Designator

 

IA System/LOA documents

Case Description

 

IA System/LOA documents

Year LOA Signed

 

IA System/LOA documents

Total Number of Lines

 

IA System/LOA documents

Supply Summary

 

 

Total Delivered Value

 

IA System/LOA documents; do NOT use DIFS as its delivered value reflects shipments already billed. Definition: Articles/Services deliveries plus delivered admin and delivered accessorials.

Total Number of Open Requisitions

 

IA System. Note: For those systems that do not/cannot track open requisitions, furnish a definition for the data being provided in this field (e.g., PDLIs outstanding).

Total Open Requisition Value

 

IA System

Estimated/Actual Case Supply/Services Completion Date

 

IA Case Manager. Enter in MM/YY format

SDR Summary

 

 

Total Number of Open SDRs

 

IA System

Total Open SDR Value

 

IA System

Closure Summary

 

 

Estimated Case Closure Date

 

IA Case Manager, in coordination with primary closure POC. Enter in MM/YY format.

Case Financial Summary

 

 

(1) Total LOA Value

 

IA System/LOA documents

(2) Total Net LOA Value

 

IA System/LOA documents

(3) Highest Financial Requirement

 

IA System/LOA documents. Definition: All financial commitments billed to date PLUS all financial commitments not yet billed (e.g., contracts awarded but not delivered). Restated, total value of all programmed requirements. Must include below-the-line surcharges.

(4) Total Collected through 15 September 2000

 

DSCA DLO report via e-mail

(5) Estimated Excess LOA Value [(1) -- (3)]

0.00

Formula driven; do not override with manual data entry.

(6) Estimated Excess Collections [(4) -- (3)]

0.00

Formula driven; do not override with manual data entry.

(7) Forecasted Activity (Note: Applies predominantly to cases not yet supply complete.)

 

 

(7A) Disbursements through 31 July 2000

 

DSCA DLO report via e-mail

(7B) Projected Expenditures:

 

IA Case Manager

(7B1) August-September 2000

 

 

(7B2) October -- December 2000

 

 

Subtotal, August-December 2000

0.00

Formula driven; do not override with manual data entry.

(7B3) January -- March 2001

 

 

(7B4) April -- June 2001

 

 

(7B5) July -- September 2001

 

 

(7B6) October -- December 2001

 

 

Subtotal, January -- December 2001

0.00

Formula driven; do not override with manual data entry.

(7B7) January -- March 2002

 

 

(7B8) April -- June 2002

 

 

(7B9) July -- September 2002

 

 

(7B10) October -- December 2002

 

 

Subtotal, January -- December 2002

0.00

Formula driven; do not override with manual data entry.

(7C)Total Projected Expenditures through 31 December 2002

0.00

Formula driven; do not override with manual data entry.

Remarks/Comments

 

IA Case Manager/CPM/CCM; Cell (A76) is a wrap text field

Attachment 7
FMS Review Survey

FMS Review Title:

Please take a few moments and fill out this important survey so that we can assess and, where needed, improve this FMS review process. Circle the rating that you feel best applies. For numeric ratings a "1" is a low or very poor assessment, while a "5" is a high or extremely pleased opinion. In addition, you are encouraged to provide any written remarks or elaborate on your opinions at the bottom of this form. The provision of your name and component being represented is strictly voluntary. All responses will be considered as non-attribution. Thank you!

Survey Item

Respondent Opinion

Preparation of the teams

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Coverage of the agenda topics submitted in advance

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Completeness of answers provided to questions raised

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Accuracy of the data presented

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Thoroughness of the data presented

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Helpfulness of the teams to find solutions

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Extent to which actions from the previous meeting were completed

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Satisfaction with the timeliness in which actions from the previous meeting were completed

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Knowledge level of the teams

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Professionalism of the teams

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Levels of representation of the teams

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Number of attendees

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Briefings, informational and/or educational materials, and other information presented

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Review of the Minutes

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Accommodations and other administrative arrangements made.

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Meeting location and conference room facilities.

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Administrative support provided in response to requests.

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Importance of the same review to be held in the future

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Overall level of satisfaction with this review

1 -- 2 -- 3 -- 4 -- 5 -- N/A

Other Comments: