J.C.N. Construction Co. v. United States

60 Fed. Cl. 400 | Fed. Cl. | 2004

OPINION AND ORDER1

LETTOW, Judge.

This post-award bid protest is before the Court on plaintiffs motions for a preliminary injunction and for judgment upon the administrative record and defendant’s cross-motion for judgment upon the administrative record. Plaintiff, J.C.N. Construction Company, Inc. (“JCN”), challenges the government’s awards to five other offerors of contracts under a multiple award construction contract (“MACC”). This MACC encompassed a range of projects to be constructed at Navy installations in the Northeastern United States over a five-year period, with such projects individually expected to range in cost from $25,000 to $1.7 million. The request for proposals (“RFP” or “solicitation”) called for submissions particularly related to a “seed project” which the contracting agency, the United States Naval Facilities Engineering Command (“Navy”), said it would use as the basis for evaluating all proposals. The Navy designated five awardees, with the seed project specifically awarded to Patel Construction Company, Inc. (“Patel”), the intervening defendant in this case. Subsequent task orders under the MACC were to be subject to competition among the five contractual awardees. JCN filed a protest of the awards before the General Accounting Office (“GAO”), but GAO denied relief. After JCN’s complaint was filed in this Court, the Navy agreed to withhold the issuance of task orders under the MACC other than the seed project until April 16, 2004.

Pursuant to Rule 65(a)(2) of Rules of the Court of Federal Claims (“RCFC”), a hearing on plaintiffs application for a preliminary injunction was consolidated with a hearing on the merits of the claim for an injunction and the cross-motions for judgment upon the administrative record, and the consolidated hearing was held on March 19, 2004. For the reasons set out below, the Court denies JCN’s application for a preliminary injunction, request for injunction, and motion for judgment upon the administrative record and grants the government’s cross-motion for judgment upon the administrative record.

BACKGROUND

A. The Contract Solicitation

The Navy’s solicitation was issued on April 8, 2003 and was labeled “Maine Multiple Award Construction Contract.” Compl. at 1. It sought proposals for indefinite-delivery, indefinite-quantity construction projects, including design-build projects, at Naval installations located in Maine, New Hampshire, and Vermont. AR 29-30.2 The solicitation contemplated awards to up to five offerors of contracts for a base period of one year with four subsequent one-year options during which the Navy could place separate task orders for projects. Id. at 29. Individual projects were estimated to cost between $25,000 and $1.7 million, with a maximum total of $30 million for all task orders placed during the full performance period. Id.

The “solicitation also ineorporate[d] a seed project upon which all proposals [would] be evaluated for award.” Id. That initial project was for construction of a working dog kennel at the Naval Air Station, Brunswick, ME. AR 30. The dog kennel would house up to eight *402dogs and would “be an enclosed building including spaces for kennel compartments, food preparation, work/treatment area, storage and mechanical room.” Id. The estimated price range for the dog kennel was $250,000 to $500,000. Id. The offeror found to be the best overall value to the government based on price and certain technical evaluation subfaetors would be awarded the contract for the seed project and be one of the up-to-five contract awardees. AR 44. Each awardee not given the seed project would receive a guaranteed minimum of $5,000. Id. Best-value source selection procedures would also be applied in selecting the other awardees. Id.

The solicitation procedures called for offerors to make separate, severable proposals regarding price and technical aspects of the proposed construction work. AR 36. The only price to be submitted by offerors related to the seed project because future task orders were to be the subject of subsequent competition on technical and price grounds. Offerors’ technical proposals would be evaluated under four equally weighted subfactors — Relevant Past Performance, Management Approach, Safety Record, and Commitment to Small Business Concerns — the combination of which would be considered approximately equal to price. AR 36-43; see also AR 17 n. 2. The subfactors would be assigned one of the following adjectival ratings: exceptional, very good,3 acceptable,4 deficient but correctable, or unacceptable. AR 43.5' Respecting evaluation of offerors’ technical proposals, the solicitation provided that “technical evaluation board members will ... assess each factor of each proposal as indicated in the RFP as [it] relate[s] to the seed project.” Id. The offerors’ price proposals for the seed project were to be “evaluated for reasonableness and the offerors[’] ability to be competitive on future projects.” AR 44.

A Source Selection Plan (“Plan”) governed the Navy’s evaluation of proposals received in response to the solicitation. The source selection organization was headed by the Source Selection Authority (“SSA”), who appointed two advisory teams — a Technical Evaluation Team (“TET”) and a Source Selection Team (“SST”) — with the latter team including a Price Analyst. AR 777-78, 801. The TET was responsible for evaluating and rating each technical proposal against the four technical subfactors, preparing a written report, and presenting an oral briefing to the SST. AR 804-05. Correspondingly, the Price Analyst was obliged to analyze each price proposal, prepare a written evaluation, and brief the SST and SSA. AR 803. The SST was assigned the duties of developing the Plan, receiving briefing from the TET and Price Analyst, and recommending to the SSA final technical and price ratings, the competitive range, and award. AR 802-03. The SSA was then responsible for approving the Plan, considering the teams’ recommendations, and exercising her independent judgment in approving award of the MACO. AR 801-02.

B. The Navy’s Consideration of the Proposals

Seventeen offerors, including JCN, submitted timely proposals, the technical aspects of which were evaluated by the TET between May 13 and July 2, 2003. AR 960. The *403TET’s initial evaluation ranked offerors into bands based on overall technical ratings. AR 963. Proposals rated Exceptional or Very Good were placed in the first band; those rated Acceptable were placed in the second band; and proposals containing at least one subfactor rated Unacceptable were relegated to the third band. AR 963-964. JCN received an overall rating of Acceptable and was accordingly placed in the second band. AR 963. Patel, the ultimate awardee of the seed project, was also placed in the second band, receiving an overall rating of Deficient but Correctable with a preliminary minimum rating of Acceptable. The Source Selection Board Report explained that the preliminary ratings were subject to adjustment:

Some ratings are Deficient but Correctable (DC) with a preliminary/anticipated minimum rating in parentheses. The deficiencies for the firms in the top two bands are considered minor and easily correctable. The final ratings may actually be higher or lower than the preliminary ratings____ The TET will finalize the ranking once the firms have been given the opportunity to correct their deficiencies, and the TET evaluated that information.

AR 963.

JCN’s estimated price for the seed project was [* * *], the third lowest of the fifteen submitted.6 AR 964. Patel’s price, $345,775, was the lowest of all offerors. Id. The government’s contemporary estimate for the seed project was $381,475, but the SST indicated that this estimate was inaccurate and would most likely be increased by approximately $100,000. AR 964-65. The initial technical and price evaluations eliminated six offerors, with JCN surviving among the nine remaining competitors. AR 965.

The remaining offerors submitted revised technical proposals in response to deficiencies and weaknesses identified by TET in its initial report. AR 1031. The TET assigned revised overall ratings to these offerors and classified them by bands. AR 1031-32. The first band listed offerors whose proposals were rated Exceptional, the second band consisted of those offerors whose proposals were considered Very Good or potentially Very Good, and those in the third band were rated Acceptable or potentially Acceptable. AR 1032. JCN’s technical rating did not change, and it was ranked seventh of nine and placed first in the third band. Id. Patel’s rating likewise remained the same, and it was ranked eighth overall and second in the third band. Id. Patel was given a final chance to revise its technical proposal to correct its deficiencies in addition to submitting its final offer. AR 1035.

Revisions to price were also submitted, and JCN lowered its price by [* * *] to [* * *]. AR 1034. Patel did not revise its price, which the SST noted “was significantly lower than the revised Government estimate” of $429,084. Id. Patel’s price remained the lowest of the nine offerors. Id.

The TET reconvened a final time on September 11, 2003, to evaluate the offerors’ final revisions to their proposals. AR 1065. Among other things, it “reviewed [a] supplemental letter received from J.C.N. Construction Co. and determined there was no significant new information in the letter as compared with the firm’s previous proposal.” Id. Accordingly, the TET did not revise JCN’s ratings or overall technical ranking, and JCN remained seventh of nine offerors and in the third band. AR 1066. Respecting Patel, the TET revised its overall technical ranking to Acceptable, finding its deficiencies corrected based on Patel’s responses to the TET’s questions. AR 1045-46. Patel ranked ninth on the technical factor, the lowest offeror in the third band. AR 1066.

Patel did not revise its price, which at $345,775 remained the lowest offer. AR 1067. JCN further lowered its price by [* * *] to $545,700. Id. The Navy revised its estimate for the seed project to $459,000, based on a projected increase in labor costs. AR 1066.

*404The final overall technical ratings and prices of each of the nine offerors were as follows (ordered by price from lowest to highest):

FINAL OVERALL FINAL TECHNICAL OFFEROR_PRICE RATING
Patel Construction $345,775 Acceptable
MCC $512,285 Exceptional
Zachau $521,300 Very Good
Diversified $543,217 Very Good
JCN $545,700 Acceptable
Haskell $558,200 Exceptional
[offeror 7] $590,925 Acceptable
[offeror 8] $594,450 Exceptional
[offeror 9] $614,400 Very Good

AR 1066-67. Based on these results, the SSA approved the SST’s recommendation that Patel represented the best value to the government and therefore should be awarded the seed project. AR 1068. The most competitive other offeror in the SST’s view was MCC Construction Corporation (“MCC”) which had a technical rating of Exceptional and the first-place technical ranking. Id. However, the SST reasoned that this technical ranking did not justify paying MCC’s premium of $166,510 for the seed project. Id. The SAA also adopted the SST’s recommendation that a MACC be awarded to MCC, Zachau Construction, Inc., Diversified Technical Consultants, Inc., and The Haskell Company (“Haskell”), as well as to Patel. Id. The SST reasoned that Haskell’s overall rating as Exceptional and second place technical ranking, as compared to JCN’s overall rating as Acceptable and seventh place technical ranking, was worth Haskell’s $12,500 premium. AR 1068-69. The award letters were issued on September 24, 2003. AR 1072-97.

C. Post-Award Proceedings

JCN filed a post-award protest with GAO on October 17, 2003. AR 1203-07.7 GAO issued a partial dismissal of JNC’s protest on November 12, 2003, AR 1197-98, and denied the remainder of the protest on January 9, 2004. AR 1-5. Thereafter, JCN filed this post-award review action in this Court on January 29, 2004. JCN seeks a declaratory judgment that the Navy’s award of the five contracts was arbitrary and capricious and contrary to law and should therefore be vacated, along with orders preliminarily and permanently enjoining the Navy from proceeding with further awards under the MACC and directing the Navy to make another source selection decision consistent with the solicitation. Compl. at 9-10 (Prayer for Relief).

The Court granted JCN limited discovery to address two claims not addressed in the administrative record, and the administrative record was supplemented in accord with the results of that discovery.8 Patel was granted *405leave to intervene as a defendant, and, as previously noted, a hearing on JCN’s application for a preliminary injunction was consolidated with a hearing on the merits of the claim for an injunction and the cross-motions for judgment upon the administrative record.

ANALYSIS

JCN alleges that the Navy improperly evaluated JCN’s technical proposal and conducted a flawed best-value analysis in violation of applicable statutes and regulations.

A. Standard of Review

The Tucker Act requires this Court to apply the standards prescribed by the Administrative Procedure Act, 5 U.S.C. § 706, in reviewing an agency’s procurement decision. 28 U.S.C. § 1491(b)(4).9 Under those standards, the reviewing court determines, among other things, whether the agency action was “arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law.” 5 U.S.C. § 706(2)(A). An agency’s bid award should be reversed only where “(1) the procurement official’s decision lacked a rational basis; or (2) the procurement procedure involved a violation of regulation or procedure.” Impresa Construzioni Geom. Domenico Garufi v. United States, 238 F.3d 1324, 1332 (Fed.Cir.2001). The Federal Circuit has explained:

When a challenge is brought on the first ground, the courts have recognized that contracting officers are entitled to exercise discretion upon a broad range of issues confronting them in the procurement process. Accordingly, the test for reviewing courts is to determine whether the contracting agency provided a coherent and reasonable explanation of its exercise of discretion, and the disappointed bidder bears a heavy burden of showing that the award decision had no rational basis. When a challenge is brought on the second ground, the disappointed bidder must show a clear and prejudicial violation of applicable statutes or regulations.

Id. at 1332-33 (internal citations and quotation marks omitted).

To support equitable relief, the plaintiff must show that it was prejudiced by significant error in the procurement process. Advanced Data Concepts, Inc. v. United States, 216 F.3d 1054, 1057 (Fed.Cir.2000). In that respect, the disappointed offeror must show that there is a substantial chance that it would have received the award but for the faulty evaluation. Id.

In its analysis, the Court must focus on the procurement record, see Camp v. Pitts, 411 U.S. 138, 142, 93 S.Ct. 1241, 36 L.Ed.2d 106 (1973), and must view critically any post hoc rationalization. See Citizens to Preserve Overton Park, Inc. v. Volpe, 401 U.S. 402, 420, 91 S.Ct. 814, 28 L.Ed.2d 136 (1971); Vermont Yankee Nuclear Power Corp. v. Natural Res. Def. Council, Inc., 435 U.S. 519, 98 S.Ct. 1197, 55 L.Ed.2d 460 (1978); Co-Steel Raritan, Inc. v. International Trade Comm’n, 357 F.3d 1294, 1316 (Fed. Cir.2004). The Court is mindful that it “is not empowered to substitute its judgment for that of the agency,” Overton Park, 401 U.S. at 416, 91 S.Ct. 814, but rather its inquiry should focus on whether the Navy considered the relevant factors and made a rational determination. Advanced Data Concepts, 216 F.3d at 1057-58.

*406 B. The Navy’s Evaluation of JCN’s Technical Proposal

JCN’s primary claims for relief center on the Navy’s evaluation of its technical proposal under two of the technical subfactors: (1) Relevant Past Performance and (2) Management Approach. With respect to both sub-factors, JCN asserts that it should have been rated “Very Good” rather than merely “Acceptable.”

1. The relevant past performance subfactor.

The solicitation emphasized that the Navy would evaluate each offeror’s relevant past performance, i.e., work on prior projects would be weighed in part according to how closely it correlated to the Navy’s selection criteria. The Navy specified several criteria pursuant to which it would evaluate performance on past projects that were comparable to those to be covered by the MACC, as well as the ability to manage more than one such project at the same time. The Navy said it would evaluate an offeror’s

past performance on the basis of its past projects in two separate areas. First, submit examples of completed past projects which are similar to the magnitude and complexity of the seed project that include all major trades of construction. Secondly, the government is also interested in the offeror’s ability to simultaneously and successfully manage multiple new construction, repair, demolition, design/build and renovation projects that are completed in the $25,000.00 to $1.7 million-dollar range.

AR 37. The Navy indicated that it was interested in work on recent projects and specifically noted that “[ijnformation required on Forms 1A [Offeror’s List of References], IB [Offeror’s Past Performance/Referenee,] and 1C [Letter of Introduction and Questionnaire] shall be used to describe and provide reference information for all projects performed within the time period from 1 January 1998 through the date initial proposals are due (projects must be complete) that demonstrate [an offeror’s] ability to successfully perform” construction projects similar to the nature of the seed project and the anticipated task orders. Id. The technical review was to be wide ranging: “[t]he currency and relevance of the Past Performance information, source of the information, context of the data, and general trends in [a] contractor’s performance shall be considered---- The Source Selection Authority shall determine the relevance of such information.” AR 38. In that regard, potential offerors were informed that “[t]he evaluation of past performance on completed projects will be a subjective assessment based on a consideration of all relevant facts and circumstances.” AR 37 (emphasis added).

In addition to the selection criteria set out explicitly in the solicitation, the Navy developed a set of detailed criteria, in question format, to guide its evaluation. As the first TET report stated, “[t]he technical proposals were reviewed against all of the required technical evaluation criteria, taken from the RFP [and] also found in the source selection plan. The following 34 evaluation criteria were used in the review of each Offeror’s proposal.” AR 839. The TET’s list of thirty-four internal criteria, expressed primarily in the form of questions, included twenty questions pertaining to the past-performance subfaetor. AR 839-42.

The solicitation limited each offeror to twenty references. AR 37. JCN’s technical proposal contained eighteen references, each of whom submitted a Form 1C providing his or her own evaluation of JCN’s performance with respect to a particular past project. See AR 1428-1650. The TET evaluated these references against the twenty past-performance evaluation questions. The TET concluded JCN’s references ranged “from fan’ to exeeptional[, and] [m]ost are from Acceptable (4), to VG (7), to Exceptional (4).” AR 1000. The TET ranked JCN as Acceptable for past performance. AR 983.

Notably, in developing its performance ratings for some offerors, the TET also relied on past-performance data obtained from the Construction Contractor Appraisal Support System (“CCASS”), a computer database maintained by the Army Corps of Engineers, but it did not do so for JCN. See AR 840; see also supra, at 404 n. 7. One of the twenty questions used internally by the TET, Question 15, asked, “[w]as the Offeror’s past performance available from any CCASS infor*407mation? If so what additional information did this provide?” AR 840. In answering this question for JCN in its summary table, the TET explained only, “no CCASS forms available.” AR 1000.

Based on these facts, the primary thrust of JCN’s challenge to the Navy’s evaluation of its proposal under this subfactor is that “there is a disconnect” between JCN’s overall Very Good rating by its references and its overall Acceptable rating by the Navy. Pl.’s Summ. J. Mem. at 13. Second, JCN argues that the Navy’s failure to find and use CCASS data relating to its past performance evidences disparate treatment. Id. at 24r-25; Pl.’s Reply at 8-9 & n. 9; Pl.’s Supp. Br. at 4-5. Finally, JCN argues that the Navy failed adequately to document its rating of JCN under several of the twenty evaluation questions. Pl.’s Summ. J. Mem. at 13,16-18, 20-21, 23-27; Pl.’s Reply at 3-4 & n. 3, 7-10.

a. The Navy’s evaluation of JCN’s prior work.

First, JCN is manifestly correct that there was a difference between the ratings given by its references and the Navy’s evaluations of those references. However, JCN’s references evaluated JCN based on their own expectations regarding its performance on a particular project. By contrast, the Navy reviewed those projects and evaluations against its specific selection criteria. The difference in rating criteria was significant for the Navy.

JCN takes particular issue with the TET’s rating of Acceptable both under Question 1, which addressed the similarity of its past projects in magnitude and complexity to the seed project, and under Question 10, which concerned whether its past projects demonstrated its ability successfully to perform anticipated task orders ranging in price from $25,000 to $1.7 million. Pl.’s Summ. J. Mem. at 14-15, 21-22; Pl.’s Reply at 5-6. In response to both questions, the TET commented that most of JCN’s projects ranged from $500,000 to well over $1 million with few under $500,000. JCN contests the TET’s comment by noting that four of its eighteen references were for projects under $500,000, and for those it had received from its references three Exceptional ratings and one Very Good rating. In these circumstances, JCN asserts that “the unreasonableness of the TET’s evaluation results approaches] lud[i]cr[osit]y.” Pl.’s Summ. J. Mem. at 22.

JCN further points out that it was limited to 20 reference projects and that it had been increasingly successful in building its business through projects of larger size. It argues that it ought not be penalized for its success. Hr’g Tr. at 15 (Mar. 19, 2004). However, this very focus on larger projects legitimately could be taken into account by the TET as a negative element in making its evaluation. The solicitation specified that the first criterion on which proposals would be evaluated would be similarity to the magnitude and complexity of the seed project, which the solicitation estimated to cost within the $250,00 to $500,000 range. AR 30, 37. JCN protests that “[t]here was no indication in the solicitation that the Navy desired more or less [than four] project references under the $500,000 range.” Pl.’s Summ. J. Mem. at 21. And, indeed, there was no such explicit indication, but by putting forward only four reference projects of the size of the seed project, JCN put itself at risk that the TET’s own evaluation of those projects might be both less favorable to JCN and given more weight than the TET’s evaluation of the more numerous larger reference projects JCN had cited. The divergence between the ratings assigned by JCN’s references and that assigned by the TET does not constitute an error of any significance to this procurement.

b. The Navy’s disparate reliance on CCASS data.

Second, JCN is also correct that the Navy treated certain offerors differently from others insofar as CCASS data were concerned. Because the administrative record was silent regarding why the Navy did not find and use CCASS data for JCN and some other offerors, the Court directed the parties to supplement the record with information about the CCASS system and the CCASS data that were available. See supra, at 404 n. 7. The additional materials and the parties’ supplemental briefs clarify how the CCASS system is supposed to work, but they do not explain *408why it did not function as intended in this procurement.

The CCASS regulations are part of the FAR. Under the regulations, a contracting agency is required to prepare a performance evaluation report for each construction contract of at least $500,000. FAR § 36.201(a)(1). Department of Defense agencies are required to use DD Form 2626 in preparing a performance evaluation report and to “[s]end each contractor performance evaluation report to the central data base immediately upon its completion.” Department of Defense Federal Acquisition Regulation Supplement (“DFARS”) [48 C.F.R.] § 236.201(a), (c). Contracting agencies are required to complete the report “at the time of final acceptance of the work, at the time of contract termination, or at other times, as appropriate, in accordance with agency procedures.” FAR § 36.201(a)(2). The Naval Facilities Engineering Command’s procedures require the preparation and submission of a performance evaluation report to the CCASS “within 60 days after [the] Useable Completion Date (UCD) [of a construction project].” AR 2016 (NAVFAC P-68, 42.1503 — 100(b)(2)(iii)).10 During this sixty-day time period, a copy of the report is to be distributed to the contractor. AR 2006-07 (NAVFACINST 4335.4 1Í 4.c.(2)(c)). Such distribution to the contractor is a step that must be accomplished as a prerequisite to submission of the form to the CCASS. AR 2009 (NAVFACINST 1I5.d.(l), (2)). Contracting agencies are required to keep evaluation reports on file in the CCASS for six years, FAR § 36.201(c)(1); DFARS § 236.201(c)(l)(A)(2), but “only performance evaluations entered in the past three years will be used in evaluating contractors for potential new Navy contract awards under the source selection process and responsibility determinations.” AR 2006 (NAVFA-CINST 4335.4 IT 3.d).

In the instant case, the TET found and used CCASS data for four of the nine offerors competing in the TET’s second round of evaluations, viz., Diversified, MCC, Triton and Patel. It is evident from the record that the existence of CCASS evaluations for those four offerors was used by the TET to enhance their rankings. Specifically, the TET rated Diversified, MCC, and [offeror 8] Very Good under Question 15, which, as previously noted, asked: ICWas the Offeror’s past performance available from any CCASS information? If so, what additional information did this provide?” See supra, at 406 (quoting AR 840). Even more to the point, the TET in each instance identified a positive response to this evaluation question as a “strength.” AR 993, 1002, 1012. The existence of CCASS data for Patel was especially valuable for that offeror. For Patel, the TET found three available CCASS evaluations, one of which was rated satisfactory overall and marginal in one respect, with the other two evaluations rated above average. AR 1005. The TET noted the marginal aspect of the one evaluation as a “weakness,” AR 1007, but rated Patel Acceptable under Question 15 based on the generally favorable nature of the three evaluations found in the CCASS. AR 1005.

The other five offerors received a neutral rating under this evaluation question based on the lack of CCASS data applicable to them. AR 990 [offeror 7], 997 (Haskell), 1000(JCN), 1010 ([offeror 9]), 1016 (Zachau). JCN was particularly disadvantaged because it appears from the record that the TET downgraded JCN on this basis, noting JCN “has a wide range of fair acceptable to exceptional ratings from their references, but no CCASS evaluations were available.” AR 983. In the corresponding sections of its report, however, the TET made no mention of the lack of CCASS data with respect to the other four offerors. AR 975-76 (Haskell); 978-79 ([offeror 9]); 981-82 (Zachau); 987-88 ([offeror 7]).

Furthermore, it is evident that evaluations of JCN’s past performance should exist in the CCASS database. Referring to the list *409of past projects JCN submitted as part of its proposal, the government observes that of the sixteen Department of Defense projects listed by JCN as having been completed within the past six years, seven fall below the $500,000 reporting level requirement. Def.’s Supp. Br. at 2 (citing AR 1129). However, the government offers no explanation as to why CCASS evaluations do not exist for the other nine projects that do meet the reporting requirements. With its supplemental brief, the government submitted two affidavits. The first affiant, the employee who conducted the CCASS search in May 2003 pursuant to the TET’s request, stated that no records were found under the entry, “J.C.N.” Def.’s Supp. Br., Attach. 1 (Decl. of Deborah A. Orr (Mar. 25, 2004)). The second affiant stated that he conducted a search on March 24, 2004, under the entries “JCN,” “JCN Construction Company,” and the DUNS number “048620967,” and found one evaluation under each. Id., Attach. 2 (Decl. of Anthony B. Teti (Mar. 25, 2004)). This evaluation had been entered into the CCASS on March 15, 2004, subsequent to the contract awards. Id. Obviously, these searches differ in the form of JCN’s initials and names that were entered for search. Id. Nonetheless, taken together these affidavits support the TET’s statement that no records were available for JCN in the CCASS database at the relevant time, but they do not explain why that was so.

Material submitted by JCN contradicts this showing by the government. With its supplemental brief, JCN submitted an affidavit from its vice president who stated that JCN has received multiple DD Forms 2626, including one dated February 22, 2001, a copy of which was attached to the affidavit. Pl.’s Supp. Br., Attach. (Sworn Statement of Steve Bennett (Mar. 29, 2004)). This evaluation pertains to the eighteenth project listed as a reference in JCN’s proposal, which involved replacing a heating system at New Boston Air Force Station in 1999 for $512,707. AR 1129.11 Given that this contract was completed within the past six years and was for an amount greater than $500,000, it is inexplicable that this performance evaluation was not available in the CCASS database.12

In short, the record shows that the Navy gave favorable technical ratings to those offerors that had evaluations on file in the CCASS. Those who had no such evaluations in the CCASS suffered by comparison, and JCN particularly was downgraded in this regard because the TET considered that CCASS data ought to have been available for JCN. The record contains no indication that at the time JCN was even aware such data were absent. As a private party, JCN did not and does not have access to the CCASS. AR 1988. Also, nothing about the CCASS appears to have been mentioned in the solicitation or in the TET’s follow-up inquiries.13 Moreover, the 34 detailed questions used by the TET to guide its evaluations, including the specific question about the CCASS data, were used only by the TET internally. JCN thus bears no direct responsibility for the absence of CCASS data regarding its prior government contract work.

The government argues that the absence of CCASS data for JCN is not a ground for assignment of error. As the government puts it: “The TET recorded in the record that no CCASS data were available, and these records are entitled to a presumption of regularity.” Def.’s Supp. Br. at 2 (citing Emery Worldwide Airlines, Inc. v. United States, 264 F.3d 1071, 1085 (Fed.Cir.2001)). However, on this record, the direct cause of the lack of CCASS data for JCN appears to be a failure on the part of governmental officials to complete evaluations regarding *410JCN’s work under prior government contracts, as they were required to do by applicable regulations. The presumption of regularity thus has been rebutted.

Nonetheless, JCN must shoulder some indirect responsibility for the omission because it had the ability to inquire whether evaluations had been submitted to and included in the CCASS database. As the government points out, one of the “frequently asked questions” on the website of the Army Corps of Engineers, the overall administrator of the CCASS system, is: “My firm has not received a performance evaluation for a construction contract that was completed last year. How can I make sure an evaluation was prepared and get a copy of it?” AR 1988. The government does not quote the Corps’ answer, but that answer is instructive because it emphasizes that the contractor does not have direct access to the CCASS but may request information about evaluations of its projects:

A. Contact the agency’s project manager and request a copy of the performance evaluation from CCASS. You will be asked to fax your request on company letterhead. If you have the construction contract number, include it in your request. Remember to specify that the copy comes from CCASS. This ensures that your performance evaluation was entered correctly into CCASS.

Id. In short, a sophisticated contractor preparing to bid on a construction contract subject to the FAR would act prudently to insure that appropriate evaluations of its prior projects were included in the CCASS. There is no evidence that JCN did so, but that should not excuse the fact that under the CCASS regulations, evaluations for JCN should have been in the system. In sum, the Navy’s use of CCASS data in connection with this contractual solicitation amounted to an error that operated to JCN’s detriment, but the significance of this error remains to be determined.

c. The Navy’s documentation of its evaluations.

Third, JCN’s argument that the TET’s reports were inadequately documented turns on the fact that for most of the twenty evaluation questions appearing in the Evaluation Summary Tables section of its reports, the TET provided little or no explanation of its ratings. However, those reports also contain a Comparative Analysis section, which provides narrative explanations of the ratings contained in the summary tables. See AR 847-77 (TET Report I); 973-88 (TET Report II); 1043-60 (TET Report III).

In general, “[t]he relative strengths, deficiencies, significant weaknesses, and risks supporting proposal evaluation shall be documented in the contract file.” FAR 15.305(a). Specifically, “[w]hen tradeoffs are performed ..., the source selection records shall include — (i) [a]n assessment of each offeror’s ability to accomplish the technical requirements; and (ii)[a] summary, matrix, or quantitative ranking, along with appropriate supporting narrative, of each technical proposal using the evaluation factors.” FAR § 15.305(a)(3). See also E.W. Bliss Co. v. United States, 33 Fed.Cl. 123, 143 (1995) (interpreting an earlier version of the FAR and observing, “the applicable procurement regulations do not require the summary documents to include an evaluation of the respective proposals item by item”), aff'd, 77 F.3d 445 (Fed.Cir.1996). There is no requirement that the supporting narrative be contained within the summary tables themselves. The narratives found in the Comparative Analysis section of the reports suffice to explain the TET’s contemporaneous rationale for its technical ratings.

2. The management approach subfactor.

The solicitation required each offeror to submit a narrative respecting its approach for managing the seed project as well as multiple projects issued under the MACC, and to include information on the offeror’s project tracking software, understanding of the design-build process, and organizational charts illustrating management authority. AR 38. The TET employed Questions 21-30 of its thirty-four internal evaluation questions to focus its comparative analysis of the offerors’ management approaches. AR 841.

JCN raises two arguments against the TET’s evaluation of JCN under the Management Approach subfactor. The first is that *411with respect to nine of the ten questions, Questions 22-30, the TET answered only “yes” and assigned an Acceptable rating without providing any explanation. However, this asserted error is not borne out by the full record. The TET’s narrative discussion pertaining to this subfactor appears not with the questions but rather within the Comparative Analysis section of the reports. See AR 863, 983-84,1044.

Second, JCN takes issue with the TET’s rating of Acceptable under the remaining question, Question 21, when the TET’s comments in the summary table respecting that question are all positive. That question asks: “Did the Offeror provide information in a narrative format covering plans for the overall management of the MACC contract (the approach for managing multiple task orders issued under this contract, i.e., requirements for simultaneous performance in various locations)?” AR 841. JCN emphasizes the favorable nature of the TET’s comments contained in its summary table: ‘Tes, management approach narrative gives attention to different types of construction that may come under this MACC. Discusses Design Build used on Hangar 5 at Brunswick— understands the Navy’s DB process, is very aware of the need to coordinate with Gov’t and monitor progress. Familiar with using the 3-phase method of Quality Control.” Pl.’s Summ. J. Mem. at 28 (quoting AR 1000). However, this summary was not the TET’s whole commentary on the subject. A more detailed analysis was set out in the Comparative Analysis section of the reports:

They [JCN] show an understanding of the Design Build (D/B) process but did not demonstrate a significant base of experience in this area of construction. The only example of a Design Build project provided was for the NAS Brunswick Hangar 5 modifications. This Design Build experience is considered weaker than that provided by Offeror’s [sic] who are ranked within the first band. Hangar 5 was not initially awarded as a design build contract. The sprinkler fire protection portion of that contract was amended to allow the contractor to provide design build services for that portion of the project. The TET has requested clarification on this Design Build project.

AR 863. Moreover, the TET did not rest with this evaluation. It provided an opportunity for, and received, a clarification from JCN. And, based on that clarification, the TET awarded JCN a preference, noting that JCN successfully performed the design-build portion of the contract pursuant to a modification. AR 984. Nevertheless, JCN protests that the TET should have revised its rating to Very Good based on the preference and that “[t]here was no indication in the solicitation that offerors submitting more than one project reference would be considered more favorably.” Pls.’s Summ. J. Mem. at 19. However, the TET’s conclusion that the design-build portion of this project was sufficient for an adequate rating but not more is not unreasonable.

C. The Navy’s BestAValue Analysis

JCN’s final contention is that the Navy was so keen on saving money that it changed its evaluation framework from a best-value to a best-price analysis without amending the solicitation as required by FAR § 15.206(a). Pl.’s Summ. J. Mem. at 34-37; Pl.’s Reply at 11-14; Hr’g Tr. at 38-40, 86 (Mar. 19, 2004). These arguments largely focus on Patel’s posture in the procurement and the fact that Patel was awarded the contract for the seed project based upon its lowest-price offer.

First, JCN observes that the TET’s final report, which upgraded Patel’s overall technical rating from Deficient But Correctable to Acceptable, stated under the Management Approach subfactor that “there is significant risk to the government associated with [Patel’s] proposal.” AR 1045 (emphasis added). Superficially, this statement appears to be at odds with that part of the definition of “Acceptable” requiring that the proposal be “generally one ... with an average level of risk associated with the proposed approach.” AR 43 (emphasis added). JCN argues that the TET’s statement is more consistent with an Unacceptable rating, which is assigned to a proposal that “is generally one ... with a significantly above average level of risk associated with the proposed approach.” Id. (emphasis added). However, a proposal *412could involve significant risk and still fall within the average level of risk determined by the given set of proposals. Indeed, the TET in the instant case found that Patel’s “Management Approach Factor is considered Acceptable, but the proposal is ranked at the bottom of the acceptable range due to this associated risk.” AR 1045 (emphasis added). JCN fails to make an adequate showing that the TET’s technical rating of Patel is irrational. The TET provided a reasonable explanation of why the risk associated with Patel’s Technical Proposal places it within the definition of Acceptable, the parameters of which in this instance are determined by the risk associated with the proposals of the other offerors in the third band. Moreover, as Patel observed, the TET’s comment that Patel’s proposal involves a significant risk relates only to Patel’s understanding of the design-build process. Hr’g Tr. at 72-73 (Mar. 19, 2004) (citing AR 1045). An offer- or’s understanding of that process is only one of several Management Approach criteria specified in the solicitation, which criteria also include the offeror’s approach relating to the seed project (which was not design-build, AR 30) as well as capabilities of the offeror’s project tracking software and the offeror’s organization of its management authority. AR 38.

Second, JCN’s argument that the Navy selected Patel solely on the basis of its lowest price, without regard to the selection criteria set out in the solicitation, also involves a risk analysis. JCN asserts that “[ljogically, the risk presented by Patel’s unreasonably low price would flow down to Patel’s ability to be competitive in bidding for future work under the MACC. If Patel was unable to complete the ‘seed’ project, and defaulted, it could not be competitive for future work under the MACC.” PL’s Reply at 13. The Navy, however, was fully aware of this risk and accordingly took precautions in determining that Patel’s price was reasonable and that Patel would be competitive on future projects. The SSB’s final report explained the steps the Navy took:

The lowest offer from Patel was 31% below the government estimate. However, the PET explained the lowest offer from Patel appeared low as compared with the government estimate, and it appeared likely that they made a business decision to absorb some of the cost in order to guarantee a spot on this MACC. Patel was previously informed in writing that its offer was low and had the opportunity to check its estimate. It appeared that Patel decided to aggressively price out this relatively small dollar task order in order to win a spot on this MACC.
During the week of 15 Sep 03, the [Price Analyst] and SSA obtained a price breakdown from Patel to ensure Patel understood the scope of work and could accomplish the work for the proposed price. The [Price Analyst] and the SSA reviewed the price breakdown submitted by Patel with in detail via phone conference call. The SSA in addition contacted the firm’s bonding company to advise that the Navy was concerned that Patel’s price appeared to be low in comparison with the revised government estimate. The surety confirmed that they would support Patel not only for bid bond submitted, but also for payment and performance bonds for the seed project. Considering the above, Patel’s price was considered fair and reasonable.

AR 1067 (emphasis added). It is thus evident that the Navy did not accept Patel’s lowest-price offer until after taking a sequence of actions designed to evaluate that price under the solicitation criteria. The Navy assured itself that Patel was not likely to default on “this relatively small dollar task order.” Id. In short, JCN’s arguments on price and best-value grounds are not supported by the administrative record.

D. Synopsis

As stated supra, at 405, to succeed on the merits of its challenge, JCN must show that the Navy’s source selection decision was based on significant error in the procurement process and that the error was prejudicial to JCN. As discussed previously, JCN has succeeded in showing that the Navy’s action on this procurement was flawed in one respect. The failure on the part of governmental officials to complete evaluations for JCN’s prior work under government contracts and to submit those evaluations to the CCASS data*413base was not in accord with the FAR and DFARS. See supra, at 408-09. The government counters that in this procurement the Navy properly looked to what was available on the CCASS and also that any error did not prejudice JCN because the SST in its final report recommended that [offeror 8] be placed ahead of JCN in the queue for one of the five contracts. Def.’s Cross-Mot. at 25 (quoting AR 1069); Def.’s Reply at 17; Hr’g Tr. at 62-63 (Mar. 19, 2004). These observations, however, overlook the fact that the Navy assigned [offeror 8] a “strength” on the basis of CCASS data, whereas the Navy discounted JCN’s rating on the basis of CCASS evaluations that the government did not have, but should have had, available.

Nonetheless, although the absence of CCASS data for JCN operated to the detriment of JCN in this procurement, that lack was attributable to a dereliction of duty by officials concerned with other construction contracts. Furthermore, JCN had means available to it to confirm whether evaluations that should have been submitted in connection with earlier construction projects were in fact contained in the CCASS database. See supra, at 409. In these circumstances, the Court cannot determine that the Navy’s procurement decision in this case was based on significant error.

CONCLUSION

Because there is no significant error in the procurement at issue that would provide a ground for relief, JCN’s application for a preliminary injunction, request for an injunction, and motion for judgment upon the administrative record are denied. The government’s cross-motion for judgment upon the administrative record is granted. The Clerk is directed to enter judgment for defendant. No costs.

Prior to the release of this opinion and order to the public, the parties shall review this unredacted opinion for competition-sensitive, proprietary, or other protected information. The parties shall file proposed redacted versions of this opinion on or before April 22, 2004. The parties must confine such proposed redactions to information that constitutes “confidential or proprietary information” within the meaning of RCFC Appendix C, II4, which is the “protected information” covered by the Protective Order entered on February 11, 2004 in this case.

IT IS SO ORDERED.

. An unredacted version of this opinion was issued under seal on April 15, 2004. The parties thereafter jointly proposed redactions of competition-sensitive, proprietary, and confidential material, and the Court accepted those proposed redactions in their totality. The resulting redactions by the Court are represented by brackets enclosing asterisks "[* * "]. The names of three offerors have also been redacted and replaced with a generic identification, e.g., "[offer- or 7].” The only other change from the opinion and order issued under seal has been the addition of this footnote.

. "AR” refers to the Administrative Record filed with the Court. This record includes the documentation of the Navy’s solicitation of proposals and action on those proposals to make awards of contracts, plus the record of the post-award proceedings before GAO.

. The solicitation defined a Very Good rating as follows: "Assign a very good rating for proposals/factors that exceed the requirements of the solicitation in a way considered beneficial to the Government. A 'Very Good’ proposal/factor is generally one with a good probability of success, with less than an average level of risk associated with the proposed approach and the nature and number of any noted weaknesses have an insignificant impact on the proposal. A very good proposal/factor has no deficiencies.” AR 43.

. The solicitation defined an Acceptable rating as follows: "Assign an acceptable rating for proposals/factors that meet all requirements of the solicitation or exceed requirements in a way not considered beneficial to the Government. An ‘Acceptable’ proposal/factor is generally one with an average probability of success, with an average level of risk associated with the proposed approach and the nature and number or any noted weaknesses may have a minor impact on the proposal. An acceptable proposal/factor has no deficiencies.” AR43.

. In addition, the Small Business subfactor was keyed to numeric Small Business Subcontracting Goals and to Small Business Subcontracting provisions of the Federal Acquisition Regulation ("FAR”), including FAR [48 C.F.R.] § 52.219-8. AR 41.

. Two of the seventeen offerors failed to submit price proposals and thus were excluded from consideration. AR 965.

. The Navy debriefed JCN by telephone on October 14, 2003. AR 19. JCN’s protest thus was filed with GAO within "5 days after the debriefing date offered to the unsuccessful offeror." 31 U.S.C. § 3553(d)(4)(B), and consequently triggered the automatic-stay provisions of the Competition in Contracting Act, suspending performance on the contract during the pendency of proceedings before GAO. See 31 U.S.C. §§ 3553(d)(3)(A)(ii), 3553(d)(3)(B).

. With its complaint, JCN filed a Motion for Leave to Conduct Expedited Discovery. JCN sought discovery regarding two categories of information, first, deposition testimony regarding asserted bias and de facto debarment allegedly stemming from prior work on a contract at the Brunswick Naval Air Station, and, second, documentary evidence about its post-performance evaluations that it asserted had not been included in GAO’s redacted administrative record. With the motion, JCN proffered documentary evidence tending to support its bias and de facto debarment claim. Pl.’s Prelim. Inj. Mem., Ex. 1 (Contractor Evaluation Questionnaire for Contract 43-00 (June 6, 2000)). At the close of proceedings held on February 4, 2004, and as reflected in the Court’s confirmatory Order dated February 11, 2004, the Court granted JCN leave to conduct limited depositions, and those depositions were subsequently taken.

The Court denied JCN’s request for discovery related to past-performance evaluations because the government agreed to include the requested materials in the administrative record it filed with the Court on February 17, 2004. The record shows that a governmentally generated type of past-performance data was used in evaluating the proposals of certain offerors, but not others, and that the offerors for whom such additional data were used strongly tended to be in the successful group of awardees. No such data were used with respect to JCN. Pursuant to this Court’s direction at the hearing held on March 19, 2004, Hr'g Tr. at 31, 67-68, 90, on March 31, *4052004, the parties jointly moved to supplement the administrative record with information regarding the absent govemmentally generated past-performance data, and that motion was granted. In addition, JCN and the government each filed a supplemental five-page brief regarding the data, its generation, and its use both generally and particularly with respect to JCN.

The Court may properly rely on this supplemental evidence because the Navy’s use of such data for other offerors, but not JCN, ”[wa]s not adequately explained in the record before the court." Esch v. Yeutter, 876 F.2d 976, 991 (D.C.Cir.1989) (quoting one of eight exceptions to the general rule in judicial review of administrative actions against reliance on extra-record evidence, drawn from Steven Stark and Sarah Wald, Setting No Records: The Failed Attempts to Limit the Record in Review of Administrative Action, 36 Admin. L.Rev. 333, 345 (1984)).

. The Court possesses jurisdiction over this case pursuant to 28 U.S.C. § 1491(b)(1). The Tucker Act provides that ”[i]n any action under this subsection [§ 1491(b)], the courts shall review the agency’s decision pursuant to the standards set forth in section 706 of title 5.” 28 U.S.C. § 1491(b)(4).

. NAVFAC Instruction 4335.4, which "provides procedures for the preparation and distribution of NAVFAC construction contractor performance evaluations,” AR 2016 (NAVFAC P-68, 42.1503-100(b)(2)(ii)), mandates that DD Forms 2626 be completed and submitted to the CCASS "within 60 days after the facility Beneficial Occupancy Date (BOD).” AR 2006 (NAVFACINST 4335.4 114.c.(l)(a) (Mar. 20, 2000)).

. The performance evaluation lists $514,684.07 as the net amount paid to JCN. Pl.’s Supp. Br., Attach.

. Also not explained is why this particular evaluation was submitted so long after the project had been completed. The evaluation dated February 2001 recites'that August 31, 1999 was the revised contract completion date. Pl.’s Supp. Br., Attach.

. Regarding information from other sources, the solicitation provided: “In evaluating the offeror’s past performance on completed projects, the Government may consider information in the offeror's proposal and information from other sources, including references, past and current customers, Government agencies and any other sources deemed necessary.” AR 37.

midpage