For better or worse, artificial intelligence (AI) is transforming the legal industry. The Government Accountability Office’s (GAO) recently decided Bramstedt Surgical Inc.[1], dedicating three pages to warnings about penalties it could have imposed on the protester and protester’s counsel for incorrect citations. For those tracking GAO’s recent handling of protesters and lawyers who file documents that appear to be riddled with AI-hallucinated citations or legal arguments citing to cases that do not stand for the premise alluded to, the decision comes as no surprise. Nevertheless, the Bramstedt decision serves as a stark reminder that while AI can be a useful tool, it can also create serious professional and legal exposure when used without proper oversight. In this blog, we will explain why GAO’s decision should be a wake-up call for government contractors relying on AI for their protests, identify key risks protesters need to understand before using AI, and provide actionable advice for protesters to ensure their lawyers are using AI responsibly.
The Bramstedt Decision
Bramstedt Surgical Inc., a longtime provider of surgical instrument maintenance services for the Department of Veterans Affairs (VA), filed a post-award protest arguing that the VA should have directly notified it of the opportunity to compete, despite the solicitation’s proper posting on SAM.gov. While GAO dismissed the bid protest for failing to state a legally sufficient basis, what really caught GAO’s attention was the protester’s citations to legal authorities that simply did not exist.
The last three pages of GAO’s decision are dedicated solely to addressing what it described as citations that “bear the hallmarks” of AI‑generated legal authority.[2] GAO noted that the protest appeared to rely on “a large‑language model or other artificial intelligence… without adequate verification that the generated results were accurate.”[3] GAO’s deduction appears to be that the protester or its lawyer submitted arguments supported by AI‑fabricated authority.
While GAO dismissed the protest for an altogether different reason, its suspicion regarding AI certainly overshadows the underlying dispute and ends in a wake-up call from GAO that every protester and their lawyer should heed: future filings containing fabricated legal authorities may result in sanctions and other penalties.
Key Risks of Using AI for Bid Protests
The legal profession is increasingly integrating AI into research, drafting, and analysis. Used responsibly, these tools can improve efficiency and expand access to information. But AI is just one step on the journey to the right answer rather than the end all, be all. AI is a tool and not a substitute for sound legal judgment or, frankly, a lawyer’s duty of candor to the tribunal. AI frequently provides false information in its quest to generate helpful responses, so protesters and attorneys cannot rely on AI output without verifying its accuracy.
While the Bramstedt decision does not dispositively address whether the protester or its counsel used AI, GAO’s analysis does illustrate three key risks:
- AI Can Generate “Hallucinated” Citations: Large‑language models can produce convincing but entirely fictional citations to legal authorities. Without a human checking every single citation to ensure it actually exists and that the content it cites is correct, these errors can slip into filings.
- Courts and Administrative Tribunals are Watching: Judges and administrative bodies are becoming more vigilant about AI‑generated submissions. We may be in the stern warnings period but, as the list of GAO decisions chiding potential AI misusage grows, it is not hard to imagine that GAO’s patience will wane. We may soon find GAO more willing to impose penalties beyond scolding in these situations.
- Clients May Bear the Consequences: Potential penalties for lawyers who cite to an AI hallucinations are wide-ranging and plentiful, but the protester also suffers harm no matter where the error originates. In addition to the risk of derailing a protest that might otherwise succeed on the merits, the reputational harm from having your business’s name associated with a protest based on non-existent legal authorities will stick around on the Internet in perpetuity.
Having the AI Talk with Your Attorney
Protesters do not need to fear AI, but they should understand how their legal team uses it. A reputable law firm will be transparent about its processes and able to explain how it prevents the kinds of errors that may have occurred in the Bramstedt protest. Here is a brief list of questions you should consider asking your protest counsel before engaging them:
- Do you use AI tools in legal research or drafting?
- What safeguards do you have to ensure accuracy?
- How do you verify citations, facts, and legal authorities generated by AI?
- Do you have internal policies governing AI use?
GAO’s Bramstedt decision is not an indictment of AI, but a reminder that protesters and their lawyers need to use technology responsibly. Your lawyer should be able to answer these questions and leave you feeling confident that humans will verify all AI-generated content, and your lawyer will maintain accountability and quality control at every step of the representation. Your business’s legal rights and reputation are too important to risk on a shortcut.
If your company is navigating a complex procurement or considering a potential protest, it is essential to ensure your legal team is using AI to your advantage and not your detriment. If you have questions about this topic or bid protests in general, please contact Katie Burrows, Josie Farinelli, or another member of PilieroMazza’s Bid Protests Group.
____________________
If you’re seeking practical insights to gain a competitive edge by understanding the government’s compliance requirements, tune into PilieroMazza’s podcasts: GovCon Live!, Clocking in with PilieroMazza, and Ex Rel. Radio.
[1] B‑424064, Jan. 28, 2026
[2] Bramstedt at 10.
[3] Id. at 11-12.
