Why Artificial Intelligence should NOT write your next proposalFeb 27, 2023
OpenAI’s bot, ChatGPT, has been making headlines around the world since its launch in November 2022. News from Business Insider detailed jobs ChatGPT may eventually take over. Multiple news reports have covered college students attempting to write papers using this new Artificial Intelligence (AI) software (and follow-up articles about how to counteract this type of work). If you haven’t heard about ChatGPT, it could be that it hasn’t become pervasive within your community or industry ... yet.
How it Works
ChatGPT is making a name for itself by making things “faster” for writers in many industries. OpenAI’s ChatGPT is a free platform that uses AI software to write engaging content for readers. The chatbot pulls and analyzes data and techniques from different sources including its programmed methods from “human AI trainers.” In other words, the system is trained to understand dialogue and provide responses based on programming and scanning. By tweaking the way you ask the program to generate a response, it can take a monologue and create short and compelling text. If you want to take that a step further, you can even ask it to generate social media posts complete with hashtags based on the text it wrote. In essence, it can take out some of the thought involved with writing descriptive (and by extension, engaging) text.
3 reasons to avoid using AI-generated content on your next bid
AI software sounds and is amazing; however, it is not for everyone. With human programming, the ability to rapidly cull through millions of data and generate on-topic content, ChatGPT sounds like an easy win – we certainly understand the temptation to use it on proposals! But proposal writers responding to government solicitations should be leery of using AI software like ChatGPT when writing their next proposal response. We’re going to look below the surface of this “winning” software and highlight three reasons why you should not use ChatGPT to write government proposals.
Reason 1: Artificial Intelligence can be, well, artificial …
Responding to a Government solicitation is not as simple as stringing together answers to a series of individual questions. For example, there are nuanced interdependencies across volumes (e.g., the way you organize your team will influence your technical delivery, and vice versa) that you must consider when describing your solution. Proposals are also not supposed to be too theoretical – meaning you should be describing your actual capabilities, approach, and resources, and that foundation of data is not something that ChatGPT has access to. In other words, ChatGPT may be able to describe how a requirement ought to be addressed, but it’s not going to describe how you (or your team) would address it in the real world using your real resource. Bottom line, we recommend not leaving it to AI to sort out the fact that you know about Zero Trust, Cloud Computing across multiple frameworks, or military processing systems. ChatGPT has access to a cache of information, but it does not know you, your team, or your experience.
We tell proposal teams this all the time, but most credible bidders will be able to assemble a technically compliant proposal. What sets the winning bid apart is a team’s ability to compel and convince the evaluators – often through specific examples, quantifiable metrics, and successful past performance – that they are the best team to perform the work.
You may ask the software about what security requirements you need to work in certain spaces. ChatGPT can answer that question, but it cannot give you the kind of details that show you’ve personally spoken with a Security Officer and understand the nuances of badging, workspaces, or technology requirements unique to the client. It’s up to you and your team to provide that personalization.
Remember, the RFP is a response to prove what you know and how you know it. Using predictive text may involve using robust verbs and verbose prose, but it doesn’t replace hard-earned knowledge from boots on the ground.
Example of ChatGPTs response to a request to build an introductory paragraph for a proposal response
In case you’re still not convinced, consider this: ChatGPT’s “knowledge cutoff” date is September 2021. This means it cannot pull any current information after this date. For instance, if you ask what the Pantone Color of the Year was for 2022 it will inform you that it cannot answer as it’s after its knowledge cutoff date. In the realm of government solicitations that are looking for emerging technologies to address changing geopolitical situations, even a six-month latency can make a huge difference in the data.
Reason 2: Plagiarism or Copyright Infringement is very real.
One concern that comes up a lot regarding ChatGPT is plagiarism or copyright infringement. We’ll break these down into two definitions from the Merriam-Webster Dictionary:
Plagiarism: is stealing or passing “off (the ideas or words of another) as one’s own; use (another’s production) without crediting the source”
Copyright: “the exclusive legal right to reproduce, publish, sell, or distribute the matter and form of something (such as a literary, musical, or artistic work)”.
ChatGPT’s vast knowledge comes from real people (e.g., programmers), but it also comes from other sources. With the content that comes back, none of this work is cited by the program. It’s up to the user to research the information presented to ensure it’s (1) not copywritten and (2) accurate. ChatGPT’s customer support recommends that if you’re going to use their software you run it through plagiarism software. In addition to ensuring the content is not plagiarized or copywritten, the software can also provide inaccurate and misinformation. When using the software, users will always need to counter the information provided with research to ensure its accurate and not misleading.
In addition to plagiarism and copyright infringement, the other concern is creative property. In 2022, the U.S. Copyright Office issued a second rejection for the use of autonomous artificial intelligence for creative art. According to the decision, they “will not register works ‘produced by a machine or mere mechanical process’ that operates ‘without any creative input or intervention from a human author’ because, under the statue, ‘a work must be created by a human being’”.
When generating compelling content that you may want to later copyright or trademark, the use of AI creates a quandary of who can “own” the work.
Reason 3: ChatGPT can’t replace real people.
This may seem like an obvious reason, but you need people to truly respond to a government solicitation. Even if you’re able to overlook the formatting challenges and lack of specificity, we’ve noticed a recent increase in proposals that include an orals component. In one recent case, the government requested the RFP response be an orals deck – if you’re not familiar, it’s basically a PowerPoint presentation that the team who will perform on the contract has to present and explain. The solicitation did not ask for a separate written response. In this situation, the client was required to write their response in compliance with the RFP orals requirement. Oral presentations to the government not only include you saying how you’ll meet the requirements but may also include a question-and-answer session. While there may be time for a caucus, there is a reliance on bringing the “A” team to the presentation to think on their feet to respond to the question or scenario. There isn’t time to consult ChatGPT or another AI software for the response. At the end of the day, AI doesn’t replace real knowledge by your team.
Conclusion: Autonomous AI software is amazing and will continue evolving as the programs grow and learn. However, it does not replace the knowledge you have from personal experience. A proposal isn’t a free-form essay about a topic. It’s a hyper-focused response to address the needs with the resources and capabilities that you have. Unless everything about your company and the way you do things exists in a way for the software to pull it, then you need real people to write real approaches with real results. Relying on the AI software to state how you’ll address re-badging or access to specific offices with strict security requirements can lead to missing the point of the government’s request.
There are inherent risks in using this type of software in your government proposal, as mentioned above, that you should consider before creating an account for your next proposal.
In researching information for this blog, I asked ChatGPT to show me an example of a government proposal. While it laid out the things it “thought” should be included, it was not comprehensive, nor would it have been compliant. Government proposals have a set format and style the graders are looking for during their review. It’s why our FastProp Process includes How To guides and templates to help ensure you start off on the right foot. Check out some of our free templates, our book (you can buy it on Amazon!), and even our self-paced FastProp course. And consider doing something digging before you engage ChatGPT or other AI-type software on your next proposal.
Written by Morgan Over and Rebecca Wayland
Morgan is our marketing manager and a proposal support specialist. When she isn’t handling marketing for Trident, she is tech editing documents and building orals decks for our clients. As a military spouse based in South Korea, she supports clients around the world as part of our globally dispersed team.
Rebecca is our HR and Development Manager. While Rebecca primarily wears the HR hat, she offers comprehensive proposal management, capture support, market research and training. She is also our GWAC lead so if you’re exploring Polaris, OASIS+, or Alliant 3, she is definitely your SME. As a U.S. Navy veteran and military spouse based in Honolulu, she supports clients around the world as part of our globally dispersed team.