Skip to main content

Insights

Practical notes on defense communication.

Short posts on messaging, procurement writing, and digital presence in the defense and aerospace market. No general marketing advice.

MessagingJan 20264 min read

Why government buyers skim — and what to do about it.

Procurement officers spend an average of 90 seconds on the first review of any unsolicited capability brief. Here is what that means for how you structure your materials.

Government buyers are not hostile to new technology. They are overwhelmed with volume. A procurement officer at a defense ministry may review 40 capability statements in a week. The average time spent on initial review: 90 seconds.

This is not a problem you can solve with better design. It is a structural problem. The information hierarchy of your document either survives a 90-second scan or it does not.

What works: a one-sentence capability statement at the top, measurable outcomes before technical specifications, a clear section for relevant prior deployments, and a single explicit ask.

What does not work: an executive summary that reads like a press release, a product description that leads with engineering architecture, and a timeline that shows years of R&D before any operational context.

The test is simple: give your document to someone who does not know your technology and ask them to tell you what you do and who it is for. If the answer takes more than 30 seconds, the document needs restructuring.

DeliverablesDec 20256 min read

The 7 sections every defense capability statement needs.

A capability statement is not a company brochure. It is a structured document with a specific function in the procurement process. Here is what it must contain.

Most companies write capability statements that are actually company overviews. These do not function well in procurement contexts because they answer the wrong questions.

A capability statement answers one question for the evaluator: does this vendor have the specific capability I need, at the right maturity level, with evidence of prior delivery? Every section must serve that question.

Section 1 — Core capability (one sentence). What you do, for whom, and in what operational context.

Section 2 — Technology maturity. TRL level, current deployment status, and regulatory or certification status where applicable.

Section 3 — Technical specifications. Key performance parameters only — not a full databook. If you have a SWaP-C table, it goes here.

Section 4 — Prior deployments. Anonymized if required. Context, scale, and outcome for each reference.

Section 5 — Differentiators. What you do that direct alternatives do not. Specific and verifiable.

Section 6 — Compliance and certifications. ITAR, EAR, STANAG, MIL-SPEC, ISO, national equivalents as applicable.

Section 7 — Contact and next step. One person, one email, one ask.

ProcessNov 20255 min read

Bilingual doesn't mean translated: writing for defense procurement.

An EN→HE translation of a procurement document is rarely sufficient. The two languages serve different audiences with different decision criteria. Here is how to approach bilingual defense materials.

The most common mistake in bilingual procurement materials is treating translation as a technical exercise. You take an English document and produce a Hebrew version that says the same things in the same order.

This fails for a specific reason: the Hebrew-speaking procurement officer in the IL defense context is operating under different institutional constraints, has different reference points, and uses different evaluation criteria than an English-speaking NATO procurement official.

A bilingual document should be a bicultural document. The EN version optimizes for the expectations of an international institutional buyer. The HE version optimizes for the IL procurement process, which has specific documentation conventions and institutional language.

Practically: the structure can be similar, but the emphasis shifts. References to IL-specific programs, IMOD standards, or prior government interactions belong in the HE version in positions of prominence they would not have in the EN version.

The operational implication: if you are commissioning bilingual procurement materials, brief the writer on both audiences separately. Do not ask for a translation. Ask for a version.

MessagingOct 20253 min read

The pitch deck is not the problem.

When a defense startup does not close, the default diagnosis is that the deck needs redesigning. Usually the deck is not the problem.

We see this pattern repeatedly: a company is not converting meetings into next steps, someone recommends redesigning the pitch deck, the deck gets a visual refresh, and the conversion rate stays the same.

The deck is rarely the problem. The problem is almost always upstream: the positioning is ambiguous, the buyer type is wrong, the ask is unclear, or the meeting is happening at the wrong point in the procurement cycle.

A new deck with the same underlying message architecture will produce the same result. The design change provides the temporary feeling of having addressed the problem without addressing the problem.

The test: write the deck in plain text only, no design. If the text version does not make a coherent argument, the designed version will not either.

SectorSep 20255 min read

What EO/IR companies consistently get wrong in their materials.

Electro-optical and infrared technology companies operate in a niche where the technology is highly technical and the buyers are equally technical. This creates a specific messaging trap.

EO/IR companies assume that because their buyers are technical, their marketing materials should be maximally technical. This is a partial truth that leads to a complete error.

The evaluation process for an EO/IR payload typically involves at least three stakeholder layers: the technical evaluator who verifies specifications, the program manager who assesses operational fit, and the budget authority who assesses value and risk.

Only the first of these three is well-served by a document leading with MTF curves and NEDT values. The second needs mission scenario framing. The third needs outcome evidence and risk mitigation language.

The materials that perform best in EO/IR procurement are structured in layers: specifications available and prominent, but framed by operational scenarios that make the specifications meaningful for non-specialist readers.

One practical change that consistently improves performance: add a two-sentence mission scenario before every technical section. Not a marketing statement — a specific operational context in which the parameter you are about to describe becomes relevant.

DeliverablesAug 20254 min read

How to write a redacted case study that still convinces.

NDA constraints mean you cannot name clients or describe programs in detail. This does not mean case studies are useless — it means they have to be structured differently.

The function of a case study in a procurement context is to demonstrate that you have done this before, at the relevant scale, with a measurable outcome. NDA redaction does not have to eliminate this function.

What you can almost always keep: the sector, the problem description (generalized), the type of deliverable, the quantified outcome (if the number itself is not sensitive), and the timeline.

What needs redaction: client name, program name, specific geography, dollar value of contract, and any technical detail that reveals classified capability.

The resulting structure: 'A [sector] company facing [problem type] required [deliverable type]. After [change description], [outcome metric] within [timeframe].' This is lean, but it works.

The mistake to avoid: over-redacting to the point where the case study communicates nothing. A case study that says only 'we worked with a defense company and the outcome was positive' provides no evidence at all.

Want a topic covered?

If you have a specific question about defense communication, messaging, or procurement materials — send it through the contact form. We write about reader requests.

Send a question