# The AEC-Specific Software Decision Nobody Talks About

**By Dan Cumberland** · Published May 13, 2026 · Categories: AI Strategy

> Most AEC firms are answering the wrong question about AI software.  The dominant conversation is "which tool should we buy?"— but the decision that compounds...

## The Strategic Gap Hiding in Plain Sight

Most AEC firms are answering the wrong question about AI software\.  The dominant conversation is "which tool should we buy?"— but the decision that compounds over the next decade is structural, not transactional, and most firms are making it by default\.

Fifty\-three percent of A&E firms now report using AI tools, up from 38% the prior year[1](/blog/blog-architecture-decision#ref-1)\.  Only 27% use AI for automation, problem\-solving, or decision\-making[2](/blog/blog-architecture-decision#ref-2)\.  The gap between those two numbers is the story\.  One measures firms touching AI\.  The other measures firms operating with it\.

Walk into any AEC industry publication this quarter and the headlines read like inventory: top tools for design teams, ten plugins that save time, the best AI for specifications\.  All useful\.  None of it touches the real software decision facing AEC firms— whether AI capability will live inside the firm as compounding intellectual property or be rented from general\-purpose vendors\.  The decision underneath the noise has a name\.

## The Architecture Decision Underneath the Tool Decision

An architecture decision is a structural commitment about where capability lives, what compounds inside the firm, and what is rented from outside\.  Tool decisions are downstream and easy to reverse\.  Architecture decisions are upstream and they compound for years, whether the firm names them or not\.

Consider what makes them different\.

```html-table
<table><thead><tr><th>Tool Decision</th><th>Architecture Decision</th></tr></thead><tbody><tr><td>Which Revit add-in to license</td><td>Whether design methodology stays inside the firm or enters a vendor's training pool</td></tr><tr><td>Which ChatGPT plan for the team</td><td>Where institutional knowledge lives in five years</td></tr><tr><td>Which estimating plugin to pilot</td><td>Which workflows are portable and which lock the firm into one ecosystem</td></tr><tr><td>Reversible in a quarter</td><td>Compounds for a decade</td></tr></tbody></table>
```

Tool decisions are reversible\.  Architecture decisions compound\.  And every AEC firm in the $20M–$100M range is making architecture decisions about AI right now— most of them by default, through shadow IT, vendor defaults, and the path of least resistance\.

A 75\-person practice where seven people individually subscribed to ChatGPT Plus has made an architecture decision\.  A firm that built three years of project workflows inside one vendor's AI features has made one too\.  A firm that fed proprietary specifications into a generic chatbot to "see what it could do" has made the most consequential one of the three\.  None of these firms named the decision\.  They made it anyway\.

The shadow\-IT pattern is documented across software industries[3](/blog/blog-architecture-decision#ref-3)\.  Naming the decision matters because of what gets lost when it stays unnamed\.

## Why the Tool\-First Frame Costs You

When AEC firms treat AI as a tool category to evaluate rather than an architecture to design, three costs compound: proprietary methodology dilutes into general\-purpose models, vendor lock\-in deepens through workflows built inside proprietary platforms, and the firm rents what could be owned\.

### Cost 1 — Proprietary methodology dilutes

The first cost is the most AEC\-specific\.  Yegatech frames it directly[4](/blog/blog-architecture-decision#ref-4):

> "AEC companies possess unique data sets from their projects, and AI built on specific data is more aligned with the company's needs than those based on generic data sets\."

Design methodology, project archives, and client\-specific workflows are the firm's competitive moat\.  Generic AI tools at default tiers can use inputs for training under their standard terms\.  Data leakage is named as the most significant AI security concern facing AEC organizations today[5](/blog/blog-architecture-decision#ref-5)\.  Proprietary methodology is a competitive asset when it compounds inside the firm\.  It is a liability the moment it enters a vendor's training pool\.

### Cost 2 — Workflow embedding deepens lock\-in

The second cost is workflow embedding\.  A workflow built inside a single vendor's AI features— Revit AI add\-ins, ACC custom dashboards, a specific platform's generative tooling— belongs to that vendor's ecosystem\.  When the firm changes vendors, the institutional knowledge embedded in the workflow does not travel\.  The same logic applies to a documented [AI governance strategy](/blog/ai-governance-strategy)— without explicit data classification and contractual export rights, lock\-in deepens silently\.  Bluebeam's 2026 outlook found 42% of AEC firms cite data sharing security as a top integration challenge[6](/blog/blog-architecture-decision#ref-6)\.  The firms naming this aren't doing it because they're cautious\.  They're doing it because they've seen what happens\.

### Cost 3 — The firm rents what it could own

The third cost is rental in place of ownership— and it adds up\.  Aaron Vorwerk's 3Ps framework \(Practical, Purposeful, Private\)[7](/blog/blog-architecture-decision#ref-7) is a useful guardrail at the tool level\.  But guardrails do not replace an architecture decision\.  Vorwerk himself frames the deeper problem:

> "AI without a solid data infrastructure is like building on sand\."

One honest caveat\.  Some firms genuinely don't have proprietary methodology worth protecting— the small commercial shop doing standard tenant fit\-outs on standard templates can rationally commoditize and use generic tools\.  This article is for the firms positioning around expertise\.  If your firm's edge is the methodology, the methodology is the asset that needs an architecture around it\.  If you're not sure which side of that line you sit on, that's its own [hidden cost worth measuring](/blog/hidden-costs-ai-projects) before the next vendor pitch\.

If the cost is real, the question is how to make this decision on purpose\.  Software engineering already has a tool for that\.

## Borrowing the ADR Framework from Software Engineering

An Architecture Decision Record \(ADR\) is a short document— typically one or two pages— that captures a single architectural decision, its context, the options considered, and its consequences[8](/blog/blog-architecture-decision#ref-8)\.  The practice is standard in mature software organizations and transfers directly to AEC firms making AI and software architecture choices\.

ADRs work in software for three reasons\.  They force options to be named before commitment\.  They surface consequences early\.  They create institutional memory that survives staff turnover\.  AWS documents the ADR process as standard prescriptive guidance for their architecture teams[9](/blog/blog-architecture-decision#ref-9)\.  Thoughtworks and most mature engineering organizations follow the same pattern\.

The transfer to AEC is not a thing AEC firms already do— so let me name that openly\.  Software engineering teams have been writing ADRs for over a decade\.  AEC firms making AI decisions of equal consequence have, for the most part, written nothing down\.  This article is making the transfer\.  It's a useful import, not an existing AEC convention\.

What an AEC ADR looks like in practice:

> **Decision:** \[The specific architecture choice— e\.g\., "Build an in\-house RAG \(retrieval\-augmented generation\) system on five years of project archives for spec generation\."\]  **Context:** \[What made this decision necessary— competitive pressure, a specific workflow bottleneck, a client requirement\.\]  **Options considered:** \[The 3–4 paths— generic ChatGPT subscription, vendor\-embedded AI feature, in\-house RAG \(retrieval\-augmented generation\) system, hybrid approach\.\]  **Decision made:** \[The chosen path and why\.\]  **Consequences:** \[What the firm now owns, what it has committed to, what becomes harder to reverse\.\]

That's the entire artifact\.  One page\.  No consultant required to write it\.  An Architecture Decision Record is not a deliverable for a consultant\.  It is a discipline a firm imposes on itself before it spends another dollar on AI tooling\.  This is where Dan's broader [AI Decision Framework for Founders](/blog/ai-decision-framework-founders) overlaps with software\-engineering practice— the discipline of writing the decision down is half the work\.  The other half is asking the right questions\.  You don't need prompts\.  You need to think\.

## Four Architecture Decision Questions $20M–$100M AEC Firms Aren't Asking

Four questions separate an architecture\-first AI decision from a tool\-first one\.  Most firms in the $20M–$100M range haven't asked any of them— not because they aren't sophisticated, but because the vendor\-led conversation never raises them\.

The four questions are not technical\.  They are ownership questions, and ownership is what compounds\.

### Question 1 — What lives inside the firm vs outside?

Some capabilities should compound as firm IP— proprietary prompts, custom assistants trained on firm methodology, decision\-support systems trained on project archives\.  Others are reasonable to rent: generic drafting, summarization, transcription\.  The split depends on the firm's strategy\.  A firm whose differentiation is healthcare\-facility expertise should own the AI capability that touches healthcare\-facility methodology\.  A firm whose differentiation is speed should rent more aggressively\.  This is the [AI consultant vs in\-house](/blog/ai-consultant-vs-inhouse) decision applied to capability, not headcount\.

### Question 2 — Who owns the workflows you build inside vendor platforms?

A workflow built inside Revit's AI tools, ACC's custom dashboards, or a vendor\-specific plugin is not portable\.  When the firm changes vendors, the workflow doesn't come along\.  The institutional knowledge embedded in it stays with the platform\.  Frame the question early— before the workflow exists\.  A workflow you can document, export, and rebuild on a different platform is an asset\.  A workflow that only runs inside one vendor's ecosystem is a hostage\.

### Question 3 — What is your data portability strategy?

The IFC standard— Industry Foundation Classes, recognized as ISO 16739\-1:2024 and maintained by buildingSMART International[10](/blog/blog-architecture-decision#ref-10)— is the open, vendor\-neutral data standard for AEC interoperability\.  It is necessary but not sufficient\.  Open standards alone don't beat vendor lock\-in\.  They make it negotiable\.  Pair IFC with contractual export rights and a documented data\-classification policy\.  IFC has existed for years and lock\-in still dominates the industry\.  The firms that have actually escaped lock\-in did so by making data portability a contractual condition, not just a technical preference\.

### Question 4 — What compounds in five years vs what dilutes?

A firm that fed three years of proprietary design methodology into a generic AI tool's training pool has not built an asset\.  A firm that trained a private model on the same data has\.  Same activity\.  Opposite outcome\.  The difference is the architecture decision underneath\.  Both are true— vendors will be part of the stack, and methodology can still be owned\.  The firm that asks "what do we own at the end of five years?" makes a different decision than the firm asking "what's the best tool today?"

## What This Looks Like for a $20M–$100M AEC Firm

A $20M–$100M AEC firm is large enough to make strategic IT decisions and small enough to be making them ad\-hoc\.  The architecture decision for this tier is different from a 5\-person practice \(which can rationally rent everything\) and different from a 1,000\-person firm \(which has formal governance\)\.  It is the tier where the decision is both possible and most often unmade\.

A 75\-person firm has the data, the case archive, and the methodology to make AI compound\.  It rarely has the dedicated function to make sure it does\.

What "doing it on purpose" looks like:

- A quarterly architecture\-decision review using the ADR format\.  One hour, one document per decision, one named owner\.
- A single accountable owner for the architecture— typically a managing principal or COO partnered with IT\.  Not IT alone\.  IT runs systems; architecture decisions are strategic\.
- A documented data\-classification policy that names what proprietary methodology is and what tier of AI tool it can touch\.

Barge Design Solutions offers a concrete example of architecture\-first thinking in practice[11](/blog/blog-architecture-decision#ref-11)\.  The firm reduced Health & Safety Plan creation from 8–10 hours down to 10–15 minutes by training an AI assistant on its own historical project data and regulatory information\.  The point isn't the time saved\.  The point is what made it possible\.  The AI was trained on firm\-specific data, not bolted on as a generic chatbot\.  That choice is the architecture decision in miniature\.

The Bluebeam outlook quantifies what an architected approach earns\.  68% of early adopters saved at least $50,000\.  46% reclaimed 500–1,000 hours on tasks like scheduling, planning, and document analysis[12](/blog/blog-architecture-decision#ref-12)\.  The numbers are concentrated where AI is architected into workflows, not where it has accumulated as a stack of disconnected subscriptions\.  The firms that win the next decade in the $20M–$100M tier are the ones that name the architecture decision now, while the cost of getting it wrong is still recoverable\.

## FAQ — Architecture Decisions in AEC

The questions AEC leaders ask most often about AI software architecture— answered directly\.

### What is the most overlooked software decision AEC firms face?

Whether AI capability will live inside the firm as compounding intellectual property or be rented from general\-purpose vendors\.  Tool selection is downstream of this decision\.  Most firms are making it by default through shadow IT and vendor defaults rather than by deliberate choice\.

### What is an Architecture Decision Record?

An Architecture Decision Record \(ADR\) is a short document— typically one or two pages— that captures a single architectural decision, its context, the options considered, and its consequences[8](/blog/blog-architecture-decision#ref-8)\.  The practice was popularized in software engineering and is now standard in mature organizations, including those documented by Martin Fowler and AWS Prescriptive Guidance[9](/blog/blog-architecture-decision#ref-9)\.

### What percentage of AEC firms actually use AI strategically?

53% of A&E firms reported using AI tools in 2025, up from 38% the year before[1](/blog/blog-architecture-decision#ref-1)\.  Only 27% use AI for automation, problem\-solving, or decision\-making[2](/blog/blog-architecture-decision#ref-2)\.  The gap reflects the difference between firms touching AI and firms operating with it\.

### How can AEC firms avoid vendor lock\-in?

By making explicit architecture decisions about data portability before tool selection\.  The IFC standard \(ISO 16739\-1:2024\), maintained by buildingSMART International, is the open vendor\-neutral data standard for AEC interoperability[10](/blog/blog-architecture-decision#ref-10)\.  Pair it with contractual data export rights to make lock\-in negotiable rather than structural\.

### Should AEC firms use ChatGPT for client work?

Only on tiers where inputs are not used for training \(typically Team or Enterprise plans\), and only with a documented data\-classification policy in place\.  Generic\-tier AI tools create intellectual property and confidentiality risk for firms with proprietary design methodology[5](/blog/blog-architecture-decision#ref-5)[7](/blog/blog-architecture-decision#ref-7)\.

### What ROI are early AEC AI adopters seeing?

68% of early adopters have saved at least $50,000\.  46% have reclaimed 500–1,000 hours on tasks like scheduling, planning, and document analysis[12](/blog/blog-architecture-decision#ref-12)\.  The ROI concentrates where AI is architected into workflows rather than purchased as standalone subscriptions\.

## The Decision Is Already Being Made

Every AEC firm in the $20M–$100M range is making architecture decisions about AI right now— through shadow IT subscriptions, vendor defaults, and the path of least resistance\.  The decision is happening\.  The question is whether the firm is making it on purpose\.

AI mastery in AEC is not a tactical question\.  It is a strategy question wearing a tooling costume\.  The firms that win the next decade in this tier are the ones that name the architecture decision while the cost of getting it wrong is still recoverable\.  The framing matters because the alternative isn't neutral\.  Doing nothing is a choice that compounds the same way doing something does— it just compounds in someone else's favor\.

If the architecture question is one your firm hasn't named yet, the first ADR is a good place to start\.  An implementation partner can help you write it\.  Dan Cumberland Labs works with AEC firms making exactly these decisions— start with our [AI strategy services](/services/ai-strategy/) overview if a working session would help\.

Move closer to the fire\.  Engage the harder architecture conversation rather than hiding in the tool one\.  AI is intellectual augmentation— it amplifies what the firm already does well, and it dilutes what the firm hasn't named\.  The architecture decision is what determines which one happens\.

⚠️ EVERYTHING BELOW IS PIPELINE METADATA — NOT PUBLISHED

## References

1. Deltek, "What the 46th Annual Deltek Clarity AE Study Reveals About the Architecture and Engineering Industry" \(2025\) — [https://www\.deltek\.com/en/about/media\-center/press\-releases/2025/what\-the\-46th\-annual\-deltek\-clarity\-ae\-study\-reveals\-about\-the\-industry](https://www.deltek.com/en/about/media-center/press-releases/2025/what-the-46th-annual-deltek-clarity-ae-study-reveals-about-the-industry)
2. Bluebeam \(Nemetschek Group\), "New Bluebeam Report Shows Early AI Adopters in AEC Seeing Significant ROI Despite Uneven Adoption" \(2025\) — [https://press\.bluebeam\.com/2025/10/new\-bluebeam\-report\-shows\-early\-ai\-adopters\-in\-aec\-seeing\-significant\-roi\-despite\-uneven\-adoption/](https://press.bluebeam.com/2025/10/new-bluebeam-report-shows-early-ai-adopters-in-aec-seeing-significant-roi-despite-uneven-adoption/)
3. aec\+tech, "A Practical, Purposeful, and Private Approach to AI Adoption in AEC" \(2025\) — [https://www\.aecplustech\.com/blog/practical\-purposeful\-private\-approach\-ai\-adoption\-aec](https://www.aecplustech.com/blog/practical-purposeful-private-approach-ai-adoption-aec)
4. Yegatech, "Why should AEC companies experiment with building their own AI solutions?" \(2025\) — [https://yegatech\.com/unlocking\-competitive\-advantage\-why\-should\-aec\-companies\-experiment\-with\-building\-their\-own\-ai\-solutions/](https://yegatech.com/unlocking-competitive-advantage-why-should-aec-companies-experiment-with-building-their-own-ai-solutions/)
5. Interscale, "AI Security Risks AEC Firms Face When Deploying AI Tools" \(2025\) — [https://interscale\.com\.au/blog/ai\-security\-risks\-aec/](https://interscale.com.au/blog/ai-security-risks-aec/)
6. Bluebeam \(Nemetschek Group\), "2026 AEC Technology Outlook" \(2025\) — [https://press\.bluebeam\.com/2025/10/new\-bluebeam\-report\-shows\-early\-ai\-adopters\-in\-aec\-seeing\-significant\-roi\-despite\-uneven\-adoption/](https://press.bluebeam.com/2025/10/new-bluebeam-report-shows-early-ai-adopters-in-aec-seeing-significant-roi-despite-uneven-adoption/)
7. aec\+tech \(Aaron Vorwerk\), "A Practical, Purposeful, and Private Approach to AI Adoption in AEC" \(2025\) — [https://www\.aecplustech\.com/blog/practical\-purposeful\-private\-approach\-ai\-adoption\-aec](https://www.aecplustech.com/blog/practical-purposeful-private-approach-ai-adoption-aec)
8. Martin Fowler / Thoughtworks, "Architecture Decision Record \(bliki\)" \(2024\) — [https://martinfowler\.com/bliki/ArchitectureDecisionRecord\.html](https://martinfowler.com/bliki/ArchitectureDecisionRecord.html)
9. Amazon Web Services, "ADR process — AWS Prescriptive Guidance" \(2024\) — [https://docs\.aws\.amazon\.com/prescriptive\-guidance/latest/architectural\-decision\-records/adr\-process\.html](https://docs.aws.amazon.com/prescriptive-guidance/latest/architectural-decision-records/adr-process.html)
10. buildingSMART International, "Industry Foundation Classes \(IFC\)" \(2024\) — [https://www\.buildingsmart\.org/standards/bsi\-standards/industry\-foundation\-classes/](https://www.buildingsmart.org/standards/bsi-standards/industry-foundation-classes/)
11. Building Design \+ Construction, "AI in AEC: Where firms should start and how to scale adoption" \(2025\) — [https://www\.bdcnetwork\.com/aec\-tech/article/55359703/ai\-in\-aec\-where\-firms\-should\-start\-and\-how\-to\-scale\-adoption](https://www.bdcnetwork.com/aec-tech/article/55359703/ai-in-aec-where-firms-should-start-and-how-to-scale-adoption)
12. Bluebeam \(Nemetschek Group\), "2026 AEC Technology Outlook — Early Adopter ROI Data" \(2025\) — [https://press\.bluebeam\.com/2025/10/new\-bluebeam\-report\-shows\-early\-ai\-adopters\-in\-aec\-seeing\-significant\-roi\-despite\-uneven\-adoption/](https://press.bluebeam.com/2025/10/new-bluebeam-report-shows-early-ai-adopters-in-aec-seeing-significant-roi-despite-uneven-adoption/)


---

Source: https://dancumberlandlabs.com/blog/architecture-decision/
