The launch of ChatGPT in late 2022 reignited a lifelong obsession with systems thinking — inspired by James Burke’s Connections approach to uncovering hidden links between disciplines. That spark led to the creation of Brug AI, a foresight engine built to surface breakthrough ideas by connecting emerging technologies.
Real-world testing quickly exposed a deeper vulnerability.
As Brug AI generated valuable, potentially patentable insights, it became clear that the innovation process itself was legally exposed. Major LLM providers took wildly different stances on handling sensitive R&D content: some explicitly warned users to stop inputting critical information, while others downplayed the risks entirely (Mar 2025) — comparing their platforms to using email or cloud storage services, despite the different legal realities.
There was no standard. No clear framework. And no way to prove anything after the fact.
The realization was simple but urgent:
The innovation process itself needs legal and technical protection — not just its outputs.
This critical gap led to the invention of NDAMode™: the first cryptographic framework for securing AI sessions at the point of origin, enabling enforceable confidentiality, authorship verification, and full session compliance — all without relying on blind trust.