On the security of plugins

March 14, 2022

I was recently curious about how various applications implemented their plugin architectures. Obsidian plug-ins in particular seemed powerful, and I was impressed they seemingly found a way to implement extensions securely without needing a sandbox.

I installed a calendar plugin, and was immediately intrigued—how is the application rendering external code directly in the application, as if it were part of the application? Did their team solve secure extensions?

I then saw this thread. Ah. So security is not the goal after all. Extensions are rendered directly in the application as if they were part of it because they really are part of it: external, network-downloaded code runs as first-party. You could seemingly download a plugin that runs rm -rf ~/, and it would proceed to delete your home directory. The advice I’ve seen on how to avoid this is essentially to practice caution; “don’t download plugins that run rm -rf on your hard drive.”

I felt somewhat let down. Many local-only apps tout themselves as “privacy focused” simply by virtue of the fact they operate on local data. Yet what does privacy really mean, if apps that allow system-access plugins use that term?

(Logseq was one I saw pop up, which describes itself as "privacy-first”; I have not yet peered into its plugins architecture. Obsidian offers end-to-end encrypted sync; I wonder however if plugin developers could extract encryption keys, given they run with the same permissions as the application?)

I take occupational offense to misuse of the term private because we’ve spent the last half decade building Standard Notes to be private without any ambiguity to what that term means.

Our extensions environment uses a sandboxed two-way message bridge. Extensions cannot access anything you wouldn’t want them to—most certainly not your filesystem. Extensions for us are primarily editors and themes, and not general application extensibility. They can’t add custom toolbar buttons, or implement custom behavior outside the context of an isolated editor frame. Less broad power certainly, but it’s a trade off between power and access. For us, that’s a no-brainer. I’m not sure otherwise privacy-minded people understand this tradeoff.

Using the word “private” as “anything that isn’t on a cloud” is a low bar, in my opinion. We know this is not the definition of private we want. When we think private, and when software products typically use the word private, they mean it to say privacy is a primary focus of the application, as enacted and permeated through mission, culture, code and operation. Using private to refer to something that is local-first is like a “gluten-free” label on a water bottle. Sure, it’s true in a sense, but also, come on.

When it comes to privacy and security, it’s deathly important to be as unambiguous as possible. You could hardly over-communicate. Ideally you’d let your code speak for itself, but closed, proprietary source seems to be the trend in this generation of new tools. When you close-source your encryption software, you have a lot of compensating to do. I can’t yet say I’ve seen this from the many varying tools offering privacy and encryption services.

But the decade is young.

Thanks for reading

Go to the top