The current generation of cloud meeting tools all work the same way. A bot joins your call, the audio streams to a third-party server, transcription happens there, the summary is generated there, and a copy of the document lives there afterwards. From a feature point of view they look impressive. From a privacy point of view they hand the contents of your meeting to a vendor before anyone in the meeting has had a chance to read it back.
This piece is for lawyers, advisors, compliance leads, HR teams and public sector staff who are evaluating these tools, and for anyone in those roles who has been told to “just use whatever marketing is using”. The argument here is not that AI meeting tools are bad. It is that the cloud delivery model is not designed for sensitive work, and alternatives exist.
What “cloud” means in a meeting tool
The word gets used loosely. Three things happen on a vendor’s servers when you use a cloud meeting tool:
- The audio is uploaded. Either through a bot that joins the call, or via a recorder that streams audio in real time. Either way the recording leaves the device.
- The transcription runs there. Speech-to-text models are large. The vendor runs them on their hardware, against your audio.
- The document is generated there. A language model reads the transcript and produces the meeting notes. That happens on the vendor’s infrastructure too.
A copy of all three artefacts (audio, transcript, document) ends up in the vendor’s storage. Some vendors give you a setting to delete it after some period. Some keep it indefinitely. Almost all reserve the right to use the data for “service improvement” unless you actively opt out.
For a casual all-hands meeting that does not get to that, this is fine. For a meeting where a client says something the firm has a duty to keep confidential, it is not.
The professional and contractual problem
In several professional contexts the duty of confidence sits above the question of whether the cloud vendor has good security.
A solicitor taking instructions from a client owes a duty of confidentiality that is older than any cloud provider’s security certification. The fact that the conversation has been processed by a third party at all is the issue. The solicitor cannot consent on the client’s behalf to that processing happening at a vendor they did not choose, in a jurisdiction they did not name, against terms they have not read.
In-house counsel discussing a regulatory matter has the same constraint. A compliance officer talking to a whistleblower has it more sharply. A clinician taking a history from a patient has it from a different angle but with the same effect. A trustee meeting where the charity’s governance is being discussed has it again.
In each case the legal entity that should be deciding what happens to the recording is not the firm or the meeting organiser. It is the person who said the sensitive thing, and they did not have the conversation about cloud AI before they spoke.
The data-protection problem
UK GDPR makes this concrete. The minute personal data is processed by a third party on your behalf, that third party becomes a processor. You as the controller need a written contract with them setting out the processing, a record of that processing, an assessment of the risks, and a way to honour data-subject rights against the data the processor holds. If the processing happens outside the UK or the EU, you also need a transfer mechanism.
A cloud meeting tool is a processor by any reading. The vendor’s standard data-processing addendum tries to cover this, and for low-risk meeting content it works fine. For meetings that are sensitive it pulls in scrutiny that could have been avoided by not creating the processor relationship in the first place.
A tool that runs entirely on your own machine is not a processor. There is nothing to assess, no DPIA to write, no transfer mechanism to argue about. The legal posture is simpler because the technical posture is simpler.
The “but it’s encrypted in transit” answer
Vendors push back on this with a technical argument. The audio is encrypted in transit. The storage is encrypted at rest. The vendor is SOC 2 certified. The data is in EU regions. Each of those is true and each of them is beside the point.
Encrypted in transit means the vendor is the one decrypting it at the other end. They can read it. They have to be able to read it for the transcription model to run on it. The encryption is a barrier between you and the wider internet. It is not a barrier between you and the vendor.
The same is true of encryption at rest. The vendor holds the keys. The encryption protects the data from a separate intruder breaking into the vendor’s storage. It does not protect the data from the vendor itself, the vendor’s employees acting in error, or the vendor receiving a lawful order from a government to hand it over.
For most meetings, none of this matters. For the meetings where the cloud question comes up at all, all of it does.
What changes when nothing leaves the device
Whistle Enterprise records the meeting on your computer, transcribes it on your computer and writes the document on your computer. The audio, the transcript and the document live in a workspace on your machine that you can encrypt with a password. Nothing about the meeting reaches a vendor’s server because there is no vendor server. The product runs locally; there is no service to call.
The professional duties still apply. The data-protection duties still apply. But the technical surface that those duties have to cover shrinks to “what’s on this machine”. For most of the audiences listed at the top of this page, that is the right size for it to be.
If you want the buying detail, the pricing page has the tier table and the FAQ. If you want to read the security position in product terms, the security notes cover what the product does and does not do at a system level. If you want to see what the document Whistle Enterprise produces actually looks like, download the trial and run it on a recording you already have.