Using AES-GCM with large files (>800mb)

After the release of the long awaited vetKeys feature, I started implementing it into my encrypted file storage application. But I ran into a problem that I would like to solve.

Since the files are encrypted, we must get the entire file to the browser and then perform the decrypt operation and give the decrypted data to the user as a file. But I encountered an error decrypting large files (800mb) and I understand this is a browser memory limitation.

OperationError

Raised when the operation failed for an operation-specific reason (e.g. algorithm parameters of invalid sizes, or there was an error decrypting the ciphertext).

I use the AES-GCM symmetric key and therefore have to transfer the entire file and cannot process it in chunks. Should I use CTR or CBC algorithms to get around this limitation? I’ve read that CTR and CBC are less safe and that’s why it’s recommended to use GCM.

Have you considered encrypting the file in chunks? This would also be necessary for streaming encrypted files, for example video or audio.

2 Likes

I thought about it, in this case I don’t have to return the chunk stream from the storage canister, but simply return an array of chunk identifiers and query them on the frontend, decrypt and merge them. And it would be useful for streaming, you’re right.

I did the encryption of the chunks before uploading into the canister and it helped me bypass this limit. Then another limit awaited me - 2GB for ArrayBuffer. In any case, I can’t upload anymore because I don’t use stable memory for storage in Motoko.