How to upload file to google cloud storage using Java?

What is the best strategy for encrypting files at rest in Ruby?

  • I am building a Rails 3.0.7 app with Ruby 1.9.2.  We use the Carrierwave gem to handle file uploads.  I would like to insert pre-processing of the files to encrypt them at rest, and decrypt them prior to download by users.  I can access the file data prior to the upload in Carrierwave, and I have some ideas about where to insert decryption, but I'm not sure how best to apply encryption to the file data itself. I know bcrypt is popular for password storage, but is it the right solution for file data?  If so, does anyone have experience running file data through the Ruby bcrypt library? Update:  Based on the excellent response from to a similar , please note the following: - Q: Who needs to decrypt the files?   A:  Authenticated users with appropriate authorization should be able to download files, but all decryption can be self-contained in the application (i.e., the users are not involved in generating or managing keys). - Q:  Will this file be changed in place?   A:  No.  Files are not modified in place. - Q:  Who is the attacker?    A:  Since files will be stored either on a remote file server or in the cloud, encryption of files at rest is intended to thwart: (1) attackers who are able to breach perimeter security of the file store, and (2) "rogue insiders" involved in management of the file store.  It is understood that if an attacker manages to breach the application security and gain sufficient authorization, they would potentially have access to decrypted file data. - Q:  What level of access should we assume?   A:  Application users will be authenticated via a conservative Devise login strategy, and resource-level access is authorized by roles via Declarative Authorization.  Infrastructure users (e.g., hosting and file store personnel) theoretically have access into the perimeter of the file store. - Q:  Do we need to provide integrity and non-repudiation services?   A:  I am interested in comments on this as well.  If files are stored in the cloud (e.g., AWS S3), I'm roughly familiar with techniques for validating the file's integrity via metadata, but am far from expert on this point.  Would love to hear what others are doing in this respect.

  • Answer:

    Some points: Performance: You are going to want to encrypt the files with a stream cipher - probably AES or blowfish (which bcrypt utilizes).  You will want your app to stream the file back without loading the entire thing in memory.  You should benchmark the performance of a variety of stream ciphers - looking at both CPU and memory overhead.  Also note that some ciphers can enjoy hardware support for dramatically better performance (AES at least - not sure about blowfish) Resumability: Supporting the HTTP Range header (to support pausing/resuming downloads) may be a concern.  You will have to be careful when selecting a block cipher mode (the standard CBC mode is out, since you can't seek with it) - see http://en.wikipedia.org/wiki/Block_cipher_modes_of_operation for an obtuse overview of the modes. Data Store Attacks: Make sure you're not using the same key for each file - in case an attacker gains access to your S3 bucket or other backing stores.  A randomly generated key per file is ideal (not based on the file name, or any other metadata stored with the file).  Perhaps the file's key could be encrypted via an asymmetric cipher - a pretty common practice.  If the asymmetric keys are stolen, you are just as vulnerable. Note, however, that encrypting a random key along side the file gives you useful properties: If an asymmetric key is compromised (but not the full data store), you have an opportunity to re-generate the random file key with a new asymmetric key - without re-encrypting every affected file. You can utilize key rotation to mitigate the risk of a single asymmetric key being stolen.  Assuming you have a way to store those keys separately. Insider Attacks: Avoid letting personnel have direct access to the encryption keys for the files (or the asymmetric keys if you go with the approach above) when at all possible.  A good way to manage this is to have a separate process/daemon that manages keys & access to them.  If using the asymmetric key approach, it can perform the encryption/decryption of file keys - ensuring that the asymmetric keys are never exposed outside of this daemon. A daemon like this is a similar idea as OS X's keychain daemon - or TPM.  Perhaps this is even hosted as a separate service on especially locked down hosts. Logic Attacks: Try to avoid ever loading encryption keys into memory.  And when you have to, make sure to clear them from memory as soon as possible!  You don't want an attacker to in work a buffer overflow or other fun logic bug to expose keys that were recently used.  (Another reason to use an external process)

Ian MacLeod at Quora Visit the source

Was this solution helpful to you?

Just Added Q & A:

Find solution

For every problem there is a solution! Proved by Solucija.

  • Got an issue and looking for advice?

  • Ask Solucija to search every corner of the Web for help.

  • Get workable solutions and helpful tips in a moment.

Just ask Solucija about an issue you face and immediately get a list of ready solutions, answers and tips from other Internet users. We always provide the most suitable and complete answer to your question at the top, along with a few good alternatives below.