Backup from GDrive to your own S3 bucket


  • GDrive from the Google Workspace suit of products contains important documents that need to be frequently backed up.
  • Using the LegacyFlo GDrive-S3-ZIP request type, you can now backup files from any user's Google Drive to a S3 bucket in your AWS account.

Step 1: Preparing your S3 bucket

A. Create an S3 bucket on AWS

Note: We recommend that you use a single S3 bucket in your AWS account to receive LegacyFlo request output.

Create your S3 bucket in a supported region and allow list and write access to LegacyFlo. The steps for the same are given in the links below:

  1. Get an AWS account.
  2. Create an S3 bucket in a region supported by Mithi. 
  3. Set a temporary bucket policy to allow LegacyFlo to get the name and region of the buckets as mentioned here.
  4. When the S3 bucket is used as a source, give list and read access to the objects on the bucket and when the S3 bucket is used as a destination, allow list and write access to LegacyFlo. The steps to do so are mentioned here.

B. Register your S3 bucket as a source/destination S3 bucket on LegacyFlo

For LegacyFlo to upload data to your S3 bucket, it needs to be added to as a destination S3 bucket in LegacyFlo. The steps for doing the same are as given below:

Note: This step has to be done only once for each destination bucket.

  1. Login to LegacyFlo
  2. Click on the Initials icon on the top of the left-most pane.
  3. Select Register S3 bucket
  4. Click on Add S3 bucket button. A dialog box will pop-up
  5. On the Prepare Bucket page, you will be shown the list of steps to be completed to create and grant access to the S3 bucket. Confirm that you have done as instructed and click on I confirm that the above configuration is complete.
  6. Click Continue
  7. On the Configure, Verify & Save page,  give the Bucket Name and select the Bucket Region.
  8. Click Verify. LegacyFlo will check the access to the bucket.
  9. If LegacyFlo cannot access the bucket, you will be shown an error message similar to All access to this object has been disabled. Check S3 access rights and name and retry.
  10. If the connection is successful then click on Save. LegacyFlo will remember this S3 for further requests.

Step 2: Getting an Google Workspace access key and Registering with LegacyFlo

Your Google Workspace needs to be primed up to get access to the user data on GDrive to be backed up using LegacyFlo. The steps to do so are given here.

Step 3: Submitting a LegacyFlo request

You can submit requests to backup GDrive data using the LegacyFlo interface.

The request type to be used is GDrive-S3-ZIP.

The parameters for the request are as follows:

Source Information

User IDUser id on Google Workspace domain whose data needs to be imported
(Note: The id should not contain the domain name.)
DomainSource domain on Google Workspace""
GDrive Folder name

The folder from which data needs to backed up on the destination. All the contents of this folder and its subfolders will be backed up. 

Note that the folder name should not start with a /.

If all the GDrive data is to be backed up, leave this value empty


Filter Information

Since (DD-MMM-YYYY)If this field is left blank, all the files in the source will be added to the backup.
If this field has a valid input, then all files which were created or modified since the date specified will be added to the backup.

Other Information

Send report toA valid email id to which status reports are

Destination Information

S3 bucket name

The S3 bucket for the destination file. Select the S3 bucket name from the drop-down list.

The list will show 

  • S3 buckets in your AWS account registered with LegacyFlo
  • the bucket hosting your Vaultastic Open Store(mithi-ind1-vaultastic-legacystore).
S3 folder path

If the file has to be created on the Vaultastic Open Store, then this parameter will have the name of your Vaultastic domain. Example:

If the file is to be created on your own bucket, give the folder which will contain the file. Leave this value empty if the destination file has to be created on the root of the S3 bucket.

NOTE: S3 folder path should not start or end with a "/"

File name

The destination filename. The file that will contain the results of the LegacyFlo command

NOTE: Filenames should not start or end with a "/"

Step 3 (optional): Submitting multiple LegacyFlo requests to backup data for multiple users

To upload multiple jobs using a CSV, create a CSV file with a list of all the users and other details required to upload data.