MithiDocs

Backup from GDrive to your own S3 bucket

Overview

  • GDrive from the Google Workspace suit of products contains important documents that need to be frequently backed up.
  • Using the LegacyFlo GDrive-S3-ZIP request type, you can now backup files from any user's Google Drive to a S3 bucket in your AWS account.

Step 1: Preparing your S3 bucket


A. Create an S3 bucket on AWS

Note: We recommend that you use a single S3 bucket in your AWS account to receive LegacyFlo request output.


Create your S3 bucket in a supported region and allow list and write access to LegacyFlo. The steps for the same are given in the links below:

  1. Get an AWS account.
  2. Create an S3 bucket in a region supported by Mithi. 
  3. Allow list and write access to LegacyFlo.

B. Register your S3 bucket as a destination S3 bucket on LegacyFlo

For LegacyFlo to upload data to your S3 bucket, it needs to be added to as a destination S3 bucket in LegacyFlo. The steps for doing the same are as given below:

Note: This step has to be done only once for each destination bucket.


  1. Login to LegacyFlo
  2. Select Add Request
  3. On the left pane,
    1. Give a dummy title 
    2. Select MITHI-S3-ZIP as the Request Type
  4. In the section on Destination Information
    1. Click the [+] icon next to S3 Bucket Name
    2. On the Prepare Bucket page, you will be shown the list of steps to be completed to create and grant access to the S3 bucket. Confirm that you have done as instructed and click on I confirm that the above configuration is complete.
    3. Click Continue
    4. On the Configure, Verify & Save page,  give the Bucket Name and select the Bucket Region.
    5. Click Verify. LegacyFlo will check the access to the bucket.
    6. If LegacyFlo cannot access the bucket, you will be shown an error message similar to All access to this object has been disabled. Check S3 access rights and name and retry.
    7. If the connection is successful then click on Save. LegacyFlo will remember this S3 for further requests.

Step 2: Getting an Google Workspace access key and Registering with LegacyFlo

Your Google Workspace needs to be primed up to get access to the user data on GDrive to be backed up using LegacyFlo. The steps to do so are given here.

Step 3: Submitting a LegacyFlo request

You can submit requests to backup GDrive data using the LegacyFlo interface.

The request type to be used is GDrive-S3-ZIP.

The parameters for the request are as follows:

Source Information

FIELD NAMEDESCRIPTIONSAMPLE VALUE
User IDUser id on Google Workspace domain whose data needs to be imported
(Note: The id should not contain the domain name.)
"john"
DomainSource domain on Google Workspace"baya.com"
GDrive Folder name

The folder from which data needs to backed up on the destination. All the contents of this folder and its subfolders will be backed up. 

Note that the folder name should not start with a /.

If all the GDrive data is to be backed up, leave this value empty

"23-Jun-2020"

Filter Information

FIELD NAMEDESCRIPTIONSAMPLE VALUE
Since (YYYY-MM-DD)If this field is left blank, all the files in the source will be added to the backup.
If this field has a valid input, then all files which were created or modified since the date specified will be added to the backup.
"2022-01-31"

Other Information

FIELD NAMEDESCRIPTIONSAMPLE VALUE
Send report toA valid email id to which status reports are sentjohn@baya.com

Destination Information

FIELD NAMEDESCRIPTION

SAMPLE VALUE

S3 bucket nameThis is the name of the S3 bucket in your account which will hold the output of the LegacyFlo request
Backupbucket

NOTE: Refer to the steps here to prepare your own S3 bucket to receive LegacyFlo output
S3 folder pathThe folder created on your S3 bucket for the outputGDrivebackup, FreshdeskSolution
File nameThe destination filename will have the result of the LegacyFlo job.

john-drive.zip
freshdesk-nov-backup.zip


Step 3 (optional): Submitting multiple LegacyFlo requests to backup data for multiple users

To upload multiple jobs using a CSV, create a CSV file with a list of all the users and other details required to upload data.