Overview
- GDrive from the Google Workspace suit of products contains important documents that need to be frequently backed up.
- Using the LegacyFlo GDrive-S3-ZIP request type, you can now backup files from any user's Google Drive to a S3 bucket in your AWS account.
Step 1: Preparing your S3 bucket
A. Create an S3 bucket on AWS
Create your S3 bucket in a supported region and allow list and write access to LegacyFlo. The steps for the same are given in the links below:
- Get an AWS account.
- Create an S3 bucket in a region supported by Mithi.
- Set a temporary bucket policy to allow LegacyFlo to get the name and region of the buckets as mentioned here.
- When the S3 bucket is used as a source, give list and read access to the objects on the bucket and when the S3 bucket is used as a destination, allow list and write access to LegacyFlo. The steps to do so are mentioned here.
B. Register your S3 bucket as a source/destination S3 bucket on LegacyFlo
For LegacyFlo to upload data to your S3 bucket, it needs to be added to as a destination S3 bucket in LegacyFlo. The steps for doing the same are as given below:
Step 2: Getting an Google Workspace access key and Registering with LegacyFlo
Your Google Workspace needs to be primed up to get access to the user data on GDrive to be backed up using LegacyFlo. The steps to do so are given here.
Step 3: Submitting a LegacyFlo request
You can submit requests to backup GDrive data using the LegacyFlo interface.
The request type to be used is GDrive-S3-ZIP.
The parameters for the request are as follows:
Source Information
FIELD NAME | DESCRIPTION | SAMPLE VALUE |
---|---|---|
User ID | User id on Google Workspace domain whose data needs to be imported (Note: The id should not contain the domain name.) | "john" |
Domain | Source domain on Google Workspace | "baya.com" |
GDrive Folder name | The folder from which data needs to backed up on the destination. All the contents of this folder and its subfolders will be backed up. Note that the folder name should not start with a /. If all the GDrive data is to be backed up, leave this value empty | "23-Jun-2020" |
Filter Information
FIELD NAME | DESCRIPTION | SAMPLE VALUE |
---|---|---|
Since (DD-MMM-YYYY) | If this field is left blank, all the files in the source will be added to the backup. If this field has a valid input, then all files which were created or modified since the date specified will be added to the backup. | "31-JAN-2022" |
Other Information
FIELD NAME | DESCRIPTION | SAMPLE VALUE |
---|---|---|
Send report to | A valid email id to which status reports are sent | john@baya.com |
Destination Information
FIELD NAME | DESCRIPTION | SAMPLE VALUE |
---|---|---|
S3 bucket name | This is the name of the S3 bucket in your account which will hold the output of the LegacyFlo request | Backupbucket NOTE: Refer to the steps here to prepare your own S3 bucket to receive LegacyFlo output |
S3 folder path | The folder created on your S3 bucket for the output | GDrivebackup, FreshdeskSolution |
Doc Type |
| |
File name |
| john-drive.zip, freshdesk-nov-backup.zip |
Step 3 (optional): Submitting multiple LegacyFlo requests to backup data for multiple users
To upload multiple jobs using a CSV, create a CSV file with a list of all the users and other details required to upload data.