Overview
- Documents in a M365 Sharepoint site contain important documents that need to be frequently backed up.
- Using the LegacyFlo SHAREPOINT-S3-ZIP request type, you can now backup files from any M365 Sharepoint site to an S3 bucket in your own AWS account
Step 1: Preparing your S3 bucket
A. Create an S3 bucket on AWS
Create your S3 bucket in a supported region and allow list and write access to LegacyFlo. The steps for the same are given in the links below:
- Get an AWS account.
- Create an S3 bucket in a region supported by Mithi.
- Set a temporary bucket policy to allow LegacyFlo to get the name and region of the buckets as mentioned here.
- When the S3 bucket is used as a source, give list and read access to the objects on the bucket and when the S3 bucket is used as a destination, allow list and write access to LegacyFlo. The steps to do so are mentioned here.
B. Register your S3 bucket as a source/destination S3 bucket on LegacyFlo
For LegacyFlo to upload data to your S3 bucket, it needs to be added to as a destination S3 bucket in LegacyFlo. The steps for doing the same are as given below:
Step 2: Getting access to the Sharepoint documents
This is a one-time process, in which you generate the Graph API access key and register with LegacyFlo.
Follow the steps given here.
Step 3: Submitting a LegacyFlo request
You can submit requests to back up Sharepoint Site Documents using the LegacyFlo interface.
The request type to be used is Sharepoint-S3-ZIP.
The parameters for the request are as follows:
Source Information
FIELD NAME | DESCRIPTION | SAMPLE VALUE |
---|---|---|
Sharepoint Site Name | The Site on Sharepoint from which documents have to be backedup | "Site1" |
Filter Information
FIELD NAME | DESCRIPTION | SAMPLE VALUE |
---|---|---|
Since (DD-MMM-YYYY) | If this field is left blank, all the files in the source will be added to the backup. If this field has a valid input, then all files which were created or modified since the date specified will be added to the backup. | "31-JAN-2022" |
Before(DD-MMM-YYYY) | If this field is left blank, all the files from the date specified in the Since field till the date prior to the date when the execution starts will be retrieved. If the field has a valid input, all files updated or created till a day prior to the date specified will be retrieved. | "31-MAR-2022" |
Other Information
FIELD NAME | DESCRIPTION | SAMPLE VALUE |
---|---|---|
Send report to | A valid email id to which status reports are sent | john@baya.com |
Destination Information
FIELD NAME | DESCRIPTION | SAMPLE VALUE |
---|---|---|
S3 bucket name | This is the name of the S3 bucket in your account which will hold the output of the LegacyFlo request | Backupbucket NOTE: Refer to the steps here to prepare your own S3 bucket to receive LegacyFlo output |
S3 folder path | The folder created on your S3 bucket for the output | GDrivebackup, FreshdeskSolution |
Doc Type |
| |
File name |
| john-drive.zip, freshdesk-nov-backup.zip |
Step 3 (optional): Submitting multiple LegacyFlo requests to backup data for multiple users
To upload multiple jobs using a CSV, create a CSV file with a list of all the users and other details required to upload data.