Upload email data files to an S3 bucket
Choosing the bucket
Option 1. Uploading to your own S3 Bucket
We highly recommend that the raw data files are uploaded to an S3 bucket in your AWS account.
The steps to do so are as follows:
- Create an AWS account
- Create an S3 bucket in a region supported by LegacyFlo
- Grant List and Read access to LegacyFlo
- Create a folder called "Vaultastic". Copy the data files to be migrated into this folder.
For details on the steps refer to the documentation here.
Option 2: Uploading to the S3 bucket in Vaultastic V4 Open or Deep Store
The other option to upload data files to the S3 bucket in the Vaultastic V4 Open or Deep Store.
To know the S3 credentials of your Open or Deep Stores, contact Mithi Customer Care.
Choose the upload method
With AWS, there are more than one ways to upload data to an S3 bucket. Two recommended options are the S3 browser for small data and the AWS Snowball for large data.
Note: Any tools required to upload data to the S3 bucket, including those recommended have to be purchased by you, the customer.
Option 1: Using a tool such as the S3 browser to upload the files
Refer to the documentation of the third party tool for uploading data.
Option 2: Using the AWS Snowball
When migrating large volumes of data. It is advisable to use a device from the AWS Snowball to transport data from your data center to the AWS data center in the selected region. Refer to the AWS documentation on ordering and using the Snowball.
Convert EML to PST
- On successful uploading of the EML data files to S3, submit a request to convert the EML data file to Mithi PST file using the LegacyFlo application.
- For each EML file submit a separate request.
- To submit multiple requests, manually create and upload a CSV. Use the help given in the following sections to define your requests.
- When submitting a request to convert EML data file, select EML-PST as the Request type.
- Provide EML file details under the Source information section.
- Provide PST file details under the Destination information section.
- Refer to the following table to understand the description and sample values for the fields.
Source Information (EML Files)
FIELD NAME | DESCRIPTION |
---|---|
S3 Bucket Name | The S3 bucket that has the source file to be processed. Select the S3 bucket name from the drop-down list. The list will show
|
S3 Folder path | If the file is on the Vaultastic Open Store, then this parameter will have the name of your Vaultastic domain. Example: baya-com.vaultastic.com If the file is on your own bucket, give the folder which contains the file. Leave this value empty if the source file is on the root of the S3 bucket. NOTE: S3 folder path should not start or end with a "/" |
File name | The source filename. The compressed file such as .zip or .pst which contains the source data NOTE: Filenames should not start or end with a "/" |
Destination information (PST Files)
FIELD NAME | DESCRIPTION |
---|---|
S3 bucket name | The S3 bucket for the destination file. Select the S3 bucket name from the drop-down list. The list will show
|
S3 folder path | If the file has to be created on the Vaultastic Open Store, then this parameter will have the name of your Vaultastic domain. Example: baya-com.vaultastic.com If the file is to be created on your own bucket, give the folder which will contain the file. Leave this value empty if the destination file has to be created on the root of the S3 bucket. NOTE: S3 folder path should not start or end with a "/" |
File name | The destination filename. The file that will contain the results of the LegacyFlo command NOTE: Filenames should not start or end with a "/" |
Other Information
FIELD NAME | DESCRIPTION | SAMPLE VALUE (TO UPLOAD TO VAULTASTIC) |
---|---|---|
Send report to mail id | Email address which will get the notifications | "postmaster@baya.com" |
Want to submit multiple jobs using CSV?
To upload multiple jobs using a CSV, create a CSV file with a list of all the users and other details required to upload data.