It is possible to user the data lab as a classic S3 server to upload file to it. It support the classic S3 tools like AWS CLI and Rclone. The files are store in a defined folder in the data lab.
How to configure the data lab S3 server?
To configure the data lab S3 server, you need to create credentials to upload files to the data lab using the S3 interface:
- Open your Data Lab.
- Go to the Settings page.
- Navigate to the Monitoring section.
- Click on the Credentials tab.
- Click on ➕ Create to add new credentials.
- Select the credential type as 'S3 Lab Server'.
- Generate an access key and a secret key.
- Enter a name for the bucket. Note that a bucket in the lab is associated with a folder.
- Provide the folder path where the files will be stored. ⚠️ Pay attention to the destination folder as incorrect settings may impact your lab functionality.
- Click on 💾 Save to store the credentials.
You are now ready to upload files to the Lab S3 server.
How to send files to the Lab S3 server?
Once your credentials are configured, you can start uploading files to the data lab using your preferred S3 client.
The lab S3 endpoint is located at:https://glab.[DESTINATION_LAB_HOST].minorque.constellab.app/s3-server/v1
Using an agent
The following agent can be used to synchronise a folder with a Lab S3 server : https://constellab.community/agents/d0035cc8-9d33-496e-9fcc-65430212c915/sync-folder-to-another-lab/version/1
Using Rclone
RClone allows you to synchronize files from another data lab or a local source to your destination lab:
Use the sync command with caution. ⚠️ Be aware that the sync operation may overwrite and delete files or folders in the destination lab as it mirrors the source folder.
rclone sync [DESTINATION_FOLDER] :s3:[BUCKET_NAME]/ \
--s3-provider=Other \
--s3-access-key-id=[ACCESS_KEY] \
--s3-secret-access-key=[SECRET_KEY] \
--s3-endpoint=https://glab.[DESTINATION_LAB_HOST].constellab.app/s3-server/v1 \
--modify-window=1ms \
--progress