Start a conversation

Restrictive S3 bucket permissions

If you have an existing bucket but don't wish to give permission for DocEvent.io to access the entire bucket, you are able to limit the paths that DocEvent has access to.

First, you can configure the bucket access like so:

{
    "Statement": [
        {
            "Action": [
                "s3:GetObject*",
                "s3:PutObject*",
                "s3:DeleteObject"
            ],
            "Resource": [
                "arn:aws:s3:::temp-test-ftp-bucket-1/docevent1/*",
                "arn:aws:s3:::temp-test-ftp-bucket-1/docevent2/*"
            ],
            "Effect": "Allow"
        },
        {
            "Action": [
                "s3:GetBucketLocation",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::temp-test-ftp-bucket-1"
            ],
            "Effect": "Allow"
        }
    ]
}

The above will only provide access for this account to the temp-test-ftp-bucket-1 for docevent1/* and docevent2/* paths.

Note, DocEvent still requires ListBucket and GetBucketLocation permissions to the bucket directly.

Next, when creating the Simple FTP Service, specify the "Test Path" as the directory that DocEvent has permissions to, for Example docevent1/ or docevent2/


Next you can Verify Access which will test the permissions to this directory are valid.

Once completed you can create the user account as you normally would. 

Important, when creating the user for this service, be sure to set the home directory in a directory that DocEvent has permissions, ie. /docevent1 for the above example

Choose files or drag and drop files
Was this article helpful?
Yes
No
  1. Mariusz

  2. Posted
  3. Updated

Comments