23

I can log into my server with cyberduck or filezilla but cannot read my homedirectory. s3 bucket "mybucket" exists. In cyber duck I see

"Cannot readdir on root. Please contact your web hosting service provider for assistance." and in Filezilla "Error: Reading directory .: permission denied"

even though I can connect to server.

Am I missing some user permission in the policies below ?

These are my permissions

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Resource": "arn:aws:s3:::MYBUCKET"
        },
        {
            "Sid": "VisualEditor1",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject"
            ],
            "Resource": "arn:aws:s3:::MYBUCKET/*"
        },
        {
            "Sid": "VisualEditor2",
            "Effect": "Allow",
            "Action": "transfer:*",
            "Resource": "*"
        }
    ]
}

These are my trust relationships:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "s3.amazonaws.com"
            },
            "Action": "sts:AssumeRole"
        },
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "transfer.amazonaws.com"
            },
            "Action": "sts:AssumeRole"
        }
    ]
}
user11020868
  • 379
  • 2
  • 4
  • 9

2 Answers2

41

User Role should be:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowListingOfUserFolder",
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Effect": "Allow",
            "Resource": [
                "arn:aws:s3:::BUCKET_NAME"
            ]
        },
        {
            "Sid": "HomeDirObjectAccess",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObjectVersion",
                "s3:DeleteObject",
                "s3:GetObjectVersion"
            ],
            "Resource": "arn:aws:s3:::BUCKET_NAME/*"
        }
    ]
}

Trust relationship of User:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "",
      "Effect": "Allow",
      "Principal": {
        "Service": "transfer.amazonaws.com"
      },
      "Action": "sts:AssumeRole"
    }
  ]
}

Home directory for your user should be /BUCKET_NAME

Algeriassic
  • 1,614
  • 10
  • 10
  • 6
    *This should be the accepted answer!* – Jude Niroshan Mar 26 '19 at 17:03
  • Thanks, this resolves my issue. – user11020868 Mar 28 '19 at 01:15
  • 3
    This answer just saved me a lot of heartache. I was setting up SFTP and my default role/policy had a trust relationship with s3.amazonaws.com. Connecting would give me an error stating "Unable to AssumeRole". The real problem was that I needed a trust relationship with transfer.amazonaws.com instead of s3.amazonaws.com . – Warren Krewenki Apr 17 '19 at 15:27
  • 1
    Please mark it as the accepted answer. – Algeriassic Apr 18 '19 at 15:20
  • 1
    I want to allow user only to Put objects i.e remove "s3:GetObject", "s3:DeleteObjectVersion", "s3:DeleteObject", "s3:GetObjectVersion" But with that I cannot list objects in the Home directory, Any solution to resolve this greatly appreciated? – user1393608 Nov 06 '19 at 06:12
  • I am able to work with a user specific home folder in the same bucket with this approach. But, when working with logical directories approach, as mentioned here (https://github.com/aws-samples/transfer-for-sftp-logical-directories) I am getting Access denied error in file zilla and user is unable to login. Please help. – CSR May 07 '20 at 05:04
  • oh my goodness thank you so much; this answer combined with @WarrenKrewenki finally fixed this for me after hours of pain and suffering! I was using this via pysftp and just getting `PermissionError: [Errno 13] Unable to assume role` which is obviously not a lot of info!! – Tommy Apr 28 '21 at 15:00
  • Happy that it helped you fix the issue! – Algeriassic Apr 29 '21 at 17:54
  • If your S3 bucket is encrypted with a KMS you also need to allow the role to use that KMS in the policy, see: https://stackoverflow.com/a/54241647/4091202 – sylr Dec 02 '21 at 13:15
  • Thanks a lot for solution!! – NIrav Modi Aug 25 '22 at 10:16
3

I had issues with this until I added, specifically, the s3:GetObject permission to the aws_transfer_user policy. I expected s3:ListBucket to be enough, but it was not. sftp> ls would fail until I had GetObject.

Here's the Terraform for it:

resource "aws_transfer_user" "example-ftp-user" {
  count                     = length(var.uploader_users)
  user_name                 = var.uploader_users[count.index].username

  server_id                 = aws_transfer_server.example-transfer.id
  role                      = aws_iam_role.sftp_content_incoming.arn
  home_directory_type       = "LOGICAL"

  home_directory_mappings {
      entry = "/"
      target = "/my-bucket/$${Transfer:UserName}"
    }

    policy = <<POLICY
{
    "Version": "2012-10-17",
    "Statement": [
      {
        "Sid": "AllowSftpUserAccessToS3",
        "Effect": "Allow",
        "Action": [
          "s3:ListBucket",
          "s3:PutObject",
          "s3:GetObject",
          "s3:DeleteObjectVersion",
          "s3:DeleteObject",
          "s3:GetObjectVersion",
          "s3:GetBucketLocation"
        ],
        "Resource": [
          "${aws_s3_bucket.bucket.arn}/${var.uploader_users[count.index].username}",
          "${aws_s3_bucket.bucket.arn}/${var.uploader_users[count.index].username}/*"
        ]
      }
    ]
}
POLICY
}

And I define users in a .tfvars file; e.g.:

uploader_users = [
  {
    username = "firstuser"
    public_key = "ssh-rsa ...."
  },
  {
    username = "seconduser"
    public_key = "ssh-rsa ..."
  },
  {
    username = "thirduser"
    public_key = "ssh-rsa ..."
  }
]

I hope this helps someone. It took me a lot of tinkering before I finally got this working, and I'm not 100% sure of the interactions with other policies might ultimately be in play. But after applying this was the moment I could connect and list bucket contents without getting "Permission denied".

rotarydial
  • 131
  • 4