-
Notifications
You must be signed in to change notification settings - Fork 323
[StorageBrowser] integrating existing S3 resources issue #6258
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Regarding this issue of supporting existing S3 resources, I can confirm StoBro supports listing from bucket root. So I think to support S3 bucket's that's not created by Amplify, but still configured to be accessed by Cognito identity, |
Hi, any news? I am also unable to use Amplify Storage with any S3 bucket (https://docs.amplify.aws/react/build-a-backend/storage/use-with-custom-s3/#use-storage-resources-with-an-amplify-backend) Thanks! |
Hi @nookale! Can you provide some additional information on how you've configured the |
Hello @calebpollman can you please share how you configured this part: ".....I was able to make it work, if I edit amplify_outputs.json directly and add path configuration access to " * / * "....." Since I was checking the amplify_outputs.json file, but I don't see the "path Configuration" section there. Thanks advance! Carlos |
From this page: https://docs.amplify.aws/react/build-a-backend/storage/use-with-custom-s3/ I tried the part: Use storage resources with an Amplify backend. Regarding the bucket, besides the one specified in the article I mentioned above, I also used:
Which is the configuration that is automatically generated when a bucket is created as an Amplified resource, so I copied it to the "non-amplify" bucket I wanted to use with Storage. However, I tried many different combinations and modifications for both the bucket policy and CORS. None of them worked. Also, regarding the "Principal," it was unclear which one to use, so I tried all the roles associated with the amplify project: CustomS3AutoDeleteObjects, AmplifyBranchLinkerCustom, amplifyAuthunauthenticate, amplifyAuthauthenticatedU. For CORS, I have also used this (https://ui.docs.amplify.aws/react/connected-components/storage/storage-browser):
Regarding the Next Js app, I followed the instructions that can be found here: https://docs.amplify.aws/react/build-a-backend/storage/use-with-custom-s3/ |
By the way, just for the records and in case it may be of interest: When the bucket is created as an Amplify Resource, the storage part of the autogenerated amplify_outputs.json file looks like:
However, when adding a custom S3 bucket as a resource, this is how it looks like:
|
Hi :) |
Have you tried adding the backend.addOutput({
storage: {
bucket_name: 'my-default-bucket',
aws_region: 'my-default-bucket-region',
buckets: [{
name: 'default-bucket-friendly-name',
aws_region: 'my-default-bucket-region',
// @ts-expect-error
paths: {
'my-root-folder/*': {
guest: ['get', 'list'] as const,
authenticated: ['list', 'get', 'delete', 'write'] as const,
}
}
}]
}
}); You need to use |
Hello @sephilink, @nookale, @itconsultor et al, we've updated our documentation on integrating existing S3 buckets with Amplify, including several examples on how to configure your bucket with the Amplify backend. Please let us know if you're still encountering issues with setting up the bucket, thanks! |
Hi, I'm having a bit of trouble to make it work with existing Cognito and S3 resources.
I've referenced the resources on backend configuration (code below), the bucket also has CORS configured according to the documentation: https://ui.docs.amplify.aws/react/connected-components/storage/storage-browser#bucket-cors
And I also set bucket policies to allow access from the Coginito Authenticated role to access and list the bucket.
However, I'm getting on display that there is "No folders or files"
I was able to make it work, if I edit amplify_outputs.json directly and add path configuration access to " * / * "
Here is my bucket policy
Originally posted by @alanst32 in #5731 (comment)
The text was updated successfully, but these errors were encountered: