admin管理员组

文章数量:1313161

I am trying to sync a local directory to an S3 bucket and the set commands are taking me in an erroneous circle.

(I've scrubbed the personal directory and bucket names)

Command for the simple sync function I am using:

aws s3 sync . s3://<BUCKET NAME> 

Result:

An error occurred (MissingContentLength) when calling the PutObject operation: You must provide the Content-Length HTTP header.

I added the "content-length" header in the command:

DIRECTORY=.
BUCKET_NAME="BUCKET NAME"

# Function to upload a file with Content-Length header
upload_file() {
  local file=$1
  local content_length=$(stat -c%s "$file")
  local relative_path="${file#$DIRECTORY/}"

  aws s3 sync "$file" "s3://$BUCKET_NAME/$relative_path" \
 --metadata-directive REPLACE \
 --content-length "$content_length" \
 --content-type application/octet-stream \
 --content-disposition attachment \
 --content-encoding identity
}

export -f upload_file

# Find and upload files in the local directory
find "$DIRECTORY" -type f -exec bash -c 'upload_file "$0"' {} \;

Result:

Unknown options: --content-length,1093865263

I try a simple CP command

aws s3 cp . s3://BUCKETNAME

upload failed: ./ to s3://BUCKETNAME Need to rewind the stream <botocore.httpchecksum.AwsChunkedWrapper object at 0x72351153a720>, but stream is not seekable.

Copying a single file:

aws s3 cp FILENAME s3://BUCKETNAME

Result:

An error occurred (MissingContentLength) when calling the UploadPart operation: You must provide the Content-Length HTTP header.

I am at a loss as to what exactly AWS S3 CLI is looking for from me at this point. Does anyone have any direction to point me to? Thanks!

I am trying to sync a local directory to an S3 bucket and the set commands are taking me in an erroneous circle.

(I've scrubbed the personal directory and bucket names)

Command for the simple sync function I am using:

aws s3 sync . s3://<BUCKET NAME> 

Result:

An error occurred (MissingContentLength) when calling the PutObject operation: You must provide the Content-Length HTTP header.

I added the "content-length" header in the command:

DIRECTORY=.
BUCKET_NAME="BUCKET NAME"

# Function to upload a file with Content-Length header
upload_file() {
  local file=$1
  local content_length=$(stat -c%s "$file")
  local relative_path="${file#$DIRECTORY/}"

  aws s3 sync "$file" "s3://$BUCKET_NAME/$relative_path" \
 --metadata-directive REPLACE \
 --content-length "$content_length" \
 --content-type application/octet-stream \
 --content-disposition attachment \
 --content-encoding identity
}

export -f upload_file

# Find and upload files in the local directory
find "$DIRECTORY" -type f -exec bash -c 'upload_file "$0"' {} \;

Result:

Unknown options: --content-length,1093865263

I try a simple CP command

aws s3 cp . s3://BUCKETNAME

upload failed: ./ to s3://BUCKETNAME Need to rewind the stream <botocore.httpchecksum.AwsChunkedWrapper object at 0x72351153a720>, but stream is not seekable.

Copying a single file:

aws s3 cp FILENAME s3://BUCKETNAME

Result:

An error occurred (MissingContentLength) when calling the UploadPart operation: You must provide the Content-Length HTTP header.

I am at a loss as to what exactly AWS S3 CLI is looking for from me at this point. Does anyone have any direction to point me to? Thanks!

Share Improve this question edited Feb 1 at 23:25 John Rotenstein 270k28 gold badges446 silver badges530 bronze badges Recognized by AWS Collective asked Feb 1 at 15:51 Ben MurphyBen Murphy 111 bronze badge 6
  • 1 You shouldn't need to provide those parameters to aws s3 sync. Is it just one file that is causing the problem or is it all files? You might want to update your AWS CLI. Or, use aws s3 cp --recursive. – John Rotenstein Commented Feb 1 at 23:27
  • 1 Make sure you are using up to date awscli. – jarmod Commented Feb 1 at 23:48
  • All files give the same error when trying to upload individually. I've un-installed and reinstalled AWS cli twice now – Ben Murphy Commented Feb 2 at 1:55
  • There seem to be some issues (here and here) with the later botocore versions (aws cli uses botocore). The suggested workaround is to downgrade botocore to a version before 1.36. – Man made of meat Commented Feb 2 at 4:48
  • Currently on botocore version 1.34.46 with the errors – Ben Murphy Commented Feb 2 at 18:36
 |  Show 1 more comment

1 Answer 1

Reset to default 0

This is a new breaking change feature in aws cli: https://github/boto/boto3/issues/4392 Many 3rd party services not supported this yet.

Try downgrade aws cli version

curl "https://awscli.amazonaws/awscli-exe-linux-x86_64-2.17.13.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install --update

Or just add 2 final lines to your ~/.aws/config profile

[profile my_account]
request_checksum_calculation=WHEN_REQUIRED
response_checksum_validation=WHEN_REQUIRED

本文标签: amazon web servicesAWS CLI copy local file to S3 cloud giving errorsStack Overflow