7

我正在使用 Bitbuckets 管道。我希望它将我的 repo 的全部内容(非常小)推送到 S3。我不想把它拉上拉链,推到 S3 然后解压缩。我只是希望它采用我的 Bitbucket 存储库中的现有文件/文件夹结构并将其推送到 S3。

yaml 文件和 .py 文件应该是什么样子才能完成此任务?

这是当前的 yaml 文件:

image: python:3.5.1

pipelines:
  branches:
    master:
      - step:
          script:
            # - apt-get update # required to install zip
            # - apt-get install -y zip # required if you want to zip repository objects
            - pip install boto3==1.3.0 # required for s3_upload.py
            # the first argument is the name of the existing S3 bucket to upload the artefact to
            # the second argument is the artefact to be uploaded
            # the third argument is the the bucket key
            # html files
            - python s3_upload.py my-bucket-name html/index_template.html html/index_template.html # run the deployment script
            # Example command line parameters. Replace with your values
            #- python s3_upload.py bb-s3-upload SampleApp_Linux.zip SampleApp_Linux # run the deployment script

这是我当前的python:

from __future__ import print_function
import os
import sys
import argparse
import boto3
from botocore.exceptions import ClientError

def upload_to_s3(bucket, artefact, bucket_key):
    """
    Uploads an artefact to Amazon S3
    """
    try:
        client = boto3.client('s3')
    except ClientError as err:
        print("Failed to create boto3 client.\n" + str(err))
        return False
    try:
        client.put_object(
            Body=open(artefact, 'rb'),
            Bucket=bucket,
            Key=bucket_key
        )
    except ClientError as err:
        print("Failed to upload artefact to S3.\n" + str(err))
        return False
    except IOError as err:
        print("Failed to access artefact in this directory.\n" + str(err))
        return False
    return True


def main():

    parser = argparse.ArgumentParser()
    parser.add_argument("bucket", help="Name of the existing S3 bucket")
    parser.add_argument("artefact", help="Name of the artefact to be uploaded to S3")
    parser.add_argument("bucket_key", help="Name of the S3 Bucket key")
    args = parser.parse_args()

    if not upload_to_s3(args.bucket, args.artefact, args.bucket_key):
        sys.exit(1)

if __name__ == "__main__":
    main()

这需要我将 yaml 文件中的 repo 中的每个文件作为另一个命令列出。我只希望它抓取所有内容并将其上传到 S3。

4

4 回答 4

2

您可以更改为使用 docker https://hub.docker.com/r/abesiyo/s3/

它运行得很好

bitbucket-pipelines.yml

image: abesiyo/s3

pipelines:
    default:
       - step:
          script:
             - s3 --region "us-east-1" rm s3://<bucket name>
             - s3 --region "us-east-1" sync . s3://<bucket name> 

还请在 bitbucket 管道 AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY 上设置环境变量

于 2016-10-25T10:00:04.080 回答
2

Atlassian 现在提供“管道”来简化一些常见任务的配置。还有一个用于 S3 上传

无需指定不同的图像类型:

image: node:8

pipelines:
  branches:
    master:
      - step:
          script:
            - pipe: atlassian/aws-s3-deploy:0.2.1
              variables:
                AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                AWS_DEFAULT_REGION: "us-east-1"
                S3_BUCKET: "your.bucket.name"
                LOCAL_PATH: "dist"
于 2019-02-01T14:51:18.977 回答
2

我自己想通了。这是python文件,'s3_upload.py'

from __future__ import print_function
import os
import sys
import argparse
import boto3
#import zipfile
from botocore.exceptions import ClientError

def upload_to_s3(bucket, artefact, is_folder, bucket_key):
    try:
        client = boto3.client('s3')
    except ClientError as err:
        print("Failed to create boto3 client.\n" + str(err))
        return False
    if is_folder == 'true':
        for root, dirs, files in os.walk(artefact, topdown=False):
            print('Walking it')
            for file in files:
                #add a check like this if you just want certain file types uploaded
                #if file.endswith('.js'):
                try:
                    print(file)
                    client.upload_file(os.path.join(root, file), bucket, os.path.join(root, file))
                except ClientError as err:
                    print("Failed to upload artefact to S3.\n" + str(err))
                    return False
                except IOError as err:
                    print("Failed to access artefact in this directory.\n" + str(err))
                    return False
                #else:
                #    print('Skipping file:' + file)
    else:
        print('Uploading file ' + artefact)
        client.upload_file(artefact, bucket, bucket_key)
    return True


def main():

    parser = argparse.ArgumentParser()
    parser.add_argument("bucket", help="Name of the existing S3 bucket")
    parser.add_argument("artefact", help="Name of the artefact to be uploaded to S3")
    parser.add_argument("is_folder", help="True if its the name of a folder")
    parser.add_argument("bucket_key", help="Name of file in bucket")
    args = parser.parse_args()

    if not upload_to_s3(args.bucket, args.artefact, args.is_folder, args.bucket_key):
        sys.exit(1)

if __name__ == "__main__":
    main()

这是他们的 bitbucket-pipelines.yml 文件:

---
image: python:3.5.1

pipelines:
  branches:
    dev:
      - step:
          script:
            - pip install boto3==1.4.1 # required for s3_upload.py
            - pip install requests
            # the first argument is the name of the existing S3 bucket to upload the artefact to
            # the second argument is the artefact to be uploaded
            # the third argument is if the artefact is a folder
            # the fourth argument is the bucket_key to use
            - python s3_emptyBucket.py dev-slz-processor-repo
            - python s3_upload.py dev-slz-processor-repo lambda true lambda
            - python s3_upload.py dev-slz-processor-repo node_modules true node_modules
            - python s3_upload.py dev-slz-processor-repo config.dev.json false config.json
    stage:
      - step:
          script:
            - pip install boto3==1.3.0 # required for s3_upload.py
            - python s3_emptyBucket.py staging-slz-processor-repo
            - python s3_upload.py staging-slz-processor-repo lambda true lambda
            - python s3_upload.py staging-slz-processor-repo node_modules true node_modules
            - python s3_upload.py staging-slz-processor-repo config.staging.json false config.json
    master:
      - step:
          script:
            - pip install boto3==1.3.0 # required for s3_upload.py
            - python s3_emptyBucket.py prod-slz-processor-repo
            - python s3_upload.py prod-slz-processor-repo lambda true lambda
            - python s3_upload.py prod-slz-processor-repo node_modules true node_modules
            - python s3_upload.py prod-slz-processor-repo config.prod.json false config.json

作为 dev 分支的示例,它抓取“lambda”文件夹中的所有内容,遍历该文件夹的整个结构,并且对于它找到的每个项目,它将其上传到 dev-slz-processor-repo 存储桶

最后,这里有一个有用的函数“s3_emptyBucket”,用于在上传新对象之前从存储桶中删除所有对象:

from __future__ import print_function
import os
import sys
import argparse
import boto3
#import zipfile
from botocore.exceptions import ClientError

def empty_bucket(bucket):
    try:
        resource = boto3.resource('s3')
    except ClientError as err:
        print("Failed to create boto3 resource.\n" + str(err))
        return False
    print("Removing all objects from bucket: " + bucket)
    resource.Bucket(bucket).objects.delete()
    return True


def main():

    parser = argparse.ArgumentParser()
    parser.add_argument("bucket", help="Name of the existing S3 bucket to empty")
    args = parser.parse_args()

    if not empty_bucket(args.bucket):
        sys.exit(1)

if __name__ == "__main__":
    main()
于 2016-11-29T04:23:25.670 回答
1

为了将静态网站部署到 Amazon S3,我有这个 bitbucket-pipelines.yml 配置文件:

image: attensee/s3_website

pipelines:
  default:
    - step:
        script:
          - s3_website push

我正在使用 attensee/s3_website docker 镜像,因为该镜像安装了很棒的 s3_website 工具。s3_website (s3_website.yml) 的配置文件[在Bitbucket的仓库根目录下创建这个文件]看起来是这样的:

s3_id: <%= ENV['S3_ID'] %>
s3_secret: <%= ENV['S3_SECRET'] %>
s3_bucket: bitbucket-pipelines
site : .

我们必须在环境变量中定义环境变量 S3_ID 和 S3_SECRET,从位桶设置

感谢https://www.savjee.be/2016/06/Deploying-website-to-ftp-or-amazon-s3-with-BitBucket-Pipelines/ 的解决方案

于 2017-06-05T15:19:56.300 回答