Skip to content

Commit

Permalink
[env][s]: change OBJECT_STORAGE to AWS in env names and STORAGE_BUCKE…
Browse files Browse the repository at this point in the history
…T_NAME to STORAGE_BUCKET.
  • Loading branch information
rufuspollock committed Jul 13, 2017
1 parent 025516f commit 3b175e0
Show file tree
Hide file tree
Showing 3 changed files with 15 additions and 15 deletions.
8 changes: 4 additions & 4 deletions .env.template
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,11 @@ DOMAIN_API=api-${STAGE}.${DOMAIN_BASE}
# ======================

# AWS Credentials - common across buckets
OBJECT_STORAGE_ACCESS_KEY=
OBJECT_STORAGE_SECRET_KEY=
AWS_ACCESS_KEY=
AWS_SECRET_KEY=

# Bucket locations (used by various services)
PACKAGESTORE_BUCKET_NAME=pkgstore-${STAGE}.${DOMAIN_BASE}
PKGSTORE_BUCKET=pkgstore-${STAGE}.${DOMAIN_BASE}

# ============
# auth service
Expand All @@ -44,5 +44,5 @@ GITHUB_SECRET=
# ============

# NOTE: storage credentials are above in Object Storage
RAWSTORE_BUCKET_NAME=rawstore-${STAGE}.${DOMAIN_BASE}
RAWSTORE_BUCKET=rawstore-${STAGE}.${DOMAIN_BASE}

18 changes: 9 additions & 9 deletions docker-cloud.yml
Original file line number Diff line number Diff line change
Expand Up @@ -62,20 +62,20 @@ rawstore:
environment:
VIRTUAL_HOST: ${DOMAIN_API}/rawstore/*
AUTH_SERVER: http:https://auth:8000
STORAGE_ACCESS_KEY_ID: ${OBJECT_STORAGE_ACCESS_KEY}
STORAGE_SECRET_ACCESS_KEY: ${OBJECT_STORAGE_SECRET_KEY}
STORAGE_BUCKET_NAME: ${RAWSTORE_BUCKET_NAME}
STORAGE_ACCESS_KEY_ID: ${AWS_ACCESS_KEY}
STORAGE_SECRET_ACCESS_KEY: ${AWS_SECRET_KEY}
STORAGE_BUCKET_NAME: ${RAWSTORE_BUCKET}
STORAGE_PATH_PATTERN: '{md5}'
assembler:
autoredeploy: true
restart: always
environment:
VIRTUAL_HOST=${DOMAIN_API}/pipelines/*
SOURCESPEC_REGISTRY_DB_ENGINE=postgresql:https://datahub@postgres/datahub
DPP_BASE_PATH=/pipelines/
PKGSTORE_BUCKET=${PACKAGESTORE_BUCKET_NAME}
AWS_ACCESS_KEY_ID=${OBJECT_STORAGE_ACCESS_KEY}
AWS_SECRET_ACCESS_KEY=${OBJECT_STORAGE_SECRET_KEY}
VIRTUAL_HOST: ${DOMAIN_API}/pipelines/*
SOURCESPEC_REGISTRY_DB_ENGINE: postgresql:https://datahub@postgres/datahub
DPP_BASE_PATH: /pipelines/
PKGSTORE_BUCKET: ${PKGSTORE_BUCKET}
AWS_ACCESS_KEY_ID: ${AWS_ACCESS_KEY}
AWS_SECRET_ACCESS_KEY: ${AWS_SECRET_KEY}
image: 'datopian/assembler:latest'
specstore:
autoredeploy: true
Expand Down
4 changes: 2 additions & 2 deletions main.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,8 +89,8 @@ def s3(self):
"""Creates regular and logging S3 Buckets if not exist"""
s3_client = boto3.client(
's3',
aws_access_key_id=self.config['OBJECT_STORAGE_ACCESS_KEY'],
aws_secret_access_key=self.config['OBJECT_STORAGE_SECRET_KEY']
aws_access_key_id=self.config['AWS_ACCESS_KEY'],
aws_secret_access_key=self.config['AWS_SECRET_KEY']
)
bucket_list = [self.config[env] for env in self.config if 'BUCKET' in env]
for bucket in bucket_list:
Expand Down

0 comments on commit 3b175e0

Please sign in to comment.