Skip to content

difu/Arcus

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Arcus

Experimental GRIB Cloud Cache and Raster Data query platform

Aim of this project is to evaluate different technologies to analyze, distribute and share GRIB data.

It will use free data of Deutscher Wetterdienst at ftp:https://ftp-cdc.dwd.de/pub/REA/COSMO_REA6/

This project uses eccodes from ECMFW and gdal for GRIB encoding/decoding.

Quickstart

Infrastructure

All infrastructure will be deployed on AWS. To install the AWS command line tools please refer to http:https://docs.aws.amazon.com/cli/latest/userguide/awscli-install-linux.html. To create and modify the infrastructure Terraform is used. Download the terraform executable and take a look at the getting started guide.

  • Create an AWS user and grant this user

    • AWS managed policies
      • SystemAdministrator
      • AmazonElasticFileSystemFullAccess
      • AmazonElasticMapReduceRole
    • Inline policy
    {
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1500631120000",
            "Effect": "Allow",
            "Action": [
                "iam:CreatePolicy",
                "iam:PutRolePolicy",
                "iam:DeleteRolePolicy",
                "iam:CreateRole",
                "iam:AttachRolePolicy",
                "iam:CreateInstanceProfile",
                "iam:AddRoleToInstanceProfile",
                "iam:PassRole",
                "iam:DetachRolePolicy",
                "iam:RemoveRoleFromInstanceProfile",
                "iam:DeleteInstanceProfile",
                "iam:DeleteRole",
                "iam:DeleteUserPolicy",
                "iam:DeletePolicy",
                "elasticmapreduce:RunJobFlow",
                "elasticmapreduce:DescribeCluster",
                "elasticmapreduce:TerminateJobFlows",
                "lambda:AddPermission",
                "lambda:RemovePermission",
                "lambda:PublishLayerVersion",
                "apigateway:*",
            ],
            "Resource": [
                "*"
            ]
        }
    ]
    }
  • Create an S3 bucket where Arcus stores its internal components etc. and name it like my_internal_bucket. Note that this bucket name must have an unique name. Remember that name as it is needed when you want to deploy the infrastructure.

  • Download from Oracle OTN

    • oracle-instantclient18.3-basic-18.3.0.0.0-1.x86_64.rpm
    • oracle-instantclient18.3-sqlplus-18.3.0.0.0-1.x86_64.rpm
    • oracle-instantclient18.3-devel-18.3.0.0.0-1.x86_64.rpm

    and put the files under software/oracle/

  • Creation of infrastructure

terraform init

terraform import aws_s3_bucket.internal_bucket my_internal_bucket_name

terraform apply -var arcus_internal_bucket_name = my_internal_bucket_name

For further configuration see the variables.tf file in the terraform folder.

  • Destruction of infrastructure

terraform state rm aws_s3_bucket.internal_bucket

terraform destroy