Skip to content

A Terraform module for Elastic Functionbeat to ship Cloudwatch logs

Notifications You must be signed in to change notification settings

PacoVK/terraform-aws-functionbeat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

README

[DEPRECATION NOTICE]

Functionbeat has been deprecated in favor to the new Elastic Serverless Forwarder. Fortunately, Elastic Serverless Forwarder ships with a Terraform deployment capability.
This module will stay maintained, but no additional features will be added.

INFO: AWS deprecated the golang runtime. This module now uses the alternative way to run go binaries using provided.al2 runtime. This requires a Functionbeat version of at least 8.12.1. If you need to run a prior version you must use module version < 3.x. The full provided.al2 runtime was heavily supported by lutz108!

What is this module for?

Terraform wrapper module to ship Cloudwatch Logs to Kibana via Functionbeat. See official Docs.
The official Functionbeat is based on Cloudformation and also ships with a deployment CLI. If you prefer to stick to Terraform you cannot use Functionbeat alongside your infrastructure code base. This module wrapps the base function to package the Functionbeat lambda and actually deploys via Terraform.

Requirements

Since this module executes a script ensure your machine has the following software available:

  • jq
  • curl
  • tar
  • zip
  • unzip

Running under Alpine

ℹ️ The Functionbeat installer is not compatible with Alpine, due to missing libc. To be able to use this module on Alpine, eg. in a CI pipeline, you need to provide the missing dependencies. You can install libc6-compat using apk add --no-cache libc6-compat.

Simple example

For detailed example please refer to this blog post using Elasticsearch output Please note that output to Logstash is also possible, but in this example we use Elasticsearch.

resource "aws_security_group" "functionbeat_securitygroup" {
  name   = "Functionbeat"
  vpc_id = data.aws_vpc.vpc.id

  egress {
    from_port   = 443
    protocol    = "tcp"
    to_port     = 443
    description = "HTTPS"
    cidr_blocks = ["0.0.0.0/0"]
  }
}

module "functionbeat" {
  source = "git::ssh:https://[email protected]:PacoVK/functionbeat.git"

  application_name     = "crazy-test-application"
  functionbeat_version = "7.17.1"
  lambda_config = {
    name = "my-kibana-exporter"

    vpc_config = {
      vpc_id             = data.aws_vpc.vpc.id
      subnet_ids         = data.aws_subnets.private.ids
      security_group_ids = [aws_security_group.functionbeat_securitygroup.id]
    }

    output_elasticsearch = {
      hosts : ["https://your-endpoint:443"]
      protocol : "https"
      username : "elastic"
      password : "mysupersecret"
    }
  }
}

Advanced example

Head over to example/elasticsearch/elasticsearch.tf or example/logstash/logstash.tf to get an more advanced example.

Usage

Parameter Required Description
application_name X Name of the application to ship the logs from
functionbeat_version X Version to download and deploy of Functionbeat
lambda_config X Functionbeat and Lambda config (see below)
tags - Tags to add to all created AWS resources (see below)
lambda_reserved_concurrent_execution - Reserved concurrency (default: 5)
lambda_memory_size - Memory size (default: 128MB)
lambda_timeout - Timeout (default: 3s)
lambda_description - Description added to the Lambda (default: "Lambda function to ship cloudwatch logs to Kibana")
lambda_write_arn_to_ssm - Switch to control weather the actual Lambda ARN should be written to SSM (default:true)
fb_log_level - Functionbeat loglevel, will be set as an ENV on the Lambda level for easy adjustion (default: info)
fb_extra_configuration - HCL-Map with actual Functionbeat config (default: {})
fb_extra_tags - The tags of the shipper are included in their own field with each transaction published (default: [])
loggroup_name - Name of the Cloudwatch log group to be added as trigger for the function (default: null)
loggroup_filter_pattern - Regex pattern to filter logs which trigger the Lambda (default: "")

lambda_config (required)

You configure your lambda here.

  lambda_config = {
    name = "<NAME-OF-YOUR-LAMBDA>"
    vpc_config = {
      vpc_id = <TARGET-VPC>
      subnet_ids = <TARGET-SUBNET-IDS>
      security_group_ids = [<A-SECURITYGROUP-ID>]
    }
    # You can put any HCL-Map with valid Functionbeat config for Elasticsearch Output 
    output_elasticsearch = {
      hosts = ["https://your-endpoint:443"]
      protocol = "https"
      username = "elastic"
      password = "mysupersecret"
    }
  }

Converting YAML into HCL

You easily extend the Functionbeat reference by setting fb_extra_configuration. Just head over to the official Documentation. To ease you life make use of the online YAML to HCL converter to translate from YAML to valid HCL.

Example:

processors:
    - add_fields:
        target: project
        fields:
          name: myproject
          id: '574734885120952459'

becomes

processors = [
  {
    add_fields = {
      fields = {
        id = "574734885120952459"
        name = "myproject"
      }
      target = "project"
    }
  }
]

which results in the following module configuration

fb_extra_configuration = {
  processors = [
    {
      add_fields = {
        fields = {
          id = "574734885120952459"
          name = "myproject"
        }
        target = "project"
      }
    }
  ]
}

Outputs

This module exposes:

  • the functionbeat lambda ARN
  • if lambda_write_arn_to_ssm is set to true, the name of the actual created SSM parameter

Just get ahead for quick test

Requirement

  • Setup AWS config locally
  • Setup Terraform cli

In examples/ there is an advanced example. Simply checkout the module source and

cd examples/elasticsearch
terrafrom init
terraform apply -auto-approve

Clean up after you're done

terraform destroy -auto-approve

Integrate with serverless framework

You can easily attach cloudwatchlog groups of your serverless application, just by using the serverless-plugin-log-subscription.

  1. Use this module and install the Lambda, ensure lambda_write_arn_to_ssm is set to true, which is default.
module "functionbeat" {
  lambda_config = {
    name = "my-kibana-log-shipper"
  ...
}
  1. To attach all your Lambdas logs for your Serverless application add the following plugin config into your serverless.yml
custom:
  logSubscription:
    enabled: true
    destinationArn: '${ssm:my-kibana-log-shipper_arn}'