Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Devo furnishes you with model Python scripts that you deploy as a function on AWS Lambda to listen for changes in an AWS S3 bucket. New bucket objects are detected, collected, tagged, and forwarded securely to the Devo Cloud. 

...

Open the config.json.example file in an editor and edit the values for the following parameters.

Parameter

Description

address

This is the host address for the Devo Cloud for the region you are using. It should be one of:

  • USA: collector-us.

elb.relay.logtrust.net
  • devo.io

  • Europe: eu.elb.relay.logtrust.net

port

The inbound port number of the Devo Platform host should always be 443.

chain

The name of the Devo domain Chain CA file.

This is usually chain.crt.

cert

The name of the Devo domain certificate file. 

Ex: devo_domain.crt

key

The name of the Devo domain private key file.

Ex: devo_domain.key

tag

This is the Devo tag that corresponds to the technology that generated the events you are sending to Devo. There are hundreds of supported technologies.

For log files in common event format (CEF), it is not necessary to set this parameter and you can just leave it blank. Just be sure that the technology is one that we support in CEF.

In the case that there is no Devo tag that corresponds to the event's technology, you can assign a tag that starts with my.app. In this case, the event's fields will not be parsed.

Save the file as config.json in the folder where the domain certificates and Python script are saved. Delete the original config.json.example file.  

...

  1. Create a new AWS Lambda function in the same zone in which the S3 bucket resides.

  2. Click Blueprints, then click the s3-get-object-python blueprint tile.

    Image RemovedImage Added

     

  3. Click the Configure button. The next page contains three sections; Basic informationS3 trigger, and Lambda function code.

  4. In the Basic information section, enter a Name for the new function. 

    1. If using an existing role, make sure that it has Lambda execution and S3 read permissions. 

    2. If not using an existing role, create a new one. Under Role, select Create new role from AWS Policy Templates. Enter a role name and select Amazon S3 object read-only permissions as the Policy Template.

      Image RemovedImage Added
  5.  In the S3 trigger section, select the Bucket that contains the events, set the Event type to All object create events, then select Enable trigger.

  6. Click Create function. The next page contains several sections in which you configure the details of your new function.

  7. Modify the Function code section as indicated below and for Function package, click Upload to select the .zip file you created earlier. Then, click Save to upload the file.

  8. In the Execution role section, select the role you specified/created for the function. In the Basic settings section, set the Memory and Timeout to an interval that is close to, but less than, the event creation frequency. For example, if the log file creation frequency is 5 minutes, set the Timeout to 4 minutes and 30 seconds. In the Network section, select No VPC for the VPC value.

  9. Click Save

  10. Now, select the new function to view its details. In the Execution role area, click View the <function-name> role to edit the role permissions.

  11. On the Permissions tab, click Attach policy. Select AmazonS3ReadOnlyAccess, then click Attach policy.

...