Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

Devo furnishes you with model Python scripts that you deploy as a function on AWS Lambda to listen for changes in an AWS S3 bucket. New bucket objects are detected, collected, tagged, and forwarded securely to the Devo Cloud. 

We provide two model scripts, one for collecting events in text format and another for events in JSON format. Both need to be reviewed and customized for your environment.

If you are not comfortable working with Python code, we strongly recommend that you contact Devo customer support to request assistance with this procedure.

Due to the nature of services logging to S3, there will be a time gap from the generation of the event in the original source and its arrival to Devo. Log events will only be ingested once they are written to the S3 bucket. You should keep this in mind when searching for log events by time range and when setting write frequency.

This article takes you step-by-step through the configuration process:

Download the Devo domain certificate files

In the Devo web application, go to Administration → Credentials → X.509 Certificates and download the X.509 Certificate, Private Key, and Chain CA to a new folder.

Download the source files

We provide you with the necessary script and configuration files to collect either plain text or JSON-formatted events from a file in an S3 bucket. 

If you need to collect files in another format and are proficient with Python code and Lambda functions, you can download either of these zip files and edit the Python script (lambda_function.py) as needed.

Download the zip files you need:

Decompress the zip file and copy the following folder and two files to the folder where you saved the Devo domain certificates:

  • /devo

  • config.json.example

  • lambda_function.py

Have a look at the README for a description of these files. 

Edit and rename the config.json.example file

Open the config.json.example file in an editor and edit the values for the following parameters.

Parameter

Description

address

This is the host address for the Devo Cloud for the region you are using. It should be one of:

  • USA: us.elb.relay.logtrust.net

  • Europe: eu.elb.relay.logtrust.net

port

The inbound port number of the Devo Platform host should always be 443.

chain

The name of the Devo domain Chain CA file.

This is usually chain.crt.

cert

The name of the Devo domain certificate file. 

Ex: devo_domain.crt

key

The name of the Devo domain private key file.

Ex: devo_domain.key

tag

This is the Devo tag that corresponds to the technology that generated the events you are sending to Devo. There are hundreds of supported technologies.

For log files in common event format (CEF), it is not necessary to set this parameter and you can just leave it blank. Just be sure that the technology is one that we support in CEF.

In the case that there is no Devo tag that corresponds to the event's technology, you can assign a tag that starts with my.app. In this case, the event's fields will not be parsed.

Save the file as config.json in the folder where the domain certificates and Python script are saved. Delete the original config.json.example file.  

Customize the Python script for your environment

The Python scripts we provide are only models for collecting JSON or plain text events. There are variables in the scripts that you need to review and modify to suit your environment.

Customizing the script that collects JSON events

Below is an excerpt from the model Python script set up to collect JSON events from the S3 bucket. This is the section of code that you need to modify to suit your environment. In particular:

  • Change the value "Records" to the name of the JSON object that contains the event array you want to send to Devo.

  • Change the value of the zip parameter to true if you want to send the data compressed.

If you have questions about editing this Python script, contact Devo customer support.

Script for JSON events
###### START: From this point until END, you need to 
###### carefully review the code to make sure all 
###### variables match your environment.

	# If the name has a .gz extension, then decompress the data
        if key[-3:] == '.gz':
            data = zlib.decompress(data, 16+zlib.MAX_WBITS)

        config = Configuration("config.json")
        con = Sender(config=config.get("sender"))

     # Send JSON-formatted events to Devo
        print("Starting to send lines to Devo")
        counter = 0
        for line in data.splitlines():
            events_json = json.loads(line)
            for single_event in events_json["Records"]:
                counter += con.send(tag=config.get("tag"),
                                    msg=json.dumps(single_event),
                                    zip=False)
        con.close()
        print("Finished sending lines to Devo (%d)" % counter)
		
###### END of code containing key variables.

Customizing the script that collects plain text events

Below is an excerpt from the model Python script set up to collect plain text events from the S3 bucket. This is the section of code that you need to modify to suit your environment. In particular:

  • Change the value of the zip parameter to true if you want to send the data compressed.

If you have questions about editing this Python script, contact Devo customer support.

Script for plain text events
###### START: From this point until END, you need to 
###### carefully review the code to make sure all 
###### variables match your environment.

	# If the name has a .gz extension, then decompress the data
        if key[-3:] == '.gz':
            data = zlib.decompress(data, 16+zlib.MAX_WBITS)

        config = Configuration("config.json")
        con = Sender(config=config.get("sender"))

    # Send plain text events to Devo
        print("Starting to send lines to Devo")
        counter = 0
        for line in data.splitlines():
            counter += con.send(tag=config.get("tag"), msg=line, zip=False)
        con.close()
        print("Finished sending (%d) lines to Devo" % counter)
		
###### END of code containing key variables.

Prepare a ZIP file for upload

You should have a folder with the following five files plus the devo folder (and its contents): your updated and renamed configuration file, the Lambda Python script file, and the three certificate files you downloaded from your Devo domain. Note that two of the certificate files should have the name of your Devo domain (devo_domain in the example below).  

Create a ZIP file containing the folder plus the five files, and name it whatever you like. You will upload this ZIP to AWS to create the Lambda function in step 7 of the next procedure. 

Create a new Lambda function

This procedure guides you through creating the new Lambda function that will monitor the S3 bucket for changes. 

  1. Create a new AWS Lambda function in the same zone in which the S3 bucket resides.

  2. Click Blueprints, then click the s3-get-object-python blueprint tile. 

  3. Click the Configure button. The next page contains three sections; Basic informationS3 trigger, and Lambda function code.


  4. In the Basic information section, enter a Name for the new function. 

    1. If using an existing role, make sure that it has Lambda execution and S3 read permissions. 

    2. If not using an existing role, create a new one. Under Role, select Create new role from AWS Policy Templates. Enter a role name and select Amazon S3 object read-only permissions as the Policy Template.

  5.  In the S3 trigger section, select the Bucket that contains the events, set the Event type to All object create events, then select Enable trigger.

  6. Click Create function. The next page contains several sections in which you configure the details of your new function.

  7. Modify the Function code section as indicated below and for Function package, click Upload to select the .zip file you created earlier. Then, click Save to upload the file.

  8. In the Execution role section, select the role you specified/created for the function. In the Basic settings section, set the Memory and Timeout to an interval that is close to, but less than, the event creation frequency. For example, if the log file creation frequency is 5 minutes, set the Timeout to 4 minutes and 30 seconds. In the Network section, select No VPC for the VPC value.

  9. Click Save

  10. Now, select the new function to view its details. In the Execution role area, click View the <function-name> role to edit the role permissions.

  11. On the Permissions tab, click Attach policy. Select AmazonS3ReadOnlyAccess, then click Attach policy.

Now you can confirm that the Lambda function has been correctly associated to the bucket. Go to S3 and open the bucket. In the bucket's Properties tab, make sure that there's an active notification associated with Events.

If there is no active notification, click the Events tile, then click Add notification. Set up a new event as shown below and click Save.

Now, every time there a new object file is written to the S3 bucket, it will be sent to your Devo domain with the tag specified in the config.json file.


  • No labels