Document toolboxDocument toolbox

Google Cloud Platform collector

Configuration requirements

To run this collector, there are some configurations detailed below that you need to take into account.

Configuration

Details

Configuration

Details

GCP console access

  • You should have credentials to access the console.

Permissions

  • Administrator permissions to access the GCP console.

Logging services

The following features have been configured:

  • GCP project

  • Service account

  • GCP Pub/Sub

  • Sink (optional)

Enable SCC

  • SCC Audit logs: you used the logging service to pull the data source.

  • SSC Comand Center: you used scc_findigs service to pull the data source.

Credentials

  • JSON credentials have been filled or deleted.

More information

Refer to the Vendor setup section to know more about these configurations.

Overview

This collector lets you build, deploy and scale applications, websites, and services on the same infrastructure as Google. It also provides the possibility to integrate the Google Cloud Platform (GCP) with the Devo platform making it easy to query and analyze GCP event data. You can view it in the pre-configures Activeboards or you can customize it.

Devo’s GCP collector also enables to retrieve data stored in the GCP via Google Cloud APIs such as audit logs, Security Command Center findings, networking, load balance, and more available via Pub/Sub into Devo to query, correlate, analyze and visualize to enable Enterprise IT and Cybersecurity teams to take the most impactful decisions at the petabyte scale.

Devo collector features

Feature

Details

Feature

Details

Allow parallel downloading (multipod)

  • Allowed

Running environments

  • Collector server

  • On-premise

Populated Devo events

  • Table

Flattening preprocessing

  • No

For more information on how the events are parsed, visit our page.

Data sources

Data source

Description

API endpoint

Collector service name

Devo table

Available from release

Logging (formerly StackDrive)

Cloud logging allows you to store, search, analyze, monitor, and alert on logging data and events from Google Cloud and Amazon Web Services.

pub/sub queue

logging

cloud.gcp.<logname_part1>.<logname_part2>

This service allows you to select two different autodispatcher system. This is the structure for the one based on logname.

cloud.gcp.<resource_type_part1>.<resource_type_part2>

This service allows you to select two different autodispatcher system. This is the structure for the one based on resource type.

v1.0.20

Security Command Center Findings

Security Command Center is a Google Cloud’s centralized vulnerability and threat reporting service.

pub/sub queue

scc_findings

cloud.gcp.scc.findings

v1.1.4

Vendor setup

To enable the collection in the vendor setup, there are some minimal requirements to follow:

  1. GCP console access: You should have credentials to access the GCP console.

  2. Owner or Administrator permissions within the GCP console.

Enable the Logging service

Here you will find how to enable the Logging service (formerly Stackdriver)

Logging Service Overview

GCP centralizes all the monitoring information from all services in the cloud catalog that is inside the service named after Logging.

Some information is enabled by default and free of charge. Other information, that in case of activating its generation, will concur some costs, so it must be enabled manually. In both cases, the generated information (messages) will arrive at the Logging service.

The Logging service has different ways of exporting the information stored and structured in messages. In this case, it’s being used by another GCP service called PubSub, basically, this service will contain a topic object that will receive a filtered set of messages from the Logging service, then the GCP collector will retrieve all those messages from the topic object using a subscription (in the pull mode).

To facilitate the retrieve is recommended to split the source message using different topic objects, you can split it by resource type, region, project ID, and so on:

Configuration of the Logging service

Here you will find which features you need to configure to receive events from both services:

  1. GCP Project: You need to have a GCP Project in the console to be able to receive data.

  2. Service account: The Service account is a Google service that allows.

  3. GCP Pub/Sub: It is the queue from which the events will be downloaded, it is necessary to create a Topic and a Subscription.

  4. Sink (optional): The sink is a filter to receive only the type of events that you want.

Here you will find the steps to configure each feature:

  1. Go to the left-bar many, select IAM & Admin, and then click on Create a Project.

  2. Fill in the project details and click on Create.

  3. Select the project and click Open.

  1. Click on the name of the project in the top menu.

  2. Copy the Project ID.

  1. Go to your Google GCP console project, click to open the left bar menu, click on IAM & Admin and click on service account to create a GCP credential.

  2. Click on + Create Service Account to create the credentials.

    1. Fill in the section Service account details fields and click on CREATE AND CONTINUE.

    2. In the section Grant users access to this service account: In role field put Pub/sub Subscriber. Click on CONTINUE.
      If you want to enable the undelivered messages logging feature, will be needed to also add the Monitoring Viewer role to the Service Account.

    3. The section Grant users access to this service account is an optional field and it is not necessary to fill in.

    4. Finally, Click on DONE.

  3. Now, we have to add the Keys to the service account that was previously created and download it as a JSON format. After clicking on Done, you’ll be redirected to the Services Accounts of your project. Search for the service account that you created and click on it.

  4. On Service Account Details click on the KEYS tab.

  5. Click on the button ADD KEY and Create new key.

  6. Select JSON format and click on CREATE .

  7. Download the credentials file and move it to <any_directory>/devo-collectors/gcp/credentials/ directory.

  8. Copy the content of the json file. You can use any free software to convert the content of the json file to base64.

  1. Paste it into a base64 encoder and copy the result.

  1. Use the search tool to find the Pub/Sub service.

  2. Click on Create a topic.

  3. Fill in the Topic ID.

  4. Mark the Add a default subscription box, then the subscription will be automatically created.

  5. Click Create. The Subscription is created by default and is located in the Subscription tab.

  1. Click on the Subscription ID created in the previous step.

  2. Copy and save the subscription name.

  3. Search for IAM in the search tool and click on the IAM service.

  4. Click on the edit button of the service account that was created.

  5. Click on Add condition and fill it as the following:

    1. Condition type: Select Resource → Name

    2. Operator: is

    3. Value: Use the name of the subscription you already copied.

  6. Click on the Save button

Enable the Security Command Center Service (SCC)

Events can be retrieved differently depending on the source:

  • SCC Audit logs: Events obtained through the Logging service.

  • SCC Findings: Events obtained from external services.

Enable the Security Command Center (SCC) Audit logs

The events will be obtained through the centralized Logging service. Refer to the Configuration of the Logging service section to know how to configure it.

Here you will find the steps to filter this type of event:

 

Action

Steps

 

Action

Steps

1

Activate Security Command Center service

In order to receive this type of event, it is necessary to have the Security Command Center service activated.

Refer to the Security Command Center Quickstart video from the Google guide.

2

Setting up a new topic

Refer to the Configuration of the Logging section to know how to do it.

3

Setting up a Pub/Sub

Refer to the Configuration of the Logging section to know how to do it.

4

Setting up a sink

Refer to the Configuration of the Logging section to know how to do it.

Enable the Security Command Center (SCC) Findings

These events are obtained from the Security Command Center service and are injected directly into the Pub/Sub without going through the Logging service.

 

Action

Steps

1

Configure Identity and Access Management (IAM) roles.

Refer to the official Google guide in which additional configurations are described.

2

Activate the Security Command Center API.

3

Setting up a Pub/Sub topic.

4

Creating a Notification configuration.

Minimum configuration required for basic pulling

Although this collector supports advanced configuration, the fields required to retrieve data with basic configuration are defined below.

Setting

Details

Setting

Details

source_id_value

This param allows you to assign a custom name for identifying the environment of the infrastructure.

project_id_value

The name of the GCP project. Refer to the Configuration of the Logging service section to know got to get this value.

file_content_base64

The service account credentials in base64. Refer to the Configuration of the Logging service section to know got to get this value.

subscription_id

The ID of the Pub/Sub subscription. Refer to the Configuration of the Logging service section to know got to get this value.

Accepted authentication methods

Depending on how did you obtain the credentials, you will have to either fill or delete the following properties on the JSON credentials configuration block.

 

Authentication method

Project ID

Base64 credentials

File credentials

Available on

 

Authentication method

Project ID

Base64 credentials

File credentials

Available on

1

Service account with Base64.

Required

Required

 

  • Collector Server

  • On-Premise

2

Service account with the file credentials.

Required

 

Required

  • On-Premise

Run the collector

Once the data source is configured, you can either send us the required information if you want us to host and manage the collector for you (Cloud collector), or deploy and host the collector in your own machine using a Docker image (On-premise collector).

Collector services detail

This section is intended to explain how to proceed with specific actions for services.

Custom Service

This is the only service that GCP has. Multiple custom services can be created to ingest data from different pub/sub sinks, however, the only data sources supported by this collector are:

  • Logging events: The previous sections Running the data collector explain how to configure a Logging service but more custom logging services can be created with different Pub/Sub filters.

  • SCC findings events: This service is also a custom service configured with data coming from a source external to the Logging service.

Devo categorization and destination

The following table shows the Devo tables and the tags to which the events are ingested based on each data source:

Data Source

Devo tables

Devo tag

Details

Data Source

Devo tables

Devo tag

Details

Logging service

cloud.gcp.<logname_part1>.<logname_part2>

cloud.gcp.<logname_part1>.<logname_part2>

This is an autocalculated default tag structure to which the events that come from the Logging service are sent. These events are of type LogEntry.

This tag structure is based on the following message fields:

  • logname_part1 -> It is the first part of the logname field, for example: inlogName: "projects/projectabc-1234/logs/cloudaudit.googleapis.com%2Factivity" the logname_part1 is cloudaudit

  • logname_part2 -> It is the second part of the logname field, for example: in logName: "projects/projectabc-1234/logs/cloudaudit.googleapis.com%2Factivity" the logname_part1 is activity

cloud.gcp.<resource_type_part1>.<resource_type_part2>

cloud.gcp.<resource_type_part1>.<resource_type_part2>

This is an autocalculated default to which the events that come from the Logging service are sent. These events are of type MonitoredResource.

This tag structure is based on the following message fields:

  • resource_type_part1 -> It is the first part of the type field, for example: in"type": "gce_instance" the resource_type_part1 is gce

  • resource_type_part2 -> It is the second part of the type field, for example: in "type": "gce_instance" the resource_type_part2 is instance

custom_tag

custom_tag

If the user adds a custom tag all events will be sent to that custom tag.

cloud.gcp.unknown.none

cloud.gcp.unknown.none

All events that are not in JSON format are sent to that tag (unless a custom tag has been defined)

SCC findings

cloud.gcp.scc.findings

cloud.gcp.scc.findings

This is the recommended value for custom_tag parameter when ingesting SCC Findings. This tag must be defined in the custom tag.

custom_tag

custom_tag

You also can define a different tag for these events, but bare in mind that only cloud.gcp.scc.findings will be native parsed by Devo as SCC Findings.

Events service

Collector operations

This section is intended to explain how to proceed with specific operations of this collector.

Change log

Release

Released on

Release type

Details

Recommendations

Release

Released on

Release type

Details

Recommendations

v1.8.0

Dec 9, 2024

IMPROVEMENTS

Bug Fixes

Bug Fixes

  • Fixed a bug with a big integer when converting to a json dictionary

Improvements

  • Updated DCSDK to 1.13.1

  • Updated Dockerfile to 1.3.1

Recommended version

v1.7.0

Jun 21, 2024

IMPROVEMENTS

Improvements

  • Added small changes to make the configuration compatible with older versions than 1.2.1

  • wheel upgraded from 0.42.0 to 0.43.0

  • google-cloud-logging upgraded from 3.6.0 to 3.10.0

  • google-cloud-pubsub upgraded from 2.18.4 to 2.21.4

  • google-cloud-monitoring upgraded from 2.15.1 to 2.21.0

  • pandas upgraded from 1.3.5 to 1.5.3

Upgrade

v1.6.0

Apr 29, 2024

IMPROVEMENTS

Improvements

  • Upgraded DCSDK from 1.9.2 to 1.11.1

  • Upgrade the Docker base image to 1.2.0

Upgrade

v1.5.0

Sep 25, 2023

IMPROVEMENTSNEW FEATURES

Improvements

  • Upgraded DCSDK from 1.9.0 to 1.9.2

    • Store lookup instances into DevoSender to avoid creation of new instances for the same lookup

    • Ensure service_config is a dict into templates

    • Upgrade internal dependencies

New features

Upgrade

v1.4.0

Aug 3, 2023

IMPROVEMENTS

Improvements

  • Updated DCSDK from 1.7.2 to 1.9.0

    • Changed log level to some messages from info to debug

    • Changed some wrong log messages

    • Upgraded some internal dependencies

    • Changed queue passed to setup instance constructor

    • Ability to validate collector setup and exit without pulling any data

    • Ability to store in the persistence the messages that couldn't be sent after the collector stopped

    • Ability to send messages from the persistence when the collector starts and before the puller begins working

    • Ensure special characters are properly sent to the platform

Upgrade

v1.3.0

Apr 13, 2023

IMPROVEMENTSBUG FIXING

Improvements

  • Improved base64 generation.

  • Updated DCSDK from 1.6.3 to 1.7.2.

    • Added a lock to enhance sender object

    • Added new class attrs to the __setstate__ and __getstate__ queue methods

    • Fix sending attribute value to the __setstate__ and __getstate__ queue methods

    • Added log traces when queues are full and have to wait

    • Added log traces of queues time waiting every minute in debug mode

    • Added method to calculate queue size in bytes

    • Block incoming events in queues when there are no space left

    • Send telemetry events to Devo platform

    • Upgraded internal Python dependency Redis to v4.5.4

    • Upgraded internal Python dependency DevoSDK to v5.1.3

    • Fixed obfuscation not working when messages are sent from templates

    • Obfuscation service can be now configured from user config and module definiton

    • Obfuscation service can now obfuscate items inside arrays.

Bug fixing

  • Fixed a known issue on the DevoSender with the DCSDK update.

Upgrade

v1.2.2

Febr 27, 2023

-

-

-

v1.2.1

Nov 29, 2022

IMPROVEMENTSBUG FIXING

Improvements

  • Devo Collector SDK upgraded from version 1.4.2 to version 1.4.4b.

    • Added some extra checks for supporting MacOS as development environment

    • The "template" supports the controlled stop functionality

    • Some log traces now are shown less frequently

    • The default value for the logging frequency for "main" processes hsa been changed (to 120 seconds)

    • Added log traces for knowing the execution environment status (debug mode)

    • Fixes in the current puller template version

    • The Docker container exits with the proper error code

Bug fixing

  • Configurable logging traces for undelivered messages in GCP moved to thread model to avoid a special case in which is never triggered.
    pubsub_undelivered_messages_request_interval changed to pubsub_undelivered_messages_request_interval_in_seconds. New default value, every 600 seconds.

Recommended version

v1.1.4

Jun 1, 2022

IMPROVEMENT

Improvements

  • New tag cloud.gcp.unknown.none for all services.

  • When the collector processes a message that is not in JSON format, it sends it to the cloud.gcp.unknown.none table (only if the custom tag is not used).

  • The behaviour of custom tags has been changed: If a custom tag is used the message will always go to the custom tag even if it is not in JSON format.

Upgrade

v1.1.3

May 23, 2022

IMPROVEMENT

Improvements

  • Validated base64 variables from config.yaml. A new function was created to check if the base64 token in the configuration file has a valid format.

  • Increase the Queue consuming throughput

Upgrade

v1.1.2

Apr 12, 2022

IMPROVEMENT

VULNS

 

Improvements

  • The underlay Devo Collector SDK has been upgraded to v1.1.4 to improve efficiency, increase the resilience and mitigate vulnerabilities.

  • The hard-reset procedure when losing connection with Devo has been improved.

Vulnerabilities mitigated

  • CVE-2022-1664

  • CVE-2021-33574

  • CVE-2022-23218

  • CVE-2022-23219

  • CVE-2019-8457

  • CVE-2022-1586

  • CVE-2022-1292

  • CVE-2022-2068

  • CVE-2022-1304

  • CVE-2022-1271

  • CVE-2021-3999

  • CVE-2021-33560

  • CVE-2022-29460

  • CVE-2022-29458

  • CVE-2022-0778

  • CVE-2022-2097

  • CVE-2020-16156

  • CVE-2018-25032

Upgrade

v1.1.1

Mar 8, 2022

IMPROVEMENT

Improvements

  • The underlay Devo Collector SDK has been upgraded to v1.1.3 to improve efficiency and performance.

  • When the collector loses the connection with Devo it executes a hard-restart protocol to force the reconnection with a fresh configuration.

Upgrade

v1.1.0

Mar 1, 2022

IMPROVEMENT

Improvements

  • The following properties have been renamed to be more user-readable:

    • credentials_file to filename

    • credentials_file_content_base64 to file_content_base64

  • Added new optional categorization mode which categorizes the events based on their fields to create the Devo Tag.

  • The underlay Devo Collector SDK has been upgraded to v1.1.0 to improve efficiency.

Upgrade