Google Cloud Platform collector

Google Cloud Platform collector

Configuration requirements

To run this collector, there are some configurations detailed below that you need to take into account.

Configuration

Details

Configuration

Details

GCP console access

  • You should have credentials to access the console.

Permissions

  • Administrator permissions to access the GCP console.

Logging services

The following features have been configured:

  • GCP project

  • Service account

  • GCP Pub/Sub

  • Sink (optional)

Enable SCC

  • SCC Audit logs: you used the logging service to pull the data source.

  • SSC Comand Center: you used scc_findigs service to pull the data source.

Credentials

  • JSON credentials have been filled or deleted.

Refer to the Vendor setup section to know more about these configurations.

Overview

Google Cloud Platform (GCP) lets you build, deploy, and scale applications, websites, and services on the same infrastructure as Google. This collector provides the possibility to integrate GCP with the Devo Platform, making it easy to query and analyze GCP event data, view it in the pre-configured Activeboards, or customize them to suit your needs.

This collector is a generic Pub/Sub collector that collects data from any source using Pub/Sub in the logging service. Additionally, it also collects data from Netskope, which uses Pub/Sub Lite.

Devo’s GCP Collector enables customers to retrieve event data stored in GCP via Google Cloud APIs, such as audit logs, Security Command Center findings, networking, load balancing, and more, available via Pub/Sub into Devo. This allows IT and Cybersecurity teams to query, correlate, analyze, and visualize data at a petabyte scale, helping them make the most impactful decisions.

1.x to 2.x migrating guide

If you are migrating from v1.x to v2.x, you can find a complete guide in this article.

Devo collector features

Feature

Details

Feature

Details

Allow parallel downloading (multipod)

  • Allowed

Running environments

  • Collector server

  • On-premise

Populated Devo events

  • Table

Flattening preprocessing

  • No

Data sources

ID

Data Source

Description

API Endpoint

Collector Service Name

Devo Table

Available from release

ID

Data Source

Description

API Endpoint

Collector Service Name

Devo Table

Available from release

1

Logging (formerly StackDriver)

Cloud Logging is a fully managed service that allows you to store, search, analyze, monitor, and alert on logging data and events from Google Cloud and Amazon Web Services.

pub/sub queue

logging

cloud.gcp.<logname_part1>.<logname_part2>

This service allows you to select two different autodispatcher system. This is the structure for the one based on logname.

cloud.gcp.<resource_type_part1>.<resource_type_part2>

This service allows you to select two different autodispatcher system. This is the structure for the one based on resource type.

v1.0.20

3

Security Command Center Findings

Security Command Center is Google Cloud's centralized vulnerability and threat reporting service.

pub/sub queue

scc_findings

cloud.gcp.scc.findings

v1.1.4

4

Netskope Web Transactions

Netskope Web Transactions provide detailed visibility into web activity, including user behaviors, threats, and policy violations. This enables organizations to enforce security policies and protect against data exfiltration.

pub/sub lite

netskope-web-transaction

casb.netskope.transaction_events

v2.0.0

For more information on how the events are parsed, visit our page.

Flattening preprocessing

This collector does not implement flattening

Vendor setup

To enable the collection in the vendor setup, there are some minimal requirements to follow:

  1. GCP console access: You should have credentials to access the GCP console.

  2. Owner or Administrator permissions within the GCP console.

Enable the Logging service

Here you will find how to enable the Logging service (formerly Stackdriver)

Logging Service Overview

GCP centralizes all the monitoring information from all services in the cloud catalog that is inside the service named after Logging.

You have to use the logging service to pull this data source.

Some information is enabled by default and free of charge. Other information, that in case of activating its generation, will concur some costs, so it must be enabled manually. In both cases, the generated information (messages) will arrive at the Logging service.

The diagram is only an example to the GCP services. There are many more GCP services.

The Logging service has different ways of exporting the information stored and structured in messages. In this case, it’s being used by another GCP service called PubSub, basically, this service will contain a topic object that will receive a filtered set of messages from the Logging service, then the GCP collector will retrieve all those messages from the topic object using a subscription (in the pull mode).

To facilitate the retrieve is recommended to split the source message using different topic objects, you can split it by resource type, region, project ID, and so on:

Configuration of the Logging service

Here you will find which features you need to configure to receive events from both services:

  1. GCP Project: You need to have a GCP Project in the console to be able to receive data.

  2. Service account: The Service account is a Google service that allows.

  3. GCP Pub/Sub: It is the queue from which the events will be downloaded, it is necessary to create a Topic and a Subscription.

  4. Sink (optional): The sink is a filter to receive only the type of events that you want.

Here you will find the steps to configure each feature:

  1. Go to the left-bar many, select IAM & Admin, and then click on Create a Project.

  2. Fill in the project details and click on Create.

  3. Select the project and click Open.

  1. Click on the name of the project in the top menu.

  2. Copy the Project ID.

Save the Project ID

It is important to save this value to later configure the collector.

  1. Go to your Google GCP console project, click to open the left bar menu, click on IAM & Admin and click on service account to create a GCP credential.

  2. Click on + Create Service Account to create the credentials.

    1. Fill in the section Service account details fields and click on CREATE AND CONTINUE.

    2. In the section Grant users access to this service account: In role field put Pub/sub Subscriber. Click on CONTINUE.
      If you want to enable the undelivered messages logging feature, will be needed to also add the Monitoring Viewer role to the Service Account.

    3. The section Grant users access to this service account is an optional field and it is not necessary to fill in.

    4. Finally, Click on DONE.

  3. Now, we have to add the Keys to the service account that was previously created and download it as a JSON format. After clicking on Done, you’ll be redirected to the Services Accounts of your project. Search for the service account that you created and click on it.

  4. On Service Account Details click on the KEYS tab.

  5. Click on the button ADD KEY and Create new key.

  6. Select JSON format and click on CREATE .

  7. Download the credentials file and move it to <any_directory>/devo-collectors/gcp/credentials/ directory.

  8. Copy the content of the json file. You can use any free software to convert the content of the json file to base64.

It is important to save the credentials file to later run the collector in the collector server.

  1. Paste it into a base64 encoder and copy the result.

It is important to save this value to later run the collector on-premise and in the collector server.

  1. Use the search tool to find the Pub/Sub service.

  2. Click on Create a topic.

  3. Fill in the Topic ID.

  4. Mark the Add a default subscription box, then the subscription will be automatically created.

  5. Click Create. The Subscription is created by default and is located in the Subscription tab.

It is important to save the Subscription ID to use it later in the collector configuration. In this example is called test-topic-sub1.

This is an optional step

This step is optional, the Service account can already access to all subscriptions. Refer to the Access control IAM documentation for more information on Access control for subscriptions.

  1. Click on the Subscription ID created in the previous step.

  2. Copy and save the subscription name.

  3. Search for IAM in the search tool and click on the IAM service.

  4. Click on the edit button of the service account that was created.

  5. Click on Add condition and fill it as the following:

    1. Condition type: Select Resource → Name

    2. Operator: is

    3. Value: Use the name of the subscription you already copied.

  6. Click on the Save button

More information

For more information on Access Control refer to the following article.

This is an optional step

This step is optional, the Service account can already access to all subscriptions. Refer to the Access control IAM documentation for more information on Access control for subscriptions.

  1. Use the search tool and look for the Logging service.

  2. Click on Logs Router and click on Create Sink.

  3. Follow the steps and when you finish click on Create sink.

More information

Refer to the official Google documentation about how to Configure and manage sinks.

Enable the Security Command Center Service (SCC)

It is mandatory that you have configured the Logging service to enable the SCC.

Events can be retrieved differently depending on the source:

  • SCC Audit logs: Events obtained through the Logging service.

  • SCC Findings: Events obtained from external services.

Enable the Security Command Center (SCC) Audit logs

The events will be obtained through the centralized Logging service. Refer to the Configuration of the Logging service section to know how to configure it.

You have to use the logging service to pull this data source.

Here you will find the steps to filter this type of event:

 

Action

Steps

 

Action

Steps

1

Activate Security Command Center service

When SCC is activated, the events will go directly through the Logging service to the default sink. The following steps are optional but recommended to filter SCC events on another Pub/Sub.

In order to receive this type of event, it is necessary to have the Security Command Center service activated.

Refer to the Security Command Center Quickstart video from the Google guide.

2

Setting up a new topic

Refer to the Configuration of the Logging section to know how to do it.

3

Setting up a Pub/Sub

Refer to the Configuration of the Logging section to know how to do it.

4

Setting up a sink

Refer to the Configuration of the Logging section to know how to do it.

Enable the Security Command Center (SCC) Findings

These events are obtained from the Security Command Center service and are injected directly into the Pub/Sub without going through the Logging service.

You have to use the scc_findigs service to pull this data source.

 

Action

Steps

1

Configure Identity and Access Management (IAM) roles.

Refer to the official Google guide in which additional configurations are described.

2

Activate the Security Command Center API.

3

Setting up a Pub/Sub topic.

4

Creating a Notification configuration.

How to Enable Netskope Web Transactions

 

Action

Steps

1

Configure IAM Roles

Ensure the necessary IAM permissions for Pub/Sub access. Assign the required roles to the Service Account.

2

Activate Netskope API

Enable the Netskope API in the Netskope admin console.

3

Create a Pub/Sub Topic

In the GCP console, go to Pub/Sub, create a new Topic, and ensure Add a default subscription is checked.

4

Set Up Subscription

Once the Topic is created, navigate to Subscriptions, edit the subscription properties as needed, and save.

5

Configure Collector

Add the correct Subscription ID and Service Account credentials (Base64 encoded) to the collector configuration.

For additional details, refer to Netskope's official documentation.

Minimum configuration required for basic pulling

Although this collector supports advanced configuration, the fields required to retrieve data with basic configuration are defined below.

This minimum configuration refers exclusively to those specific parameters of this integration. There are more required parameters related to the generic behavior of the collector. Check setting sections for details.

Setting

Details

Setting

Details

source_id_value

This param allows you to assign a custom name for identifying the environment of the infrastructure.

project_id_value

The name of the GCP project. Refer to the Configuration of the Logging service section to know got to get this value.

file_content_base64

The service account credentials in base64. Refer to the Configuration of the Logging service section to know got to get this value.

subscription_name

The ID of the Pub/Sub subscription. Refer to the Configuration of the Logging service section to know got to get this value.

See the Accepted authentication methods section to verify what settings are required based on the desired authentication method.

Accepted authentication methods

Depending on how did you obtain the credentials, you will have to either fill or delete the following properties on the JSON credentials configuration block.

 

Authentication method

Project ID

Base64 credentials

File credentials

Available on

 

Authentication method

Project ID

Base64 credentials

File credentials

Available on

1

Service account with Base64.

Required

Required

 

  • Collector Server

  • On-Premise

2

Service account with the file credentials.

Required

 

Required

  • On-Premise

Run the collector

Once the data source is configured, you can either send us the required information if you want us to host and manage the collector for you (Cloud collector), or deploy and host the collector in your own machine using a Docker image (On-premise collector).