Cloud Logging for Storage Transfer Service

This page describes how to configure and view Cloud Logging for Storage Transfer Service logs.

Cloud Logging for Storage Transfer Service is supported for all transfers. FIND operations are not logged for agent-based transfers.

File system transfers can additionally configure file system transfer logs.

Before you begin

Before you begin, verify that you have access to Cloud Logging. We recommend the Logs Viewer (roles/logging.viewer) Identity and Access Management role. For more information on Logging access, see Access control with IAM.

The following describe how to verify and grant IAM access:

Loggable actions

The following actions can be logged:

  • FIND: Finding work to do, such as listing files in a directory or listing objects in a bucket. Not supported for agent-based transfers.
  • COPY: Copying files or objects to Cloud Storage.
  • DELETE: Deleting files or objects at the source or the destination. For transfers between two file systems, also logs the deletion of files from the intermediary Cloud Storage bucket.

For each action, you can choose to log success and/or failure states.

Enable logging

To enable logging, specify the actions and the states to log.

gcloud CLI

When creating a transfer job with gcloud transfer jobs create, use the following flags to enable logging:

gcloud transfer jobs create SOURCE DESTINATION \
  --log-actions=copy,delete,find \
  --log-action-states=succeeded,failed

You must specify at least one value for each flag.

REST

To create a logging configuration, use transferJobs.create with a LoggingConfig:

{
  "name":"transferJobs/myFirstTransfer",
  "status": "ENABLED",
  "projectId": "test-id-001",
  "loggingConfig": {
     "logActions": ["FIND", "DELETE", "COPY"],
     "logActionStates": ["SUCCEEDED", "FAILED"],
  },
  "transferSpec": {
      "awsS3DataSource": {
          "bucketName": "AWS_SOURCE_NAME",
          "awsAccessKey": {
              "accessKeyId": "AWS_ACCESS_KEY_ID",
              "secretAccessKey": "AWS_SECRET_ACCESS_KEY"
          }
      },
      "gcsDataSink": {
           "bucketName": "destination_bucket",
           "path": "foo/bar/"
      },
   }
}

Adjust loggingConfig to include the specific logActions and logActionStates to log. For example, to log when copy and find actions fail, provide the following loggingConfig:

"loggingConfig": {
  "logActions": ["COPY", "FIND"],
  "logActionStates": ["FAILED"],
}

Update a logging configuration

gcloud CLI

To update an existing job's logging configuration, use the appropriate flags with the gcloud transfer jobs update command:

gcloud transfer jobs update NAME \
  --log-actions=copy,delete,find \
  --log-action-states=succeeded,failed

To disable logging for this job, specify --clear-log-config:

gcloud transfer jobs update NAME --clear-log-config

REST

To update an existing transfer job's logging configuration, use transferJobs.patch with LoggingConfig:

{
  "projectId": "test-id-001",
  "transferJob": {
    "loggingConfig": {
       "logActions": ["FIND", "DELETE", "COPY"],
       "logActionStates": ["SUCCEEDED", "FAILED"],
    },
  },
  "updateTransferJobFieldMask": "loggingConfig"
}

The updateTransferJobFieldMask specifies the field that is being updated in this request and is required.

To disable logging for this job, send a loggingConfig with empty lists for logActions and logActionStates:

{
  "projectId": "test-id-001",
  "transferJob": {
    "loggingConfig": {
       "logActions": [],
       "logActionStates": [],
    },
  },
  "updateTransferJobFieldMask": "loggingConfig"
}

View logs

To view transfer logs, do the following:

Google Cloud console

  1. Go to the Google Cloud navigation menu and select Logging > Logs Explorer :

    Go to the Logs Explorer

  2. Select a Google Cloud project.

  3. From the Upgrade menu, switch from Legacy Logs Viewer to Logs Explorer.

  4. To filter your logs to show only Storage Transfer Service entries, type storage_transfer_job into the query field and click Run query.

  5. In the Query results pane, click Edit time to change the time period for which to return results.

For more information on using the Logs Explorer, see Using the Logs Explorer.

gcloud CLI

To use the gcloud CLI to search for Storage Transfer Service logs, use the gcloud logging read command.

Specify a filter to limit your results to Storage Transfer Service logs.

gcloud logging read "resource.type=storage_transfer_job"

Cloud Logging API

Use the entries.list Cloud Logging API method.

To filter your results to include only Storage Transfer Service-related entries, use the filter field. A sample JSON request object is below.

{
"resourceNames":
  [
    "projects/my-project-name"
  ],
  "orderBy": "timestamp desc",
  "filter": "resource.type=\"storage_transfer_job\""
}

Transfer log format

The following section describes the fields for Storage Transfer Service logs.

All Storage Transfer Service-specific fields are contained within a jsonPayload object.

FIND actions

{
  "jsonPayload": {
    "@type": "type.googleapis.com/google.storagetransfer.logging.TransferActivityLog",
    "action": "FIND",
    "completeTime": "2021-12-16T18:58:49.344509695Z",
    "destinationContainer": {
      "gcsBucket": {
        "bucket": "my-bucket-2",
      },
      "type": "GCS",
    },
    "operation": "transferOperations/transferJobs-7876027868280507149--3019866490856027148",
    "sourceContainer": {
      "gcsBucket": {
        "bucket": "my-bucket-1"
      },
      "type": "GCS"
    },
    "status": {
      "statusCode": "OK"
    }
  }
}

COPY and DELETE actions

{
  "jsonPayload": {
    "@type": "type.googleapis.com/google.storagetransfer.logging.TransferActivityLog",
    "action": "COPY",
    "completeTime": "2021-12-16T18:59:00.510509049Z",
    "destinationObject": {
      "gcsObject": {
        "bucket": "my-bucket-2",
        "objectKey": "README.md"
      },
      "type": "GCS",
    },
    "operation": "transferOperations/transferJobs-7876027868280507149--3019866490856027148",
    "sourceObject": {
      "gcsObject": {
        "bucket": "my-bucket-1",
        "lastModifiedTime": "2021-12-07T16:41:09.456Z",
        "md5": "WgnCOIdfCXNTUDpQJSKb2w==",
        "objectKey": "README.md",
      },
      "type": "GCS",
    },
    "status": {
      "statusCode": "OK"
    }
  }
}
Log field Description
@type The value is always type.googleapis.com/google.storagetransfer.logging.TransferActivityLog.
action

Describes the action of this particular task. One of the following:

  • FIND: Finding work to do, such as listing files in a directory or listing objects in a bucket. Not reported for agent-based transfers.
  • COPY: Copying files or objects to Cloud Storage.
  • DELETE: Deleting files or objects at the source, destination, or intermediary bucket.
completeTime The ISO 8601-compliant timestamp at which the operation completed.
destinationContainer

Only present for FIND operations. FIND operations are not logged for agent-based transfers.

The destination container for this transfer. Contains two sub-fields:

  • gcsBucket.bucket: The destination Cloud Storage bucket name.
  • type: Always GCS.
destinationObject

Only present for COPY and DELETE operations.

Information about the object at the destination. Contains two sub-fields:

  • Either gcsObject or posixFile, depending on the destination. Both options contain multiple sub-fields that specify location, date/time info, and the object or file's hash.
  • type is one of GCS or POSIX_FS.

For example:

"destinationObject": {
  "type": "POSIX_FS",
  "posixFile": {
    "crc32c": "0",
    "path": "/tmp/data/filename.txt",
    "lastModifiedTime": "2022-09-22T04:33:45Z"
  }
}
operation The fully-qualified transferOperations name.
sourceContainer

Only present for FIND operations. FIND operations are not logged for agent-based transfers.

The source container for this transfer. Contains two sub-fields:

  • An entry specifying the source location. The field is named according to the source type. Possible fields are as follows.
    • awsS3Bucket.bucket: The AWS S3 bucket name.
    • azureBlobContainer: Contains sub-fields account and container, which together define the Microsoft Azure Blob storage URI.
    • gcsBucket.bucket: The Cloud Storage bucket name.
    • httpManifest.url: The URL of a URL list that specifies publicly-available files to download from an HTTP(S) server.
  • type is one of AWS_S3, AZURE_BLOB, GCS, or HTTP.

For example:

"sourceContainer": {
  "gcsBucket": {
    "bucket": "my-bucket-1"
  }
  type: "GCS"
}
sourceObject

Only present for COPY and DELETE operations.

Information about the source object. Contains two sub-fields:

  • An entry specific to the source object's host. The field is named according to the source type and contains subfields for metadata. Possible fields are as follows.
    • awsS3Object: An AWS S3 object.
    • azureBlob: A file in Azure Blob Storage.
    • gcsObject: A Cloud Storage object.
    • httpFile: A file specified by a URL list.
    • posixFile: A file on a POSIX file system.
  • type is one of AWS_S3, AZURE_BLOB, GCS, HTTP, or POSIX_FS.

For example:

"sourceObject": {
  "gcsObject": {
    "bucket": "my-bucket-1"
    "lastModifiedTime": "2021-12-07T16:41:09.456Z"
    "md5": "WgnCOIdfCXNTUDpQJSKb2w=="
    "objectKey": "README.md"
  }
  type: "GCS"
}
status

The status of the action. If status.statusCode is OK, the action succeeded. Otherwise, the action failed. The status.errorType and status.errorMessage fields are only populated if the status is not OK.

In addition, the top-level resource field contains the following fields.

"resource": {
  "labels": {
    "job_id": "transferJobs/7876027868280507149"
    "project_id": "my-project-id"
  }
  "type": "storage_transfer_job"
}
Log field Description
resource.labels.job_id The Storage Transfer Service job name to which this log belongs.
resource.labels.project_id The Google Cloud project ID for this transfer.