Create and manage data transfers with the Console

This page shows you how to use the Google Cloud console to set up and manage transfer jobs. To work with Storage Transfer Service programmatically, see Creating a Storage Transfer Service Client and Creating and Managing Transfers Programmatically.

Before you start

Do the following before creating a transfer job:

  1. Verify that you have Storage Transfer Service access by checking that you are assigned one of the following roles:

    • roles/owner
    • roles/editor
    • roles/storagetransfer.admin
    • roles/storagetransfer.user
    • A custom role that includes, at bare minimum, roles/storagetransfer.user permissions.

      For more information about adding and viewing project-level permissions, see Using IAM permissions with projects.

    For more information, see Troubleshooting access.

    For more information about IAM roles and permissions in Storage Transfer Service, see Access control using IAM roles and permissions.

  2. Configure access to data sources and sinks.

Transferring on-premises data

We offer the following solutions for transferring on-premises data:

  • For small data sets you can use gsutil rsync. Use gsutil rsync to transfer data between Cloud Storage and other cloud storage providers, or between Cloud Storage and your on-premises data.

  • For large data sets you can use Transfer service for on-premises data. Use Transfer service for on-premises data to transfer data between Cloud Storage and your on-premises storage.

    For more information, see the following:

    • Before you begin
    • Download data from Cloud Storage (Preview)

Set up a transfer job

  1. In the Cloud console, go to the Transfer service for cloud data page.

    Go to Transfer Service for cloud data

  2. Click Create transfer job.

  3. Choose a source:

    Cloud Storage

    Your user account must have storage.buckets.get permission to select source and destination buckets. Alternatively, you can type the name of the bucket directly. For more information, see Troubleshooting access.

    1. Under Source type, select Google Cloud Storage bucket.

    2. Select a bucket by doing one of the following:

      • Enter an existing Cloud Storage bucket name in the Bucket name field without the prefix gs:// . To specify a Cloud Storage bucket from another project, type the name exactly into the Bucket name field.

      • Select a list of existing buckets in your projects by clicking Browse, then selecting a bucket.

        When you click Browse, you can select buckets in other projects by clicking the Project ID, and then selecting the new Project ID and bucket.

      • To create a new bucket, click Create new bucket.

    3. Optional: To include files in a particular path, enter the path in the Folder path field.

    4. Optional: To include a subset of files in your source, click Add prefix. You can include files based on filename prefix. For more information, see Selecting source objects to transfer.
    5. Optional: To apply more filters to your transfer, click Advanced filters. The following items are displayed:
      • Exclude files that start with: Excludes files from the transfer based on a name prefix that you specify. To specify a prefix, click Add prefix.
      • Include only files last modified: Include files in the transfer depending on when they were last modified before the transfer.

        You can specify an Absolute time range and a Relative time range. A relative time range is relative to the start time of the transfer.

    6. Click Next step.

    Amazon S3

    1. Under Select source, select Amazon S3 bucket.

    2. In the Bucket name field, enter the source bucket name.

      The bucket name is the name as it appears in the AWS Management Console.

    3. Select your Amazon Web Services (AWS) authentication method. You can provide an AWS access key or an Amazon Resource Name (ARN) for identity federation:

      • Access key: Enter your access key in the Access key ID field and the secret associated with your access key in the Secret access key field.

      • ARN: Enter your ARN in the AWS IAM role ARN field, with the following syntax:

        arn:aws:iam::ACCOUNT:role/ROLE-NAME-WITH-PATH                                  

        Where:

        • ACCOUNT : The AWS account ID with no hyphens.
        • ROLE-NAME-WITH-PATH : The AWS role name including path.

        For more information on ARNs, see IAM ARNs.

      For more information on Amazon access keys, see Creating an Amazon S3 access keys.

    4. Click Next step.

    5. Optional: To include a subset of files in your source, click Add prefix. You can include files based on filename prefix. For more information, see Selecting source objects to transfer.
    6. Optional: To apply more filters to your transfer, click Advanced filters. The following items are displayed:
      • Exclude files that start with: Excludes files from the transfer based on a name prefix that you specify. To specify a prefix, click Add prefix.
      • Include only files last modified: Include files in the transfer depending on when they were last modified before the transfer.

        You can specify an Absolute time range and a Relative time range. A relative time range is relative to the start time of the transfer.

    Microsoft Azure Blob Storage

    1. Under Select source, select Azure Storage container.

    2. Specify the following:

      1. Storage account name — the source Microsoft Azure Storage account name.

        The storage account name is displayed in the Microsoft Azure Storage portal under All services > Storage > Storage accounts.

      2. Container name — the Microsoft Azure Storage container name.

        The container name is displayed in the Microsoft Azure Storage portal under Storage explorer > Blob containers.

      3. Shared access signature (SAS) — the Microsoft Azure Storage SAS token created from a stored access policy. For more information, see Grant limited access to Azure Storage resources using shared access signatures (SAS).

        The default expiration time for SAS tokens is 8 hours. When you create your SAS token, be sure to set a reasonable expiration time that enables you to successfully complete your transfer.
    3. Optional: To include a subset of files in your source, click Add prefix. You can include files based on filename prefix. For more information, see Selecting source objects to transfer.
    4. Optional: To apply more filters to your transfer, click Advanced filters. The following items are displayed:
      • Exclude files that start with: Excludes files from the transfer based on a name prefix that you specify. To specify a prefix, click Add prefix.
      • Include only files last modified: Include files in the transfer depending on when they were last modified before the transfer.

        You can specify an Absolute time range and a Relative time range. A relative time range is relative to the start time of the transfer.

    5. Click Next step.

    URL list

    1. Under Select source, select URL list.

    2. Under URL of TSV file, provide the URL to a tab-separated values (TSV) file. See Creating a URL List for details about how to create the TSV file.

    3. Optional: To include a subset of files in your source, click Add prefix. You can include files based on filename prefix. For more information, see Selecting source objects to transfer.
    4. Optional: To apply more filters to your transfer, click Advanced filters. The following items are displayed:
      • Exclude files that start with: Excludes files from the transfer based on a name prefix that you specify. To specify a prefix, click Add prefix.
      • Include only files last modified: Include files in the transfer depending on when they were last modified before the transfer.

        You can specify an Absolute time range and a Relative time range. A relative time range is relative to the start time of the transfer.

    5. Click Next step.

  4. Choose a destination by completing the following:

    1. In the Bucket name field, enter the destination bucket name or click Browse to select a bucket from a list of existing buckets in your current project. To create a new bucket, click Create new bucket.

    2. Optional: To transfer files to a particular path, enter the path in the Folder path field.

  5. Choose settings for the transfer job by completing the following:

    1. In the Describe your transfer job field, enter a description of the transfer. As a best practice, enter a description that is meaningful and unique so that you can tell jobs apart.

    2. Under When to overwrite, select one of the following:

      • If different: Overwrites destination files if the source file with the same name has different Etags or checksum values.

      • Always: Always ovewrites destination files when the source file has the same name, even if they're identical.

      • Never: Never overwrites destination files.

    3. Under When to delete, select one of the following:

      • Never: Never delete files from either the source or destination.

      • Delete file from source after they're transferred: Delete files from the source after they're transferred to the destination.

      • Delete files from destination if they're not also at source: If files in the destination Cloud Storage bucket aren't also in the source, then delete the files from the Cloud Storage bucket.

        This option ensures that the destination Cloud Storage bucket exactly matches your source.

    4. Click Next step.

  6. Choose your scheduling options:

    1. From the Run once drop-down list, select one of the following:

      • Run once: Runs a single transfer, starting at a time that you select.

      • Run every day: Runs a transfer daily, starting at a time that you select.

        You can enter an optional End date, or leave End date blank to run the transfer continually.

      • Run every week: Runs a transfer weekly, starting at a time that you select.

      • Run with custom frequency: Runs a transfer at a frequency that you select. You can choose to repeat the transfer at a regular interval of Hours, Days, or Weeks.

        You can enter an optional End date, or leave End date blank to run the transfer continually.

    2. From the Starting now drop-down list, select one of the following:

      • Starting now: Starts the transfer after you click Create.

      • Starting on: Starts the transfer on the date and time that you select. Click Calendar to display a calendar to select the start date.

    3. To create your transfer job, click Create.

View transfer job details

You can view the following job details from Cloud console:

  • The job description and name
  • The source type and location
  • The destination location
  • Job frequency
  • Job statistics

To view transfer job details, do the following:

  1. In the Cloud console, go to the Transfer service for cloud data page.

    Go to Transfer Service for cloud data

  2. Click the transfer job's description.

    The Job details page is displayed.

Run a transfer job from an existing configuration

You can run a transfer job from an existing transfer configuration, which lets you re-run a transfer job with settings you've used previously.

To run a transfer from an existing configuration, do the following:

  1. In the Cloud console, go to the Transfer service for cloud data page.

    Go to Transfer Service for cloud data

  2. Click the transfer job's description.

    The Job details page is displayed.

  3. To start a transfer job, click Start a run.

Alternatively, you can edit an existing transfer configuration to use new settings. For more information, see Editing an existing transfer configuration.

Edit an existing transfer configuration

You can edit an existing transfer configuration to adjust a transfer to suit your needs. You can edit the following items of an existing transfer configuration:

  • The transfer job's description.
  • Microsoft Azure Blob Storage or Amazon S3 source credentials.
  • Any filters applied to the transfer job.
  • Options to overwrite or delete files.
  • The transfer job's schedule.

To edit a transfer job, do the following:

  1. In the Cloud console, go to the Transfer service for cloud data page.

    Go to Transfer Service for cloud data

  2. Click the transfer job's description.

    The Job details page is displayed.

  3. Click Configuration.

    The transfer job configuration is displayed.

  4. To change an item, click Edit next to the item.

  5. To start a job with the edits, click Start a run.

Delete a transfer job

You can delete transfer jobs that you no longer need. Deleting a job does the following:

  • Stops any existing transfers that are part of the job.
  • Stops any recurring transfers that are part of the job.
  • Erases the job's configuration details.

Deleting a job is permanent. Once you delete the transfer job, it is removed from the list of transfer jobs. Transfer job information is fully deleted from Storage Transfer Service after 30 days.

Alternatively, you can deactivate the transfer job, which keeps the job listed in the Jobs page and lets you reactivate or modify the job.

To delete a transfer job, do the following:

  1. In the Cloud console, go to the Transfer service for cloud data page.

    Go to Transfer Service for cloud data

  2. Select the checkbox next to the transfer job that you want to delete.

  3. Click Delete job.

    The Delete transfer job? dialog is displayed.

  4. Read the dialog, then confirm the job deletion by typing the job's automatically assigned name in the field and click Delete.

    The job is removed from the Jobs page.

Deactivate a transfer job

Deactivating a transfer job stops the transfer job from starting any further transfer operations, including future scheduled operations or operations manually started from the Start a run button.

To deactivate a transfer job, do the following:

  1. In the Cloud console, go to the Transfer service for cloud data page.

    Go to Transfer Service for cloud data

  2. Click the transfer job's description.

    The Job details page is displayed.

  3. Click Disable job. The Disable transfer job? dialog is displayed.

  4. Read the dialog, then confirm the job's deactivation by clicking Confirm.

    A notice at the top of the Job details page is displayed, reminding you that the job is deactivated.

Pause a transfer job

You can pause a transfer currently in progress. When you pause the transfer, the job state is maintained, and you can unpause it later. While a transfer is paused, the schedule will not trigger the job to run again.

To pause a transfer job, do the following:

  1. In the Cloud console, go to the Transfer service for cloud data page.

    Go to Transfer Service for cloud data

  2. Click the transfer job's description.

    The Job details page is displayed.

  3. Click Pause run

    The status for the current operation is displayed as Paused.

Restart a paused transfer job

You can unpause a previously paused transfer job, which restarts the job in the same spot it was when it was paused.

To unpause a transfer job, do the following:

  1. In the Cloud console, go to the Transfer service for cloud data page.

    Go to Transfer Service for cloud data

  2. Click the transfer job's description.

    The Job details page is displayed.

  3. Click Resume run.

    The status for the current operation is displayed as In progress.

View historical job status information

You can view the historical status information for transfer job runs. The following job information is available:

  • Transfer status
  • Start and stop times
  • Duration
  • Progress
  • Data transferred
  • Number of errors
  • Data skipped
  • Average speed estimate

To view historical status information for a transfer job, do the following:

  1. In the Cloud console, go to the Transfer service for cloud data page.

    Go to Transfer Service for cloud data

  2. Click the transfer job's description.

    The Job details page is displayed.

  3. To display the details of a particular run, click the Start time for the job run.

    The Run details page is displayed.

View job error details

If you encounter errors during a transfer run, you can view error details to help troubleshoot the error.

To view error details for a transfer job run, do the following:

  1. In the Cloud console, go to the Transfer service for cloud data page.

    Go to Transfer Service for cloud data

  2. Click the transfer job's description.

    The Job details page is displayed.

  3. Do one of the following to display job details:

    • Most recent job run:

      1. Click either the See error details button or the View error details link.

        The Error details page is displayed.

    • Historical job run: Do the following:

      1. Click the job run's Start time.

        The Run details page is displayed.

      2. Click the View error details link.

        The Error details page is displayed.

  4. To display additional details about each displayed error, click Expand more.

  5. To filter errors, enter properties to filter in the Enter property name or value field.

    When you place your cursor in the field, a drop-down menu with relevant options is displayed to help you build your filter.

Selecting source objects to transfer

Storage Transfer Service has prefixes you can use to select which files to include or exclude from the data source. In general, you can think of the prefixes as narrowing down the objects that get transferred. You can use just include prefixes, just exclude prefixes, or both. The following guidance applies for Amazon S3, Microsoft Azure Blob Storage and Cloud Storage data sources.

  • Do not include the leading slash in a prefix. For example, to include the requests.gz object in a transfer from the following bucket path s3://my-aws-bucket/logs/y=2015/requests.gz, specify the include prefix as logs/y=2015/requests.gz.

  • If you use include prefixes and exclude prefixes together, then exclude prefixes must start with the value of one of the include prefixes. For example, if you specify a as an include prefix, valid exclude prefixes are a/b, aaa, and abc.

  • If you use just exclude prefixes, there are not restrictions on the prefixes you can use.

  • If you do not specify any prefixes, then all objects in the bucket are transferred.

  • Do not provide a path name for the data source or sink bucket names. For example, s3://my-aws-bucket and gs://example-bucket are valid, but s3://my-aws-bucket/subfolder or gs://example-bucket/files are not. To include paths, use include and exclude prefixes.

  • Storage Transfer Service does not support remapping, that is, you can not copy the path files/2015 in the data source to files/2016 in the data sink.

For more specifics about working with include and exclude prefixes, see the includePrefixes and excludePrefixes field descriptions in the API.

For more general information about prefixes, see Listing Keys Hierarchically Using a Prefix and Delimiter in the Amazon S3 documentation or the Objects list method for Cloud Storage.

Creating an access key

These steps give an overview of the process of creating Amazon S3 access key credentials that can be used in data transfers from an Amazon S3 bucket to a Cloud Storage bucket. For detailed information, see Creating an IAM User in Your AWS Account and Bucket Policy Examples.

For information on our data retention policy for user credentials, see User credentials.

  1. Create a new user in the AWS Identity and Access Management console.

  2. Note or download the access credentials.

    The downloaded credentials contain the user name, access key ID, and secret access key. When you configure the transfer job in Cloud Storage, you only need the access key ID and secret access key.

  3. Attach a managed policy to the IAM user that contains the permissions needed to complete a transfer.

    Attach the AmazonS3FullAccess policy if your transfer job is configured to delete source objects; otherwise, attach the AmazonS3ReadyOnlyAccess policy. For example, the AmazonS3FullAccess managed policy attached to a user through the IAM console is:

                            {   "Version": "2012-10-17",   "Statement": [     {       "Effect": "Allow",       "Action": "s3:*",       "Resource": "*"     }   ] }                                              
  4. Optional: Create a policy that is more restrictive than the managed policies.

    For example, you can create a policy that limits access to just the Amazon S3 bucket. For more information, see Bucket Policy Examples.

What's next

Learn how to work with Cloud Storage.