gsutil cp – Copy and Move Files on Google Cloud Platform. gsutil cp/mv command is mainly used to perform actions on the files or objects on the Google Cloud Storage from your local machine or from your Compute Engine Virtual Machine. You can also use gsutil wildcard to sync multiple objects to GCS.
This guide helps you to fully understand the usage of gsutil cp
command to copy files and objects or move files or objects between your local computer and Google Cloud Storage and between your Compute Engine Instance and Google Cloud Storage buckets.
To use the gsutil command you need a little knowledge about how to use the terminal/command line/ssh
Install gsutil in your local computer
To start with you need to install the Google Cloud sdk on your local computer.
Once you have installed the sdk open your terminal and initialize the gcloud environment.
gcloud init
This command allows you to login to your Google Cloud account and select the project to perform actions.
gsutil Commands Usage
gsutil help
a built in help command.gsutil help cp
a list of available commands for the copy action.gsutil help options
to get info about the top-level command-line options.gsutil version -l
outputs the installation details of our gsutil
gsutil cp Upload File
You can upload a file from your local computer to your Google Cloud Storage bucket using gsutil cp
command easily.
#syntax
gsutil cp <source-file> <destination_bucket>
#Example
gsutil cp filename.txt gs://bucketname
The above command will copy filename.txt
from your current directory and upload it to your mentioned Google Cloud storage bucket.
gsutil cp Upload Directory and Files
If you want to upload any directory with the files inside it recursively, you can use the -r
option to upload a folder and all contents of it.
gsutil cp -r folder-name gs://bucketname
If your folder is large enough, you can also use the -m
option to perform a parallel multi-threaded/multi-processing transfer, which is more faster than the later. Please check the example shown below.
gsutil -m cp -r folder-name gs://bucketname
gsutil cp Download File/Object
You can also download a file from your Google Cloud Storage bucket to your local computer. This time you just need to switch the source and target location as shown below
gsutil cp gs://bucketname/filename.txt local-location
As mentioned before you can use the -r
option to download a folder and its contents from Google Cloud Storage.
gsutil cp -r gs://bucketname/folder-name local-location
You can also use the -m
option to download large number of files which performs a parallel (multi-threaded/multi-processing) copy.
gsutil -m cp -r gs://bucketname/folder-name local-location
gsutil cp from Google Compute Engine Instance
One of the most handy usage for Google Cloud developers is to copy a file from your Google Compute Engine instance to Google Cloud Storage and vice versa. By default all Google cloud compute engine instance have gsutil pre installed. So you just need to execute the default gsutil cp command from your Compute Engine instance terminal.
To run this command you need to have to enable the Allow full access to all Cloud APIs for the instance. You can find this in your instance settings. Then SSH to your instance and run the following command.
sudo gsutil cp path/filename gs://bucket_name
To copy a file from Google Cloud Storage to your Compute Engine Instance you can use the following command.
sudo gsutil cp gs://bucket_name/file-name path/folder-name
Copy from AWS S3 to Google Cloud Storage
To transfer files between AWS S3 and Google Cloud Storage you need to setup S3 access credentials in a .boto
file in your working directory.
Create a new file with the name .boto
Populate the file with the following.
[Credentials]
aws_access_key_id = ACCESS_KEY_ID
aws_secret_access_key = SECRET_ACCESS_KEY
Now you can execute the following command to transfer the files from Amazon S3 to GCS.
gsutil cp s3://bucket-name/filename gs://bucket-name
To copy a file from GCS to S3 you can use this command.
gsutil cp gs://bucket-name/filename s3://bucket-name
You can use the -r
option to copy directories and also use the -m
option to copy large number of files.
Copy between two Google Cloud Storage buckets
To transfer a file within two Google Cloud Storge buckets you can use the following command.
gsutil cp gs://bucket-name/filename gs://bucket-name
Copy file from URL to Google Cloud Storage
You can copy a file from any URL and then upload it directly to your Google Cloud Storage bucket.
curl -L file-url | gsutil cp - gs://bucket-name/filename
Create a new folder using gsutil cp command
You can also create a new folder in your bucket using the cp
command.
gsutil cp folder-name gs://bucket-name
Setup Permissions for objections with gsutil
You can also setup permissions like read or write or owner rights to the objects that are copied from other buckets.
Provide read access to all users for objects. You can also use a wildcard for all files and folders as shown below.
gsutil -m acl ch -u AllUsers:R gs://bucket_name/folder/*.jpg
The above command will make all jpg
files in the particular folder as public.
You can also provide write access to the bucket.
gsutil -m acl ch -u AllUsers:W gs://bucket_name
For more details about permissions you can have a look at official documentation.
Get your Professional Google Cloud Architect certificate with this easy to learn course now.
Conclusion
Now you have learned some of the ways to copy files using the gsutil cp command from AWS S3, Dropbox URL, between buckets, from VM Instance to Google Cloud Storage.
If you like this post and have some other ideas please feel free to post a comment below.
Leave your Reply