gsutil cp – Copy and Move Files on Google Cloud Platform. gsutil cp/mv command is mainly used to perform actions on the files or objects on the Google Cloud Storage from your local machine or from your Compute Engine Virtual Machine. You can also use gsutil wildcard to sync multiple objects to GCS.
This guide helps you to fully understand the usage of gsutil cp
command to copy files and objects or move files or objects between your local computer and Google Cloud Storage and between your Compute Engine Instance and Google Cloud Storage buckets.
To use the gsutil command you need a little knowledge about how to use the terminal/command line/ssh
Install gsutil in your local computer
To start with you need to install the Google Cloud sdk on your local computer.
Once you have installed the sdk open your terminal and initialize the gcloud environment.
gcloud init
This command allows you to login to your Google Cloud account and select the project to perform actions.
Basic gsutil commands
gsutil help
a built in help command.gsutil help cp
a list of available commands for the copy action.gsutil help options
to get info about the top-level command-line options.gsutil version -l
outputs the installation details of our gsutil
gsutil cp Upload File/Object
Use the following command to upload a file from your local computer to your Google Cloud Storage bucket
gsutil cp local-location/filename gs://bucketname/
You can use the -r
option to upload a folder.
gsutil cp -r folder-name gs://bucketname/
You can also use the -m
option to upload large number of files which performs a parallel (multi-threaded/multi-processing) copy.
gsutil -m cp -r folder-name gs://bucketname
gsutil cp Download File/Object
Use the following command to download a file from your Google Cloud Storage backet to your local computer.
gsutil cp gs://bucketname/filename local-location
You can use the -r
option to download a folder from GCS.
gsutil cp -r gs://bucketname/folder-name local-location
You can also use the -m
option to download large number of files which performs a parallel (multi-threaded/multi-processing) copy.
gsutil -m cp -r gs://bucketname/folder-name local-location
gsutil copy from Google Compute Engine Instance
To copy a file from your Google Compute Engine instance to Google Cloud Storage use the following command.
To run this command you need to have to enable the Allow full access to all Cloud APIs for the instance. Then SSH to your instance and run the following command.
sudo gsutil cp path/filename gs://bucket_name
To copy a file from Google Cloud Storage to your Compute Engine Instance you can use the following command.
sudo gsutil cp gs://bucket_name/file-name path/folder-name
Copy from AWS S3 to Google Cloud Storage
To transfer files between AWS S3 and Google Cloud Storage you need to setup S3 access credentials in a .boto
file in your working directory.
Create a new file with the name .boto
Populate the file with the following.
[Credentials] aws_access_key_id = ACCESS_KEY_ID aws_secret_access_key = SECRET_ACCESS_KEY
Now you can execute the following command to transfer the files from Amazon S3 to GCS.
gsutil cp s3://bucket-name/filename gs://bucket-name
To copy a file from GCS to S3 you can use this command.
gsutil cp gs://bucket-name/filename s3://bucket-name
You can use the -r
option to copy directories and also use the -m
option to copy large number of files.
Copy between two Google Cloud Storage buckets
To transfer a file within two Google Cloud Storge buckets you can use the following command.
gsutil cp gs://bucket-name/filename gs://bucket-name
Copy file from URL to Google Cloud Storage
You can copy a file from any URL and then upload it directly to your Google Cloud Storage bucket.
curl -L file-url | gsutil cp - gs://bucket-name/filename
Create a new folder using gsutil cp command
You can also create a new folder in your bucket using the cp
command.
gsutil cp folder-name gs://bucket-name
Setup Permissions for objections with gsutil
You can also setup permissions like read or write or owner rights to the objects that are copied from other buckets.
Provide read access to all users for objects. You can also use a wildcard for all files and folders as shown below.
gsutil -m acl ch -u AllUsers:R gs://bucket_name/folder/*.jpg
The above command will make all jpg
files in the particular folder as public.
You can also provide write access to the bucket.
gsutil -m acl ch -u AllUsers:W gs://bucket_name
For more details about permissions you can have a look at official documentation.
Get your Professional Google Cloud Architect certificate with this easy to learn course now.
Conclusion
Now you have learned some of the ways to copy files using the gsutil cp command from AWS S3, Dropbox URL, between buckets, from VM Instance to Google Cloud Storage.
If you like this post and have some other ideas please feel free to post a comment below.
I cna’t find any documentation on how to specify the local-location variable. I’m trying to download to a folder on the C: drive of my windows machine (let’s say c:\users\username\desktop), but it feels like every iteration I’ve used, whether it’s Linux format for directory path, or Windows format, fails with an error, usually “no space left on device.”
Any thoughts on how to resolve? Thanks!
Can you please share the full gsutil command you used to copy?
how to upload files and folder in gcp cloud storage using jenkins
You can install google sdk on your server and use the gsuitl commands to transfer files
Hi,
It is a good article. I am looking for a solution to handle custom time metadata value to say 2018. I am uploading archived data which are 2018 files, when I post the files in gcs – it takes created time, updated time as in current time but I want that or custom time [Not metadata local-custom-time or local-creation-time] to say as 2018. Please let me know if there’s an option
thanks