Fast uploads with Google Cloud Storage

Learn how to upload big files concurrently in Google cloud

Uploading big files with restricted access to a bucket

Recently, I had to upload a big file to Google Cloud storage. I was given an account.json file with the credentials to upload these files.

So in order to not spend the whole night uploading these files, I looked up for the best way to do it concurrently.

First Step: Install Gcloud

Go to Gcloud CLI and look for the right binary for your environment.

In my case, it was MacOS x86_64

I made sure that I had installed python 3.13 with brew and pyenv

  eval "$(/usr/local/bin/pyenv init -)"
  python --version

After decompressing the file, I run the following:

  ./google-cloud-sdk/install.sh

You will be presented with different questions, just choose the default values. And finally, make gcloud binary available in your terminal

  export PATH="${HOME}/google-cloud-sdk/bin:$PATH"

Second Step: Setup your account

Your first need to authenticate so your account is configured by default

  gcloud auth activate-service-account --key-file=my-account.json 

After that you can validate by running this:

gcloud auth list

You can also set your current project by running:

gcloud config set project abc-123

If you've given the specific bucket name with restricted access, we can use gsutl

gsutil -o "GSUtil:parallel_composite_upload_threshold=150M" cp big_file.zip gs://mybucket/

This will upload this file really fast in parallel chunks and Google Storage will assemble them on the other side. You can check if the file arrived as expected

gsutil ls -l gs://mybucket/
gsutil du -sh gs://mybucket/

If you have a good Network bandwidth, expect the files to fly over Google.