This is part of the writeup on Using Google Cloud Storage with Apps Script. By now you should have completed all the steps in Setting up or creating a console project, Enabling APIs and OAuth2 and Using the service account to enable access to cloud storage and now have a project and credentials you can use to create your storage buckets.
If you prefer to look at some examples first, then see GcsStore examples
You should already have the cGoa library, from when you set up your credentials in Using the service account to enable access to cloud storage
You'll also need the cGcsStore library on github or at this key.
It's not necessary, but I usually also include the cUseful library too, as it has lots of shortcuts and I may use some of them in the examples throughout this post. You'll find that on github or at
So my final resource manifest looks like this.
You can use the Cloud storage browser, which you can get to via the console
Here the browser is showing my buckets
Deciding on what buckets to use
It's worth thinking about how you are going to use cloud storage with respect to Apps Script. In my case, I'm planning to set up 3 buckets.
Of course you can just lump them all into one, but if you are planning a cache type usage, then you can setup GcsStore (or use the browser) to set up lifecycle management to clear out object after a period of time. Lifecycle management applies at the bucket level, so that's why you'd need a separate bucket.
You will have set up Oauth2 in Using the service account to enable access to cloud storage, or perhaps you didn't use cGoa and have an access token in some other way. In any case, here's the pattern for setting up any app that plans to use this GcsStore.
If you want to use GcsStore to create a bucket, then you can use this pattern.Note that to create a bucket you need the Project Id. Goa knows it (as in this pattern), but if you are not using goa, you'll need to provide the project id
Finally, we just need to assign the bucket to the handle
If you are using cache, you can set expiry times for entries, just like the cacheservice. However, the Cloud Storage platform does not have this capability natively. It does have lifecycle management, where you can ask it to clean up unused items after a certain number of days. GcsStore respects expiry times by ignoring items that have expired, but the lifecycle mechanism (along with the cleaner method which we'll look at later), can be used to keep the storage clean.
By default, the bucket will not have lifecycle enabled, which will mean that items written are permanent. You can change this either via the browser, or with GcsStore
One of the problems of the Properties and CacheService is that you can only see values in them from the same script. If you want to share values across script you have to do something else. Although the cloud platform is actually flat storage you can simulate folders. Using this technique you can make particular data visible across any sharing community just giving it a folder key.
For example you could simulate the UserProperties() visibility with this
which would look like this in the browser
would show up in the folder and be visible to any script using this project and using the same folder key.
The default folder key is "globals" as in
Objects can be expired by setting a default expiration time (in seconds) for the store
or/and passing it in a put operation.
GcsStore uses object metadata to decide whether an item has expired , and if it has , gcs.get will act as if it doesn't exist.
An item expiring does not remove it, as there is no mechanism in Google Cloud Platform to make it happen, but you can use gcs.setLifetime as discussed earlier to automatically delete all items in a bucket after some number of days.
In addition, GcsStore has a cleaner function, invoked by gcs.cleaner(), which will remove any expired items.
If necessary, you can trigger something like the code below to run every now and again.
You'll notice when it's finished it writes a report to the store.
which looks like this
The value in gcs.put (key,value) can be a string, an object or a blob. If it's an object, it will be stringifed and automatically parsed when read back in again with gcs.get(key). The content type of a blob is preserved through to the cloud storage, and if it was written as a blob, the same blob will be returned by gcs.get() as in this example.
GcsStore can automatically compress items. If they were compressed by gcs.put(), they will be uncompressed by gcs.get().
Objects can be compressed by setting a default for the store
or/and passing it in a put operation.
Compressed items are stored as zip files, so they can be downloaded as normal through the browser if required.
There are some examples of all this in GcsStore examples
Services > Desktop Liberation - the definitive resource for Google Apps Script and Microsoft Office automation > Google cloud topics > Cloud Storage and Apps Script >