Have you ever worked on applications that interact with multiple cloud providers to store or retrieve data?
Yes?
Then you know the pain of writing custom code for each cloud provider’s API. With so many providers on the market, it’s crucial to support as many as possible to attract and retain customers. This flexibility allows customers to choose the best cloud for their needs — otherwise, it’s easy to lose them.
However, each cloud provider has its own API for interacting with cloud storage, making multi-cloud support in a single application challenging. Every new cloud adds more code to maintain.
What if there were a common interface that allowed developers to write generic code to interact with any cloud storage?
This is where the Go Cloud Development Kit (Go CDK) comes in — a set of Go libraries providing a common interface to interact consistently with multiple cloud providers. The goal is to help developers write cloud-agnostic code that runs in any environment with minimal changes.
In this article, I’ll show how to use Go CDK to write cloud-agnostic code for interacting with cloud storage and the local file system. Yes, it even supports the local file system, making it a very flexible solution for all types of storage.
Quick Overview: The blob
Package
The blob
package is a core component of Go CDK, used to read, write, list, and delete blobs (objects/files) from underlying storage.
It uses drivers to support multiple cloud providers:
- fileblob: Local filesystem storage
- gcsblob: Google Cloud Storage
- s3blob: Amazon S3
- azureblob: Azure Blob Storage
- memblob: In-memory storage (useful for testing)
Now that you understand the basics, let’s dive into some code examples demonstrating how to use Go CDK in Go applications.
Installation and Setup
Install Go CDK using:
go get gocloud.dev
This article uses Google Cloud Storage (GCS) for demonstration.
For GCS:
- Download your service account JSON and set it as an environment variable:
export GOOGLE_APPLICATION_CREDENTIALS=<PATH_TO_JSON_FILE>
(You can download this from your GCS service account settings.)
- Create a bucket in GCS where you will write objects/files.
Writing an Object to a GCS Bucket
The following example reads data from a local input file and writes it as an object to a cloud bucket:
func testStoringFiles() {
ctx := context.Background()
if *bucketURL == "" || *filePath == "" || *objectName == "" {
log.Fatal("Usage: go run main.go -bucket BUCKET_URL -file FILE_PATH -object OBJECT_NAME")
}
bucket, err := blob.OpenBucket(ctx, *bucketURL)
if err != nil {
log.Fatalf("Failed to open bucket %q: %v", *bucketURL, err)
}
defer bucket.Close()
f, err := os.Open(*filePath)
if err != nil {
log.Fatalf("Failed to open file %q: %v", *filePath, err)
}
defer f.Close()
w, err := bucket.NewWriter(ctx, *objectName, nil)
if err != nil {
log.Fatalf("Failed to create writer for object %q: %v", *objectName, err)
}
if _, err := io.Copy(w, f); err != nil {
w.Close() // ignore error from Close if copy fails
log.Fatalf("Failed to write data to object %q: %v", *objectName, err)
}
if err := w.Close(); err != nil {
log.Fatalf("Failed to close writer for object %q: %v", *objectName, err)
}
fmt.Printf("Successfully uploaded %q to bucket %q as object %q\n", *filePath, *bucketURL, *objectName)
}
-
bucketURL
: The bucket where the object will be stored -
filePath
: The local file to read -
objectName
: The name of the object to write in the bucket
Reading an Object from a GCS Bucket
This example opens an object in a bucket and prints its contents:
func testReadingFiles() {
ctx := context.Background()
bucket, err := blob.OpenBucket(ctx, *bucketURL)
if err != nil {
log.Fatalf("Failed to open bucket: %v", err)
}
defer bucket.Close()
reader, err := bucket.NewReader(ctx, *objectName, nil)
if err != nil {
log.Fatalf("Failed to open blob reader: %v", err)
}
defer reader.Close()
data, err := io.ReadAll(reader)
if err != nil {
log.Fatalf("Failed to read blob data: %v", err)
}
fmt.Printf("Contents of %s:\n%s\n", *objectName, string(data))
}
Listing All Objects in a Bucket
This function lists all objects in the specified bucket along with their sizes:
func testListingFiles() {
ctx := context.Background()
bucket, err := blob.OpenBucket(ctx, *bucketURL)
if err != nil {
log.Fatalf("Failed to open bucket: %v", err)
}
defer bucket.Close()
iter := bucket.List(&blob.ListOptions{})
for {
obj, err := iter.Next(ctx)
if err == io.EOF {
break
}
if err != nil {
log.Fatalf("Failed to list objects: %v", err)
}
fmt.Println("File:", obj.Key, obj.Size)
}
}
Deleting an Object from a Bucket
The following code deletes a specified object from the bucket:
func testDeletingFile() {
ctx := context.Background()
bucket, err := blob.OpenBucket(ctx, *bucketURL)
if err != nil {
log.Fatalf("Failed to open bucket: %v", err)
}
defer bucket.Close()
if err := bucket.Delete(ctx, *objectName); err != nil {
log.Fatalf("Failed to delete object %q: %v", *objectName, err)
}
fmt.Printf("Successfully deleted object %q from bucket %q\n", *objectName, *bucketURL)
}
Supported Storage Providers and Bucket URLs
The above functions work with any of the following storage providers by using the appropriate bucket URL scheme:
Provider | Bucket URL Scheme |
---|---|
Google Cloud Storage | gs://{bucket-name} |
Amazon S3 | s3://{bucket-name} |
Azure Blob Storage | azblob://{container-name} |
Local File System | file:///{file-path} |
Conclusion
I hope this article has given you a clear understanding of how to use Go CDK in Go applications. Try out these examples to get hands-on experience. In the future, consider using Go CDK to build cloud-agnostic applications — it simplifies multi-cloud deployment by abstracting provider-specific differences behind a single interface.
Important note: Go CDK supports accessing objects in cloud storage (read, write, list, delete), but it does not support creating buckets. To create buckets, you’ll need to use cloud-specific libraries. For example, to create a bucket in GCS, you must use the Google Cloud Storage client library: cloud.google.com/go/storage
.
Bonus Tool Recommendation
If you’re tired of spending time searching for and understanding internal APIs, check out the AI tool LiveAPI. It helps you discover, understand, and use APIs within large codebases, making your API work much easier.
Top comments (0)