Hi there! I'm Shrijith Venkatrama, founder of Hexmos. Right now, I’m building LiveAPI, a first of its kind tool for helping you automatically index API endpoints across all your repositories. LiveAPI helps you discover, understand and use APIs in large tech infrastructures with ease.
Google Cloud Platform (GCP) has a ton of services, but most developers stick to the usual suspects like Compute Engine, BigQuery, or Kubernetes. There’s a whole world of lesser-known tools that can make your life easier, save time, or solve problems you didn’t even know you had. Let’s dive into some of these under-the-radar GCP services with practical examples and details to show you what they can do. No fluff, just stuff you can actually use.
1. Cloud Run: Serverless That Doesn’t Hate Developers
Cloud Run is GCP’s serverless platform for running containers without managing servers. It’s not just “serverless functions” like Cloud Functions—it lets you run full-blown containerized apps with zero server management. You get auto-scaling, HTTPS endpoints, and custom domains out of the box.
- Why it’s powerful: You can deploy any containerized app (Node.js, Python, Go, etc.) and Cloud Run handles scaling to zero or thousands of requests. It’s cheaper than running VMs and simpler than Kubernetes.
- Use case: A REST API that scales automatically based on traffic, like a backend for a mobile app.
Example: Deploying a Node.js API
Here’s a simple Node.js app deployed on Cloud Run. This assumes you have a Dockerized app and the gcloud
CLI set up.
# Dockerfile
FROM node:16
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8080
CMD ["node", "server.js"]
# server.js
const express = require('express');
const app = express();
app.get('/', (req, res) => {
res.json({ message: 'Hello from Cloud Run!' });
});
app.listen(8080, () => {
console.log('Server running on port 8080');
});
// Build and push the container
docker build -t gcr.io/[PROJECT-ID]/my-app .
docker push gcr.io/[PROJECT-ID]/my-app
// Deploy to Cloud Run
gcloud run deploy my-service \
--image gcr.io/[PROJECT-ID]/my-app \
--platform managed \
--region us-central1 \
--allow-unauthenticated
# Output: Service URL like https://my-service-abc123-uc.a.run.app
Run the gcloud
command, and you’ll get a URL to access your API. It scales to zero when idle, so you only pay for actual usage. Check out Cloud Run’s docs for more.
2. Firestore: NoSQL That’s More Than Just a Database
Firestore is GCP’s NoSQL database, but it’s not just for storing JSON-like data. It has real-time listeners, offline support, and scales globally with low latency. It’s great for apps needing live updates, like chat or dashboards.
- Why it’s powerful: Firestore’s real-time sync and querying capabilities make it ideal for dynamic apps. It’s also got a generous free tier.
- Use case: A collaborative to-do app where changes sync instantly across devices.
Example: Real-Time To-Do List
Here’s a JavaScript snippet using Firestore to sync to-do items in real time.
// Initialize Firebase (assumes Firebase SDK is included)
import { initializeApp } from 'firebase/app';
import { getFirestore, collection, onSnapshot, addDoc } from 'firebase/firestore';
const app = initializeApp({
apiKey: 'your-api-key',
authDomain: 'your-project-id.firebaseapp.com',
projectId: 'your-project-id'
});
const db = getFirestore(app);
// Listen for real-time updates
const todosRef = collection(db, 'todos');
onSnapshot(todosRef, (snapshot) => {
snapshot.docChanges().forEach((change) => {
console.log(change.type, change.doc.data());
// Output: e.g., "added" { text: "Buy groceries", done: false }
});
});
// Add a new to-do
async function addTodo(text) {
await addDoc(todosRef, { text, done: false });
}
addTodo('Learn Firestore');
// Output: New to-do appears in real-time for all clients
Firestore’s real-time listeners fire instantly when CRUD operations happen. See Firestore’s getting started guide for setup details.
3. Cloud Scheduler: Your App’s Personal Cron Job
Cloud Scheduler is a fully managed cron job service. It lets you schedule HTTP requests, Pub/Sub messages, or App Engine tasks with a simple interface. Think of it as a reliable cron for the cloud.
- Why it’s powerful: No need to manage servers for scheduled tasks. It’s integrated with GCP, so you can trigger almost anything.
- Use case: Run a daily data cleanup job by calling an HTTP endpoint.
Example: Scheduling a Cleanup Job
Here’s how to set up a Cloud Scheduler job to hit an HTTP endpoint daily.
# Create a scheduler job
gcloud scheduler jobs create http cleanup-job \
--schedule "0 0 * * *" \
--uri "https://my-app-abc123-uc.a.run.app/cleanup" \
--http-method POST \
--time-zone "America/New_York"
# Example endpoint (Node.js)
const express = require('express');
const app = express();
app.post('/cleanup', (req, res) => {
console.log('Running cleanup job');
// Add cleanup logic here
res.status(200).send('Cleanup done');
});
app.listen(8080);
// Output: Logs "Running cleanup job" at midnight EST daily
You can monitor jobs in the GCP Console. Cloud Scheduler’s docs have more examples.
4. Secret Manager: Keep Your Secrets Safe
Secret Manager is a secure way to store and access sensitive data like API keys, passwords, or certificates. It’s not just a vault—it integrates with other GCP services for seamless access.
- Why it’s powerful: Centralized secret management with versioning and access control. No more hardcoding keys in your code.
- Use case: Store an API key for a third-party service and access it in your app.
Example: Accessing a Secret in Python
Here’s how to retrieve a secret in a Python app.
from google.cloud import secretmanager
client = secretmanager.SecretManagerServiceClient()
secret_name = "projects/[PROJECT-ID]/secrets/my-api-key/versions/latest"
response = client.access_secret_version(request={"name": secret_name})
api_key = response.payload.data.decode("UTF-8")
print(f"API Key: {api_key}")
# Output: API Key: your-secret-key-here
You create secrets in the GCP Console or CLI, and apps fetch them securely. Check Secret Manager’s overview for setup steps.
5. Cloud Tasks: Queue Jobs Like a Pro
Cloud Tasks lets you manage asynchronous tasks by queuing HTTP requests. Unlike Cloud Scheduler, it’s for dynamic, on-demand task queues, like processing user uploads or sending emails.
- Why it’s powerful: Fine-grained control over task execution, retries, and rate limits. It’s serverless and scales automatically.
- Use case: Queue image processing tasks for a photo-sharing app.
Example: Queuing an Image Processing Task
Here’s a Python example to queue a task.
from google.cloud import tasks_v2
from google.protobuf import timestamp_pb2
import datetime
client = tasks_v2.CloudTasksClient()
parent = client.queue_path("[PROJECT-ID]", "us-central1", "my-queue")
task = {
"http_request": {
"http_method": tasks_v2.HttpMethod.POST,
"url": "https://my-app-abc123-uc.a.run.app/process-image",
"body": b"image_id=12345"
}
}
response = client.create_task(request={"parent": parent, "task": task})
print(f"Created task: {response.name}")
# Output: Created task: projects/[PROJECT-ID]/locations/us-central1/queues/my-queue/tasks/abc123
You need to create a queue in GCP first. See Cloud Tasks’ quickstart.
6. Dataflow: Stream and Batch Processing Without Tears
Dataflow is a managed service for processing streaming and batch data. It’s built on Apache Beam, so you write your pipeline once and run it anywhere. It’s great for ETL jobs or real-time analytics.
- Why it’s powerful: Handles massive datasets with auto-scaling and fault tolerance. No infrastructure to manage.
- Use case: Process logs in real time to detect anomalies.
Example: Streaming Log Processor
Here’s a Python Beam pipeline to count log events.
import apache_beam as beam
from apache_beam.options.pipeline_options import PipelineOptions
from apache_beam.io.gcp.pubsub import ReadFromPubSub
from apache_beam.io.gcp.bigquery import WriteToBigQuery
options = PipelineOptions(streaming=True)
with beam.Pipeline(options=options) as pipeline:
logs = (pipeline
| "Read from PubSub" >> ReadFromPubSub(subscription="projects/[PROJECT-ID]/subscriptions/my-sub")
| "Parse logs" >> beam.Map(lambda x: {"event": x.decode("utf-8"), "count": 1})
| "Group by event" >> beam.GroupByKey()
| "Sum counts" >> beam.Map(lambda x: {"event": x[0], "total": sum(x[1])})
| "Write to BigQuery" >> WriteToBigQuery(
table="[PROJECT-ID]:logs.dataset",
schema={"fields": [{"name": "event", "type": "STRING"}, {"name": "total", "type": "INTEGER"}]}
))
# Output: Writes aggregated event counts to BigQuery
Run this with python script.py --project [PROJECT-ID]
. Dataflow’s docs explain setup.
7. Memorystore: In-Memory Speed for Redis and Memcached
Memorystore is GCP’s managed Redis and Memcached service. It’s perfect for caching, session management, or real-time leaderboards.
- Why it’s powerful: Low-latency access to data with automatic scaling and backups. No server maintenance.
- Use case: Cache user sessions for a web app.
Example: Caching with Redis in Node.js
Here’s how to use Memorystore with Redis.
const redis = require('redis');
const client = redis.createClient({
url: 'redis://[MEMORYSTORE-IP]:6379'
});
client.on('error', (err) => console.log('Redis error:', err));
await client.connect();
await client.set('user:123', JSON.stringify({ name: 'Alice', session: 'abc' }));
const value = await client.get('user:123');
console.log(value);
// Output: {"name":"Alice","session":"abc"}
You need to create a Memorystore instance in GCP. See Memorystore’s guide.
8. Cloud Build: CI/CD That’s Actually Simple
Cloud Build is a managed CI/CD service for building, testing, and deploying code. It’s not just for containers—it supports any workflow with custom steps.
- Why it’s powerful: Fast builds, tight GCP integration, and a clear YAML-based config. No Jenkins nightmares.
- Use case: Automate a build pipeline for a Python app.
Example: CI/CD Pipeline
Here’s a cloudbuild.yaml
to build and deploy a Python app.
steps:
- name: 'python:3.9'
entrypoint: 'bash'
args: ['-c', 'pip install -r requirements.txt && pytest']
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '-t', 'gcr.io/[PROJECT-ID]/my-app', '.']
- name: 'gcr.io/cloud-builders/docker'
args: ['push', 'gcr.io/[PROJECT-ID]/my-app']
- name: 'gcr.io/google.com/cloudsdktool/cloud-sdk'
args: ['run', 'deploy', 'my-service', '--image', 'gcr.io/[PROJECT-ID]/my-app', '--platform', 'managed', '--region', 'us-central1']
images: ['gcr.io/[PROJECT-ID]/my-app']
# Output: Builds, tests, and deploys to Cloud Run
Run with gcloud builds submit --config cloudbuild.yaml
. Check Cloud Build’s docs.
What You Can Do Next
These services are just the start. Cloud Run is great for quick deployments, Firestore for real-time apps, and Dataflow for heavy data processing. Try one that fits your project—most have free tiers or low costs for small apps. Start with the examples above, tweak them for your use case, and check the linked docs for deeper dives. If you’re stuck, the GCP community on Dev.to or Stack Overflow is super helpful. Pick a service, build something small, and see how it fits into your workflow.
Top comments (0)