GCP Cloud Functions

Cloud Functions is GCP's serverless, event-driven compute service. It lets you run small pieces of code in response to events — like an HTTP request, a file upload to Cloud Storage, or a message arriving in a Pub/Sub queue — without provisioning or managing any servers.

Think of Cloud Functions like a light switch that triggers an action. When someone flips the switch (event), the room lights up (function runs). When nobody flips it, there is no electricity consumed (no cost). The wiring (infrastructure) is Google's responsibility.

When to Use Cloud Functions

ScenarioExample
Process file uploadsResize an image when it is uploaded to Cloud Storage
Respond to HTTP requestsBuild a lightweight REST API endpoint
React to database changesSend a welcome email when a new user is added to Firestore
Process messagesRead and process messages from a Pub/Sub topic
Scheduled tasksRun a cleanup script every night at midnight

Cloud Functions Generations

GCP offers two generations of Cloud Functions:

Feature1st Generation2nd Generation (Recommended)
Underlying platformOriginal Cloud Functions runtimeBuilt on Cloud Run
TimeoutUp to 9 minutesUp to 60 minutes
Concurrency1 request per instanceUp to 1000 requests per instance
Larger instancesUp to 8 GB RAMUp to 32 GB RAM, 8 vCPU

Writing an HTTP Cloud Function

Python Example

# main.py
import functions_framework

@functions_framework.http
def hello_world(request):
    """HTTP Cloud Function."""
    name = request.args.get('name', 'World')
    return f'Hello, {name}! This is a Cloud Function.'
# requirements.txt
functions-framework==3.5.0
# Deploy the function
gcloud functions deploy hello-world \
  --gen2 \
  --runtime python311 \
  --region us-central1 \
  --source . \
  --entry-point hello_world \
  --trigger-http \
  --allow-unauthenticated

After deployment, the function gets a URL:

https://us-central1-my-project.cloudfunctions.net/hello-world

Testing:

curl "https://us-central1-my-project.cloudfunctions.net/hello-world?name=GCP"
# Output: Hello, GCP! This is a Cloud Function.

Cloud Storage Trigger – Image Processing Example

This function runs automatically when a file is uploaded to a Cloud Storage bucket:

# main.py — triggered by Cloud Storage
import functions_framework
from google.cloud import storage

@functions_framework.cloud_event
def process_upload(cloud_event):
    """Triggered when a file is uploaded to Cloud Storage."""
    data = cloud_event.data
    bucket_name = data["bucket"]
    file_name = data["name"]

    print(f"New file uploaded: {file_name} in bucket: {bucket_name}")

    # Example: log file size
    client = storage.Client()
    bucket = client.bucket(bucket_name)
    blob = bucket.blob(file_name)
    blob.reload()
    print(f"File size: {blob.size} bytes")
# Deploy with Cloud Storage trigger
gcloud functions deploy process-upload \
  --gen2 \
  --runtime python311 \
  --region us-central1 \
  --source . \
  --entry-point process_upload \
  --trigger-event-filters="type=google.cloud.storage.object.v1.finalized" \
  --trigger-event-filters="bucket=my-upload-bucket"

Pub/Sub Trigger Example

This function processes messages from a Pub/Sub topic:

# main.py — triggered by Pub/Sub
import functions_framework
import base64

@functions_framework.cloud_event
def handle_message(cloud_event):
    """Process a Pub/Sub message."""
    message_data = cloud_event.data["message"]["data"]
    decoded = base64.b64decode(message_data).decode("utf-8")
    print(f"Received message: {decoded}")
# Deploy with Pub/Sub trigger
gcloud functions deploy handle-message \
  --gen2 \
  --runtime python311 \
  --region us-central1 \
  --source . \
  --entry-point handle_message \
  --trigger-topic my-topic

Scheduled Cloud Functions with Cloud Scheduler

To run a Cloud Function on a schedule, pair it with Cloud Scheduler. Cloud Scheduler sends an HTTP request to the function at the defined time.

Function Flow:
Cloud Scheduler (every night at 11 PM)
        │
        │ HTTP POST request
        ▼
Cloud Function: cleanup-old-data
        │
        │ Deletes records older than 30 days
        ▼
Cloud Firestore Database
# Create a Cloud Scheduler job that calls a Cloud Function
gcloud scheduler jobs create http nightly-cleanup \
  --schedule="0 23 * * *" \
  --uri="https://us-central1-my-project.cloudfunctions.net/cleanup-old-data" \
  --message-body='{"days": 30}' \
  --time-zone="Asia/Kolkata"

Environment Variables and Secrets

# Set environment variables during deployment
gcloud functions deploy my-function \
  --gen2 \
  --set-env-vars API_KEY=mykey123,APP_ENV=production

# Use Secret Manager for sensitive values
gcloud functions deploy my-function \
  --gen2 \
  --set-secrets 'DB_PASSWORD=my-db-secret:latest'

Cloud Functions Pricing

  • Billing is based on the number of invocations, compute time, and memory used.
  • The first 2 million invocations per month are free.
  • Functions that are idle cost nothing — no requests, no charge.

Key Takeaways

  • Cloud Functions runs small code snippets in response to events — no server management needed.
  • 2nd Generation is built on Cloud Run and supports longer timeouts and higher concurrency.
  • Triggers include HTTP requests, Cloud Storage events, Pub/Sub messages, and Firestore changes.
  • Functions scale to zero when idle — billing stops completely.
  • Cloud Scheduler pairs with Cloud Functions to run code on a timed schedule.
  • Use Secret Manager for sensitive values like API keys and passwords.

Leave a Comment