Hi Cloud Documents

Custom Logs Ingestion Guide

Introduction

This document provides an example of how to ingest custom logs into Multicloud Observability Platform. The scenario involves ingesting CloudFlare logs into the Multicloud Observability Platform platform via Huawei Object Block Store (OBS).

Steps

  1. Setup Object Block Store (OBS) on Huawei Cloud
  2. Configure Programmatic Access for OBS on Huawei Cloud
  3. Configure Cloudflare to Send Logs to OBS
  4. Create Function Graph to Trigger on File Upload to OBS and Send Data to Multicloud Observability Platform DataKit via API
  5. Verify Data in Multicloud Observability Platform
  6. Create a Pipeline to Format Custom Logs and Attach It to the Log Source

1. Setup Object Block Store (OBS) on Huawei Cloud

  • Navigate to “Object Storage Service” on Huawei Cloud console.
  • Click “Create Bucket” to provision a storage bucket.
  • Optionally, create a “logs” folder within the bucket.

2. Configure Programmatic Access for OBS on Huawei Cloud

  • Navigate to “IAM” > “Users” on Huawei Cloud console.
  • Click “Create User” and populate the required details.
  • Assign relevant permissions for OBS access.
  • Save the access key and secret, which will be automatically downloaded as credentials.csv.

3. Configure Cloudflare to Send Logs to OBS

  • Log into the Cloudflare console and navigate to your website configuration.
  • Go to “Analytics & Logs” > “Logs” and click “Add Logpush job”.
  • Select the data sets (e.g., HTTP requests) to push to OBS.
  • Choose “S3-Compatible” as the storage provider and furnish details from OBS and the access key/secret.
  • Validate access and enable log pushing.
  • Verify that logs are being sent to the OBS bucket.

4. Create Function Graph to Trigger on File Upload to OBS and Send Data to Multicloud Observability Platform DataKit via API

  • On Huawei Cloud console, navigate to “Function Graph” > “Functions” and create a new function.
  • Use Python as the runtime and add necessary environment variables for API access keys.
  • Implement the function script to handle log file uploads and send data to Multicloud Observability Platform DataKit.

Example Python script:

import sys
import hashlib
import hmac
import binascii
from datetime import datetime
import json
import requests

def generate_signature(method, secret_key, date, bucket, key):
    canonical_string = f"{method}\n\n\n{date}\n/{bucket}/{key}"
    hashed = hmac.new(secret_key.encode('UTF-8'), canonical_string.encode('UTF-8'), hashlib.sha1)
    return binascii.b2a_base64(hashed.digest()).strip().decode('UTF-8')

def handler(event, context):
    bucket_name = event['Records'][0]['s3']['bucket']['name']
    object_key = event['Records'][0]['s3']['object']['key']
    secret_access_key = context.getUserData('API_SECRET')
    access_key_id = context.getUserData('API_ACCESS_KEY')
    date = datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')
    signature = generate_signature("GET", secret_access_key, date, bucket_name, object_key)
    url = f"https://{bucket_name}.obs.ap-southeast-3.myhuaweicloud.com/{object_key}"
    headers = {'Authorization': f'OBS {access_key_id}:{signature}', 'Date': date}

    response = requests.get(url, headers=headers)
    response.raise_for_status()
    response_text_modified = response.text.replace("}\n{", "},\n{")
    json_array = json.loads(f'[{response_text_modified}]')

    datakit_url = "http://<YOUR_DATAKIT_IP>:9529/v1/write/logstreaming?type=firelens&source=cf_hw_obs"
    for item in json_array:
        requests.post(datakit_url, json=item)

    return {"statusCode": 200, "body": json.dumps(event)}
  • Create a trigger for the function to run on file uploads to the OBS bucket.
  • Monitor the function to ensure it is running correctly.

5. Verify Data in Multicloud Observability Platform

  • Go to the Multicloud Observability Platform platform and navigate to “Logs” to verify that the data is being ingested correctly.

6. Create a Pipeline to Format Custom Logs and Attach It to the Log Source

  • Navigate to “Logs” > “Pipelines” on Multicloud Observability Platform.
  • Create a new pipeline to parse the custom logs.

Example pipeline script:

jsonObj = get_key(message)
if valid_json(jsonObj):
    jsonObj = load_json(jsonObj)
    add_key("CacheCacheStatus", jsonObj["CacheCacheStatus"])
    add_key("ClientCountry", jsonObj["ClientCountry"])
    add_key("ClientIP", jsonObj["ClientIP"])
    add_key("EdgeResponseBytes", jsonObj["EdgeResponseBytes"])
    add_key("EdgeResponseStatus", jsonObj["EdgeResponseStatus"])
    add_key("EdgeStartTimestamp", jsonObj["EdgeStartTimestamp"])
    add_key("RayID", jsonObj["RayID"])

  • Save the pipeline and verify the formatted logs in the Logs Explorer.
@2025 All rights reserved. Produced by First Wave Technology.