Loggie is a Golang-based, cloud-native log collection agent built for lightweight, high-performance operation. In a Function Compute custom runtime, Loggie runs as a background process inside your function instance. Your function writes logs to local files, Loggie reads those files, and then forwards the logs to Simple Log Service (SLS) for storage and analysis.
Prerequisites
Before you begin, make sure you have:
The SLS project must be in the same region as the function you create in Step 1.
Billing
Setting the FC_EXTENSION_SLS_LOGGIE=true environment variable freezes the instance for 10 seconds after each invocation so that Loggie can finish uploading logs. This freeze period is billed under the same rules as Prefreeze hooks. For details, see Billing rules.
Step 1: Create a function in a custom runtime
Log on to the Function Compute console. In the left-side navigation pane, click Services & Functions.
In the top navigation bar, select a region. On the Services page, click the target service.
On the Functions page, click Create Function.
On the Create Function page, set the following parameters. Use the default values for other parameters, then click Create. For more information, see Create a function. Sample
app.py:Note Replacefilename='/tmp/log/fc-flask.log'with your actual log file path. This path must match thesources.pathsvalue in the pipeline configuration created in Step 2.Parameter Value Creation method Use Custom Runtime Function Name (under Basic Settings) Enter a function name Handler Type (under Basic Settings) Event Handler Runtime Python 3.9 Code Upload Method Use Folder. The folder name is code, and the file inside isapp.py. See the sample code below.Startup Command /code/bootstrap(created in Step 2)Listening Port 9000 from flask import Flask from flask import request import logging import os REQUEST_ID_HEADER = 'x-fc-request-id' app = Flask(__name__) format_str = '[%(asctime)s] %(levelname)s in %(module)s: %(message)s' logging.basicConfig(filename='/tmp/log/fc-flask.log', filemode='w', format=format_str, encoding='utf-8', level=logging.DEBUG) @app.route("/invoke", methods = ["POST"]) def hello_world(): rid = request.headers.get(REQUEST_ID_HEADER) logger = logging.getLogger() print("FC Invoke Start RequestId: " + rid) logger.info("FC Invoke Start RequestId: " + rid) data = request.stream.read() print(str(data)) logger.info("receive event: {}".format(str(data))) print("FC Invoke End RequestId: " + rid) logger.info("FC Invoke Start RequestId: " + rid) return "Hello, World!" if __name__ == '__main__': app.run(host='0.0.0.0',port=9000)
Step 2: Create the bootstrap file
After creating the function, use WebIDE on the Code tab to create a bootstrap file in the CODE directory.
Add the following content to the bootstrap file:
#!/bin/bash
#1. Create the pipelines.yml file.
mkdir -p /tmp/log /code/etc
cat << EOF > /code/etc/pipelines.yml
pipelines:
- name: demo
sources:
- type: file
name: fc-demo
addonMeta: true
fields:
topic: "loggie"
fieldsUnderRoot: true
paths:
- "/tmp/log/*.log"
sink:
type: sls
endpoint: ${LOGGIE_SINK_SLS_ENDPOINT}
accessKeyId: ${LOGGIE_SINK_SLS_ACCESS_ID}
accessKeySecret: ${LOGGIE_SINK_SLS_ACCESS_SECRET}
project: ${LOGGIE_SINK_SLS_PROJECT}
logstore: ${LOGGIE_SINK_SLS_LOGSTORE}
topic: ${LOGGIE_SINK_SLS_TOPIC}
EOF
#2. Create the loggie.yml file.
cat << EOF > /code/etc/loggie.yml
EOF
#3. Start Loggie and run it as a background process.
/opt/bin/loggie -config.system=/code/etc/loggie.yml -config.pipeline=/code/etc/pipelines.yml > /tmp/loggie.log 2>&1 &
#4. Start the application.
exec python app.pyThe script does the following:
Creates
pipelines.yml-- the pipeline configuration file.Section Description sourcesDefines the log type and path. This example collects all .logfiles from the/tmp/log/directory.sinkSpecifies the SLS destination. The variables are set in Step 4. Creates
loggie.yml-- the Loggie system configuration file. An empty file uses the default configuration. The file must exist even if left empty. For non-default configurations, see the Loggie reference documentation.Starts Loggie as a background process. Loggie runtime logs are written to
/tmp/loggie.log.Starts the application. This example uses Python. Replace the command with the language your function uses.
After adding the file content, grant execution permission to the bootstrap file. In WebIDE, choose Terminal > New Terminal, then run:
chmod 777 bootstrapStep 3: Add the Loggie official common layer
Click the Configurations tab. In the Layers section, click Modify.
In the panel, choose Add Layer > Add Official Common Layer and configure the Loggie layer.
Layer name Compatible runtime Layer version ARN Loggie Agent Custom runtime 1 (used in this example) acs:fc:{region}:official:layers/Loggie13x/versions/1Click OK.
Step 4: Set environment variables
On the Configurations tab, go to the Environment Variables section and click Modify.
Add the following environment variables. For more information, see Environment variables.
FC_EXTENSION_SLS_LOGGIE=true-- Enables the 10-second post-invocation freeze so that Loggie can finish uploading logs. See Billing for cost details. The six SLS sink variables referenced inpipelines.yml:Variable Description LOGGIE_SINK_SLS_ENDPOINTSLS endpoint for your region LOGGIE_SINK_SLS_ACCESS_IDAccessKey ID LOGGIE_SINK_SLS_ACCESS_SECRETAccessKey secret LOGGIE_SINK_SLS_PROJECTSLS project name LOGGIE_SINK_SLS_LOGSTORESLS Logstore name LOGGIE_SINK_SLS_TOPICLog topic Click OK. After the configuration is saved, Loggie begins forwarding function logs to SLS.
Step 5: Verify results
On the Code tab, click Test Function. The first invocation may delay log delivery. Invoke the function several times to allow Loggie to initialize and flush logs.
Log on to the Log Service console. Query logs by the region, project, and Logstore you configured in
pipelines.yml. In the query results, look for these fields:Field Description bodyThe log content written by your function state.*Metadata about the log collection state. The hostnamefield understatecontains the ID of the instance where the function ran.
Troubleshooting
Loggie runs independently within the function instance. Function Compute does not monitor Loggie health, and Loggie failures do not affect function execution. Log queries in SLS may have a latency of several seconds.
If logs do not appear in SLS, use the following steps to diagnose the issue.
Logs are missing after function runs
When a function runs successfully, the instance stays alive for several minutes after invocation. Log on to the instance to inspect Loggie. For instructions, see Run commands to manage function instances.
Check the following:
| Check | Action |
|---|---|
| Loggie is running | If /tmp/loggie.log does not exist, start Loggie manually from the command line. |
| Pipeline configuration is correct | Open /code/etc/pipelines.yml and verify the source paths and sink credentials. |
| SLS sink started | Look for a log line similar to pipeline sink(sink/sls)-0 invoke loop start in /tmp/loggie.log. |
| Log files are detected | Look for a log line similar to start collect file: /tmp/log/fc-flask.log. If missing, confirm that your application writes log files to a path that matches the paths pattern in pipelines.yml. |
Function fails to run
To isolate the issue, remove the Loggie startup logic from the bootstrap file and test whether the function runs on its own. Loggie is an external extension and should not affect function execution. If you experience unexpected process exits or execution timeouts, increase the memory or CPU specifications for the function.
References
To transform logs before upload -- for example, to parse JSON logs or filter out DEBUG entries -- add interceptor configurations to
pipelines.yml. For details, see the Loggie interceptor reference.