By using AWS re:Post, you agree to the AWS re:Post Terms of Use

Error Using AWS Built-in Fluent Bit to Send Logs to Elastic Cloud from EKS Fargate

0

I am using EKS Fargate and trying to use AWS built-in Fluent Bit to send logs to Elastic Cloud, but I am encountering the following error:

{ "log": "[2024/05/16 14:35:17] [error] [output:es:es.0] HTTP status=400 URI=/_bulk, response:" }
{
    "log": "{\"error\":{\"root_cause\":[{\"type\":\"x_content_parse_exception\",\"reason\":\"[1:24] Unexpected character ('l' (code 108)): was expecting comma to separate Object entries\\n at [Source: (byte[])\\\"{\\\"create\\\":{\\\"_index\\\":\\\"\\\"leoh-logging\\\"\\\"}}\\\"; line: 1, column: 24]\"}],\"type\":\"x_content_parse_exception\",\"reason\":\"[1:24] Unexpected character ('l' (code 108)): was expecting comma to separate Object entries\\n at [Source: (byte[])\\\"{\\\"create\\\":{\\\"_index\\\":\\\"\\\"leoh-logging\\\"\\\"}}\\\"; line: 1, column: 24]\",\"caused_by\":{\"type\":\"json_parse_exception\",\"reason\":\"Unexpected character ('l' (code 108)): was expecting comma to separate Object entries\\n at [Source: (byte[])\\\"{\\\"create\\\":{\\\"_index\\\":\\\"\\\"leoh-logging\\\"\\\"}}\\\"; line: 1, column: 24]\"}},\"status\":400}"
}

Here is my output configuration:

  [OUTPUT]
  Name  es
  Match *
  Index "leoh-logging"
  Host  ***
  Port  9243
  HTTP_User elastic-beats
  HTTP_Passwd ***
  tls   On
  tls.verify Off
  Suppress_Type_Name On

I have checked my configuration and ensured that the credentials are correct. However, I still receive this error. Has anyone encountered a similar issue or has experience in resolving this error? Any help would be greatly appreciated. Additionally, I would like to inquire about the compatibility of the current version of Fluent Bit with Elastic Cloud. Could you please confirm if there are any know compatibility issues or if there are specific versions recommended for optimal performance?

Thank you very much!

  • please accept the answer if it was useful

2 Answers
3
Accepted Answer

Hello,

To resolve the JSON parsing error when using Fluent Bit to send logs from EKS Fargate to Elastic Cloud, follow these steps:

1.Verify Log Data Format: Ensure your logs are correctly formatted JSON.

2.Escape Special Characters: Ensure special characters in your logs are properly escaped.

3.Correct Fluent Bit Configuration: Update your Fluent Bit configuration to match the correct JSON structure for Elasticsearch.

Here is a corrected example:

[OUTPUT]
    Name            es
    Match           *
    Index           leoh-logging
    Host            your-elasticsearch-host
    Port            9243
    HTTP_User       elastic-beats
    HTTP_Passwd     your-password
    tls             On
    tls.verify      Off
    Suppress_Type_Name On
    Replace_Dots    On
    Logstash_Format On
    Logstash_Prefix leoh-logging
    Logstash_DateFormat %Y.%m.%d

Example Correct JSON Structure Ensure the log structure sent is correct:

{ "index": { "_index": "leoh-logging", "_type": "_doc" } }
{ "message": "your log message", "timestamp": "2024-05-24T10:00:00Z", "level": "info" }

Compatibility Ensure you are using compatible versions of Fluent Bit and Elasticsearch. Check the Fluent Bit documentation for version compatibility. Update Fluent Bit to the latest version if necessary.

profile picture
EXPERT
answered 6 months ago
  • Thank you very much for your prompt and informative response! Your explanation regarding the JSON parsing error in Fluent Bit and the steps to resolve it were clear and valuable. I especially appreciate the example configuration and the breakdown of the correct JSON structure.

    Upon reviewing my configuration, I identified the mistake – the extra double quote mark (") within the index section. This explains the parsing error. I've corrected the configuration, and now my logs are flowing smoothly to Elastic Cloud.

3

The error you are encountering indicates that Fluent Bit is sending log data to Elasticsearch with a JSON parsing issue. Specifically, the error message points out an unexpected character 'l' at a certain position in the JSON string, which suggests a formatting issue in the data.

Verify Log Data Format. Ensure that the log data being sent to Elasticsearch is properly formatted JSON. Fluent Bit might be sending a malformed JSON string.

Escape Special Characters. If your logs contain special characters or are not properly escaped, it could cause parsing issues. Ensure that your log data is correctly escaped.

Check Fluent Bit Filters. If you use any filters to modify the log data before sending it to Elasticsearch, verify that these filters are correctly processing and formatting the data.

Example Correct JSON Structure

{
  "create": {
    "_index": "leoh-logging",
    "_type": "_doc",
    "_id": "1"
  },
  "log": {
    "message": "your log message",
    "timestamp": "2024-05-24T10:00:00Z",
    "level": "info"
  }
}
profile picture
EXPERT
answered 6 months ago
profile picture
EXPERT
reviewed 6 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions