CloudWatch Metric filters with default value not making sense

0

I've created a metric filter just to get an idea about how many times a specific log pattern shows up, nothing crazy. Metric value is set to 1 and Default value is set to 0. Since it's not a high-resolution metric, CloudWatch is aggregating it in a minute period. All good with that. What I do not understand is the difference between the Sum and Sample Count statistics. Why Sum and Sample Count would have different values?

  • If we assume that there was no record in the 1-minute interval with the filter pattern, Sum would be 0, and Sample Count would be 0.
  • If we assume that there was at least one record in the 1-minute interval with the filter pattern, Sum would be X, and Sample Count would be X, where X is greater than 0.

An example: Let's say I created a metric filter with the pattern "ERROR:", and I set Metric value is set to 1 and Default value is set to 0. We have the following logs for three different log streams under the same log group in a specific minute in the timeline:

Log stream 1:

  • ERROR: XXXXXXX
  • INFO: XXXXXX

Log stream 2:

  • INFO: XXXXXX
  • INFO: XXXXXX

Log stream 3:

  • ERROR: XXXXXXX
  • ERROR: XXXXXXX
  • ERROR: XXXXXXX

What would be the values for Sum and Sample Count in your opinion? 4, right!?

Aucune réponse

Vous n'êtes pas connecté. Se connecter pour publier une réponse.

Une bonne réponse répond clairement à la question, contient des commentaires constructifs et encourage le développement professionnel de la personne qui pose la question.

Instructions pour répondre aux questions