Oct. 9, 2022, 5:10 p.m. | David Schiff

Towards AI - Medium pub.towardsai.net

In the previous article, I covered the basics of the Attention mechanism, and in general, I covered the transformer block. In this part of the series, I would like to cover how LogBERT trains and how we can use it to detect anomalies in log sequences.

Let’s get into the nitty gritty little details of LogBERT.

In the paper (link: https://arxiv.org/pdf/2103.04475.pdf) a log sequence is defined as:

Where S is a sequence of keys (words) in the log sequence. …

anomaly detection explained logbert machine learning nlp part transformers

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne