March 3, 2024, 10:53 a.m. | /u/mramsa

Deep Learning www.reddit.com

Hello, I have recently experienced an anomaly related to TransformerEncoder. To be more concrete, here is my model:

```
class SOTC(nn.Module):
def __init__(self, n_seq, n_features, d_model=256, n_head=2, n_enc_layers=1, kernel_size=3, dropout=0.05):
"""
Arguments:
n_seq: int, number of distinct time series (e.g. individual metrics)
n_features: int, amount of data points per time series
d_model: int, transformer encoder dimension
n_head: int, number of attention heads
n_enc_layers: int, number of transformer encoder layers
kernel_size: int, size of the 1D convolutional kernel
dropout: float, dropout …

anomaly class concrete data deeplearning dropout encoder hello metrics per pytorch series time series transformer transformer encoder

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Senior Applied Data Scientist

@ dunnhumby | London

Principal Data Architect - Azure & Big Data

@ MGM Resorts International | Home Office - US, NV