March 3, 2024, 10:53 a.m. | /u/mramsa

Deep Learning www.reddit.com

Hello, I have recently experienced an anomaly related to TransformerEncoder. To be more concrete, here is my model:

```
class SOTC(nn.Module):
def __init__(self, n_seq, n_features, d_model=256, n_head=2, n_enc_layers=1, kernel_size=3, dropout=0.05):
"""
Arguments:
n_seq: int, number of distinct time series (e.g. individual metrics)
n_features: int, amount of data points per time series
d_model: int, transformer encoder dimension
n_head: int, number of attention heads
n_enc_layers: int, number of transformer encoder layers
kernel_size: int, size of the 1D convolutional kernel
dropout: float, dropout …

anomaly class concrete data deeplearning dropout encoder hello metrics per pytorch series time series transformer transformer encoder

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Field Sample Specialist (Air Sampling) - Eurofins Environment Testing – Pueblo, CO

@ Eurofins | Pueblo, CO, United States

Camera Perception Engineer

@ Meta | Sunnyvale, CA