Oct. 24, 2023, 9 a.m. | Laura Case

InfoWorld Analytics www.infoworld.com



For years, data teams worked with simple data pipelines. These generally consisted of a few applications or data feeds that converged into a standard extract, transform, and load (ETL) tool that fed data into a centralized data warehouse. From that warehouse, data was sent to a set number of places, like a reporting tool or spreadsheets. As a result, data protection was relatively straightforward. There simply was not as much data to protect, and the locations of the data …

analytics applications centralized data cloud security compliance computation data data governance data pipelines data teams data warehouse encryption etl extract fed pipelines security set simple standard teams tool warehouse

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Risk Management - Machine Learning and Model Delivery Services, Product Associate - Senior Associate-

@ JPMorgan Chase & Co. | Wilmington, DE, United States

Senior ML Engineer (Speech/ASR)

@ ObserveAI | Bengaluru