The Challenges of Legacy SIEM Models | Ali Ghodsi at RSAC 2026

Databricks · Advanced ·🧠 Large Language Models ·3w ago
Databricks CEO Ali Ghodsi outlines why traditional SIEM architectures are struggling to keep pace with modern cybersecurity needs. Key takeaways: - Cost & Ingestion: Volume-based pricing and proprietary formats make comprehensive data ingestion slow and expensive. - Retention Issues: High costs lead to short retention cycles, preventing longitudinal analysis of long-term threats. - Data Gaps: Legacy systems often exclude multimodal data, such as audio, video, and LLM transcripts. - Manual Operations: Detection and investigation remain highly manual, leaving SOC teams inundated. Watch the full keynote: https://www.databricks.com/resources/webinar/its-time-leave-legacy-siem-behind?utm_source=youtube&utm_medium=organic-social
Watch on YouTube ↗ (saves to browser)
Sign in to unlock AI tutor explanation · ⚡30

Related AI Lessons

What's new in Prompt Optimizer: latest features and improvements
Learn how to optimize prompts with the latest features and improvements in Prompt Optimizer, a crucial tool for effective LLM interactions
Dev.to AI
AI vs LLM vs AI Agents vs Automation — What’s the Real Difference?
Understand the differences between AI, LLM, AI Agents, and Automation to clarify their roles in technology
Dev.to AI
PagedAttention: vLLM’s Solution to GPU Memory Waste
Learn how PagedAttention solves GPU memory waste for large language models (LLMs) and improve your LLM serving efficiency
Medium · ChatGPT
From 30 to 60 Tokens/Second: How I Got vLLM Running on 2x RTX 3090
Learn how to install and run vLLM on 2x RTX 3090 to achieve 60 tokens/second, a significant performance boost for LLM applications
Medium · LLM
Up next
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Dave Ebbelaar (LLM Eng)
Watch →