0 - 0USD / *** No matched field name from ZENGINE response
<p>Architect and lead the modern data platform using <strong>Microsoft Fabric</strong>. You’ll define data models, pipelines, governance, and performance patterns that power analytics at scale.</p><p><strong>What You’ll Do</strong></p><ul><li>Design end‑to‑end architectures leveraging Fabric (OneLake, Lakehouses, Warehouses)</li><li>Define medallion/layered models, dimensional designs, and semantic layers</li><li>Lead ingestion and transformation with Dataflows Gen2 / Data Factory / Notebooks</li><li>Establish governance (data quality, lineage, security, RLS/OLS)</li><li>Optimize performance and cost; standardize reuseable patterns</li><li>Mentor data engineers/analysts; review solutions and set best practices</li><li>Partner with business to translate use cases into scalable models</li></ul><p><br></p>
We are looking for an experienced Data Engineer to join our team in Jacksonville, Florida. In this role, you will take the lead in designing and building a cutting-edge Azure lakehouse platform that enables business leaders to access analytics through natural language queries. This position combines hands-on technical expertise with leadership responsibilities, offering an opportunity to mentor a team of skilled engineers while driving innovation.<br><br>Responsibilities:<br>• Architect and develop a robust Azure lakehouse platform, utilizing Azure Data Lake Gen2, Delta Lake, and PySpark to create efficient data pipelines.<br>• Implement a semantic layer and metric store to ensure consistent data translation and definitions across the organization.<br>• Design and maintain real-time and batch data pipelines, incorporating medallion architecture, schema evolution, and data contracts.<br>• Build retrieval systems for large language models (LLMs) using Azure OpenAI and vectorized Delta tables to support chat-based analytics.<br>• Ensure data quality, lineage, and observability through tools like Great Expectations and Unity Catalog, while optimizing costs through partitioning and compaction.<br>• Develop automated systems for anomaly detection and alerting using Azure ML pipelines and Event Grid.<br>• Collaborate with product and operations teams to translate complex business questions into actionable data models and queries.<br>• Lead and mentor a team of data and Python engineers, establishing best practices in CI/CD, code reviews, and documentation.<br>• Ensure compliance with security, privacy, and governance standards by designing and implementing robust data handling protocols.