العودة إلى الوظائف

Data Architect #6632

1950Labs

Remotefull timeTodaylead
data architecturedata engineeringcloud engineeringazuredatabricks
تقدم الآن

Client Description

The client is a global organization in the tourism industry, offering river, ocean, and expedition cruises for passengers worldwide and operating a large fleet of vessels.

The company is currently undergoing an extensive cloud data modernization and unification program. We support them across data architecture, BI, migration, and data platform development.

A key focus area is the migration to Databricks Unity Catalog, including:

  • Migrating all data layers (landing, raw, prepared, reporting, services) from Hive Metastore to Unity Catalog
  • Migrating DLT (Delta Live Tables) and Python/SQL jobs into Databricks
  • Migrating pipelines in Azure Synapse/ADF
  • Rebuilding and adapting metadata frameworks
  • Standardizing access, lineage, governance, and overall Lakehouse structure

The client has very high technical expectations and is looking for top-level specialists capable of leading complex architectural initiatives.

Technical Requirements

  • Advanced knowledge of Microsoft Azure (data infrastructure, networking, authorization, cloud design)
  • Experience with Azure Synapse (especially Synapse Serverless and pipelines)
  • Strong expertise in Databricks (DLT, workflows, workspace administration)
  • Ability to design Data Lakehouse architectures (Medallion Architecture, Metadata-Driven ETL)
  • Very good knowledge of Python and code optimization
  • Strong SQL skills, including query optimization and SQL Server experience
  • Experience with Apache Spark (data processing workflows)
  • Experience building ETL/ELT processes and data warehouses
  • Experience with CI/CD processes (Azure DevOps, Git, branching strategies)
  • Experience implementing logging, monitoring, and optimization of data processes
  • Familiarity with Power BI and analytics workflows
  • Strong communication skills and documentation ability
  • Experience as a Lead Engineer (technical leadership, decision-making, stakeholder collaboration)

Scope of Responsibilities

  • Design and develop data architecture in Azure and Databricks environments
  • Participate in the Unity Catalog transformation (infrastructure, pipelines, frameworks, standards)
  • Migrate and modernize ETL/ELT processes (DLT, Python/SQL jobs, Synapse/ADF pipelines)
  • Design and implement Data Lakehouse solutions using Medallion architecture
  • Optimize data processing workflows (Python, SQL, Spark)
  • Build CI/CD processes and automation in Azure DevOps
  • Implement standards for logging, monitoring, and data quality
  • Lead the project from a technical perspective (Lead Engineer role)
  • Collaborate with business and technical stakeholders
  • Document solutions and mentor team members

Originally posted on Himalayas

عبر Himalayas
نشرة أسبوعية مجانية

ابقَ على مسار النيرد

بريد واحد أسبوعياً — دورات، مقالات معمّقة، أدوات، وتجارب ذكاء اصطناعي.

بدون إزعاج. إلغاء الاشتراك في أي وقت.