KF – Data Engineer with Databricks – Job9999

Job Summary

We are looking for a highly experienced Data Solution Architect to design, implement, and optimize a scalable, secure, and high-performing data architecture. This role is crucial in supporting business intelligence, analytics, and AI-driven initiatives, ensuring seamless integration across Databricks, SQL Server, and Azure Data Lake Storage (ADLSv2) environments. The ideal candidate will own end-to-end data solutions, define strategic roadmaps, and collaborate with cross-functional teams to optimize data workflows while maintaining governance, security, and scalability.

Job Responsibilities

  • Set up, configure, and deploy a working Databricks environment in Azure, including provisioning resources, configuring compute clusters, managing security, access controls, authentication, and network security.
  • Integrate Databricks with data lakes, databases, and cloud storage, install necessary libraries, ML frameworks, and Python dependencies, and create notebooks and workflows for data processing, analytics, and machine learning.
  • Design and implement robust data pipelines for ingestion, transformation, and synchronization.
  • Work with Azure SQL Server, Azure Data Factory, and ADLSv2 to optimize data movement, leveraging PySpark and SQL to enhance data processing efficiency.
  • Evaluate and recommend emerging data technologies to enhance the data ecosystem.
  • Work closely with data engineers, analysts, and business stakeholders to align solutions with business objectives.
  • Develop and maintain comprehensive documentation and best practices.

Basic Qualifications

Must-Have Skills

  • Expert SQL (preferably with data warehouse experience) and query optimization.
  • Azure Databricks & PySpark: Proven experience with big data processing and analytics.
  • Azure Data Factory & Data Engineering Fundamentals.
  • Azure Data Lake (ADLSv2): Experience with Parquet/Delta Tables.
  • Data Governance & Security: Strong knowledge of access control, compliance, and best practices in cloud environments.
  • Enterprise Data Integration: Experience working with MS Dynamics and Workday, ensuring seamless data ingestion and processing.

Nice-to-Have Skills

  • Experience with Dynamics Data and large-scale ERP migrations.
  • Understanding of data visualization and BI tools such as Tableau and Power BI.
  • Knowledge of Python, Scala, or Terraform for infrastructure automation and orchestration.

Soft Skills & Mindset

  • Strategic Thinking: Ability to align technical solutions with business goals.
  • Problem-Solving: Analytical mindset to tackle complex data architecture challenges.
  • Continuous Learning: Passion for staying updated on data technologies and industry best practices

 

  • Target Start Date: ASAP
  • Engagement Length: Confirmed until end of the year
  • Time Zone: EST
  • Working Hours: From 8:00 am to 5:00 pm
  • Country Restrictions: Avoid Venezuela and Cuba
  • Laptop requirements: BYOD

 

Job Type: Remote
Allowed Country: LATAM

Solicitar este puesto

Maximum allowed file size is 50 MB. Allowed type(s): .pdf, .doc, .docx
×