Ma.Ya

ENGENHEIRO DE DADOS

28 de novembro de 2025
Negociar direto
Deadline date:

Descrição

Work model: Remote – Brazil

What your day-to-day will look like:

· Design, develop and maintain scalable data pipelines using Azure Data Factory, Azure Data Pipelines and Microsoft Fabric;
· Build robust ETL/ELT processes leveraging Python, SQL and cloud-native data tools;
· Integrate, transform and model data from multiple internal and external sources to support analytics, reporting and strategic insights;
· Ensure data quality, reliability and performance through monitoring, automated validation and observability practices;
· Collaborate closely with Data Analysts, Data Scientists, Product Managers and Engineering teams to deliver high-impact data solutions;
· Participate in architectural decisions involving data platform evolution, storage, ingestion standards and best practices;
· Implement and maintain automated workflows, CI/CD pipelines and version control for data assets using GitHub;
· Promote excellence in data engineering through documentation, standardization and continuous improvement of processes.

Required qualifications:

· Solid experience as a Data Engineer working with Python and advanced SQL;
· Hands-on expertise with Azure Data Factory, Azure Data Pipelines and Microsoft Fabric;
· Strong understanding of ETL/ELT processes, data modeling and data integration patterns;
· Experience working with cloud data platforms and distributed architectures;
· Knowledge of Git-based workflows and source control using GitHub;
· Familiarity with performance tuning, data governance fundamentals and monitoring tools;
· Ability to communicate clearly with cross-functional teams and translate business needs into technical solutions;
· Technical English for documentation reading.

Nice to have:

· Fluent English;
· Experience with Lakehouse architectures, Delta Lake or advanced Fabric capabilities;
· Knowledge of data-focused CI/CD (GitHub Actions, Azure DevOps Pipelines);
· Familiarity with data governance, security, encryption and compliance (LGPD preferred);
· Experience with event-based ingestion (Kafka, Event Hub, Service Bus);
· Prior experience supporting enterprise-grade data platforms or large-scale architectures;
· Exposure to Azure cloud services such as Databricks, Synapse, Functions and Storage Accounts.

What we offer:

· Hiring directly through BSafe (PJ contract model);
· 100% remote work, anywhere in Brazil;