Senior Data Engineer
Requirements:
-
5+ years of experience delivering production-grade ETL/ELT pipelines and data migrations.
-
Strong SQL skills (query design, tuning, windowing, CTEs, partitioning) and solid database fundamentals (indexes, ACID, CDC).
-
Hands-on experience with APIs for data extraction/migration (REST, OAuth 2.0, pagination, retries).
-
Experience with ETL tools (e.g., Talend) and/or orchestration platforms (Fabric Data Pipelines, ADF, Airflow).
-
Cloud proficiency (Azure preferred, AWS/GCP acceptable): storage, compute, networking, IAM, and cost control.
-
Familiarity with Microsoft Fabric (OneLake, Lakehouse/Warehouse, Data Pipelines, Dataflows Gen2).
-
Excellent communication skills and ability to work directly with CTO-level stakeholders.
-
English: Upper-Intermediate+ for daily communication.
Responsibilities:
-
Design and implement ETL/ELT pipelines to synchronize data across systems (Postgres, HubSpot, Jira, QuickBooks, and other SaaS apps).
-
Lead data migration projects: mapping, cleansing, reconciliation, backfills, cutover strategies, rollback plans, and verification.
-
Build the analytics layer on Microsoft Fabric (OneLake, Lakehouse, Dataflows Gen2, Data Pipelines) and deliver optimized models for BI.
-
Integrate and harden external APIs (REST/OData/GraphQL), including authentication (OAuth 2.0), pagination, retries, and webhooks.
-
Model data (3NF, Data Vault, Kimball STAR), implement SCDs, and optimize SQL for performance and cost.
-
Establish data quality & observability standards (tests, anomaly detection, lineage, SLAs/SLOs) and CI/CD practices.
-
Partner with client stakeholders (CTO, Finance, Ops, Sales Ops) to gather requirements, make architecture decisions, and provide documentation/runbooks.
-
Champion security & compliance practices for sensitive/financial data (least privilege, masking, auditability, data retention).
Nice to have:
-
dbt (models/tests/docs), Spark/Databricks, Snowflake/BigQuery/Redshift.
-
Power BI semantic modeling.
-
Data quality/observability tools (Great Expectations, Soda, Monte Carlo).
-
Data lineage (OpenLineage/Marquez).
-
Infra & DevEx: Docker, Terraform/Bicep, GitHub Actions, Azure DevOps.
-
Experience with Jira, HubSpot, QuickBooks data models.
We offer:
Well-being:
- 10 Working days of Paid Day Off within an individual year.
- Up to 15 working days of Unpaid days off within an individual year.
- Compensation for sports activities or life insurance covering (up to 250$ per year) – after the trial period.
Professional Growth:
- Sombra University courses – enjoy a range of learning opportunities through Sombra University. It offers many educational courses, as well as educational lectures on a variety of topics.
- Sombra Around Tech – community attendance – Sombra unites engineers and experts in several areas: Front-end, Back-end, QA, DevOps, and Вusiness Analysis.
- Mentorship program – available on request.
- UDEMY online course platform – stay up-to-date with the latest technologies and programming languages.
- English courses and Speaking Club – attend English classes twice a week in small groups.
Added advantages:
- Work equipment (Laptop, monitor, and small devices compensation).
- Sombra’s referral program.
- If you know someone you believe is a good fit for our cooperation, you can recommend them and get a reward.
- Public Holidays – celebrate 18 statutory holidays in Colombia.
- Sombra events – Join Sombra’s traditional events (both online and offline).