Опис вакансії
About the Client & Mission
Our client is the world’s largest environmental nonprofit focused on reforestation and sustainable development (Nature-based Solutions). We are building a modern cloud data platform on Azure and Snowflake that will serve as a single source of truth and enable faster, data-driven decision-making.
About the Initiative
This role supports a Data Warehouse initiative focused on tangible delivery impact: trusted data, clear and scalable models, and fast release cycles (1–3 months) with well-defined SLAs. You’ll work in a collaborative setup across Data Engineering ↔ BI ↔ Product, often handling 1–2 parallel workstreams with proactive risk and dependency management.
Core Stack
- ELT/DWH: Azure Data Factory + Azure Functions (Python) → Snowflake
- CI/CD: Azure DevOps pipelines + DL Sync (Snowflake objects and pipeline deployments)
- Primary data sources: CRM/ERP (Dynamics 365, Salesforce), MS SQL, API-based ingestion, CDC concepts
- Data formats: JSON, Parquet
Team (our side)
Lead Data Engineering, PM, DevOps, QA.
Your Responsibilities
- Design, build, and maintain incremental and full-refresh ELT pipelines (ADF + Azure Functions → Snowflake).
- Develop and optimize Snowflake SQL for the DWH and data marts (Star Schema, incremental patterns, basic SCD2).
- Build production-grade Python code in Azure Functions for ingestion, orchestration, and lightweight pre-processing.
- Implement and maintain data quality controls (freshness, completeness, duplicates, late-arriving data).
- Support CI/CD delivery for Snowflake objects and pipelines across dev/test/prod (Azure DevOps + DL Sync).
- Contribute to documentation, best practices, and operational standards for the platform.
- Communicate clearly and proactively: status → risk → options → next step, ensuring predictable delivery.
Requirements (Must-have)
- 4+ years in Data Engineering or related roles.
- Strong Snowflake SQL (CTEs, window functions, COPY INTO, MERGE).
- Hands-on experience with incremental loading (watermarks, merge patterns) and basic SCD2 (effective dating / current flag).
- Strong Python (production-ready code), including API integration (pagination, retries, error handling), logging, configuration, and secrets handling.
- Solid experience with Azure Data Factory (pipelines, parameters, triggers) and Azure Functions (HTTP/Timer triggers, idempotency, retries).
- Understanding of ELT/DWH modeling (Star Schema, fact/dimension design, performance implications of joins).
- CI/CD familiarity: Azure DevOps and automated deployment practices for data platforms (DL Sync for Snowflake is a strong plus).
- Strong communication skills and a proactive, accountable approach to teamwork.
Nice to Have
- PySpark (DataFrame API, joins, aggregations; general distributed processing understanding).
- Experience with D365 / Salesforce, MS SQL sources, API-based ingestion, and CDC patterns.
- Data governance/security basics, Agile/Scrum, and broader analytics tooling exposure.
Selection Process (Transparent & Practical)
Stage 1 — Intro + TA + Short Tech Screen (40–60 min, Zoom):
- project context (multi-project setup, 1–3 month delivery cycles), must-haves for Azure/ELT, a short SQL/Python scenario;
- soft skills & culture match discussion covering: Proactive communication & stakeholders, Critical thinking & judgment, Problem solving & systems thinking, Ownership & maturity.
Stage 2 — Deep-Dive Technical Interview (75–90 min, with 2 engineers):
Live SQL (CTE/window + incremental load/SCD2 approach), PySpark mini-exercises, Azure lakehouse architecture discussion, plus a mini-case based on a real delivery situation.
No take-home task — we simulate day-to-day work during the session.
What We Offer
- Competitive compensation.
- Learning and growth alongside strong leaders, deepening expertise in Snowflake/ Azure / DWH.
- Opportunity to expand your expertise over time across diverse, mission-driven & AI projects.
- Flexible work setup: remote / abroad / office (optional), gig contract (with an option to transition if needed).
- Equipment and home-office support.
- 36 paid days off per year: 20 vacation days + UA public holidays (and related days off, as applicable).
- Monthly benefit of the cafeteria: $25 to support your personal needs (learning, mental health support, etc.).
- Performance reviews: ongoing feedback, compensation review after 12 months, then annually.
- Paid sabbatical after 5 years with the company.
P.S. Dear fellow Ukrainians,
we kindly ask you to apply for this role in a professional and well-reasoned manner, clearly highlighting the experience that is most relevant to the position.
If you are unsure whether your background fully matches the requirements, please feel free to mention this openly in your application. This will not reduce your chances of being considered; it helps us review your profile fairly and prioritize candidates based on overall fit for the role.