As our Data Platform Engineer, you will design, provision, and operate the foundational data infrastructure that powers analytics and reporting across the organization. You’ll own everything from spinning up our cloud data warehouse to building the initial ETL pipelines that bring raw data into shape—laying the groundwork for self-service analytics and data-driven decision making.
What You’ll Bring
Cloud & Infrastructure Expertise
- 5+ years in software or data engineering, with hands-on experience provisioning and maintaining cloud data warehouses (e.g., Snowflake, Amazon Redshift, Google BigQuery)
- Proficiency with Infrastructure-as-Code tools (Terraform, CloudFormation, Pulumi) to automate data platform deployments
ETL & Data Modeling
- Strong SQL skills and experience building ETL pipelines in Python or Java/Scala
- Familiarity with orchestration frameworks (Airflow, Prefect, Dagster) or transformation tools (dbt)
- Solid understanding of data modeling patterns (star, snowflake schemas) and best practices for partitioning, clustering and indexing
Platform Reliability & Security
- Experience implementing data quality checks, monitoring (Datadog, Prometheus, Monte Carlo) and alerting for pipelines
- Knowledge of data governance and security—IAM, encryption at rest/in transit, PII redaction
Collaboration & Documentation
- Excellent communicator—able to partner with analytics, product and engineering teams to translate requirements into scalable solutions
- Strong documentation skills to define data platform standards, runbooks and onboarding guides
What You’ll Own
Data Warehouse Provisioning
- Architect and spin up our production and sandbox data warehouse environments using IaC
- Define resource sizing, scaling policies and cost-optimization strategies
ETL Pipeline Development
- Build and deploy the first wave of ETL pipelines to ingest transactional, event and third-party data
- Implement transformations, schema migrations and incremental workflows
Platform Operations & Monitoring
- Embed data quality tests and SLA tracking into every pipeline
- Set up dashboards and alerts to proactively detect failures, latency spikes or data drift
Standards & Enablement
- Establish coding conventions, pipeline templates and best practices for all future data projects
- Mentor other engineers and analysts on using the data platform, from query optimization to CI/CD for analytics code
Cross-Functional Partnership
- Work closely with BI/analytics teams to understand reporting needs and iterate on data models
- Collaborate with application engineers to identify new data sources and automate their onboarding
About Us:
Alternative Payments is a leading payments platform for service-based companies. Our innovative platform enables businesses to automate the entire accounts receivable process and reduce the overall expense of payment processing. We are dedicated to fostering growth and providing exceptional value to our clients.
Our Values
Transparency & Honesty: We operate with transparency to our customers, investors, and other partners, every step of the way.
Dependability: We are dependable. We do what we say we are going to do and we do not cut corners.
Partnership: We are partners to our customers, investors, and each other, and work together to solve exciting massive problems.
Revolutionary & Boldness: We are revolutionary & bold. We break down barriers and walls to build our own walls in a stronger, safer, and simpler manner.
Diversity & Inclusion: We work together with people of all backgrounds and seek different viewpoints to generate stronger partnerships and create a stronger, more inclusive company and world.
Como se candidatar
Aplique no link: https://www.alternativepayments.io/careers/?ashby_jid=a5e32e9d-dce0-4b04-beba-ae43cf2aeb76
Aceitamos apenas candidatos no Brasil, com currículos em Inglês.
Labels
Alocação
- Remoto
Regime
- PJ
Nível
- Sênior
- Especialista