Sr. Data Engineer

Credix

Credix

Data Science
Posted on Jul 23, 2025

Sr. Data Engineer 🇧🇷

About Credix

Credix is a FinTech company dedicated to growing businesses in Latin America. Building on our expertise, we now focus on providing a tailored Buy Now, Pay Later (BNPL) solution for B2B transactions in Brazil with our platform, CrediPay. CrediPay is created to help business grow their sales and improve their cashflow efficiency through seamless and risk-free credit offering. Sellers offer their buyers flexible payment terms at an attractive price point and receive upfront payments. We manage and protect our clients from any credit & fraud risk, letting them focus only on what matters: increased sales and profitability.
Learn more about our team, culture, and vision on our company page.
Why choose Credix?
Become part of a forward-thinking start-up where boldness and a commitment to excellence are paramount, and your personal and professional development is at the forefront.
Work alongside a dedicated team of bright individuals driven by an Olympian mindset to excel in every aspect of our operations. Together, we aim to build with velocity, utilizing innovative embedded finance strategies to expand business operations in Latin America.
Experience a close and supportive work atmosphere where collaboration thrives, wise judgment guides our decisions, and where you have the opportunity to learn, grow, and take on meaningful responsibilities.

About the job

As a Senior Data Engineer, you will be at the heart of Credix's data strategy, designing and building scalable pipelines and infrastructure that empower teams across the company. Your work will enable the Risk team to enhance predictive modeling, streamline data consumption for other departments, and help drive contextual underwriting and data-driven decision-making. You are passionate about leveraging data to solve complex challenges and revolutionize the B2B credit market in Brazil.

Qualifications

Fluent in Portuguese and English, both written and spoken.
Hands-on experience building ETL/ELT pipelines with dbt (must-have) and orchestration tools like Apache Airflow, Cloud Composer, or similar.
Deep understanding of Google Cloud Platform services (e.g., BigQuery, Cloud Storage, Cloud Run, Dataflow).
Expertise in SQL and Python, with clean, well-documented coding practices.
Familiarity with data warehousing best practices, medallion design, and analytics engineering principles.
Experience working with Terraform or similar IAC tools for provisioning data infrastructure.
Bonus: Experience with streaming data ingestion (e.g., Pub/Sub, Kafka, or Dataflow).
Bonus: Familiarity with financial services data (installments, receivables, delinquency, credit scoring, etc.) and regional data sources in Brazil (Serasa, Receita, CNPJ enrichment).
Proactive, detail-oriented, and self-motivated, with a strong commitment to quality and delivery.
Ability to clearly communicate data design trade-offs and mentor junior engineers or analysts in best practices.

Core Responsibilities

Build and Own Ingestion Pipelines: Design robust, modular pipelines to ingest structured and semi-structured data into Google Cloud Platform (GCP) environments.
Develop Clean, Analytics-Ready Layers: Use dbt to transform raw ingested data into curated datasets optimized for credit risk modeling and business intelligence consumption.
Operationalize the Data Lake: Mmanage the data lifecycle of our transactional data to support both real-time and historical querying needs.
Metrics & KPI Layer: Create a single source of truth for key business KPIs and credit risk metrics by building reliable and tested data marts.
Implement Data Quality Controls: Deploy automated testing frameworks (e.g., dbt tests, GCP dataplex) to ensure 90%+ coverage and detect schema drift, nulls, and outliers.
Support API & 3rd Party Integrations: Develop ingestion frameworks for external APIs to enrich risk data.
Collaborate Across Functions: Work closely with Credit Risk, Operations, and Product teams to understand analytical needs and translate them into scalable data solutions.
Contribute to Platform Scalability: Design pipelines with reusability and modularity in mind to support onboarding new data sources and future expansion across regions or products.
Maintain Observability: Ensure logging, monitoring, and alerting are implemented across data flows for reliability and debugging (e.g., via GCP Logging, Cloud Monitoring, or third-party tools).
Documentation & Demo Ownership: Create clear, user-friendly documentation and visual diagrams of the data architecture and transformation layers.

What we offer

We believe that collaboration and team spirit thrive best in an in-office environment. Our office provides a vibrant and engaging workspace where team members can connect, innovate, and grow together. With access to our offices in Sao Paulo or Antwerp, you'll immerse yourself in a culture of innovation and collaboration.
But that's just the beginning - here's what else we offer:
A culture of learning and experimentation: where you are encouraged to explore new ideas and technologies
Competitive salary package: Your hard work deserves recognition, and we ensure you're well-rewarded for your contributions.
Equity stock options plan: Be a part of our journey towards success and share in the rewards.
Paid holidays: Enjoy the flexibility to recharge and rejuvenate
Bi-annual off-site: Awesome team building, unforgettable memories, and adventures ensured during our team off-sites.

How to apply

Timeline

We expect to reply to you within ~1 week of you submitting the application form if we are interested in setting up a call.
While we're reviewing every application carefully, we may not be able to respond to everyone personally. Rest assured that it's not because we don't value your interest - it's simply a result of our team being stretched thin as we work hard to bring our vision to life.