Skip to content

Israel

Senior Data Engineer

  • Operations
  • Advanced (5-8 Years)
  • Full-time

Description

Rapyd has unified payments, payouts and fintech on one worldwide platform, and we’re assembling the world’s best team to liberate global commerce. With offices in Tel Aviv, Amsterdam, Singapore, Iceland, London, Dubai, Hong Kong, and the U.S., the opportunities at Rapyd are limitless.

We believe in straight talk, quick decisions, strong execution and elegant solutions. Rapyd is where hard work pays off and careers take off. Join us and let’s build the future of fintech together.

Get the tools to grow globally at www.rapyd.net. Follow: Blog, Insta, LinkedIn, Twitter

Responsibilities:

You will be responsible for the entire journey of data, from its origin to being ready for analysis.

This includes:

  • Design, develop, and implement robust data pipelines using ETL/ELT tools to move data from various sources to our DWH efficiently.
  • Ensure the quality and integrity of data throughout the pipeline by implementing data cleaning, transformation, and validation techniques.
  • Collaborate with stakeholders across the organization, including BI developers, and data analysts, to understand their data needs and translate them into technical specifications.
  • Build and maintain the infrastructure for our data platform, ensuring scalability, security, and performance.
  • Develop and implement processes and tools to monitor data quality and identify potential issues.
  • Document data pipelines and processes for clear communication and future maintenance.
  • Our DE team is responsible for managing several production pipelines that are feeding customer-facing systems.

Requirements

  • 5+ years of experience working as a Data Engineer
  • Proven leadership and management skills are a plus
  • Hands-on experience in developing end-to-end ETL/ELT processes – must
  • Excellent SQL skills, including complex SQL queries and procedures 
  • In-depth knowledge of Python
  • Experience with dimensional data modeling & schema design in Data Warehouses 
  • Experience with orchestration tools: Apache Airflow, Google Dataflow, etc.- must
  • Strong orientation in GCP environments – big advantage 
  • Bachelor’s degree in Engineering, Computer Science, Math, or other numerate/analytical degree equivalent
  • Strong communication skills and excellent English are a must

Job Candidate Privacy Policy – https://www.rapyd.net/candidate-privacy-policy

Apply For This Job

Not The Job You Were Looking For?

Share this Job Description