Senior Data Engineer (Data Warehouse Team)
London (United Kingdom)
Category: IT SysAdmin / DevOps
etl python amazon-web-services amazon-redshift sql
Senior Data Engineer
Who we are
We’re global, we’re growing and we’re going to need talent to keep up the pace. We’re making payments simpler for over 4m customers worldwide, in over 90 currencies. We’ve been around for 10 years, disrupting the market with a digital payment platform that aims to make sending money abroad as easy as sending a text message.
There’s almost 1,200 of us already hard at work and we love welcoming new people. We’ve got offices across the world, from London to Sydney and 15 locations in between - they’re open for business but right now lots of us are working from home. Want to be part of our global growth story - read on…
About the role
Working in the Data Warehouse Team, you will be responsible for maintaining and building the next generation data platform for WorldRemit.
You will be in a highly visible role and take up challenges which will allow WorldRemit to scale and be customer focussed.
As a member of WorldRemit’s Engineering team you will aim high, embrace challenge and always do what’s right; acting with integrity and building trust as you contribute to the company’s technical direction and long term decision making.
Reporting to the Senior Manager, you will:
Write great code. We understand code is read more than it's written, better off tested and maintainability is a must.
Help shape what we build. You'll be working closely with product owners, designers and other engineers to design and refine our work. We work as a team and your input is key.
Own delivery. We're obsessed with shipping value; you'll own work beyond just a pull request. You'll care about bugs, scalability, uptime and other non-functional requirements.
Grow together. You'll review other's work and happily seek feedback on yours to ensure we build a better codebase and sharpen each other's skills.
What we’re looking for from you...
Proven data engineering experience developing scalable and reliable data systems.
Proven experience with at least one major Cloud Data Warehouse Solution (Redshift, Snowflake).
Experience working in a cloud environment preferably AWS.
Experience working with ETL tools like FiveTran, Singer, Informatica will be helpful.
Experience in Kafka, Spark Streaming or other related technologies.Experience in an orchestration tool like Airflow, NiFi is highly desirable.
Experience writing pyspark ETL pipelines and performance tuning.
Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices.
Strong Python and SQL skills and ability to create queries to extract and build tables.
What you’ll get from us
Life assurance of 3 times your salary, should the worst happen.
Pension scheme offering 8% matched contributions.
Private medical and dental care plans.
25 days of holiday plus bank holidays, rising to 28 after 3 years.
Recharge days, 4 per year, 1 for each quarter
Cycle to work scheme
Various dining and shopping offers
*Please note, due to remote working at present, some of our benefits may be temporarily suspended. For more information please get in touch.