Data Engineer
(Feeds)

This is a fantastic opportunity to build out and take
ownership of end-to-end data pipelines and help
to grow a burgeoning team in an industry with
highly rewarding challenges.

Where:
Brighton, UK / Hybrid
Key area:
Data Team
Salary:
£40k-£55k DOE
Attendance:
Full-Time
About the role

We work primarily with a python/airflow/aws data stack, using python in our pipelines and docker to manage our instances.
You will have experience in these as well as being fluent with more general data engineering tools and techniques.

As an individual contributor your primary focus will be on building and optimising data pipelines within our existing framework, as well as looking for ways to improve the framework.

You will be confident in taking functional and non-functional requirements and translating these into technical requirements, from which high-quality data products can be built.

This is a fantastic opportunity to build out and take ownership of end-to-end data pipelines and help to grow a burgeoning team in an industry with highly rewarding challenges.

The person we’re looking for
  • Passionate about designing and delivering solutions to complex problems
  • Inquisitive and keen to innovate in the realms of ELT, Data Modelling and Big Data
  • Curious with a desire for continuous learning
  • Friendly with a collaborative attitude to solving problems
  • Excellent communication skills
You’ll be responsible for
  • Extracting and ingesting data from a variety of websites and data services
  • Maintaining and improving the 15gifts scraping framework
  • Identifying and designing process improvements, redesigning infrastructure for greater efficiency/scalability, automating manual data processes, etc
  • Translating functional and non-functional requirements into technical requirements
  • Designing, building and upgrading clean, well documented data pipelines
  • Writing automated tests to validate data and data pipelines
  • Using version control and performing code reviews
  • Fixing bugs in a timely manner
Skills & Experience

Essential

  • Comfortable with ELT pipelines and the full data lifecycle
  • Comfortable evaluating both business requirements and technical requirements
  • A good understanding of techniques to deal with large datasets
  • Experience with airflow and docker (or similar technologies)
  • Great python and SQL skills and experience
  • Experience with cloud (AWS or similar)
  • Experience managing data pipelines over time and evolving them to meet new business requirements

Desirable

  • Familiarity with web scraping techniques and technologies
  • Experience with NoSQL databases (such as MongoDB, DocumentDB)
  • Experience with Graph databases (such as ArangoDB, Neo4j)
  • Experience handling real-time data
  • DevOps experience (CI/CD and automated tooling, bash/zsh scripting)
  • Cloud computing with AWS, GCP or Azure

 

Benefits we offer

  • Employee Assistance Programme (confidential counselling)
  • Medicash healthcare scheme (reclaim costs for dental, physiotherapy, osteopathy and optical care)
  • easitBrighton travel scheme (discounted public transport options)
  • Cycle to work scheme
  • Life Insurance scheme
  • 33 days holiday (including bank holidays)
  • Contributory pension scheme
Application deadline: 06/02/2023
Apply now