Data Engineer

W0DE2

department
Information Technology
level required
Mid-Senior Level
job timing
Full-time

Job Description

Mission:

To build a data platform to help our researchers and data scientists to quickly and accurately answer to the world’s hardest questions depend on quantitative research. 🙂

If you have experience in shipping scalable software solutions working with large data and throughput volumes and you can quickly learn new and emerging technologies – scroll down and learn more!

As a Data Engineer you will contribute to the full development lifecycle from data to pipelines and to support our colleagues in Data Science and Research teams.

Your responsibilities will be: 

  • Developing a ML cloud-based platform, to help data scientist delivering predictive models
  • Building robust data processing pipelines of high data quality in a scalable fashion (both data and maintainability)
  • Creating ETL data flows in Big Data Ecosystem
  • Creation of data models for data injection to the ML platform
  • Maintenance of the data processing pipeline and the machine learning platform
  • Creation of a well documented and easy to maintain code base
  • Unit and functional tests for high-quality codebase

Qualifications

What will you need:

  • At least 3 years of data engineering/backend software engineering experience
  • Graduate degree in a quantitative or analytical field with material exposure to coding
  • Fluent English
  • Good communication skills
  • Detail-oriented, ability to multitask and work in cross-functional teams with Agile methodologies 
  • Ability to work independently and also be able to work in a team

Technologies:

Must haves:

  • 3+ years experience in Python (data-processing, platform and runtime) & cloud technologies (e.g. AWS)
  • Experience in Data Structures (DataBase, Data Warehouse, Data Lake) and processing languages (e.g. SQL, Python, Spark)
  • Understanding of Big Data concept

Advantage:

  • Flask, Machine Learning and AI experience
  • GO programming 
  • AWS ecosystem (Fargate, S3, SQS, SNS, Lambda, RDS)
  • Docker (containers)
  • Zookeeper (management)
  • NGINX (load balancer)
  • Ansible ( application deployment)
  • Jfrog Artifactory (Repository Manager)
  • CircleCI (Continuous Integration)
  • Log and APM tools
  • MongoDB and Redis (NoSQL)
  • PostgreSQL

Additional Information

What can we offer:

  • The chance to solve the most pressing challenges on a global scale
  • Professional development: books, online training
  • Opportunity to attend or speak on conferences
  • Opportunity for travel
  • Flexible working hours
  • Home office opportunity
  • Healthcare benefit package
  • Eye-glasses allowance
  • Excellent salary and compensation package
  • You can choose your working tools (Mac or Windows)
  • Relocation allowance

Place of work: Budapest