We need one more BIG thinker!

Data is useless without the skill to analyse it!

If you have this power, read more! We have an exciting opportunity for you where you can even improve your skills!

Your tasks would be:

  • Selecting and integrating Big Data tools to provide a solution serving an analytics back end for the Risk Analytics project
  • Design and implement ETL processes for extracting, transforming and loading massive amounts of data into the Hadoop clusters
  • Provide analytics solutions on the Hadoop clusters by implementing complex queries running in parallel and with high performance

We need these skills:

  • 2-3 years relevant experience
  • Solid background in understanding the HDFS distributed file system
  • Previous experience in Data Engineering projects and solutions in the Hadoop environment
  • Knowledge of Big Data related ETL tools and messaging systems
  • Experience in implementing components/queries using different Spark modules
  • Knowledge of Big Data Querying tools
  • Ability to implement functional modules with Python &/or R programming languages using the Hadoop ecosystem frameworks

We need these technologies:

  • Hadoop Ecosystem
  • Cloudera Hadoop
  • Big Data related ETL tools
  • Python OR R programming

Nice to have technologies:

  • Flume
  • HBASE
  • HDFS
  • HIVE
  • Kafka
  • Impala
  • MapReduce
  • MongoDB
  • NoSQL
  • Pig
  • Spark

We can offer for you:

  • Internal trainings, to get to know individual technology solutions & applications
  • Participation in international projects in collaboration with experts from abroad
  • Competitive salary, cafeteria and other benefits
  • Friendly and modern work environment

Place of work – Budapest

So now you see the big picture! Do you find it exciting enough?

I hope your answer is yes, because we can’t wait to get to know you!

Do you have more professional experience with datas? Click here for the senior job opportunity!