top of page
Why is it worth choosing us?
  • You will have a stable, long-term job at us

  • You will have the opportunity to gain experience in exciting, innovative projects

  • We are a young and cohesive team

  • We support your development with language and professional training opportunities

  • We support a healthy lifestyle with our own Sports Association

  • We also provide a HomeOffice option

  • We offer attractive benefit packages

With applying you accept our privacy policy. You can apply with your professional CV also by e-mail at!

ETL/Hadoop Developer


Primary Responsibilities

• Design and development of ETL and Hadoop/ Snowflake applications.
• Undertaking end-to-end project delivery (from inception to post-implementation support), including review and finalization of business requirements, creation of functional specifications and/or system designs, and ensuring that end-solution meets business needs and expectations.
• Responsibilities around deployment support (late hour, weekend).
• Development of new transformation processes to load data from source to target, or performance tuning of existing ETL code (mappings, sessions) and Hadoop/ Snowflake Platform.
• Analysis of existing designs and interfaces and applying design modifications or enhancements.
• Coding and documenting data processing scripts and stored procedures.
• Providing business insights and analysis findings for ad-hoc data requests
• Testing software components and complete solutions (including debugging and troubleshooting) and preparing migration documentation. Providing reporting-line transparency through periodic updates on project or task status.


• Bachelor’s/master's degree in engineering, preferably Computer Science/Engineering.
• 3+ years of experience with the technical analysis and design, development and implementation of data warehousing / Data Lake solutions.
• Strong SQL programming and stored procedure development skills.
• 2+ years of experience developing in Informatica or any other ETL tool.
• 2+ years relational database experience.
• Strong UNIX Shell scripting experience to support data warehousing solutions.
• Process oriented, focused on standardization, streamlining, and implementation of best practices delivery approach.
• Excellent problem solving and analytical skills.
• Excellent verbal and written communication skills.
• Experience in optimizing large data loads.


• Understanding/experience in Hive/Impala/Spark/Snowflake.
• Experience with Teradata is a big plus.
• Ability to architect an ETL solution and data conversion strategy.
• Exposure to an Agile Development environment.
• Knowledge about TWS Scheduler.
• Strong understanding of Data warehousing domain.
• Good understanding of dimensional modelling.
• Should be a good Team player.

bottom of page