top of page
​Why Choose Us?
    • You will have a stable, long-term job at us
    • You will have the opportunity to gain experience in exciting, long-term, innovative projects.
    • Work in a multinational environment.
    • Work in a team of great engineers
    • We support your development with language and professional training opportunities
    • We offer attractive benefit packages (Cafeteria)


ETL/Hadoop Developer

​Primary Responsibilities

• Design and development of ETL and Hadoop/ Snowflake applications.
• Undertaking end-to-end project delivery (from inception to post-implementation support), including review and finalization of business requirements, creation of functional specifications and/or system designs, and ensuring that end-solution meets business needs and expectations.
• Responsibilities around deployment support (late hour, weekend).
• Development of new transformation processes to load data from source to target, or performance tuning of existing ETL code (mappings, sessions) and Hadoop/ Snowflake Platform.
• Analysis of existing designs and interfaces and applying design modifications or enhancements.
• Coding and documenting data processing scripts and stored procedures.
• Providing business insights and analysis findings for ad-hoc data requests
• Testing software components and complete solutions (including debugging and troubleshooting) and preparing migration documentation. Providing reporting-line transparency through periodic updates on project or task status.


• Bachelor’s/master's degree in engineering, preferably Computer Science/Engineering.
• 3+ years of experience with the technical analysis and design, development and implementation of data warehousing / Data Lake solutions.
• Strong SQL programming and stored procedure development skills.
• 2+ years of experience developing in Informatica or any other ETL tool.
• 2+ years relational database experience.
• Strong UNIX Shell scripting experience to support data warehousing solutions.
• Process oriented, focused on standardization, streamlining, and implementation of best practices delivery approach.
• Excellent problem solving and analytical skills.
• Excellent verbal and written communication skills.
• Experience in optimizing large data loads.


• Understanding/experience in Hive/Impala/Spark/Snowflake.
• Experience with Teradata is a big plus.
• Ability to architect an ETL solution and data conversion strategy.
• Exposure to an Agile Development environment.
• Knowledge about TWS Scheduler.
• Strong understanding of Data warehousing domain.
• Good understanding of dimensional modelling.
• Should be a good Team player.


• You will have the opportunity to gain experience in exciting, long-term, innovative projects,
• Flexible working arrangements (core hours and opportunity to work from home)
• Work in a multinational team/environment,
• A team of great engineers,
• Cafeteria.

With applying you accept our privacy policy. You can apply with your professional CV also by e-mail at!

bottom of page