Loading...
Share this Job
Apply now »

Big Data Engineer

71273

Our opportunity


Big Data Engineer  


We are seeking a Data Engineer to join our Enterprise Data Analytics & Architecture (EDAA) organization within Group Operations Business Transformation.

Your role


As Big Data Engineer, your main responsibilities will involve:

 

• Development of data processing pipelines using Spark/Scala.

• Develops and performs testing using Spark/Scala.

• Hands on development and monitoring of the Azure cloud Platform and various associated components for data ingestion, transformation and processing.

• Effectively diagnose, isolate, and resolve complex problems pertaining to data infrastructure, including performance tuning and optimization.

• Designs and writes programs according to functional and non-functional requirements.

• Develops and maintains technical documentation.

• Follows established configuration/change control processes

• Generating insights from internal and external data sources, using a wide range of tools and methodologies, e.g. from building complex data pipelines to applying Semantics and/or Machine Learning



Your Skills and Experience


As Big Data Engineer, your skills and experience will ideally include:

 

• University level education or equivalent
• 3+ years of experience working as a Software Engineer

• 2+ Experience in functional programming (Scala)

• 2+ years of experience in Apache Spark 2.0  (big plus)

• Knowledge of design patterns

• Ability to write complex SQL queries

• A good understanding of distributed computation

• Bash scripting

• GIT

• Experience in model development; in particular, Machine Learning and Natural Language Processing

• Some experience in text classification, entity extraction, clustering & topic modeling

 

Nice to have:

 

• Python

• Experience in R programming

• Experience with Hadoop technologies (i.e. Sqoop, Hive, Flume etc.)

• Continuous Integration / Continuous Deployment (Azure DevOps)

• Experience in data exploration using notebooks (Databricks notebooks).

• Experience in automation of regular tasks(e.g. data refreshes) using bash and python scripting.

• Experience with Azure cloud.

• Experience with Databricks.

• Experience with web-scraping is a plus

 

 

Minimal salary offer is 2550EUR/gross plus other pay elements based on your experience.

Apply now »