Senior Data Engineer (Python/Spark) (Budapest)

Senior Data Engineer (Python/Spark) (Budapest)
Jelentkezem az állásra ›

 

Would you like to work at a company where professional growth goes hand in hand with a strong emphasis on work-life balance? Are you looking for new challenges?

If you're fascinated by the world of airlines and eager to gain insight into how a Senior Data Engineer contributes to the operations of an airline, then this opportunity is for you!

Senior Data Engineer (Python/Spark)

Tasks

As a senior data engineer...

  • You will have the autonomy to assess and decide on the best data architectures based on the unique needs of each business case. 
  • Whether transitioning from legacy data warehouse systems or building new data lake solutions, you will architect and implement scalable, robust data pipelines.
  • You’ll be a vital member of our Business Analytics Support (BAS) team, serving multiple business departments, such as Safety, Security, and Crisis Management.
  • Your expertise to deliver scalable, future-proof solutions, leveraging both structured and unstructured data.
  • After crafting these solutions, you'll oversee their seamless operation and ensure their sustained performance.
  • You'll do all this within an agile framework, preparing us for an impending cloud migration.
  • Your contributions will be crucial in powering data marts and providing actionable insights to our customers through dashboards.
  • You'll  coach your junior teammates on a technical level. 

Requirements

  • Bachelor degree or higher in Computer Science, Software Engineering or other relevant fields (less important considering work experience)
  • At least 5 - 10 years of experience building production-grade data processing systems as a Data Engineer

In-depth knowledge of:

  • Cloud platforms like GCP (must have) and Azure (good to have).
  • The Hadoop ecosystem.
  • Building applications with Apache Spark; including Spark Streaming
  • Columnar storage solutions like Apache HBase; including knowledge about data modelling for columnar storage
  • Experience with Event Streaming Platforms like Apache Kafka
  • Experience with 2 or more server-side programming languages, e.g. Python, Java, Scala (Good to have), etc.
  • Understanding of common algorithms and data structures
  • Experience with databases and SQL
  • Experience with conceptual, logical and physical data modelling
  • Knowledge of continuous integration/continuous deployment techniques
  • Affinity with Machine Learning and/or Operations Research concepts

Offers

  • Competitive compensation package (cafeteria plan, home office support, bonus, discounted airline tickets)
  • Flexible working hours
  • Option to work from home 4 days a week
  • Professional training opportunities
  • Language courses
  • Motivating, friendly, and supportive team
Budapest

Küldjünk emailt, ha hasonló hirdetés kerül az oldalra?

Ne maradj le egy jó állásról!

Állásajánlatok - legnépszerűbb városok