KEY SKILLS & BEHAVIOURS REQUIRED
· Technical Ability: has a high level of current, technical competence in relevant technologies, and be able to independently learn new technologies and techniques as our stack changes.
· Clear communication; can communicate effectively in both written and verbal forms with technical and nontechnical audiences alike.
· Complex problem-solving ability; structured, organised, process-driven and outcome-oriented. Able to use historical experiences to help with future innovations.
· Passionate about data; enjoy being hands-on and learning about new technologies, particularly in the data field.
· Self-directed and independent; able to take general guidance and the overarching data strategy and identify practical steps to take.
TECHNICAL SKILLS REQUIRED
· Significant experience designing and building data solutions on a cloud based, big data distributed system.
· Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD).
· Comfortable writing efficient SQL and debugging.
· Understanding of ML development workflow and knowledge of when and how to use dedicated hardware.
· Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam)
· Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture.
· Understanding of Data Management principles (security and data privacy) and how they can be applied to Data Engineering processes/solutions (e.g. access management, data privacy, handling of sensitive data (e.g. GDPR
DESIRABLE
· Real-time data pipelines in a commercial production environment with Spark Streaming, Kafka or Beam
· Understanding of the challenges faced in the design and development of a streaming data pipeline and the different options for processing unbounded data (pubsub, message queues, event streaming etc)
· Understanding of the most commonly used Data Science and Machine Learning models, libraries and frameworks.
· Knowledge of the development lifecycle of analytical solutions using visualisation tools (e.g. Tableau, PowerBI, ThoughtSpot)
· Hands-on development experience in an airline, e-commerce or retail industry
· Worked within the AWS cloud ecosystem
· Experience of building a data transformation framework with dbt
What you’ll get in return
· Competitive base salary
· Up to 20% bonus
· 25 days holiday
· BAYE, SAYE & Performance share schemes
· 7% pension
· Life Insurance
· Work Away Scheme
· Flexible benefits package
· Excellent staff travel benefits
About easyJet
At easyJet our aim is to make low-cost travel easy – connecting people to what they value using Europe’s best airline network, great value fares, and friendly service.
It takes a real team effort to carry over 90 million passengers a year across 35 countries. Whether you’re working as part of our front-line operations or in our corporate functions, you’ll find people that are positive, inclusive, ready to take on a challenge, and that have your back. We call that our ‘Orange Spirit’, and we hope you’ll share that too.
Apply
Complete your application on our careers site.
We encourage individuality, empower our people to seize the initiative, and never stop learning. We see people first and foremost for their performance and potential and we are committed to building a diverse and inclusive organisation that supports the needs of all. As such we will make reasonable adjustments at interview through to employment for our candidates.
#MP2 #LI-HYBRID