DATA ENGINEER
Lausanne
Key information
- Publication date:08 December 2025
- Workload:100%
- Contract type:Permanent position
- Language:English (Intermediate)
- Place of work:Lausanne
xpleo offers a unique range of integrated engineering, quality and strategic consulting services for digital transformation. At a time of unprecedented technological acceleration, we are the trusted partner of innovative companies. We help them develop a competitive advantage and improve the daily lives of millions of people.
Joining Expleo Switzerland means working for 19,000 people in 30 countries:
- Technical and human support for each project and effective career management
- Training to develop your professional skills
- Take part in special dedicated events
- Join a dynamic team
Part of the Data Engineering team, you will play a crucial role in overseeing the maintenance and optimization of data pipelines within the Databricks platform. Your primary responsibilities will encompass addressing evolving business requirements, refining ETL processes, and ensuring the seamless flow of energy data across our systems.
Key Responsibilities:
1. Design, Develop, and Maintain Robust Data Workflows:
• Create and maintain scalable data workflows on Databricks integrated with AWS.
• Collaborate closely with cloud and frontend teams to unify data sources and establish a coherent data
model.
2. Ensure Data Pipeline Reliability and Performance:
• Guarantee the availability, integrity, and performance of data pipelines.
• Proactively monitor workflows to maintain high data quality.
3. Collaborate for Data-Driven Insights:
• Engage with cross-functional teams to identify opportunities for data-driven enhancements and insights.
• Analyze platform performance, identify bottlenecks, and recommend improvements.
4. Documentation and Continuous Learning:
• Develop and maintain comprehensive technical documentation for ETL implementations.
• Stay abreast of the latest Databricks/Spark features and best practices, contributing to the continuous
improvement of our data management capabilities.
Your profile :
Qualifications:
• Bachelor's degree in Computer Science, Information Technology, or a related field.
Minimum of 5 years as a Data Engineer, with a proven track record of implementing pipelines in Databricks
You have experience in cloud environments (AWS or Azure) and streaming.
• Previous experience in the energy trading domain is a nice-to-have.
• Ability to work effectively in a fast-paced, collaborative environment.
• Detail-oriented with effective task prioritization skills.
Fluent english required, advanced French is mandatory
Contact
About the company
Lausanne