EFG Bank AG
Geneva
14 hours ago
DataOps Engineer
- Publication date:18 October 2025
- Workload:100%
- Contract type:Permanent position
- Place of work:Geneva
Job summary
Join EFG International as a DataOps Engineer in Geneva! Be part of a dynamic team.
Tasks
- Build and optimize data pipelines in our lakehouse ecosystem.
- Ensure reliability and performance of data operations in production.
- Collaborate with teams to deliver impactful data products.
Skills
- Experience in SQL, Python, and CI/CD pipelines required.
- Strong knowledge of data streaming patterns and containerization.
- Familiarity with data governance and security best practices.
Is this helpful?
About the job
DataOps Engineer
Job Description
- Department: Data Ops
- Work time Percentage: 100%
- Location: Geneva
EFG International is a global private banking group, offering private banking and asset management services. We serve clients in over 40 locations worldwide. EFG International offers a stimulating and dynamic work environment and strives to be an employer of choice.
EFG is committed to providing an equitable and inclusive working environment that is founded on the principle of mutual respect. Joining our team means experiencing a supportive environment, where your contributions are valued and recognised. We strongly believe that the diversity of our teams gives us a competitive advantage by fostering better decision-making and greater innovation.
Our Purpose and Mission
Empowering entrepreneurial minds to create value – today and for the future.
We are a private bank, offering personalised solutions on a global scale to private and institutional clients. Our sustainable success is based on our talents and on how we partner with our clients and communities to create lasting value.Job Description
We are looking for a motivated and proactive DataOps Engineer to join our team. In this role, you will contribute to the development and operation of our two most importants data platforms, ensuring reliable, scalable, and efficient data pipelines operations and products deliveries. You will work closely with data engineers, analysts, and business stakeholders to operate and support our data integration and data lakehouse plateforms by timely delivering trusted data products to the organization.
Main missions:
- Build, operate, and optimize data pipelines and workflows across our lakehouse ecosystem.
- Ensure reliability, scalability, and performance of data operations in production environments.
- Collaborate with cross-functional teams to deliver data products that enable business insights and advanced analytics.
- Implement and maintain CI/CD pipelines for data systems and workflows.
- Act as a proactive problem solver and propose new ideas to improve data architecture and operations.
- Contribute to automation, documentation, and deploy good practices in DataOps (DevOps).
- Monitor, troubleshoot, and continuously improve data systems and processes.
- Participate to standby duty
- Mindset & Soft Skills:
- Strong motivation, curiosity, and willingness to learn.
- Proactive, autonomous, and solution-oriented.
- Ability to take initiative and bring new ideas to the table.
- Technical Skills:
- Strong knowledge of SQL (ideally Microsoft SQL Server).
- Good experience of data streaming patterns
- Good experience of Python and scripting languages; PySpark is a strong plus.
- Good experience of containerization (Docker, Kubernetes) and cloud platforms.
- Proven experience with CI/CD pipelines.
- Familiarity with lakehouse architectures (Delta Lake, Iceberg, …).
- Understanding of Data Product concepts and modern data management practices.
- Ability to read and understanding Java, Camel, Talend ESB, Powershell
- Nice to Have
- Experience with MS Fabric or Databricks
- Experience with observability
- Knowledge of data governance and security best practices.
- Familiarity with BI or analytics tools.
- An opportunity to work with modern data technologies in a collaborative and forward-thinking environment.
- The ability to shape and influence our DataOps practices and architecture.
- Accountability: Taking ownership for tasks and challenges, as well as seeking continuous improvement
- Hands-on: Being proactive to rapidly deliver high-quality results
- Passionate: Being committed and striving for excellence
- Solution-driven: Focusing on client outcomes and treating clients fairly with a risk-aware mindset
- Partnership-oriented: Promoting collaboration and teamwork. Working together with an entrepreneurial spirit.
- Please ensure to attach a cover letter to your CV when filling the application.
- Should you wish to apply for this position use this link to apply.