Data Engineer
PRODYNA (Schweiz) AG
Zürich aber
Key information
- Publication date:11 July 2025
- Workload:100%
- Contract type:Permanent position
- Place of work:Zürich aber
Job summary
Join PRODYNA as a Data Engineer, where we create innovative software solutions. Be part of a dynamic team focused on digitalization and cloud computing.
Tasks
- Design modern data platforms for large-scale data integration.
- Plan and implement backend services for data utilization.
- Optimize data pipelines and ensure quality on cloud platforms.
Skills
- Several years in software development with data-driven projects.
- Proficient in SQL and programming languages like Python.
- Knowledge of data integration tools like Azure Data Factory.
Is this helpful?
Data Engineer
At PRODYNA we design, implement, and operate custom software applications for mid- to large-sized enterprises. We're committed to offering our customers innovative and future-proof solutions through digitalization and cloud computing strategies. As a member of the Cloud Native Computing Foundation (CNCF), we promote speed, agility, and scalability in software development.
Your tasks
- As a Data Engineer, you'll be responsible for designing and conceptualizing modern data platforms capable of integrating and processing large volumes of data from systems at scale. Your responsibilities will include:
- Designing and conceptualizing modern data platforms for large-scale data integration and processing.
- Planning and implementing backend services using modern frameworks for data provisioning and utilization.
- Independently optimizing pipelines and integrating them into cloud platforms.
- Implementing tools and solutions to ensure data quality, data cleansing, and provision of aggregated data for analysis.
- Gathering and specifying requirements from our clients.
Your profile
- Several years of experience in software development, particularly in data-driven projects, data warehousing, or data platform construction.
- Knowledge in pipeline development and data integration (ELT/ETL, Azure Data Factory, DBT, Stored Procedures, or similar tools).
- Familiarity with data modeling and experience with Data Vault, Data Mesh, Kimball, or Inmon methodologies.
- Proficiency in SQL and at least one programming language (e.g., Python, Scala).
- Experience with cloud platforms (e.g., Azure, AWS, or GCP).
- Strong analytical and problem-solving skills.
Nice to Have:
- Experience with Microsoft Fabric or understanding of the Fabric ecosystem.
- Knowledge of Delta Lake, Lakehouse architecture, and data mesh principles.
- Familiarity with CI/CD for data pipelines.
- Exposure to data cataloging tools (e.g., Purview, Alation).
- Understanding of privacy regulations (e.g., GDPR, CCPA).
Your benefits
- Employee education
- 25 vacation days
- Free hardware selection
- Private health insurance
- Health support and wellbeing
- Team events
- International network
- Fruits in the office
- Employee referral programm