Senior Software Engineer - Distributed Systems
Decentriq
- Zurich
- CDI
- Temps-plein
- Own, Design & Operate Data Pipelines – Take full responsibility for all pandas- and Spark-based pipelines, from development through production and monitoring.
- Advance our ML Models – Improve and productionise models for AdTech use-cases such as look-a-like modelling, audience expansion, and campaign measurement.
- Engineer for the Invisible – Because data inside confidential enclaves is literally invisible (even to root), build extra-robust validation at the data source, exhaustive test coverage, and self-healing jobs to guarantee reliability.
- Collaborate Cross-Functionally – Work closely with data scientists, backend engineers (Rust), and product teams to ship features end-to-end.
- AI-Powered Productivity – Leverage LLM-based code assistants, design generators, and test-automation tools to move faster and raise the quality bar. Share your workflows with the team
- Drive Continuous Improvement – Profile, benchmark, and tune Spark workloads, introduce best practices in orchestration & observability, and keep our tech stack future-proof.
- (Must have) Bachelor/Master/PhD in Computer Science, Data Engineering, or a related field and 5+ years of professional experience.
- (Must have) Expert-level Python plus solid hands-on experience with pandas, PySpark/Scala Spark, and distributed-data processing.
- (Must have) Proven track record building resilient, production-grade data pipelines with rigorous data-quality and validation checks.
- (Must have) Experience running workloads in Databricks, Spark on Kubernetes, or other cloud/on-prem big-data platforms.
- (Plus) Working knowledge of ML lifecycle and model serving; familiarity with techniques for audience segmentation or look-a-like modelling is a big plus.
- (Plus) Exposure to confidential computing, secure enclaves, homomorphic encryption, or similar privacy-preserving tech.
- (Plus) Rust proficiency (we use it for backend services and compute-heavy client-side modules).
- (Plus) Data-platform skills: operating Spark clusters, job schedulers, or orchestration frameworks (Airflow, Dagster, custom schedulers).
- Join Decentriq's Engineering team as in individual contribuor and earn growing responsibilities
- Being able to create, shape, and benefit from a young company.
- An amazing and fun team that is distributed all over Europe.
- Competitive salary.
- A lot of opportunities for self-development.