← All jobs
Smartbroker
Smartbroker

(Sr.) Data Platform Engineer (m/f/d)

Data Engineer Berlin, Hybrid · 27 April 2026
Apply now
Our significantly growing Data, AI & MarTech Department is looking for an experienced Data Platform Engineer. The position is responsible for improving and continuously further developing our cloud-based data platform – the heart of Smartbroker‘s technical data infrastructure for business analytics & insights. Join us and play a major role in promoting and enabling a truly data-driven culture across the organisation! 

Job description:
  • Develop and improve ourcloud baseddata platform for data analytic and business insights using most innovative data technologies
  • Build end-to-end data pipelines from raw data ingestion to consumable data: prepare and clean structured and unstructured data and develop high-quality data models for advanced analytics and AI use cases
  • Implement data quality monitoring to ensure accuracy and reliability of data pipelines
  • Architect, code, and deploy data infrastructure components
  • Collaborate closely withhighly ambitiousdata engineers and analysts in our growing Data, AI &MarTechDepartment as well as product technology colleagues
  • Stay up to date with latest market developments in data cloud architecture and share your knowledge
  • University degree in computer science, mathematics, natural sciences, or a similar field
  • Several years of experience in data engineering and strongknow-howin building data-native, robust, scalable, and maintainable data platforms
  • Significant hands-on experience designing and operating data pipelines oncloud baseddata platforms (AWS, GCP) using data-native services (S3, Athena,BigQuery…)
  • Experience in data warehousing and containerization, e.g., Kubernetes, Docker…
  • Advanced knowledge about cloud networking & security (IAM, security groups…)
  • Proficient and experienced with Infrastructure as Code
  • Deep understanding of software engineering best practices: requirements specification, version control, CI/CD, testing, deployment, and monitoring of data pipelines and services
  • Excellent SQL skills and strong programming skills in Python, ideally including Airflow andPySpark
  • Strong knowledge in data streaming technologies like Kafka, Kinesis, Flink…
  • Excellent English communication skills, German is a plus
  • Interest in finance and fintech industry and a sense of humor