Expleo offers a unique range of integrated engineering, quality and strategic consulting services for digital transformation. At a time of unprecedented technological acceleration, we are the trusted partner of innovative companies. We help them develop a competitive advantage and improve the daily lives of millions of people.
Joining Expleo Switzerland means working for 19,000 people in 30 countries:
- Technical and human support for each project and effective career management
- Training to develop your professional skills
- Take part in special dedicated events
- Join a dynamic team
As a valued member of the Data Engineering team in a French speaking part company, you will play a crucial role in overseeing the maintenance and
optimization of data pipelines within the Databricks platform. Your primary responsibilities will encompass
addressing evolving business requirements, refining ETL processes, and ensuring the seamless flow of energy data
across our systems.
Key Responsibilities:
1. Design, Develop, and Maintain Robust Data Workflows:
• Create and maintain scalable data workflows on Databricks integrated with AWS.
• Collaborate closely with cloud and frontend teams to unify data sources and establish a coherent data
model.
2. Ensure Data Pipeline Reliability and Performance:
• Guarantee the availability, integrity, and performance of data pipelines.
• Proactively monitor workflows to maintain high data quality.
3. Collaborate for Data-Driven Insights:
• Engage with cross-functional teams to identify opportunities for data-driven enhancements and insights.
• Analyze platform performance, identify bottlenecks, and recommend improvements.
4. Documentation and Continuous Learning:
• Develop and maintain comprehensive technical documentation for ETL implementations.
• Stay abreast of the latest Databricks/Spark features and best practices, contributing to the continuous
improvement of our data management capabilities.
Qualifications:
• Bachelor's degree in Computer Science, Information Technology, or a related field.
Technical Skills:
• Strong expertise in PySpark.
• Proficiency in SQL and scripting languages (e.g., Python).
• Excellent analytical and problem-solving skills.
• Strong communication skills in French (both written and verbal) and fluency in English.
Additional Preferred Skills:
• Familiarity with industry-specific regulations and compliance requirements.
• Previous experience in the energy trading domain is a nice-to-have.
6. Personal Attributes:
• Ability to work effectively in a fast-paced, collaborative environment.
• Detail-oriented with effective task prioritization skills.
• Demonstrated adaptability and a keen willingness to learn new technologies and tools.
• Strong customer orientation.
Minimum of 5 years as a Data Engineer, with a proven track record of implementing pipelines inDatabricks
Experience in cloud environments (AWS or Azure) is a plus.
Fluent english required, advanced French is mandatory.