Job description
I'm seeking a Data Engineer for our german company to maintain and optimize our Microsoft Fabric Data Platform.
Key Responsibilities:
- Data Pipeline Maintenance: Ensure seamless operation of data pipelines.
- ETL Processes: Design and optimize complex ETL workflows for integrating diverse data sources.
- Data Engineering: Deliver scalable, efficient data solutions.
Data Sources You'll Work With:
- SQL Databases: Proficiency required.
- External APIs: Handle data extraction from OData services and REST APIs.
Required Skills & Tools:
- Azure Data Factory: Hands-on experience in creating and managing pipelines.
- Microsoft Fabric: Advanced knowledge preferrable, but experience in databricks or similar also okay.
- Python: Strong skills for data manipulation and scripting.
- Additional Tools: Experience with MySQL/MariaDB, Google BigQuery, and flat file formats like Excel/CSV is a plus.
- English or German speaking
What We're Looking For:
- Analytical Mindset: Ability to structure complex tasks and ensure high-quality results.
- Collaboration: Strong communication skills and a team-oriented approach.
- Proven Track Record: Demonstrable experience in maintaining and developing BI architectures using Microsoft technologies.
Project Details:
Duration: Long-term engagement (> 12 months)
Estimated Workload: 4-6h per day, Monday - Friday
Location: Fully remote work is possible