4 Data Engineer Jobs in Südtirol
YOUR RESPONSIBILITIES
- Design and maintain scalable data pipelines and ETL processes
- Develop APIs using Python (e. g. FastAPI) for data access and integration
- Ensure data quality, performance optimisation and security
- Knowledge in NoSQL-based databases
- Collaborate with data scientists and analysts to enable data-driven decisions
- Maintain and document data architecture and best practices
|
Vollzeit
Bolzano, Milan
07.11.2025
Bolzano, Milan
Brands of Oberalp-Group
- Design and implement data pipelines to extract, transform, and load data from various sources, including databases, cloud storage, and APIs.
- Maintain and improve existing data pipelines and models, ensuring they are efficient and reliable.
- Support IT and business colleagues in Business Intelligence and Advanced Analytics modelling.
- Collaborate with data scientists and analysts to support data-driven decision making.
- Identify opportunities to optimize data pipelines and suggest new approaches to improve efficiency and reliability.
- Write and maintain documentation for data pipelines and processes.
- Implement CI/CD processes for data pipelines and work with DevOps teams to ensure smooth deployment and operation (DataOps).
- Drive system problem resolution and root cause analysis
- Ensure and maintain integrity of code base during concurrent development cycles
- Collaborate with experts in a variety of technologies to come up with the best overall solutions
- Collaborate with cross functional team to resolve data quality and operational issues
- Identify opportunities for team standardization in coding, deployments, documentation, and other related areas and create said standards
|
Vollzeit
Bozen, Marostica
04.12.2025
Bozen, Marostica
- Verwaltung und Optimierung von Deploy-, Monitoring- und Automatisierungsprozessen auf Microsoft Fabric
- Unterstützung des Data Engineering Teams beim Aufbau skalierbarer und zuverlässiger Datenpipelines
- Definition und Umsetzung von CI/CD-Praktiken für Fabric-Komponenten (Lakehouse, Dataflows Gen2, Notebooks, Pipelines)
- Performance-, Kosten- und Sicherheitsmonitoring der Plattform
- Entwicklung von unterstützenden Tools und Skripten (z. B. PowerShell, Python, Terraform/Bicep)
- Dokumentation und Förderung von DevOps/DataOps Best Practices im Team
|
YOUR RESPONSABILITIES
- Generation, verification and consolidation of data/information for operative and strategic decision-making in Procurement department
- Monitoring of price and market developments
- Creation of reports and monitoring of non-financial targets to be met by the procurement organization
- Continuous development of models and approaches to determine target costs for projects, sourcing categories, components and KPIs
- Verification, safeguarding and improving data quality by adjusting spend data and conducting plausibility checks
- Capturing and consolidating relevant data and information for procurement risk management, in collaboration with Controlling department
- Ensure items are classified correctly for reporting, compliance, and operational efficiency
- Work closely with Commodity Management to update Supplier-Item Relations in time
- Perform periodic audits of item master data to identify and resolve discrepancies
|