| Allgemeine Daten | |
| Land: | Poland, Ukraine |
| Stadt: | unbekannt |
| Arbeitgeber: | TechMagic |
| Berufsfeld: | Data Science & Analytics |
| Vertragsart: | Full-Time |
| Gehalt: | ab |
| Job-Beschreibung | |
Part-time (2–3 days/week) | Remote | Long-term A fast-growing Swiss-based consulting and technology company specializing in data analytics, AI implementations, and ERP systems is looking for a BI Consultant / Lead to strengthen its team. The company builds intelligent data and analytics solutions for international clients, combining modern cloud technologies with open-source flexibility. Founded in 2021 and headquartered in Switzerland, it partners with organizations across industries to help them unlock the value of their data — from data integration and modeling to analytics and AI-driven insights. You’ll join a dynamic environment where experts in Azure Fabric, data engineering, and BI work side by side with business stakeholders to deliver measurable business impact. We mainly run two stacks:• Azure Fabric: OneLake/Lakehouse, Fabric/Synapse pipelines, SQL endpoints, Power BI. • Open-Source: Kafka, Apache NiFi, Iceberg, dbt, ClickHouse, Postgres Scope/impact• Own the conversation with business, shape the analytics roadmap/KPIs, and define the semantic model. • Build the critical pieces hands-on or steer our (very capable) data engineering team end-to-end. • Translate questions into robust data models, data contracts, and reliable dashboards; set QA/governance standards. Must-haves (either track)• Client-facing BI leadership; ability to frame problems and land simple, valuable solutions. • Strong SQL and dimensional modelling (star/snowflake); comfortable defining metrics/semantics. • Backlog writing for engineers (dbt tasks/ELT, data contracts, acceptance criteria). Azure Fabric track (nice-to-haves become musts if choosing this track): • Fabric/Synapse pipelines, Lakehouse/OneLake, SQL endpoints. • Power BI (dataflows, DAX, tabular modeling, Row-Level Security). • CI/CD for BI (deployment pipelines, versioning, environments). Open-Source track (nice-to-haves become musts if choosing this track) • Kafka event streams, Apache NiFi flows (ingest/enrichment/routing). • dbt (models/tests/docs), ClickHouse for analytics at scale; familiarity with OLAP engines. • Apache Superset (dashboards, permissions) and basics of orchestration (e.g., Airflow/Argo) – nice to have. Cross-cutting pluses• Python for data wrangling; testing (dbt tests, unit/contract tests). • Data quality & governance (SLAs, monitoring, lineage); GDPR awareness. • Finance analytics background (P&L, margin, cash, working capital) is a strong plus. Work format: B2B contract, paid days off (holidays, sick leave, public holidays), budget for a laptop. |
|
| Qualifikation des Bewerbers | |
| unbekannt | |
| Kontakt | |
Gefunden auf: Jobicy.com |
|