Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systems Support model development by assisting with training, validation, and optimization of machine learning workflows Conduct data analysis to extract insights and provide clear reports supporting R&D research questions Solve technical challenges related to data access, pipeline performance, and software limitations Ensure continuity of ongoing projects by aligning closely with the core team and delivering on timelines Perform image analysis and prepare datasets required for scientific and ML use cases Manage and improve ETL processes to ensure data quality, structure, and availability Document workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative field Strong proficiency in Python with expertise in scientific and analytical libraries Skilled in SQL and working with relational databases Understanding of ETL concepts and practical experience working with data pipelines Solid foundation in machine learning principles and model lifecycle Ability to perform image analysis for scientific or research applications Strong communication and interpersonal skills with the ability to collaborate in a technical team Independent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impact Hands-on involvement in AI, machine learning, and data integration challenges in a scientific environment Close collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systemsSupport model development by assisting with training, validation, and optimization of machine learning workflowsConduct data analysis to extract insights and provide clear reports supporting R&D research questionsSolve technical challenges related to data access, pipeline performance, and software limitationsEnsure continuity of ongoing projects by aligning closely with the core team and delivering on timelinesPerform image analysis and prepare datasets required for scientific and ML use casesManage and improve ETL processes to ensure data quality, structure, and availabilityDocument workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative fieldStrong proficiency in Python with expertise in scientific and analytical librariesSkilled in SQL and working with relational databasesUnderstanding of ETL concepts and practical experience working with data pipelinesSolid foundation in machine learning principles and model lifecycleAbility to perform image analysis for scientific or research applicationsStrong communication and interpersonal skills with the ability to collaborate in a technical teamIndependent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impactHands-on involvement in AI, machine learning, and data integration challenges in a scientific environmentClose collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
Azure ML, Azure Databricks) oder anderen Cloud-Lösungen Mehrjährige Projekterfahrung im Bereich Data- & Analytics-Architekturen sowie erste konkrete Erfahrung in AI-Projekten (z. B. Forecasting, Klassifikation, NLP, Computer Vision) Tiefgreifendes fachliches Verständnis und Umfassende technologische Expertise und tiefes fachliches Verständnis, erworben in zahlreichen Projekten über verschiedene Branchen hinweg Wünschenswert fundiertes (zertifiziertes) Know-how im agilen Produkt- und (Multi-)Projektmanagement (z.
Tätigkeiten: Potenziale erkennen: Analysiere Kundenanforderungen und entwickle datengetriebene Lösungen – von Analytics über Machine Learning bis GenAI.Strategien entwickeln: Konzipiere individuelle Data-&-AI-Strategien für vielfältige Use Cases.Lösungen skalieren: Entwirf zukunftsfähige Architekturen für Datenplattformen, ML-Pipelines und AI-Systeme.Technologien prüfen: Teste und empfehle passende Technologien, begleite Umsetzungsteams mit klaren Roadmaps.Business ausbauen: Unterstütze Pre-Sales-Aktivitäten, Marktanalysen und die Weiterentwicklung datenbasierter Geschäftsbereiche.Angebote mitgestalten: Bringe deine Ideen in Lösungskonzepte, Kalkulationen und Präsentationen ein.Akademischer Hintergrund: Erfolgreich abgeschlossenes Studium der (Wirtschafts-)Informatik, Computer Science oder eine vergleichbare Qualifikation.Erfahrung: Mehrjährige Erfahrung in der Konzeption und Umsetzung moderner Datenplattformen, ML-Lösungen und AI-Systeme.Technologiekompetenz: Fundierte Erfahrung mit Azure, Databricks oder AWS sowie in Python, SQL, Docker, CI/CD-Pipelines und Infrastructure as Code.Fachliches Know-how: Tiefes Verständnis moderner Datenarchitekturen, MLOps und Data Governance.Arbeitsweise: Kombination aus konzeptioneller Stärke, analytischem Denken und pragmatischer Hands-on-Mentalität.Kommunikationsstärke: Fähigkeit, komplexe Themen klar und überzeugend zu vermitteln – in Deutsch und Englisch.Flexibilität: Wir bieten dir flexible Arbeitsmodelle mit großzügiger Gleitzeitregelung – wahlweise an einem unserer Standorte oder bis zu 100 % mobil innerhalb Deutschlands.Einarbeitung & Entwicklung: Deine individuelle Einarbeitung und Karriereförderung liegen uns am Herzen.
YOUR TASKS Design and maintain scalable data architectures and pipelines Collaborate with cross‑functional teams on data requirements Implement data quality and governance processes Drive adoption of modern data engineering technologies Guide and coach junior data engineers YOUR PROFILE Degree in Computer Science, Engineering or related field Minimum of 5+ years experience in data engineering, including architecture Expertise in ETL, Data Lakes and data warehousing Strong SQL, SSIS, SSAS and Azure SQL/databricks skills Experience with CI/CD (Azure DevOps, git) Programming skills in R, Python or Scala Very good English and strong collaboration skills YOUR BENEFITS Nordex offers a range of attractive benefits – here’s a selection of what you can look forward to.
What makes you stand out You hold a degree in Business Informatics, Data Science, Computer Science, Industrial Engineering, Business Administration, or a comparable field.You have several years of experience in building, operating, or further developing data‑driven products—ideally in a sales, marketing, or customer service context.You have solid knowledge in designing and developing cloud‑based data products such as data warehouses, semantic layers, analytical models, and reporting and analytics solutions.You possess strong knowledge of data architectures, data engineering, data science, and data governance.You bring experience in leading interdisciplinary teams both functionally and disciplinarily—ideally including Data Engineers, Data Scientists, Product Owners, and Data Governance roles.You demonstrate strong analytical and conceptual thinking skills and the ability to prepare complex topics in a structured and understandable way.You communicate effectively and are able to bridge different target groups (business & tech).You work with strong execution and results orientation while ensuring high data quality.You have excellent German and English skills, both written and spoken.
Werden auch Sie ein Teil der erfolgreichen MBDA-Gruppe. WAS SIE BEITRAGEN KÖNNEN Sie studieren aktuell Informatik, Computer Vision, Statistik oder Data Analytics und sind an einer spannenden Werkstudententätigkeit interessiert? Sie interessieren sich für verschiedene Methoden aus den Bereichen Statistik und Data Mining?
What You’ll Do: Collaborate in an Agile, International TeamWork closely with colleagues from Romania, Germany, and UkraineDesign, estimate, develop, and implement software solutions aligned with business needsActively communicate progress, risks, and technical decisions to stakeholdersBuild Scalable Data SolutionsDevelop agnostic data products within a modern, cloud-native data ecosystemSupport use cases across BI, Advanced Analytics, AI, and MLTranslate business requirements into robust technical architecturesContinuously enhance performance, quality, and cost-efficiency of solutionsProactively suggest improvements and best practices What makes you stand out Degree in Computer Science, Economics, or a comparable qualificationMinimum 3 years of experience as a BI Engineer or Data Engineer, focused on cloud-based architecturesStrong expertise in: Snowflake and DBT (Data Build Tool)Solid knowledge of: SQL and Data lakehouse architectures, Python is nice to haveCommunication is Key Excellent communication skills in English (written and spoken) — mandatoryAbility to clearly explain technical concepts to both technical and non-technical stakeholdersStrong stakeholder management and collaboration skillsComfortable working in cross-border, multicultural teams We are looking forward to your application and to applicants who enrich our diverse culture!
What makes you stand out You hold a degree in Business Informatics, Data Science, Computer Science, Industrial Engineering, Business Administration, or a comparable field.You bring experience in Revenue Operations, Sales Operations, Sales Analytics, or a similar commercial analytics role.You have a strong understanding of sales processes, pipeline management, forecasting, and revenue metrics, and you can translate them into technical requirements for engineers.You hold a degree in Business Informatics, Data Science, Computer Science, Industrial Engineering, Business Administration, or a comparable field.You are a great stakeholder manager who can talk to sales and engineering alike.You are highly proficient in Power BI and experienced in building dashboards that drive action.You are comfortable working with CRM data and sales systems (e.g.
Your responsibilities: In this role your main tasks will include: Creation and maintenance of:ETL processes including all necessary BI-artefacts in SAP BW4/HANA and SAP-Datasphere or Google Cloud BigQuery corresponding front-end applications in SAP Analytics Cloud and Analysis for Office as well as Google Looker Studio Fine-tuning of the requirements with the user and defining the best solution procedure in accordance with reporting and analytics platform concept.Training employees in the use of the reporting tools SAP Analytics Cloud and Analysis for Office.Application monitoring and administration in BW4/HANA and SAP Datasphere.User maintenance of BW4/HANA and SAP Datasphere. Your profile Education: Bachelor/Master Degree in Computer Science, Statistics, or Business Informatics / Administration / Economics with a quantitative background or comparable.Experience: At least 3 years of BI experience and process managementLanguage skills: Excellent written and spoken English skills;Technical Skills:At least 1 year of experience with S/4, BW/4, and DatasphereAt least basic knowledge of SQLExperience in using statistical methods and its application to business process Experience in the Google Cloud, especially BigQuery and Data Studio, is appreciated Project management experience and working with scrum is a plus.Soft Skills:Business acumen: Quick understanding of business processes is a must.Problem-solving: Strong abilities to figure out solutions and a keen eye for details.Communication: Excellent skills in both writing and speaking to explain technical things clearly.Attitude: A proactive attitude, meaning you like to take initiative, and a strong desire to learn and grow your career in database administration.Teamwork: Ability to work effectively both on your own and as part of a larger team.