Workload: 80 - 100%
Location: Zurich
Main Task
• Design, develop, and maintain data pipelines and workflows using Apache Airflow for efficient data ingestion, transformation, and loading into the CMDB.
• Develop and optimize PL/SQL queries and stored procedures for data manipulation and retrieval within the CMDB environment. ...
• Utilize NoSQL databases for handling and processing large volumes of configuration data.
• Integrate data from various sources into the CMDB using MuleSoft and other integration platforms.
• Conduct data reconciliation activities to ensure data accuracy and consistency across multiple systems and sources.
• Develop and implement inventory data models based on the Common Information Model (CIM) to accurately represent IT assets and their relationships.
• Design and implement Extract, Transform, and Load (ETL) processes to populate and update the CMDB with accurate and up-to-date information.
• Collaborate with cross-functional teams to understand data requirements and ensure the CMDB meets business needs.
• Troubleshoot and resolve data-related issues, ensuring data integrity and availability.
• Document data processes, data models, and configurations to maintain knowledge and facilitate collaboration.
Education
• Proven experience in data engineering and data modeling.
• Scripting languages such as Python, Perl and Others.
• Strong understanding of Service Asset and Configuration Management (SACM) principles and best practices using Systems such as Microfocus Asset Manager, Peregrine Asset Center or similar (not the ITSM part).
• In-depth knowledge of the Common Information Model (CIM) from DMTF. org.
• Proficiency in Apache Airflow for workflow orchestration and automation.
• Building Web Frontends and Front- as well as Backend Loading Mechanism.
• Knowledge of Container Solutions such as iKube 2. 0 (preferred), Kubernetes or others.
• Extensive experience with PL/SQL for database operations and data manipulation.
• Experience working with NoSQL databases (e. g. , MongoDB).
• Hands-on experience with MuleSoft or other integration platforms.
• Strong data reconciliation and data quality management skills.
• Expertise in inventory data modeling and implementation.
• Solid understanding of Extract, Transform, and Load (ETL) processes using different Tooling.
• Basic Anchor Modelling Skills.
• Excellent problem-solving and analytical skills.
• Strong collaboration abilities.
Skills
Skills Must have:
• Proven experience in data engineering and data modeling.
• Scripting languages such as Python, Perl and Others.
• Strong understanding of Service Asset and Configuration Management (SACM) principles and best practices using Systems such as Microfocus Asset Manager, Peregrine Asset Center or similar (not the ITSM part).
• In-depth knowledge of the Common Information Model (CIM) from DMTF. org.
• Proficiency in Apache Airflow for workflow orchestration and automation.
• Building Web Frontends and Front- as well as Backend Loading Mechanism.
• Knowledge of Container Solutions such as iKube 2. 0 (preferred), Kubernetes or others.
• Extensive experience with PL/SQL for database operations and data manipulation.
• Experience working with NoSQL databases (e. g. , MongoDB).
• Hands-on experience with MuleSoft or other integration platforms.
• Strong data reconciliation and data quality management skills.
• Expertise in inventory data modeling and implementation.
• Solid understanding of Extract, Transform, and Load (ETL) processes using different Tooling.
• Basic Anchor Modelling Skills.
• Excellent problem-solving and analytical skills.
• Strong collaboration abilities.
Skills Nice to have:
• Expert community lead on most/each of the to be used System environments
• Systemand Service Integration development expertise on SalesForce MuleSoft basis
• Experience on Infrastructure Inventorization of Global Public Cloud provider and Hyper-Scaler Service and Systems
- Located in Switzerland (or willing to relacate)
- Payrolling
Show more Workload: 80 - 100%
Location: Zurich
Main Task
• Design, develop, and maintain data pipelines and workflows using Apache Airflow for efficient data ingestion, transformation, and loading into the CMDB.
• Develop and optimize PL/SQL queries and stored procedures for data manipulation and retrieval within the CMDB environment.
• Utilize NoSQL databases for handling and processing large volumes of configuration data.
• Integrate data from various sources into the CMDB using MuleSoft and other integration platforms.
• Conduct data reconciliation activities to ensure data accuracy and consistency across multiple systems and sources.
• Develop and implement inventory data models based on the Common Information Model (CIM) to accurately represent IT assets and their relationships.
• Design and implement Extract, Transform, and Load (ETL) processes to populate and update the CMDB with accurate and up-to-date information.
• Collaborate with cross-functional teams to understand data requirements and ensure the CMDB meets business needs.
• Troubleshoot and resolve data-related issues, ensuring data integrity and availability ...
• Document data processes, data models, and configurations to maintain knowledge and facilitate collaboration.
Education
• Proven experience in data engineering and data modeling.
• Scripting languages such as Python, Perl and Others.
• Strong understanding of Service Asset and Configuration Management (SACM) principles and best practices using Systems such as Microfocus Asset Manager, Peregrine Asset Center or similar (not the ITSM part).
• In-depth knowledge of the Common Information Model (CIM) from DMTF. org.
• Proficiency in Apache Airflow for workflow orchestration and automation.
• Building Web Frontends and Front- as well as Backend Loading Mechanism.
• Knowledge of Container Solutions such as iKube 2. 0 (preferred), Kubernetes or others.
• Extensive experience with PL/SQL for database operations and data manipulation.
• Experience working with NoSQL databases (e. g. , MongoDB).
• Hands-on experience with MuleSoft or other integration platforms.
• Strong data reconciliation and data quality management skills.
• Expertise in inventory data modeling and implementation.
• Solid understanding of Extract, Transform, and Load (ETL) processes using different Tooling.
• Basic Anchor Modelling Skills.
• Excellent problem-solving and analytical skills.
• Strong collaboration abilities.
Skills
Skills Must have:
• Proven experience in data engineering and data modeling.
• Scripting languages such as Python, Perl and Others.
• Strong understanding of Service Asset and Configuration Management (SACM) principles and best practices using Systems such as Microfocus Asset Manager, Peregrine Asset Center or similar (not the ITSM part).
• In-depth knowledge of the Common Information Model (CIM) from DMTF. org.
• Proficiency in Apache Airflow for workflow orchestration and automation.
• Building Web Frontends and Front- as well as Backend Loading Mechanism.
• Knowledge of Container Solutions such as iKube 2. 0 (preferred), Kubernetes or others.
• Extensive experience with PL/SQL for database operations and data manipulation.
• Experience working with NoSQL databases (e. g. , MongoDB).
• Hands-on experience with MuleSoft or other integration platforms.
• Strong data reconciliation and data quality management skills.
• Expertise in inventory data modeling and implementation.
• Solid understanding of Extract, Transform, and Load (ETL) processes using different Tooling.
• Basic Anchor Modelling Skills.
• Excellent problem-solving and analytical skills.
• Strong collaboration abilities.
Skills Nice to have:
• Expert community lead on most/each of the to be used System environments
• Systemand Service Integration development expertise on SalesForce MuleSoft basis
• Experience on Infrastructure Inventorization of Global Public Cloud provider and Hyper-Scaler Service and Systems
- Located in Switzerland (or willing to relacate)
- Payrolling
Show more