👀 Дизайнеры, какая команда подходит вам по вайбам? Проверяйте на Вайб-чеке→ vibe.habr.com

Data analyst (T-Nap)

Местоположение и тип занятости

Санкт-ПетербургПолный рабочий день

Компания

Международная IT-компания и стратегическое IT-подразделение Deutsche Telekom

Описание вакансии

Условия работы

We represent Move-On – a new and strategically important program in Deutsche Telekom IT which aims to build an innovative company-wide infrastructure orchestrator for managing huge set of DT IT communication equipment.

One of the main Move-On projects is T-NAP (Telekom Network Automation Platform).

The goal of T-NAP project is automation of network infrastructure operations by means of the open source software platform ONAP ( https://www.onap.org ).
Here are our values and mission:
• We believe in Agile and organize our teams as an Agile Release Train.
• We use state-of-the-art technologies such as cloudification, microservices and open APIs.
• Our platform provides E2E automation for Access Disaggregation – our first customer.
• With O-RAN town we enable NFV (Network Functions Virtualization) and make Deutsche Telekom independent from classical vendors.
• We believe in open source and become one of the top contributors to ONAP platform.

We are looking for a new colleague to join the Data Analytics Team in the T-NAP.

Responsibilities:

  • You will be applying DevOps practices for DCAE (Data Collection Analytics Engine)
  • Work with multiple data sources: collecting, cleaning, studying and analyzing data in a corporate information environment
  • Visualization of the obtained results
  • Creation of diagrams, graphs and dashboards
  • Development of new features and microservices for data correlation
  • Real time analytics including AI/ML
  • Cooperate efficiently in distributed international cross-functional teams following Agile methodologies (Scrum, SAFe) and DevOps practices

Requirements:

  • Experience with Big Data as a developer or data analyst
  • Experience with Elastic Stack ecosystem (Elasticsearch, Logstash and Kibana products)
  • Experience with Data streams processing frameworks (such as Apache Kafka Streams, Apache Spark, Storm or Flink)
  • Good knowledge of Python
  • English language at intermediate level or higher