Analytics Engineer

Местоположение и тип занятости

Полный рабочий деньМожно удаленно

Компания

Locals is all about social discovery.

Описание вакансии

Условия работы

Locals.org is a new app with a mission to connect people in real life to enjoy time together. We help our users to make new meaningful connections by hosting and joining real-life, offline experiences.

We are a growing startup with a team of 40+ (working remotely) and a few thousand active users. We are looking to hire our first Analytics Engineer. You will work closely with the Personalisation team of 2 and take ownership of the data stack.

Responsibilities

Your mission will be to deliver clean, reliable and timely data to internal stakeholders:

  • Provide relevant data for machine learning engineers to explore, build models and feed signals to personalisation systems (feed ranking, push notifications, friend suggestions, etc.).
  • Enable business users to understand how the product is performing and empower them to do their own data exploration by feeding the right data to BI reporting, product analytics and CRM tools.

Our current data stack: PostgreSQL, Cloud Firestore, Fivetran, Segment, BigQuery, dbt, Amplitude, Metabase. Most of our infrastructure is in GCP. We prefer to use modern managed cloud-based solutions to maximise dev efficiency.

We expect you to take on a blend of data engineering and data analytics tasks to achieve that:

  • Communicate with cross-functional stakeholders to reach alignment on goals.
  • Work with mobile and backend engineering teams to make sure that the right customer data is collected.
  • Ensure that all first- and third-party data is landed in a data warehouse.
  • Build data pipelines to transform the raw data into reusable models.
  • Maintain data documentation and definitions.
  • Implement data quality control with monitoring and alerting.
  • Apply software engineering best practices to analytics code: version control, code reviews, testing, continuous integration.
  • Implement regulatory compliance (GDPR, CCPA, etc.), such as anonymisation of PII.
  • Propose and implement changes in data architecture in collaboration with the Personalisation team.

Requirements

We expect you to:

  • Be proficient in SQL. Write clean and efficient code to extract reusable signals from raw data.
  • Communicate clearly and effectively, including written form. Be able to write clear proposals, seek alignment, request and act on feedback.
  • Be proactive, take ownership, be able to tolerate and resolve ambiguity.

Would be nice to have:

  • Experience working with dbt models to transform data in the warehouse.
  • Good Python skills.
  • Comfortable working with git.
  • Experience in building stream processing pipelines (e.g. GCP Dataflow, Spark Streaming, Flink).
  • Experience with implementing workflow orchestration (e.g. Airflow or Prefect).
  • Experience with implementing data quality control (e.g. Great Expectations).
  • Experience in designing and building data stack.
  • Experience in backend infrastructure design. Understanding of container and orchestration technologies: Docker, Kubernetes.
  • Experience working with data scientists and machine learning engineers, understanding of ML workflows.

Benefits

  • Remote operation capability
  • Care bonus for medical insurance, fitness, education, etc.
  • Flexible working hours
  • The project is connected with charity work, we are changing the world for the better

How to Apply

Please send your resume to k.pustovalova@locals.org.
Questions 👉 telegram: @krisspus