Data Engineer (Media Measurement sphere)


ISsoft has a current position open for a Data Engineer to work on a new project (Media Measurement sphere).

Company Background

Our client is a global leader in digital measurement and analytics, delivering insights on web, mobile and TV consumer behavior that enable clients to maximize the value of their digital investments.

Through its Audience Analytics, Advertising Analytics, and Enterprise Analytics product suites, the client provides its customers with a variety of on-demand software, real-time analytics and custom solutions to succeed in a multi-platform world. The company leverages a world-class technology infrastructure to capture trillions of digital interactions a month and power big data analytics on a global scale for its more than 2,000 clients, which include leading companies such as AOL, Baidu, BBC, Best Buy, Carat, Deutsche Bank, ESPN, France Telecom, Financial Times, Fox, LinkedIn, Microsoft, MediaCorp, Nestle, Starcom, Terra Networks, Universal McCann, Verizon, ViaMichelin and Yahoo!.

Project Description

Our client is a data company. We collect and process 60+ billion events each day, have 10s of petabytes online and each month our processes read nearly an exabyte. We use this capability to provide our clients with deep insights that no other company can match. Data Engineering is the team responsible for managing this vast dataset. We are looking for a Data Engineer to join our team.

Essential Duties and Responsibilities:
  • We build data processing pipelines that handle 100+ terabyte datasets;
  • We automate as much as we can so that we can stay focused on writing code;
  • We troubleshoot and quickly resolve issues;
  • We work with Analysts and Data Scientists to design and implement new methodology;

Previous experience in these technologies is not required. This is a great opportunity for someone looking for a new challenge.

  • Hadoop – MapR;
  • Apache Pig;
  • Spark;
  • HBase;
  • Java;
  • SQL (CTEs, window functions, UDFs, etc.);
  • An internal framework for job scheduling and execution (similar to AirFlow);
Job Requirements

Core Qualifications:


  • 6 — 10 years of professional programming experience;
  • You have a solid understanding of Computer Science fundamentals;
  • You write good code and take pride in that fact regardless of which language you are currently using;
  • You have a strong affinity towards working with data;
  • You enjoy working as a member of a team and consider feedback a learning opportunity;
  • You are comfortable in an environment that values quickly providing our customers with solutions;
  • You have an innate drive to grow and develop;


The following are considered a plus:


  • You treat performance as a feature and not an afterthought;
  • You have experience writing analytical queries that run on MPP databases;
  • You are comfortable reading query execution plans;
  • You can describe multiple MapReduce join strategies and their tradeoffs;
  • You have experience scaling Machine Learning algorithms in a distributed environment;
Напишите нам.
Мы обязательно ответим!
Отклинуться через:

*Обязательное поле

Проверьте, правильность заполнения формы.
Ваша заявка принята, спасибо. Мы свяжемся с вами, используя указанные вами контакты.