Back to search results
Data Operative
Location: London, Westgate
Posted: 2 Days ago
Job Type: Regular
Ref: R088819
A colleague looking up while in a meeting She has long brown hair with blue tips.

Data Operative

Kantar Media

Location: Westgate, W5 1UA 

DepartmentKantar, Media Audiences, BARB Service 

Introduction 

As people increasingly move across channels and platforms, Kantar Media’s data and audience measurement, targeting, analytics and advertising intelligence services unlock insights to inform powerful decision-making.

Working with panel and first-party data in over 80 countries, we have the world’s fastest growing cross-media measurement footprint, underpinned by versatility, scale, technology and expertise, to drive long-term business growth for our clients and partners.

BARB, the Broadcasters' Audience Research Board, is responsible for measuring television viewership in the UK on behalf of broadcasters and advertisers. BARB contracts research companies to collect viewing data from a panel of 6,000 homes daily. 

We are seeking a highly motivated Data Operative to join our Data Operations Team in Westgate in London.  This role will suit someone who is in between a Devops and Data Engineer The successful candidate will play a critical role in the daily production of industry-standard data files, as well as the maintenance, security, and ongoing optimization of BARB's data processing infrastructure. Responsibilities include deploying new software releases into the live production environment, ensuring the seamless operation of this essential service for the UK television industry. 

This position offers a unique opportunity to play a vital role in the delivery of critical data services for the UK television industry. If you are a highly motivated and technically adept individual with a passion for data and a flexible, results-oriented mindset, we encourage you to apply. 

Principal Responsibilities  

  • Software Implementation: Install, test, and validate new software releases, collaborating with QC to ensure optimal performance and expected output. 
  • Data Processing Control: Monitor and manage data input into the Atria batch processing system, maintaining data integrity throughout all production stages. 
  • Live Environment Deployment: Manage the deployment of new software releases into the live environment, following rigorous testing and approval protocols. 
  • System Configuration: Implement and test configuration changes on non-production servers, ensuring seamless integration with existing systems. 
  • Data Oversight: Ensure completion data collection process as well as audio matching system.  
  • Quality Assurance: Collaborate closely with the Quality Control (QC) team to ensure adherence to rigorous data standards, from validation to final client file production. 
  • Client File Delivery: Guarantee timely and accurate production and delivery of all client data files, adhering to strict schedules and industry standards. 
  • System Monitoring: Maintain 24/7 vigilance, promptly responding to system alerts, messages, and emails to ensure uninterrupted operations. 
  • Cross-Functional Collaboration: Foster effective communication and collaboration with QC, developers, support teams, and help desk personnel to address issues promptly and efficiently. 
  • Independent Problem-Solving: A resourceful and self-directed individual who can quickly assess and resolve complex issues, even under minimal supervision, is highly desired. 

Additional Requirements: 

  • Availability: Due to the 24/7 nature of our operations, the successful candidate must be readily available at short notice, including outside of standard business hours. 
  • Initiative: We seek a proactive individual who takes initiative, identifies opportunities for improvement, and contributes to the ongoing optimization of our data processes. 
  • Adaptability: The ability to embrace change and maintain composure in the face of unexpected challenges is essential for success in this role. 

Technical Skills: 

  • Proven experience with Azure cloud services and Kubernetes. 
  • Linux skills and good command line exposure to navigate the filesystem and data file.  
  • Strong proficiency in Python and Bash scripting. 
  • Hands-on experience with Airflow or similar workflow orchestration tools is a plus. 
  • Familiarity with Infrastructure as Code (IaC) principles and tools. 
  • Experience with monitoring and logging tools (e.g., Azure Monitor, Prometheus, Grafana). 
  • Knowledge of Oracle database structures and data manipulation, Knowledge of PL/SQL

  • LI-CM3 .

  • #LI-MB1

Apply Now
Back to search results