Skip to content
Senior Data Management Professional - Data Quality, Commodities Data
Location
London
Business Area
Data
Ref #
10049590

Description & Requirements

Bloomberg runs on data. Our products are fueled by powerful information. We combine data and context to paint the whole picture for our clients, around the clock – from around the world. In Data, we are responsible for delivering this data, news and analytics through innovative technology - quickly and accurately. We apply problem-solving skills to identify innovative workflow efficiencies, and we implement technology solutions to enhance our systems, products and processes.

What’s the role?

The Senior Data Management Professional role will own the data quality and automation strategy across our commodities data landscape, embedding robust controls and tooling across the full commodities data lifecycle. The role combines strategic direction setting with hands-on technical leadership, working closely with a distributed team of data professionals and engineering teams to deliver scalable, automated solutions.

We’ll trust you to:

  • Define and own the data quality and automation strategy for commodities data, aligned to product/commercial outlook and regulatory priorities.
  • Establish a target-state data control and DataOps operating model, clarifying roles across data ops, engineering, and business partners.
  • Set and maintain data quality policies, standards, and targets for critical data domains (market data, reference data, fundamentals).
  • Effective communication of the strategy and alignment with partners.
  • Supervise and mentor data team members on data quality best practices, serving as a subject matter expert and thought leader in the field.
  • Build a culture of continuous improvement, automation-first approach, and strong collaboration with engineering and internal users.
  • Work with data ops to translate business data issues into engineering-ready requirements for pipelines, controls, and monitoring enhancements.
  • Partner with data engineering leaders to embed software engineering standards (version control, automated testing, CI/CD) into data quality and data ops workflows.
  • Coordinate with platform and engineering teams to ensure tooling, infrastructure, and observability support proactive detection and resolution of data issues.
  • Design and oversee implementation of automated data quality checks (completeness, accuracy, timeliness, consistency, schema changes) across key pipelines.
  • Define frameworks and guardrails for automated imputation, including when to impute vs. reject, approved methods, and mandatory flagging of imputed values to downstream systems.
  • Govern the end-to-end data lifecycle for key commodities domains: ingestion from vendors and internal systems, normalization, enrichment, versioning, and controlled distribution to internal endpoint.
  • Own the data quality issue management framework (logging, triage, root-cause analysis, remediation plans, and tracking to closure) with clear accountability across teams.
  • Define and report data quality SLAs and metrics to senior partners, highlighting risk, trends, and remediation progress.
You’ll need to have:
*Please note we use years of experience as a guide but we certainly will consider applications from all candidates who are able to demonstrate the skills necessary for the role.

  • 4+ years experience in data management, data operations, or data controls, with a strong track record in data quality ownership; commodities, energy, or trading environments strongly preferred. 
  • Demonstrable experience leading and/or mentoring a data team, setting clear priorities, and managing stakeholder expectations at senior levels. 
  • Solid technical grounding: advanced SQL; extensive experience with Python or similar for implementing data checks and imputation logic; familiarity with modern data platforms and data quality/observability tooling.
We’d love to see:

  • Solid understanding of commodities data (market data, curves, fundamentals) and common platforms (cloud, data management systems, workflow tools). 
  • Experience defining data quality and automation strategy and operating models in a complex, regulated environment. 
  • Good understanding of DataOps principles and how to embed them into data engineering workflows to improve reliability and speed of delivery. 
  • Comfortable operating as a bridge between data ops, engineering, and the business, able to move between high-level strategy and hands-on detail when needed. 
  • Experience designing and implementing AI/ML solutions, with a well-defined validation framework.
Does this sound like you?

Apply if you think we're a good match. We'll get in touch to let you know what the next steps are!


If indicated, please note that years of experience are a guide; we will consider applications from all candidates who can demonstrate the skills necessary for the role.
Discover what makes Bloomberg unique - watch our podcast series for an inside look at our culture, values, and the people behind our success.
Apply Now