Skip to content
Senior Data Management Professional - Data Engineer - Economics Data
Location
London
Business Area
Data
Ref #
10051525

Description & Requirements

Bloomberg runs on data. Our products are fueled by powerful information. We combine data and context to paint the whole picture for our clients, around the clock - from around the world. In Data, we are responsible for delivering this data, news, and analytics through innovative technology - quickly and accurately. We apply problem-solving skills to identify workflow efficiencies and implement technology solutions to enhance our systems, products, and processes.

Our Team:

The Economics Data team is responsible for onboarding, modelling, maintaining, and improving Economics datasets that are fit for purpose for our clients. Our data supports workflows across the Bloomberg Terminal, BQL, Enterprise, and other Bloomberg products. We manage macroeconomic, government, survey, forecast, time-series, and vendor-supplied datasets. Our focus is to deliver Economics data that is accurate, timely, scalable, well-structured, and ready to use.

What’s the Role:

The Economics Data team is looking for a Senior Data Management Professional – Data Engineering to help modernize our data platform and build scalable, resilient data workflows for critical Economics datasets.

This role is focused on designing, building, and improving data pipelines, workflow orchestration, automation, monitoring, and technical infrastructure. You will reduce technical debt, modernize legacy processes, and embed quality controls directly into data pipelines and systems.

You will work closely with Data, Engineering, Product, and Domain experts to deliver reliable data solutions that improve speed, scalability, observability, and maintainability across the Economics data lifecycle.

We’ll trust you to:
  • Build, maintain, and optimize scalable data pipelines for critical Economics datasets.
  • Modernize legacy workflows, reduce technical debt, and improve performance, reliability, and maintainability.
  • Design automated pipeline controls for validation, monitoring, schema change, exception handling, and data integrity.
  • Develop workflow orchestration, alerting, observability, and remediation processes.
  • Translate business and client needs into engineering-ready requirements and scalable technical solutions.
  • Partner with Engineering on platform evolution, architecture, tooling, system design, and reliability.
  • Apply automation, AI, machine learning, or statistical techniques to improve ingestion, enrichment, validation, and monitoring.
  • Own data migrations, workflow redesigns, and technical transformation initiatives.
  • Establish best practices for pipeline design, code quality, testing, documentation, version control, and operational handover.
  • Influence data modelling, metadata, lineage, and lifecycle management practices from a technical implementation perspective.
  • Mentor team members and raise the bar for technical execution, design thinking, and engineering discipline
You’ll need to have:
  • A bachelor’s degree or above in Computer Science, Engineering, Statistics, Mathematics, Economics, Quantitative Finance, or equivalent experience.
  • 4+ years of experience designing and building scalable data solutions, ETL pipelines, data workflows, and monitoring frameworks.
  • Strong hands-on experience with Python or similar programming/scripting languages.
  • Experience with querying structured, semi-structured, and unstructured datasets.
  • Experience with workflow orchestration, observability, monitoring, alerting, and scalable architecture design.
  • Ability to analyze, refactor, and modernize legacy systems.
  • Strong understanding of data lifecycle management, data integration, data modelling, data profiling, and data governance.
  • Experience building automated controls and reliability frameworks into data pipelines.
  • Strong communication skills with the ability to collaborate across Data, Engineering, Product, Vendors, and other stakeholders.
*Please note: years of experience are a guide; we will consider applications from all candidates who can demonstrate the skills necessary for the role.

We’d love to see:
  • Experience with Economics, macroeconomic, government, survey, forecast, time-series, or vendor-supplied datasets.
  • Bloomberg Terminal, BQL, Enterprise, or Bloomberg data workflow experience.
  • Experience productionizing AI, machine learning, anomaly detection, NLP, classification, or LLM-assisted workflows.
  • Experience with cloud platforms, CI/CD, automated testing, version control, metadata management, lineage, or modern DataOps practices.
  • Project management experience with Agile delivery, backlog management, JIRA, or similar tools.
  • CDMP certification, or progress toward it, is a plus. 
If this sounds like you:
Apply! If you think we're a good match. We'll get in touch to let you know the next steps!


If indicated, please note that years of experience are a guide; we will consider applications from all candidates who can demonstrate the skills necessary for the role.
Discover what makes Bloomberg unique - watch our podcast series for an inside look at our culture, values, and the people behind our success.
Apply Now