Senior Data Management Professional - Data Engineer - Carbon Markets
Location
London
Business Area
Data
Ref #
10048751
Description & Requirements
Bloomberg runs on data. Our products are fueled by powerful information. We combine data and context to paint the whole picture for our clients, around the clock – from around the world. In Data, we are responsible for delivering this data, news and analytics through innovative technology - quickly and accurately. We apply problem-solving skills to identify innovative workflow efficiencies, and we implement technology solutions to enhance our systems, products and processes.
Carbon-related data has become increasingly crucial for our clients, providing deeper insights into risks and opportunities for corporates and financial players on the global path to net-zero. Our team delivers high-quality, transparent data and analytics on carbon markets — including emissions, carbon credits, renewable energy credits, and related instruments — across the Bloomberg Terminal and Enterprise products.
Our mission is to provide fit-for-purpose, trustworthy carbon data that improves market transparency and empowers clients to make informed decisions about carbon pricing, risk management, and sustainability strategy.
What's the role?
As a Data Engineer within the Carbon Data team, you’ll focus on the technical backbone of our carbon and environmental data products — crafting, building, and optimizing the workflows that move data from various sources to our platforms.
You’ll apply sophisticated programming, data modeling, and automation skills to ensure our datasets are consistent, scalable, and ready for downstream use across analytics and applications. Working closely with Product, Engineering, and other data teams, you’ll find opportunities to streamline processes, improve quality, and expand the breadth of carbon-related content.
This role combines hands-on engineering with leadership responsibilities: driving standard methodologies, mentoring peers, and influencing data architecture decisions across the organization.
We'll trust you to:
- Develop, scale, and maintain robust data pipelines and workflows that support ingestion, transformation, and delivery of carbon market data.
- Build and implement scalable data models and processes to ensure accuracy, completeness, and consistency across multiple data sources and systems.
- Provide strategic and technical guidance on transforming and validating complex carbon and other related datasets to maintain data integrity throughout processing stages.
- Collaborate with Product, Engineering and Commercial teams and other partners to align data architecture and workflow automation with business and client needs.
- Generate insights through statistical analysis and visualization, communicating trends and anomalies effectively to technical and non-technical partners.
- Lead or contribute to project execution using agile methodologies (e.g., JIRA, QlikSense) to ensure work is aligned, transparent, and on schedule.
- Stay ahead of on market and technology developments in carbon trading, registry data standards, and environmental data engineering to inform future improvements and generate high-value insights and content.
You'll need to have:
*Please note we use years of experience as a guide but we certainly will consider applications from all candidates who are able to demonstrate the skills necessary for the role.
- Proven proficiency in programming within development or production environments, applying Python, SQL, or similar data-focused languages to design and maintain large-scale datasets.
- 4+ years of professional experience in software or data engineering, with a solid foundation in data pipeline design, automation, and workflow orchestration.
- Demonstrated experience working with carbon markets, sustainability data, or other financial market datasets in a technical capacity
- Strong expertise of data management principles and technologies, including data modeling, transformation, and analysis.
- Experience with SQL and NoSQL databases, including schema design, query optimization, and data modeling.
- Excellent problem-solving outlook, with a focus on improving efficiency and process scalability.
- Superb communication and interpersonal skills, with the ability to convey complex technical concepts to diverse audiences.
- Effective project management and prioritization skills, ensuring technical initiatives remain aligned with requirements and delivered on schedule
- Bachelor’s degree or higher in a STEM, Computer Science, or data-related field (or equivalent professional experience).
We'd love to see:
- A Master’s degree or professional certification such as DAMA CDMP or DCAM.
- Demonstrated experience in data profiling and analysis using Python (Pandas), SQL, or equivalent tools.
- Demonstrable ability to define and implement data quality metrics as part of a broader data architecture or governance framework.
- Experience with semantic modeling to enhance interoperability across carbon, commodities, and ESG reporting datasets.
- Familiarity with visual analytics and BI tools (QlikSense, Power BI) and agile delivery tools (JIRA, Confluence).
- Understanding of ESG and sustainability data integration, particularly as it relates to carbon pricing, ESG scoring, and climate-related financial disclosures.
If this sounds like you:
Apply if you think we're a good match. We'll get in touch to let you know what the next steps are, but in the meantime feel free to have a look at this: https://www.bloomberg.com/professional
Discover what makes Bloomberg unique - watch our podcast series for an inside look at our culture, values, and the people behind our success.