Skip to content
Senior Data Management Professional - Data Engineering - Foreign Exchange Data
Location
Skillman
Business Area
Data
Ref #
10039564

Description & Requirements

Bloomberg runs on data. Our products are fueled by powerful information. We combine data and context to paint the whole picture for our clients, around the clock – from around the world. In Data, we are responsible for delivering this data, news and analytics through innovative technology - quickly and accurately. We optimize the value of our data by combining our domain and technical expertise to make our data fit-for-purpose, timely and accurate. We apply our problem-solving skills to identify innovative workflow efficiencies and we implement technology solutions to better manage our data.


Our Team:
Our Foreign Exchange (FX) Data team is seeking a Senior Data Engineer to help us drive the dataset forward. The team seeks to provide customers with transparency into OTC markets by generating market-reflective, proprietary FX rates across 180+ currencies globally. We are responsible for maintaining and enhancing Bloomberg composites based on market rates from 1,800 third party pricing sources. FX pricing is a foundational service and an integral part of many workflows for our clients including trade execution, asset valuation, risk management, back office operations, and benchmarking. 


What’s the Role?
As a Data Engineer on the FX team, you’re required to understand the data requirements, specify the modeling needs of datasets and use existing techstack solutions for efficient data ingestion workflows and data pipelining. You will implement technical solutions using programming, machine learning, AI, and human-in-the-loop approaches to make sure our data is fit-for-purpose for our clients. You will work closely with our Engineering partners, our Data Product Manager as well as Product teams, so you need to be able to coordinate with multi-disciplinary and regional teams and have experience in project management and stakeholder engagement. You will need to be comfortable working with large datasets and you will need to demonstrate strong experience in data engineering.


We trust you to:
  • Build and maintain robust and scalable ETL data pipelines to support the ingestion, transformation, and loading of vast amounts of data from various sources using Bloomberg tech stack components such as Bloomberg Data Services, Dataflow recipes and Data Technologies Pipelines or equivalent tech stack such as Amazon S3, SQL, Python Pandas
  • Analyze internal processes to find opportunities for improvement and process engineer efficient and innovative workflows using programmatic machine learning approaches
  • Set up business rules and visualization to measure and ensure the accuracy, timeliness, and completeness of the FX dataset using Bloomberg tech stack components such as the Data Science Platform and QlikSense
  • Use your deep understanding of the FX market & data, including trading and analytics workflows, to create comprehensive and transparent solutions that fit the use case of our internal and external clients
  • Collaborate with partners in creating data manipulation frameworks and establishing standard methodologies using Bloomberg's tech stack
  • Apply your proven project management skills to ensure all technical projects are on track with right requirements
  • Be responsive, resourceful, flexible and an excellent collaborator - Partner with our Product, Technology and Data Management Lab team to ensure consistent principles are leveraged, tools are fit for purpose, and results will be measurable
  • Balance the best of technical and product knowledge to craft solutions for customers.

You'll need to have:
  • 4+ years of programming and scripting in a production environment (Python, Javascript,  etc) 
  • 4+ years of experience working with databases using SQL. Experience working with extremely large data volumes is a plus.
  • Strong understanding of data warehousing methodologies, ETL processing and dimensional data modeling.
  • A bachelor’s degree or higher in Computer Science, Mathematics, or relevant data technology field, or  degree-equivalent qualifications
  • Exceptional problem-solving skills, numerical proficiency and high attention to detail
  • Ability to work independently as well as in a distributed team environment
  • Ability to optimally communicate and present concepts and methodologies to diverse audiences

We'd Love to See:
  • Data Management Association Certified Data Management Professional, Data Capability Assessment Model certification
  • Experience in crafting and developing data quality metrics and reporting as a part of a bigger data architecture framework.
  • Demonstrable experience in Data Profiling/Analysis using tools such as Python, R, or SQL
  • Knowledge of business intelligence reporting tools such as QlikSense, Power BI
  • Familiarity with Iceberg and Trino solutions 
  • Knowledge of streaming technologies like Kafka

Does this sound like you?  Apply if you think we're a good match. We'll get in touch to let you know what the next steps are.
Salary Range = 110000 - 170000 USD Annually + Benefits + Bonus

The referenced salary range is based on the Company's good faith belief at the time of posting. Actual compensation may vary based on factors such as geographic location, work experience, market conditions, education/training and skill level.


We offer one of the most comprehensive and generous benefits plans available and offer a range of total rewards that may include merit increases, incentive compensation, [Exempt roles only], paid holidays, paid time off, medical, dental, vision, short and long term disability benefits, 401(k) +match, life insurance, and various wellness programs, among others. The Company does not provide benefits directly to contingent workers/contractors and interns.

Apply Now