Data Engineer
VG Recruiting Agency
Yerevan | Full time | Hybrid
OVERVIEW
OneMarketData is a leading provider of high-performance data management and analytics solutions for the financial services industry. Our flagship platform, OneTick, built by Wall Street veterans, enables financial institutions to efficiently manage and analyze vast volumes of time-series data—driving faster and smarter decision-making in today’s data-driven landscape.
Headquartered in the United States, OneMarketData has operated its Armenian office (“OMD”) in Yerevan since 2007. The Armenian team is a vital part of our global operations, contributing to the development, enhancement, and support of our core products and services.
Our Cloud Operations and Data Engineering teams play a critical role in the end-to-end data delivery lifecycle—from ingesting raw data from vendors to delivering high-quality datasets to our clients.
Prior to advancing with your application, we kindly request that you review the
CONSENT NOTICE FOR HR AND RECRUITING provided by OneMarketData. Your attention to this matter is greatly appreciated.
JOB DESCRIPTION
We are looking for a skilled and motivated Data Engineer to join our expanding Data Engineering and Cloud Operations team. If you have a strong technical background, enjoy working with data, and thrive on solving complex problems using modern technologies, this role could be a great fit.
You will be responsible for building and maintaining ETL pipelines that process financial market data from over 100 global exchanges. This includes working with technologies like Python, Apache Airflow, and Linux-based systems in a cloud-native environment.
Key Responsibilities
- Design, build, and maintain scalable ETL pipelines for ingesting, transforming, and delivering financial market data
- Develop and manage Airflow DAGs to orchestrate complex data workflows
- Monitor and support production data pipelines, helping the operations team troubleshoot and resolve issues
- Optimize storage and data access patterns for large-scale time-series datasets
- Collaborate with the Operations, Data, and Product teams to ensure timely and accurate data delivery in line with SLAs
- Support initiatives around automation, observability, and performance improvements
- Document pipeline logic, technical workflows, and operational procedures
General Requirements
We are seeking someone with a strong understanding of data pipelines, cloud-based solutions, analytical thinking, and a problem-solving mindset—paired with excellent communication and collaboration skills.
Qualifications
- Bachelor’s degree or higher in Computer Science, Engineering, Mathematics, Physics, or a related discipline
- Hands-on experience with Python for data processing and scripting
- Solid understanding of ETL concepts and experience with tools such as Apache Airflow (or similar)
- Proficiency with Linux-based systems
- Familiarity with cloud platforms (preferably AWS)
- Experience with SQL and relational databases such as PostgreSQL
- Familiarity with DevOps tools and methodologies (Docker, Kubernetes, etc.) is a plus
- Experience with monitoring tools (e.g., Prometheus, Grafana) is a plus
- Strong analytical skills, attention to detail, and a passion for working with data
- A continuous learning mindset and eagerness to explore new technologies
Required candidate level: Senior
Contact information
Qualified candidates are welcome to apply for the job by following this link.
Please clearly mention that you have heard of this job opportunity on telegram channel VG Recruiting Agency.