Established in 2017, QCP Capital is a digital assets trading firm and global market maker in options, headquartered in Singapore.
As one of the first digital assets trading firms in Singapore, QCP Capital brings to clients deep expertise gained from thriving through multiple market cycles. Our mission is to unlock new opportunities for clients at the forefront of crypto markets through providing institutional-grade liquidity, infrastructure and research.
An active early-stage crypto and blockchain investor, QCP Capital's portfolio includes core trading infrastructure, exchanges, data and token ecosystems. QCP Capital is supported by over 70 professionals in trading, business development, operations, risk and compliance teams. More information can be found at qcp.capital.
Responsibilities
- This is a small elite team that works very closely with our research department.
- Design, build and maintain our data architecture (AWS ecosystem) in the area of market data, trade capture and blockchain data.
- Develop and maintain ETL pipelines to ensure our market data is accurate and up to date
- Develop and maintain ETL pipelines to extract, transform, and load data from blockchain networks into our data warehouse.
- Optimise database performance and ensure data integrity. Implement processes and systems to monitor data quality and ensuring production data is always accurate and available for key stakeholders
- Troubleshoot and resolution of data pipeline, ETL jobs and quality issues
- Work with data scientists, analysts, researchers and other stakeholders to ensure that data is easily accessible and meets their needs
- Collaborate with other technology and research teams to improve data models that feed various tools, automated systems and dashboards to improve data-driven decision making across the organisation
- This position will be within the technology operations team and all team members will be required to be part of an on-call rotation to address any technology ops issues.
- Ability to work with apache parquet, AWS S3/Athena/Glue/Redshift/Kinesis
- Ability to work under time constraints to push out urgent bug fixes and functionality.
Requirements
- BS or MS degree in computer science or a related technical field
- 3+ years experience of development experience in Python or Java
- 3+ years experience in databases
- Knowledge of schema design and dimensional data modelling (technique used to organize and structure data in a way that makes it easy to query and analyze)
- Experience working on data warehouse development
- The team is highly diverse, and we communicate globally in English
- Experience working in an agile team and methodologies
- Troubleshooting and documentation skills are highly essential to succeed in this role
- Comfortable working within the AWS data pipelines and analytics ecosystem
- Knowledge of blockchain and ability to handle market data a significant advantage
Benefits
The Environment We Offer
As a growing firm with a tightly-knit team, we respect and listen to all our employees. You will get the chance to make an impact by having your voice heard by everyone, including the management.
Our employees enjoy a high level of autonomy at work. We focus on substance, not form - as long as you can perform, you will be recognized and rewarded. We are also dedicated to supporting our staff and ensuring they develop holistically to maximize their potential in the long- term.
We also provide flexible working arrangement as required and a casual and fun environment to boot!