Architects, Designs, Builds, Develops, Maintains, and Implements new data systems where needed, the Architecture with best practices and data infrastructure reliable and scalable.
Responsible for deploying secure and well-tested software that meets privacy and compliance requirements, and maintains and improves CI / CD pipeline.
Proactively drive the effort of identifying opportunities to manage data and provide data solutions.
Evaluate various data architectures in the market and utilize them to develop data solutions to meet business requirements.
Drive the delivery of data products and services into systems and business processes in compliance with internal regulatory requirements.
Oversee the review of internal and external business and product requirements for data operations and activity and suggest changes and upgrades to systems and storage to accommodate ongoing needs
2. Data Integration:
Review and analyze existing data systems, and make recommendations on improvement, maintenance, or other factors to improve the system, automate manual processes, optimize data delivery, and re-design infrastructure for scale and performance and greater scalability
Monitoring and quality assurance within the system (Server/ Services/ETL /Networking/Security).
Design, build, and maintain optimized data pipelines and ETL solutions as business support tools in providing analysis and real-time analytics platforms for critical decision-making, support of data and analytics projects, integrating new sources of data into our central data warehouse to meet business requirements, and experience building and optimizing big data architectures for read-and-write performance.
Ensure data assets are organized and stored efficiently so that information is high quality, reliable, flexible, and efficient.
Manage and enhance to ensure that DWH, Data mart and BI tools work stably through aligned business/ system/ and data processes.
Ensure data is accessible to the required personnel to ensure the ease of understanding and access to the data needed to make informed business decisions.
Experience with building reports and data visualizations, using data from the data warehouse or other sources and mapping are helpful (but not required)
Requirements:
Bachelor's degree (or higher) in Computer Science, Software Engineering or Information Technology, MIS, Economics, Finance, Statistics, Mathematics, or a related field (Financial Services, Banking, Insurance, Securities).
At least 3 years experience (typically 5+ years) in Data Engineering and ETL and Analytics tools (R, Python, Pentaho, dbt) and BI&A Visualization systems (Metabase, Power BI, OBIEE, SAS VA, IBM, Yellowfin, Tableau, Qlik ), and In-depth knowledge of BI technologies, methodologies, and their application like Metabase, PowerBI, Tableau, Looker etc
Experience with building data pipelines from various business applications like Salesforce, Marketo, NetSuite, Workday, Odoo, and databases (MsSQL, Oracle, IBM, SAP, MySQL, AWS Redshift).
Experience designing and implementing large-scale distributed data pipelines and working with cloud infrastructure services like AWS/GCP/Azure (preferably AWS), (Big Query, Data Flow, Amazon EKS, EMR, Docker/ECS, and Redshift.)
Experience with the LAN or VLAN, and digital organizing tools, like Active Directory Action Network, ActionKit, or Blue State Digital, is a plus (but not required).
Experience with big data tools: Hadoop, Spark, Kafka, Spark & Kafka Streaming, Python, Scala, Talend, Azkaban, Luigi, Airflow etc
Good at math and Advanced SQL knowledge and experience working with relational databases as well as working familiarity with a variety of databases is a big plus.