
Senior Data Engineer
- Hybrid
- New York, New York, United States
- $60 - $65 per year
- Information Technology
Senior Data Engineer role in NYC (Hybrid, 3 days onsite). Strong experience required in Python, SQL Server, Databricks, Snowflake, ETL, and CI/CD with recent Banking or Financial Services domain exp.
Job description
Job Description
We are seeking a highly skilled Senior Data Engineer to join our team in New York City. The ideal candidate will have strong experience in data engineering, Python development, and modern data platforms. This role involves designing, developing, and maintaining scalable data solutions while working closely with business stakeholders and technical teams.
Location: New York City, NY (Hybrid – 3 days onsite)
Interview Process: In-person interview required in NYC.
Note: Candidates must have recent Banking / Financial domain experience.
Key Responsibilities
Understand and analyze technical specifications and business requirements.
Collaborate with business analysts and business users to gather requirements and deliver data solutions.
Develop applications and data pipelines using Python, SQL Server, Snowflake, and Databricks.
Design and maintain data models and schemas to support data integration and analytics.
Implement data validation and quality checks to ensure data accuracy and consistency.
Perform Unit Testing (UT) and System Integration Testing (SIT) with business analysts.
Support User Acceptance Testing (UAT) with business users.
Provide production support and ongoing maintenance of data platforms and applications.
Job requirements
Required Qualifications
General
12+ years of IT industry experience.
Strong communication and collaboration skills.
Experience working in Agile environments and SDLC processes.
Experience with system design and architecture.
Experience working in global delivery models (onshore/offshore teams).
Strong analytical and problem-solving skills.
Self-driven, collaborative team player able to work with minimal supervision.
Technical Skills
Mandatory Skills
Python
SQL Server and relational database concepts
Azure Databricks
Snowflake
ETL development
Scheduler tools (Autosys / Control-M)
CI/CD pipelines
Nice to Have
PySpark
Experience in financial systems, capital markets, credit risk, or regulatory applications
or
All done!
Your application has been successfully submitted!
