
Data Engineer
- Hybrid
- Phoenix, Arizona, United States
- $40 - $45 per hour
- Information Technology
Data Engineer role in PHX, AZ (Hybrid) for 6-month W2 contract. Requires 6-9 yrs experience with GCP, Python, SQL, PySpark, Spark, Hive, Big Data, and functional testing.
Job description
Job Description
Job Title: Data Engineer
Location: PHX, AZ (HYBRID) (Local only)
Duration: 6 Months Contract
USC GC Only
Video Interview
Overview:
We are seeking a skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure to support advanced analytics and business intelligence initiatives. This role requires strong expertise in cloud technologies, big data frameworks, and programming languages.
Key Responsibilities
Develop and maintain robust data pipelines using Python, PySpark, and SQL.
Design and optimize data workflows on Google Cloud Platform (GCP).
Implement and manage data storage solutions using Hive and other big data technologies.
Collaborate with cross-functional teams to understand data requirements and deliver solutions.
Conduct functional testing to ensure data accuracy and reliability.
Utilize Rally for agile project tracking and task management.
Ensure data quality, integrity, and security across all systems.
Job requirements
Required Skills & Qualifications
Proficiency in Python, SQL, and PySpark.
Hands-on experience with Google Cloud Platform (GCP) services.
Strong understanding of Hive and big data ecosystems.
Experience with functional testing in data engineering environments.
Familiarity with Rally or similar agile project management tools.
Bachelor’s degree in Computer Science, Engineering, or related field.
Must Have Skills:
6- 9 years experience
GCP
Python
SQL
Spark
Rally
PySpark
Hive
Functional Testing
Big Data
or
All done!
Your application has been successfully submitted!