AWS Engineer

Job Type: Contract
Work Flexibility: Hybrid
Location: Austin TX Seattle WA
Required Skills: AWS Intrastructure CloudFormation EC2 EMR Glue Redshift S3 SageMaker

Role: AWS Engineer
Location: Seattle, WA/Austin, TX
Duration: 6+ months
Pay Rate: $60 to $70

(Multiple Roles, multiple levels)

Summary: We are seeking a highly skilled AWS Engineer to join a cutting-edge data platform. The ideal candidate will have deep experience in AWS infrastructure, data lake architecture, and large-scale data pipeline development. This role demands hands-on expertise in AWS services such as Glue, EMR, Redshift, S3, and SageMaker, along with strong SQL, Python, and PySpark skills.

Key Responsibilities:

  • Architect, develop, and maintain scalable AWS-based data lake and ETL/ELT solutions.
  • Leverage AWS Glue, EMR, CloudFormation, Development EndPoints, S3, Redshift, and EC2 to build distributed and secure data platforms.
  • Set up and optimize Jupyter/SageMaker Notebooks for advanced analytics and data science collaboration.
  • Develop robust data pipelines using Spark clusters, ensuring performance, fault-tolerance, and maintainability.
  • Build connectors to ingest and process data from distributed sources using various integration tools and frameworks.
  • Write efficient, production-grade SQL, Python, and PySpark code for data transformation and analysis.
  • Lead proof-of-concept (PoC) efforts and scale them into production-ready systems.
  • Stay current with emerging data and cloud technologies, offering guidance on how to apply them effectively to solve complex technical and business challenges.
  • Collaborate with cross-functional teams, including data scientists, analysts, and product stakeholders.

Required Skills:

  • Proven experience setting up and managing AWS infrastructure with CloudFormation, Glue, EMR, Redshift, S3, EC2, and SageMaker.
  • Strong knowledge of Data Lake architecture and data ingestion frameworks.
  • 5+ years of experience in Data Engineering and Data Warehouse development.
  • Advanced proficiency in SQL, Python, and PySpark.
  • Experience designing and optimizing complex Spark-based data pipelines on AWS.
  • Ability to troubleshoot performance bottlenecks and production issues in large-scale distributed systems.
  • Strong leadership in taking PoCs to production through structured engineering practices.

Preferred Qualifications:

  • AWS certifications (e.g., AWS Certified Data Analytics – Specialty, Solutions Architect).
  • Prior experience at an enterprise-scale client such as Amazon or other FAANG companies.
  • Familiarity with DevOps practices and tools like Terraform, Jenkins, Docker, and Git.

Apply for this position

Allowed Type(s): .pdf, .doc, .docx