Skip Navigation

AWS and Snowflake Data Engineer
Portland, OR

Back To Job Board

Direct Hire


We been engaged to find an AWS and Snowflake Data Engineer you will be part of a fast-paced team designing, developing, testing, integrating and supporting technically innovative solutions for our Fortune 500 customers.

AWS and Snowflake Data Engineer overview:

  • Design and build reusable components, frameworks and libraries at scale to support analytics products. 
  • Design and implement product features in collaboration with business and Technology stakeholders. 
  • Identify and solve issues concerning data management to improve data quality.
  • Clean, prepare and optimize data for ingestion and consumption.
  • Collaborate on the implementation of new data management projects and re-structure of the current data architecture.
  • Implement automated workflows and routines using workflow scheduling tools. 
  • Build continuous integration, test-driven development and production deployment frameworks.
  • Analyze and profile data for designing scalable solutions. 
  • Troubleshoot data issues and perform root cause analysis to proactively resolve product and operational issues.

AWS and Snowflake Data Engineer requirements:

  • Hands on experience in AWS – EMR [Hive, Pyspark], S3, Athena or any other equivalent cloud
  • Familiarity with Spark Structured Streaming
  • Minimum experience working experience with Hadoop stack dealing huge volumes of data in a scalable fashion
  • Hands-on experience with SQL, ETL, data transformation and analytics functions
  • Hands-on Python experience including Batch scripting, data manipulation, distributable packages.
  • Experience working with batch orchestration tools such as Apache Airflow or equivalent, preferable Airflow.
  • Working with code versioning tools such as GitHub or BitBucket; expert level understanding of repo design and best practices
  • Familiarity with deployment automation tools such as Jenkins
  • Hands-on experience designing and building ETL pipelines; expert with data ingest, change data capture, data quality; hand on experience with API development.
  • Designing and developing relational database objects; knowledgeable on logical and physical data modeling concepts; some experience with Snowflake
  • Familiarity with Tableau or Cognos use cases
  • Familiarity with Agile; working experience preferred.

AWS and Snowflake Data Engineer certifications:

  • AWS
  • Snowflake
  • ETL
  • Python

Industry: Data Engineer, AWS, Snowflake

Job Code: j-2014



Back To Job Board

Submit Your Resume

"*" indicates required fields

First Name*
Last Name*
Max. file size: 20 MB.
Drop your resume and other files here or upload here
This field is for validation purposes and should be left unchanged.