Data Engineer CAPY Tech Solutions
- Portland, OR 97232, United States
- Contract
- In-person
- USD $60 per hour
Posted 3 days ago
CAPY Tech Solutions is on the lookout for a Data Engineer to join their dynamic team. The role is critical in building robust data pipelines and leveraging Databricks on AWS to streamline data processes. As a part of this team, individuals will contribute to maintaining efficient data workflows and ensuring accurate data management.
In this role you can expect to have the responsibilities:
- Design and implement scalable data pipelines using PySpark and Python.
- Develop and optimize SQL queries for ETL processes.
- Utilize Databricks on AWS for building and managing workflows.
- Read and comprehend legacy code.
- Prepare scripts for data validation.
- Monitor and troubleshoot data pipelines.
- Document data processes for future references.
- Collaborate with team members to understand requirements and deliver solutions.
This role comes with the following benefits:
This role requires you to have:
- Proficiency in PySpark and Python programming.
- Extensive experience with SQL and database management.
- Familiarity with AWS services such as S3, Lambda, Glue, and Redshift.
- Strong communication and collaboration skills.
You would benefit from having:
- Knowledge of AtScale.
- Problem-solving skills and attention to detail.
- Understanding of data warehousing concepts.
Job types include contract and temporary opportunities.
The content on this page is not written or managed by Alooba. Please reach out to CAPY Tech Solutions directly for any addtional information regarding this role.