Proxify AB: Senior Data Engineer (AWS & Python)
Role highlights
Full Time
Permanent
Senior
Remote
The Senior Data Engineer role requires over 5 years of professional experience specializing in cloud-native data platforms with a strong emphasis on AWS and Python. The candidate must have expert proficiency in Python, including libraries such as Pandas and PySpark, for data manipulation, scripting, and developing scalable ETL/ELT pipelines. Deep hands-on experience with AWS core services—S3, Glue, Lambda, EMR, Kinesis/MSK, Redshift, and Aurora—is essential, alongside proven expertise in modern cloud data warehouses like Snowflake, Amazon Redshift, Google BigQuery, or Azure Synapse. The role demands strong skills in SQL, including complex query writing and performance tuning on large datasets. Experience designing robust data models, such as star schema, snowflake, and data vault, is required. Familiarity with containerization and orchestration technologies, specifically Docker and Kubernetes, is also important. Additional competencies include implementing infrastructure as code using Terraform or CloudFormation and integrating data pipelines into CI/CD workflows. Knowledge of data quality monitoring, logging, alerting, and governance standards is expected. Preferred but not mandatory skills include experience with orchestration tools like Apache Airflow, data streaming technologies such as Kafka, Kinesis, or Flink, and AWS certifications. Excellent English communication skills and availability within the CET timezone (+/- 3 hours) are required. The candidate should hold at least a bachelor's degree, reflecting a solid educational foundation. This senior-level position offers opportunities to work on cutting-edge projects with predictable hours and flexible benefits, making it suitable for experienced data engineers seeking advanced cloud and data platform challenges.
About the role
We are seeking a Senior Data Engineer specializing in modern, cloud-native data platforms, with a strong focus on AWS and Python. You will design, build, and optimize scalable ETL/ELT pipelines and data warehouses to support analytics, machine learning, and business intelligence for clients.
Responsibilities
- Architect, implement, and maintain scalable data pipelines (ETL/ELT) using Python and AWS services
- Ingest data from various sources (APIs, databases, streaming services) into data lakes and warehouses
- Serve as subject matter expert for AWS data services (S3, Glue, EMR, Kinesis/MSK, Redshift, Aurora)
- Design robust and efficient data models (star schema, snowflake, data vault) for analytics and reporting
- Perform performance tuning and query optimization on large datasets in cloud data warehouses
- Implement infrastructure as code (Terraform or CloudFormation) and integrate pipelines into CI/CD processes
- Establish data quality monitoring, logging, alerting, and governance standards
Requirements
- 5+ years of professional experience in data engineering
- Expert proficiency in Python (Pandas, PySpark) for data manipulation, scripting, and pipeline development
- Deep hands-on experience with AWS core services for data ingestion, storage, and processing (S3, Glue, Lambda, EMR)
- Proven experience with modern data warehouses (Snowflake, Amazon Redshift, Google BigQuery, or Azure Synapse)
- Solid expertise in SQL and complex query writing/optimization
- Strong understanding of containerization and orchestration (Docker, Kubernetes)
- Fluent English communication skills
- Located in CET timezone (+/- 3 hours); applications from other time zones not considered
Nice to Have
- Experience with Infrastructure as Code (Terraform or CloudFormation)
- Proficiency with orchestration tools (Apache Airflow)
- Familiarity with data streaming technologies (Kafka, Kinesis, Flink)
- AWS certifications
Benefits
- On-time monthly payments with flexible withdrawal options
- Predictable project hours (consistent 8-hour working days)
- Up to 24 flex days off per year (for full-time positions)
- Career-accelerating positions at cutting-edge companies
- Hand-picked, personally matched opportunities
- One seamless contracting process for multiple opportunities
- Consistent monthly pay for positions landed through Proxify
How to Apply
Apply at: https://weworkremotely.com/remote-jobs/proxify-ab-senior-data-engineer-aws-python-1
More roles from Proxify
View company profileProxify AB: Senior Next.js Developer
Flexible location
Join Proxify AB as a Senior Next.js Developer to lead innovative projects, enjoy flexible remote work, and grow your car...
Proxify AB: Senior Fullstack Developer (Python)
Flexible location
Join Proxify AB as a Senior Fullstack Developer (Python) and advance your career with innovative projects. Apply now to...
Proxify AB: Senior Ruby on Rails Developer
Flexible location
Join Proxify AB as a Senior Ruby on Rails Developer, work remotely in CET timezone, and grow your career with flexible h...