Job offers
0
0
Senior AWS Data Engineer
Published on: 1764028800

In Cyclad we work wih top international IT companies in order to boost their potential in delivering outstanding, cutting-edge technologies that shape the world of the future. We are seeking a highly AWS Data engineer with solid experience in design and implement solutions for processing large-scale and unstructured datasets to join a remote team. Expected background in Big Data or Cloud projects, including hands-on experience with AWS services such as Storage and Compute (including serverless), Networking, and DevOps tools.

 

Project information:

  • Type of project: IT services, consulting
  • Office location: Poland
  • Work model: Remote
  • Budget: b2b 150 - 180 net per hour
  • Project length: long-term
  • Only candidates with citizenship in the European Union and residence in Poland
  • Start date: ASAP (depending on candidate's availability)

 

Project scope:

  • Design and implement solutions for processing large-scale and unstructured datasets (Data Mesh, Data Lake, or Streaming Architectures).
  • Develop, optimize, and test modern DWH/Big Data solutions based on the AWS cloud platform within CI/CD environments.
  • Improve data processing efficiency and support migrations from on-premises systems to public cloud platforms.
  • Collaborate with data architects and analysts to deliver high-quality cloud-based data solutions.
  • Ensure data quality, consistency, and performance across AWS services and environments.
  • Participate in code reviews and contribute to technical improvements.
Closes in 57 days!

Location:

Salary:

Requirements:

  • Proven 6+ years of experience in Big Data or Cloud projects involving processing and visualization of large and unstructured datasets, across various phases of the SDLC
  • Hands-on experience with the AWS cloud, including Storage, Compute (and Serverless), Networking, and DevOps services
  • Solid understanding of AWS services, ideally supported by relevant certifications.
  • Familiarity with several of the following technologies: Glue, Redshift, Lambda, Athena, S3, Snowflake, Docker, Terraform, CloudFormation, Kafka, Airflow, Spark
  • Basic proficiency in at least one of the following programming languages: Python, Scala, Java, or Bash
  • Very good command of English (German language skills would be an advantage)

 

Nice to Have:

  • Experience with orchestration tools (e.g., Airflow, Prefect)
  • Exposure to CI/CD pipelines and DevOps practices
  • Knowledge of streaming technologies (e.g., Kafka, Spark Streaming)
  • Experience working with Snowflake or Databricks in a production or development environment
  • Relevant certifications in AWS, data engineering, or big data technologies

 

We offer:

  • Fully remote working model
  • Full-time job agreement based on b2b and employment contract
  • Private medical care with dental care (covering 70% of costs) + rehabilitation package. Family package option possible
  • Multisport card (also for an accompanying person)
  • Life insurance

Explore more

Find out how it is to work with us

Our Clients

We proudly deliver to the leaders across industries.

Our Clients