Posted at: 20 February

Mid-level Data Engineer - Europe and Latam

Company

Arc.dev

Arc.dev is a freelance platform that only connects developers who pass their vetting process. NOTE: Jobs posted by Arc.dev are freelance opportunities provided by their clients, and you will likely communicate with Arc.dev when applying for a job.

Remote Hiring Policy:

Arc.dev is a fully remote freelance platform.

Job Type

Full-time

Allowed Applicant Locations

Serbia, Worldwide

Apply Here

Job Description

About the company:

Pass_by delivers actionable and verifiable insights into any store’s performance. We help retailers, in the US market, gain powerful insights into consumer retail behavior. All powered and enhanced by AI.

What we are looking for

We are looking for a Data Engineer to design and build scalable, efficient data pipelines, optimize database performance, and ensure the reliability of our data infrastructure. This role involves working with BigQuery, PostgreSQL, and Apache Airflow to process large-scale datasets, implementing CI/CD pipelines, and using Terraform to manage infrastructure as code. You'll also play a key role in improving our datasets based on client feedback and collaborating with cross-functional teams to deliver high-quality data solutions.

What would you be doing?

● Design and build orchestrated data pipelines that handle large-scale data processing and transformation.
● Develop and optimise queries in BigQuery and Postgres to ensure efficient data operations.
● Design and maintain database schemas that prioritize performance, scalability, and reliability.
● Perform database tuning and make optimization adjustments to enhance overall performance.
● Apply DevOps practices to implement CI/CD pipelines for data infrastructure
● Work collaboratively with cross-functional teams to understand business requirements and deliver effective solutions.
● Use Terraform to provision and manage infrastructure as code (IaC).
● Create and manage cron jobs for scheduled tasks and background processing.
● Implementing adjustments to our datasets based on client feedback.

Who we’re looking for

● Strong proficiency in Python and SQL
● Experience with data orchestration tools (e.g. Apache Airflow)
● Knowledge of cloud platforms (e.g. GCP, AWS)
● Familiarity with infrastructure-as-code (e.g. Terraform)
● Experience with relational databases (e.g. PostgreSQL), including writing high-performance queries and optimising database performance.
● Proven ability to design and maintain database schemas that prioritize performance, scalability, and reliability.
● Experience with data analysis and visualization

What we offer

● A competitive salary, and the ability to join our option-pool;
● 25 annual paid leave days, excluding bank holidays
● Remote first working with access to co-working spaces;
● A range of employee benefits and perks. (pension, flexible working hours and annual company retreat abroad)

Apply Here