Posted at: 30 March

Senior Data Engineer

Company

Apollo.io

Apollo.io is a Walnut, CA-based B2B sales intelligence platform offering tools for lead generation, email management, and CRM integration to help businesses enhance their sales engagement strategies.

Job Type

Full-time

Allowed Applicant Locations

Poland

Apply Here

Job Description

**This is a Permanent EoR role and not a B2B Contract** 

Your Role & Mission

As a Senior Data Engineer you will be responsible for maintaining and operating the data warehouse and connecting in Apollo’s data sources.

Daily Adventures and Responsibilities

  • Develop and maintain scalable data pipelines and build new integrations to support continuing increases in data volume and complexity.

  • Implement automated monitoring, alerting, self-healing (restartable/graceful failures) features while building the consumption pipelines.

  • Implement processes and systems to monitor data quality, ensuring production data is always accurate and available.

  • Write unit/integration tests, contributes to engineering wiki and document work.

  • Define company data models and write jobs to populate data models in our data warehouse.

  • Work closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.

Competencies

  • Excellent communication skills to work with engineering, product, and business owners to develop and define key business questions and build data sets that answer those questions.

  • Self-motivated and self-directed

  • Inquisitive, able to ask questions and dig deeper

  • Organized, diligent, and great attention to detail

  • Acts with the utmost integrity

  • Genuinely curious and open; loves learning

  • Critical thinking and proven problem-solving skills required

Skills & Relevant Experience

Required:

  • 5+ years experience in data engineering or in data facing role

  • Experience in data modeling, data warehousing, and building ETL pipelines

  • Deep knowledge of data warehousing with an ability to collaborate cross-functionally

  • Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)

  • Proven experience leveraging AI tools by demonstrating fluency in integrating AI-driven solutions into their workflows and a willingness to stay current with emerging AI technologies

Preferred:

  • Experience using the Python data stack

  • Experience deploying and managing data pipelines in the cloud (preferably AWS or GCP)

  • Experience working with technologies like Airflow, Hadoop and Spark

  • Understanding of streaming technologies like Kafka, Spark Streaming

Apply Here