Posted at: 23 January

Data Engineer - REMOTE WORK

Company

Merico

Merico is a San Francisco-based B2B software development company specializing in contribution analysis tools for developers, leveraging advanced technologies like Python and AI.

Remote Hiring Policy:

Merico is a fully remote company that hires developers from various regions worldwide, fostering a collaborative environment across time zones.

Job Type

Full-time

Allowed Applicant Locations

Romania, Worldwide

Salary

$55 to $60 per hour

Apply Here

Job Description

Data Engineer - REMOTE WORK - 61482

We have an immediate long-term opportunity with one of our prime clients for a position of Data Engineer to work on Remote basis.

Pay Rate: $55-$60/hr.

Imp Note:- it sounds like Python is an area of need but more than anything we need to be preparing our candidates because they are going to dig in on their customized SaaS integration across the board

Must Have:-

Python Programming

SaaS integration

GCP/Azure – Anyone

Engineer: From Hiring Manager:- 5 Years +

  • Their partner hosts in Azure, client in GCP. Highly desirable 'if not mandatory' to have candidates with experience in GCP to understand their platform.
  • Platform is built in Python, API driven integrations and data movements in Python with Dataflow.
  • No UI work in scope, has another UI developer who does that.
  • Python required, other programming languages (i.e. Java, Scala) nice to have
  • Heavier API development immediately, software API integrations, endpoint to endpoint, strong knowledge of integration patterns, API gateways and orchestration (they use APIGEE), but will need to perform some data engineering work
  • Data integrations, GCS and S3 bucket connections, data exchange
  • Open to have 1 Engineer stronger in API integrations, Python, software background and 1 more Data Engineering focused with GCP, GCS, Dataflow or Apache Beam
  • Need to be strong communicators, proactive, not heads-down developers. Team is highly collaborative, bounce ideas off of each other; engineers should thrive in that dynamic

Overview: Primary initiative is around integrating with another commercial company's software (SaaS provider), Torchlight and this other company both serve the same customers, and successful integration will expose their capability through Torchlight's product. All centered around behavioral analytics, data that tracks mobile devices, where people are: use cases around tracking drug routes, border protection, European tracking "what's going on there", commercial asset protection i.e. who is in and around facilities, which employees are in competitor facilities, etc. Have done similar integrations and have an API, but not necessarily optimized and scalable, will need to evolve and improve upon existing solutions for the long-term. Team is small: has a Product Owner, looking to bring on our Architect and 2 Engineers capable of supporting integration work and data engineering/workflow development that consists of developing data ingestion and ETL processes.

Responsibilities

  • Design, develop, and maintain data pipelines to integrate data from various cloud platforms (e.g., AWS, Azure, Google Cloud).
  • Build integration points between data in one environment and software in another to ensure seamless data flow and interoperability.
  • Collaborate with cross-functional teams to understand data requirements and ensure data quality and integrity.
  • Implement data transformation and cleansing processes to ensure accurate and reliable data.
  • Monitor and optimize data workflows for performance and scalability.
  • Troubleshoot and resolve data integration issues in a timely manner.
  • Ensure compliance with data governance and security policies.
  • Document data integration processes and workflows.

Qualifications

  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • Proven experience as a Data Engineer or in a similar role.
  • Strong knowledge of cloud platforms (AWS, Azure, Google Cloud) and their data services.
  • Proficiency in SQL and experience with ETL tools (e.g., Apache NiFi, Talend, Informatica).
  • Experience with programming languages such as Python, Java, or Scala.
  • Familiarity with data warehousing concepts and technologies (e.g., Redshift, BigQuery, Snowflake).
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills.
  • Preferred: Active Secret Clearance.
  • Preferred (but not required): Previous experience working with the Department of Defense (DoD).

Preferred Qualifications

  • Experience with containerization and orchestration tools (e.g., Docker, Kubernetes).
  • Knowledge of data governance and security best practices.
  • Certification in cloud platforms (e.g., AWS Certified Data Analytics, Google Cloud Professional Data Engineer).
  • ALL successful candidates for this position are required to work directly for PRIMUS. No agencies please only W2**

For Immediate Consideration, Please Contact

Nathan

PRIMUS Global Services

Phone: 972-798-2669 / 972 753 6500 ext. 206

Email: jobs@primusglobal.com

Apply Here