Posted at: 15 March
Senior Bigdata Engineer - Java & Pyspark
Company
Arc.dev
Arc.dev is a freelance platform that only connects developers who pass their vetting process. NOTE: Jobs posted by Arc.dev are freelance opportunities provided by their clients, and you will likely communicate with Arc.dev when applying for a job.
Remote Hiring Policy:
Arc.dev is a fully remote freelance platform.
Job Type
Full-time
Allowed Applicant Locations
Lithuania, Worldwide
Job Description
**Job Description:**We are looking for a highly skilled Software Engineer with extensive experience in large-scale data processing and migration projects. The ideal candidate should have a strong foundation in Spark/Databricks and a proven track record in big data application development. This role requires forward-thinking expertise in tooling and architecture to optimize and scale migration efforts.
Key Responsibilities:
- Convert Java-based data processing applications to PySpark, ensuring high performance and scalability.
- Work on large-scale data processing and migration projects, handling complex data pipelines.
- Leverage Spark/Databricks expertise to develop and optimize big data applications.
- Design and implement scalable and automated solutions to minimize manual migration efforts.
- Collaborate with cross-functional teams to drive best practices in big data processing.
Required Qualifications & Skills:
- Strong programming skills with expertise in Java and PySpark.
- Hands-on experience with big data processing frameworks such as Apache Spark and Databricks.
- Proven track record of working in large-scale data migration projects.
- Ability to architect and scale solutions to streamline migration processes.
- Experience in optimizing big data applications for performance and efficiency.
- Knowledge of cloud-based data platforms (AWS, Azure, or GCP) is a plus.
Job Type: Full-time
Benefits:
- 401(k)
- Dental insurance
- Health insurance
- Paid time off
- Referral program
- Relocation assistance
- Vision insurance
Schedule:
- Day shift
- Monday to Friday
Experience:
- databricks: 5 years (Required)
- Java: 6 years (Required)
- PySpark: 5 years (Required)
- Big data: 3 years (Preferred)
Work Location: Remote