Big Data Platform Engineer - Geylang
APAR TECHNOLOGIES PTE. LTD. Geylang Permanent
Job Description
We are seeking a passionate Big Data Platform engineer to join our Global Data Platform team.
Responsibilities- Operating Global Data Platform components (VM Servers, Kubernetes, Kafka) and applications (Apache stack, Collibra, Dataiku and similar).
- Implement automation of infrastructure, security components, and Continuous Integration & Continuous Delivery for optimal execution of data pipelines (ELT/ETL).
- Develop solutions to build resiliency in data pipelines with platform health checks, monitoring, and alerting mechanisms, quality, timeliness, recency, and accuracy of data delivery are improved
- Apply DevSecOps & Agile approaches to deliver the holistic and integrated solution in iterative increments.
- Liaison and collaborate with enterprise security, digital engineering, and cloud operations to gain consensus on architecture solution frameworks.
- Review system issues, incidents, and alerts to identify root causes and continuously implement features to improve platform performance.
- Be current on the latest industry developments and technology trends to effectively lead and design new features/capabilities.
- Bachelor’s degree in Engineering/Information Technology/Computer Science, or related discipline
- 5-7 years of experience in building or designing large-scale, fault-tolerant, distributed systems, (for example: data lakes, delta lakes, data meshes, data lake houses, data platforms, data streaming solutions…)
- In-depth knowledge and experience in one or more large scale distributed technologies including but not limited to: Hadoop ecosystem, Kafka, Kubernetes, Spark
- Migration experience of storage technologies (e.g. HDFS to S3 Object Storage)
- Integration of streaming and file based data ingestion / consumption (Kafka, Control M, AWA)
- Experience in DevOps, data pipeline development, and automation using Jenkins and Octopus (optional: Ansible, Chef, XL Release, and XL Deploy)
- Expert in Python and Java or another static language like Scala/R, Linux/Unix scripting, Jinja templates, puppet scripts, firewall config rules setup
- VM setup and scaling (pods), K8S scaling, managing Docker with Harbor, pushing Images through CI/CD
- Experience using data formats such as Apache Parquet, ORC or Avro Experience in machine learning algorithms is a plus.
- Hands-on experience in integrating Data Science Workbench platforms (e.g. Dataiku)
- Cloud migration experience might come handy • Experience of agile project management and methods (e.g., Scrum, SAFe)
- Knowledge of financial sector and its products is beneficial
- Exceptional communication skills to interact with both technical teams and business stakeholders.
- Detail-oriented mindset with the ability to manage multiples tasks and prioritize effectively
- Proactive, collaborative, and customer-focused approach to solving problems and driving adoption
- Passion for enabling a data-driven culture and empowering users thought data solutions.
EA Number: 11C4879
ANTAES ASIA PTE. LTD.Geylang
Senior Software Engineer (Java, Big Data)
Job Responsibilities
• Lead technical study into a proposed solution, while involving expertise from infrastructure big data expert, business analyst requirement
• Document proposed design and develop...
Huawei TechnologiesChangi, 14 km from Geylang
Responsibilities:
1. Responsible for data architecture, solution design, and product planning, in Big Data projects and product management.
2. Responsible for the technical side of big data projects, including planning, implementation...
APAR TECHNOLOGIES PTE. LTD.Toa Payoh, 4 km from Geylang
Job Description
We are seeking a passionate Big Data Platform engineer to join our Global Data Platform team.
Responsibilities
• Operating Global Data Platform components (VM Servers, Kubernetes, Kafka) and applications (Apache stack, Collibra...