Hadoop Engineer

Encora logo

Encora

View Salaries, Reviews, and more  

Job Summary


Salary
S$7,270 - S$9,547 / Monthly EST

Job Type
Permanent

Seniority
Senior Mid

Years of Experience
3-9 years

Tech Stacks
Amazon S3 Analytics Spark Streaming Teradata Avro Apache Presto Hive Spark JSON Hadoop Java

Job Description

The position exists to continuously enhance the efficiency of existing Data pipeline Job/framework and setup a best practice for Big Data projects planned on the BI & Analytics platform.

Key Accountability :
1.        Review and tune the code for the Spark framework to increase the throughput.
2.        Establish best practices and guidelines to be followed by tech teams while writing codes in Spark.
3.        Create a framework to continuously monitor the performance and identify the resource intensive jobs.

Job Duties & responsibilities :
·        Able to write code, services and components in Java, Apache Spark and Hadoop.
·        Responsible for systems analysis Design, Coding, Unit testing, CICD and other SDLC activities.
·        Proven experience working with Apache Spark streaming and batch framework.
·        Proven experience in performance tuning Java, Spark based applications is a must.
·        Knowledge of working with different file formats like Parquet , JSON and AVRO.
·        Data Warehouse experience working with RDBMS and Hadoo,  Well versed with concepts of change data capture and SCD implementation.
·        Hands-on experience in writing codes to efficiently handle files on Hadoop.
·        Ability to work in projects following Micro Services based architecture.
·        Knowledge of S3 will be a plus.
·        Ability to work proactively, independently and with global teams.
·        Strong communication skills should be able to communicate effectively with the stakeholders and present the outcome of analysis.
·        Working experience in projects following Agile methodology is preferred.

Required Experience :
-        At least 3 to 9 years of working experience, preferably in banking environments
-        Expert knowledge of technologies e.g. Hadoop, Hive, Presto, Spark, Java, RDBMS like Teradata and File storage like S3 are necessary.
-        Candidate must speak and write well.
-        Degree Holder.

Core Competencies :
·        Experienced on Hadoop frameworks such as Hive, Spark, Java.
·        Strong knowledge of relational database systems and data warehousing techniques.
·        Strong listening and communication skill set, with ability to clearly and concisely explain complex technical issue

banner icon
Prepare For Your Interview in 1 Week?
Equip yourself with possible questions that interviewers might ask you, based on your work experience and job description.
Get Started!

Achieve your dream job with our top-notch tools!

Resume Checker Illustration

Resume Checker

Our free resume checker analyzes the job description and identifies important keywords and skills missing from your resume in just a minute!

Check Now
Resume Checker Illustration

Interview Preparation

Utilizing advanced AI, our tool generates tailored interview questions based on your industry, role, and experience. Practice and receive feedback on your answers in real time!

Let's Prepare
Resume Checker Illustration

Resume Builder

Let us show you the differences between a bad, good, and great resume, and guide you in building a resume that helps you stand out to employers, ensuring you land your next position faster!

Build Resume