>

Job Summary


Job Type
Permanent

Seniority
Senior

Years of Experience
At least 10 years

Tech Stacks
Docker
Shell Script
Analytics
SOA
Shell
Git
Hive
Spark
kafka
SQL
Scala
Hadoop
Linux
Go
Python
AWS
Java

Job Description


Apply
You will use your extensive data engineering experience to architect, develop, and maintain data pipeline and data warehouse systems and tools to ensure the success

Responsibilities
  • Design and build data pipelines from ingesting third-party data sources and building APIs for receiving data from internal applications through to data warehouse(+other sources) design and and implementation
  • Communicate design ideas through written, verbal and visual communication to an audience of both technical and non-technical colleagues.
  • Troubleshoot complex issues and provide root cause analysis
  • Work cross-functionally with Product, Analytics, Business, and other teams to ensure successful execution of projects/products
  • larger volume of data and managing in terms of storage and optimization of jobs ,data insights,data security, resource allocation

Must-have Knowledge And Experience
  • Experience in designing, analyzing and implementing highly available and scalable big data platforms
  • In depth knowledge of AWS cloud
  • Experience developing in at least one of the following languages in the context of data engineering: Scala, Python, Go, Java, Shell scripting
  • Understanding of modern software engineering best practices like design patterns, microservices and SOA engineering practice.
  • You have an agile mindset and live up to the principle of “value-first”.
  • Should have Media domain experience

Requirements
  • 10+ years of IT experience with Data Architect skill
  • 4+ years in java , python or similar language
  • 4+ years of experience in various Big Data & Hadoop Ecosystem, Hadoop distributed systems
  • Expertise in Hadoop, Spark, Scala, Hive , Kafka.
  • Expertise in designing & best practices around implementing processes for loading data
  • Experience working on complex distributed systems
  • Advanced knowledge of SQL
  • Experience working with business teams/ analysts to gather requirements
  • Expert level understanding of normalization techniques and when and how to use them
  • Experienced with schema design, data warehousing, and business intelligence best practices
  • Ability to debug unfamiliar systems
  • High level of comfort in Linux environments on the command line
  • High level of comfort using a variety of AWS products
  • Comfortable working with Docker, familiar with virtual machines, or other containerized environments
  • Experienced with common git best practices
  • Excellent communication skills (verbal, written and visual)

Salaries

There are no salaries from Singapore Press Holdings that are similar to this job

View more salaries from Singapore Press Holdings


NodeFlair Insights of Singapore Press Holdings