>

Job Summary


Job Type
Permanent

Seniority

Years of Experience
Information not provided

Tech Stacks
Oracle
Analytics
Flow
Elastic
CI
Logstash
Kibana
kafka
MSSQL
SQL
C++
C
Linux
Python

Job Description


Apply
  • Design and develop software for data management, data processing, data analytics and data visualization of big data
  • Build data pipelines to ensure quality end-to-end data flow
  • Implement CI/CD pipelines to speed up delivery, improve product quality and set up monitoring dashboards for system health tracking

Requirements

  • Bachelor's degree in Information System, Computer Science, Computer Engineering, Information Technology, or equivalent
  • Prior experience in software development, system integration, testing and production deployment preferred
  • Proficient with one or more of the programming languages such as C, C++ and Python in Linux OS
  • Experience in using Kafka stream/connect, SQL/Non-SQL databases (e.g. Oracle, MS-SQL, Mongo), enterprise content management (e.g. SharePoint), enterprise search (e.g. Elastic Search), data transformation (e.g. logstash), data visualisation (e.g. Tableau, Kibana) preferred
  • Familiarity with DevOps practices and agile development cycle will be an added advantage
  • Good interpersonal, analytical and problem-solving skills
  • Able to work independently and also as part of the team
  • Working location is in Science Park II

Salaries

There are no salaries from ST Engineering that are similar to this job

View more salaries from ST Engineering


NodeFlair Insights of ST Engineering