>

Job Summary


Salary
$5,500 - $7,100 SGD / Monthly

Job Type
Permanent

Seniority
Mid

Years of Experience
At least 3 years

Tech Stacks
Amazon S3
Route53
Analytics
IAM
VPC
EMR
CI
Dynatrace
EC2
EBS
Bamboo
Chef
Puppet
New Relic
Git
Apache
Kibana
Grafana
Prometheus
Kubernetes
Elasticsearch
Ansible
Terraform
Ruby
Jenkins
MySQL
Linux
Python
AWS

Job Description


Apply
We are looking for a DevOps Engineer to join us to own the release engineering and production and development environments. Working with the product development team and consulting delivery team, this position plays a critical role in our mission of building and delivering robust data platforms that incorporate data science and machine learning algorithms and models. This is a great opportunity to grow your software release skills and utilize your infrastructure management expertise.

At Dataspark, you get to work with rich and diverse datasets, cutting edge technology, and you get to see the impact of your results in real business and government decisions, which in turn provide positive social benefit for consumers at a large scale. As a startup that is part of Singtel, DataSpark provides an enviable work environment with spirited trailblazing and industrial countenance. Working alongside creative, energetic and passionate teammates from around the world, you get to be a part of our exciting growth journey as we build the company to the next level.

Responsibilities

  • Define, scope, size, implement, test, and deploy existing and new infrastructure for both clients and internal teams that processes hundreds of terabytes each day and growing
  • Develop, support, and improve tools for continuous integration, automated testing and release management
  • Install, configure and customize DataSpark software according to project requirements, including data ingestion, algorithms, APIs, UIs and security
  • Design, implement, operate and troubleshoot the automation and monitoring of our infrastructure in multiple environments and multiple data centers owned or rented from cloud providers
  • Serve as the subject matter export on infrastructure performance to the company as well as to our clients
  • Perform system integration tests, performance tests, technical acceptance tests, and user acceptance tests to ensure proper functioning of deployed systems
  • Troubleshoot and resolve issues in multiple environments
  • Improve our infrastructure capabilities, optimizing for cost, simplicity, and maintainability
Requirements

  • Experience in building and deploying CI/CD platforms like Jenkins, Artifactory, GitHub, Bamboo and capable to support applications both on-premise and AWS cloud
  • Deep technical expertise in DevOps automation tools and scripting, i.e. Python, Ruby
  • Strong experience in open source platform, particularly in Kubernetes, Containers.
  • Experience in any of the Configuration Management and Deployment tools like — Puppet, Ansible, Chef, Terraform etc
  • Experience in logging, monitoring, tracing e.g. Cloudwatch, Elasticsearch/Kibana (ELK), Prometheus/Grafana, New Relic, Data Dog, Dynatrace etc
  • Professional experience on operating Amazon AWS Cloud (EMR, EC2, VPC, VPN, EBS, S3, Route53, IAM, AWS CLI etc)
  • Demonstrated experience in software product life cycles, both traditional enterprise software development or agile internet data product development
  • Working knowledge of network security, web and network protocols and standards
  • Knowledge of information security issues is a plus
  • Good knowledge of monitoring systems

Desired Skills and Experience
Nagios, Git, Puppet, DevOps, Government Agency, MySQL, Bash, Vision, insight, Analytics, Python, business value, Apache, trustworthy, Linux, Data, Capability

Salaries of DevOps at DataSpark

Salaries from DataSpark that are similar to DevOps Engineer

View more DevOps salaries at DataSpark View more DevOps salaries


NodeFlair Insights of DataSpark