Are you looking to be a part of the most influential company in the blockchain industry and contribute to the crypto-currency revolution that is changing the world?
Responsibilities
- Ensure performance and availability of big data infrastructure
- Deploy and operate big data components according to business scenarios
- Response to incidents promptly and identify potential issues
- Keep learning big data technologies and continuously optimize performance, availability and scalability
Qualifications
- 3+ years of Big Data DevOps experience
- Familiar with data pipeline including batch ETL, realtime streaming and job scheduling
- Hands on experience on big data components including Hadoop, Hive, Spark, Spark Streaming, Presto, HBase, Elasticsearch, Kafka, ZooKeeper, Redis, Airflow
- Experienced in deploying, operating, monitoring, optimizing and troubleshooting large scale big data infrastructure
- Proficient in shell scripting, Python and SQL
- Experiences with AWS big data services is a plus
- Team working, analytical mind, be optimistic, willing to accept challenges and quick response against incidents
- Due to business nature and smooth collaboration within teams, bilingual is a must - English and Mandarin
Conditions
• Do something meaningful; Be a part of the future of finance technology and the no.1 company in the industry
• Fast moving, challenging and unique business problems
• International work environment and flat organisation
• Great career development opportunities in a growing company
• Possibility for relocation and international transfers mid-career
• Competitive salary
• Flexible working hours, Casual work attire