>

Job Summary


Job Type
Permanent

Seniority

Years of Experience
Information not provided

Tech Stacks
play
Analytics
Pandas
Scrapy
Spark
Airflow
Jenkins
Hadoop
Python
AWS

Job Description


Apply
We are looking for a Data Engineer to join our Operations Team, who will play an integral part in Castlery, you will be responsible for collecting and creating a data warehouse, generating insights to drive growth.

  • Define, collect, and model data from business processes or third parties to generate insights and drive strategic or continuous improvement initiatives for the whole company.
  • Work with technology teams to design and implement reliable and scalable data pipelines.
  • Work with functional stakeholders to build modern data infrastructure (e.g. data warehouse, data lake) and solve any data-related technical issues.
  • Be the ‘Go-To’ expert and constantly improve data infrastructure in Castlery.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Computing Engineering or other equivalent degrees with outstanding academic achievements.
  • Proven track records working as Data Engineer, Data Scientist, Data Analyst or equivalent.
  • Good command of database structures and query languages.
  • Strong in Python programming and familiar with popular libraries such as Pandas, Scrapy.
  • Familiar with AWS Data Analytics service and setting up data pipelines, e.g. Airflow, Jenkins.
  • Knowledge of big data framework and tech stack is a plus, e.g. Spark, Hadoop.
  • Passion and curiosity for solving challenging problems by leveraging on the right technology.
  • Ability to work collaboratively in a multi-cultural team environment and lead changes.
  • Ability to work effectively with people at all levels in an organization.
  • Ability to deliver sophisticated ideas effectively, both verbally and in writing, in English.

Salaries

There are no salaries from Castlery that are similar to this job

View more salaries from Castlery


NodeFlair Insights of Castlery