takealot.com , a leading South African online retailer, is looking for a highly talented DataOps Engineer for our DataOps team in Cape Town.
We are a young, dynamic, hyper growth company looking for smart, creative, hard-working people with integrity to join us. We offer a market related, Total Remuneration Package which allows full flexibility according to your needs, a great work environment and a promise that you won’t be bored as long as you are prepared for a challenge and want to build something great.
At Takealot, our DataOps Team is focused on delivering value faster by creating predictable delivery and change management of data, data models and related artifacts. DataOps uses technology to automate the design, deployment and management of data delivery with appropriate levels of governance, and it uses metadata to improve the usability and value of data in a dynamic environment. Takealot is growing quickly, which brings a number of unique and interesting challenges. Data is growing quickly within the organization, and there is a lot of opportunity to shape the tools, technologies and culture of data in the company.
This position reports to the Data Platform Team Lead
Your responsibilities will include:
Terraform to manage Cloud Infrastructure, Chef to manage virtual servers
Building and deploying systems for metrics, monitoring, and logging
Operations for Kafka, Kubernetes, and more
CI/CD Build Systems to ensure our teams can deploy frequently and safely
Code management and review
Hardening servers, and building security into the platform
Developing automation so we can focus on the hard problems
Implementing features, technology, and processes that move us towards industry best practices, improving on scalability, efficiency, reliability, and security
Responding to incidents and requests
Is passionate about technology, enjoys keeping up to date with the industry
Is a team player that can function as an individual
Excellent communication skills
Shows solid reasoning and decision making
Has a deep understanding of database engines
Confident in his/her ability and skills
Qualifications and experience:
Bachelor’s Degree or Advanced Diploma in Information Systems, Computer Science, Mathematics, Engineering and a minimum of 3 years of DataOps experience in a software/technology environment is required.
In the event that a candidate does not have a bachelor’s degree or an advanced diploma (in Information Systems, Computer Science, Mathematics, or Engineering), an equivalent experience requirement must be met, which equates to a minimum of 6 years of DataOps experience in a software/technology environment.
An understanding of computer science fundamentals, including linux and operating systems, networking
Solid grasp of development fundamentals such as data structures and algorithms
Can write code (preferably in Python)
Experience with open source relational database systems (MySQL, PostgreSQL)
Practical experience working with other database systems like BigQuery, Redis and ElasticSearch will be beneficial
Has experience with Kafka, PubSub, or other event-based systems
Has experience with Google Cloud, or another cloud provider (architecture, operations)
Understands cost and implications of scaling
Has experience managing Kubernetes Clusters (certificates, users, kubeadm, kubectl etc.)