JOB PURPOSE
Responsible to support data science projects and solutions by leveraging data and software engineering experience to solve a variety of use cases across the Telkom Group and for its customers. Expected to be highly skilled in batch and real-time data ingestion and data transformation with the ability to design, build and scale data pipelines for efficient data integration and processing..
RESPONSIBILITIES
• Build data integration pipelines used to move data between databases, file systems & other locations while tracking data quality according to measures set by data owner.
• Provide data support for specialized data centric services we provide to grow our data maturity:
• This includes but are not limited to: Data modelling, database design, data quality assessments, data format conversion, data transformation, data tool selection.
• Enable big data engineering by creating and managing code and tools created to efficiently obtain and move vast amounts of data between file systems and/or databases while ensuring data integrity.
• Enable live/real-time data engineering by creating and managing code and tools built to efficiently obtain and move real-time data while ensuring data integrity.
• Engage with stakeholders to support the design and delivery of data science projects and solutions.
• Use data engineering techniques to solve business problems
• Responsible for working with a team of data engineers to develop and maintain our cloud-based development and production platforms.
• Support ETL/ELT processes and data ingestion for exploratory data analysis and solution development.
• Lead and develop a team of junior data engineers.
• Contribute to our agile way of work and our innovation culture.
• Up to date knowledge of data platforms and related technologies.
• Translate business requirements into system requirements.
• Consistent documentation of all implemented data integration pipelines and processes.
• Support tools, data integration applications and infrastructure lifecycles via standard service management principles and processes
MINIMUM REQUIREMENTS
• Data and cloud certifications will be advantageous (GCP, Azure, AWS) as well as certifications for other products in our stack (Airflow, Beam, Cloud
• DataFlow, Apache Arrow, Apache Arrow Flight, Apache Iceberg, Tableau, Alteryx)
• QUALIFICATIONS
• 3-year degree/ diploma (NQF level 6) preferably in Computer Science, Mathematics, Statistics, Data Engineering or a related field. A relevant post
• graduate degree will be an added advantage.
• EXPERIENCE
• 3-5 years relevant experience, of which at least 2 years must have been in a data engineering environment.
• Experience in ICT/ Telecommunications will
• be an advantage.
• Experience with system and process analysis and design.
• SPECIAL REQUIREMENTS
• Experience with Google Cloud Platform.
• Expected to stay abreast of new data engineering frameworks and developments and to put them into practice.
• FUNCTIONAL KNOWLEDGE
• Relational and non-relational database foundational knowledge
• Data engineering knowledge base
• Data integration concepts and terms
• Software and data engineering knowledge (Python, Scala, Shell scripting, JavaScript, Firebase)
• Cloud data streaming knowledge (Kafka, Sqoop, Pub-Sub based models/ services, Redis, RabbitMQ, Airflow, Beam, Cloud DataFlow, Apache Arrow, Apache Arrow Flight, Apache Iceberg)
• Data ops CI/CD and cloud deployment best practices knowledge
• Cloud computing and platform management (GCP, Azure, AWS, etc.)
Full understanding of science and scientific principles.Our client has an EE opportunity available for a Production Pharmacist.
PSG has an exciting career development opportunity for a degreed individual interested in a career as Insure Adviser Assistant within the short-term insurance\u2026
Must have a minimum of 3/4 years' experience within a Management position within high-end fashion or hospitality industry.Must have good attention to detail.
Enhance excellent customer service in the despatch department.RCL FOODS is seeking a Despatch Supervisor to join our Millbake - Baking Business Unit.