Jobs Career Advice Signup
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Apr 10, 2023
    Deadline: Not specified
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Never pay for any CBT, test or assessment as part of any recruitment process. When in doubt, contact us

    Since our establishment in 1918, Sanlam has been a prominent part of the South African business landscape. We have always held a long-term view of how business adapts to the demands of the environment in which it operates. Today, in a dynamic world, we see an evolving set of social, economic, political and environmental imperatives that require our skilfu...
    Read more about this company

     

    Senior Big Data Engineer : SBI

    Role Description

    • Do you have a passion for working with big data and developing data pipelines that can handle large volumes of data? Are you experienced with Hadoop Distributed File System (HDFS) and Cloudera, and familiar with Java, Python or Spark? If you answered yes, then we want you to join our data warehouse and analytics team!
    • We are one of the largest insurers in Africa and we are looking for a highly skilled Big Data Engineer to help us make sense of our data. In this role, you will be responsible for designing, developing, and maintaining data pipelines to ingest and persist large volumes of data. You will collaborate with data scientists and data analysts to understand business requirements and translate them into technical specifications. You will also troubleshoot data pipeline issues and optimize performance, as well as contribute to the design and implementation of our cloud strategy.

    What will you do?

    • Opportunity to work with a talented and dynamic team of professionals.
    • Competitive salary and benefits package.
    • Opportunity to contribute to our cloud journey.
    • Access to cutting-edge technologies and tools.
    • Opportunity for growth and career advancement.

    What will make you successful in this role?

    Requirements:

    • Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.
    • At least 3 years of experience in designing and developing data pipelines.
    • Experience with HDFS and Cloudera.
    • Proficient in scripting languages such as Python, Bash, or Perl.
    • Familiarity with Spark or similar environments.
    • Strong analytical and problem-solving skills.
    • Excellent communication and collaboration skills.

    Knowledge and Skills

    • Object and data models
    • Defines, designs and builds dimensional databases
    • Translates business needs into long-term architecture solutions
    • Development of data warehousing blueprints.
    • Data warehouse management

    Core Competencies

    • Being resilient - Contributing independently
    • Collaborates - Contributing independently
    • Cultivates innovation - Contributing independently
    • Customer focus - Contributing independently
    • Drives results - Contributing independently

    Closing Date: 30 August 2023

    Method of Application

    Interested and qualified? Go to Sanlam Group on careers.sanlamcloud.co.za to apply

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Sanlam Group Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail