Job Board

HADOOP SPECIALIST – Design Big Data Solutions for Leading Financial Services Org - R1.5Mil/a CTC, JHB

Job Title: HADOOP SPECIALIST – Design Big Data Solutions for Leading Financial Services Org - R1.5Mil/a CTC, JHB
Contract Type: Permanent
Location: Johannesburg
Salary: R1.5Mil/annum CTC
Contact Name: Ashleigh Watermeyer
Contact Email:
Job Published: October 24, 2018 04:21

Job Description

Fantastic opportunity for a HADOOP SPECIALIST to join a leading Financial Services Company and take the lead designing & developing new solutions on Hadoop, building new data pipelines and developing end to end BIG DATA Solutions. You will need to be highly skilled in HIVE, Pig, Spark, Impala, Oozie, Sqoop and MapReduce to provide expert insights to the business. 

This Hadoop Specialist position is Johannesburg based and is paying R1.5Mil/annum CTC

This leading Wealth Manager provides a range of Investment, Insurance and Wealth Management solutions into the private, commercial as well as institutional market. This company is changing the future of savings. By joining this company, you will have the opportunity to work in agile-driven teams on highly complex projects and will also be part of a big-name brand.

As Hadoop Specialist, with skills in HIVE, Pig, Spark, Impala, Ozie, Sqoop and MapReduce, you will build new data pipelines, perform advanced data manipulation of large amounts of data and develop Big Data Solutions that enable technology capabilities within the organization. You will lead the solution architecture – analyse and enhance the architecture of current operations & implement initiatives to improve application performance. As the Big Data SME within the business, you will need to have a good understanding of the financial services/banking industry in order to deliver well thought-out insights & come up with innovative ways to optimise the use of technology within the business.

• Relevant Tertiary Education – BSc / BCom / Computer Science / Information Systems / Hadoop Development Certification
• Solid experience in Hadoop – HIVE, Pig, Spark, Impala, Oozie, Sqoop, and MapReduce
• Good knowledge of database structures, theories, principles, and practices
• Ability to write Pig Latin scripts
• Hands on experience in HiveQL
• Familiarity with data loading tools like Flume, Sqoop and Kafka
• In-depth knowledge of Data Warehouse and Big Data best practices
• Java development experience
• Financial Services experience
• Analytical and problem-solving skills applied to Big Data domain

If you qualify for this role, please email your CV directly to:

Ashleigh Watermeyer or ash [at]
0 2 1  8 0 1  5 0 0 1
We appreciate every application and we do give each due consideration but if you have not had a response to your application within 14 days please consider it unsuccessful. To help us respond swiftly, please ensure you have read the requirements and ensure that your application contains the relevant details for the position you are applying for.