Hadoop Developer - Job Details

Job in 'Business Intelligence, Data Warehousing and Database Administration' discipline and 'Business Intelligence Consultant' category


Advertised Details
Job title: Hadoop Developer
Reference code: KLC0007
Job category: Business Intelligence Consultant
Position status: Filled Min Salary (p/m): Market related
Position type: Contract / Permanent Max Salary (p/m): Market related
City: Johannesburg  
Description
PBT Group has a requirement for a Hadoop Developer to interpret requirements provided by business and produce effective Big Data solutions.

Programming exposure for code transformations and integrating the big data solution with existing systems. Develop information solutions from a variety of sources for both structured and unstructured data. Technical ownership of Big Data solutions for structured and unstructured data.

DUTIES:
- Develop and implement big data models and solutions
- Design and implement ETL methodologies and technologies and the integration with big data
- Conduct root cause analysis on production issues
- Technical leadership of entire information management process of both structured and unstructured data
- Provide ongoing support and enhancement to ETL system
- Optimization and the information solutions
- Implementing machine learning algorithms
- Configuration of the Hadoop infrastructure and environment for optimal performance
- Integrate with statistical and actuarial analysts to build models
- Producing relevant technical documentation and specifications
- Estimate time and resource requirements for business requirement
- Integration of big data solutions with existing reporting and analytical solutions
- Develop data processing functions (DPF’s) using Java and Python
Skills Required
- Strong experience in Hadoop – HIVE, Pig, Spark, Impala, Oozie, Sqoop, and Map Reduce
- Writing high-performance, reliable and maintainable code.
- Ability to write MapReduce jobs.
- Good knowledge of database structures, theories, principles, and practices.
- Ability to write Pig Latin scripts.
- Hands on experience in HiveQL.
- Familiarity with data loading tools like Flume, Sqoop and Kafka.
- Knowledge of workflow/schedulers like Oozie.
- Analytical and problem solving skills, applied to Big Data domain
- Proven understanding with Hadoop, HBase, Hive, Pig, and HBase.
- Good aptitude in multi-threading and concurrency concepts.
- Must have Java experience.
- Financial Services experience.
- In-depth knowledge of Data Warehouse and Big Data best practices.
- Knowledge and technical appreciation of the interconnectivities and interfaces between various technical platforms, operating systems and processes.
- Good understanding of data ITIL
- Must understand the need to align the IT and business strategies.
Additional
- Tertiary qualifications with majors in at least one of the following: Computer Science, Information Systems or similar
- Certification in Hadoop Development.

With in-depth knowledge of business intelligence solutions and experience that spans more than two decades in over 25 countries, PBT Group has honed its expertise through engagement with most of the top 100 companies, answering to diverse needs to give clients not only a competitive edge, but also a sustainable advantage.

You are currently not logged in.

You will therefore not be able to view the complete list of requirements
for this job, nor will you be able to apply for it.

Please login above or register your details with us.


Latest Jobs

Quick Links