COVID-19: We are continuing to provide recruitment services to our customers & candidates and urge you to contact your relevant recruiter should you need any assistance.
  • +61 2 80678518 (Head Office)
  • +91 40-7960-9275 (Offshore Office)
Jobs in Australia, Sydney, Melbourne | Job Search Australia | Careers Australia | Job Opportunities | Solution Architect and a Hadoop Engineer
Careers
LinkedIn Apply
Experience :  5+ Years
 Location :  canberra
Hiring Mode :  Contract
Contract :  5+ Months
Salary :  Open
Description : 

The requirement for the Solution Architect is as follows:  

Essential Criteria

  1. Demonstrated ability to architect solutions to support ITD projects using the System Development Life Cycle (SDLC) and Agile development practices and processes maximising the EDW environment and connectivity between different technical capabilities
  2. Experience with one or more EDW technologies such as, Cloudera Hadoop, Teradata, SAS, Informatica Power centre, Informatica Big Data Manager, R, Python, SQL.
  3. Demonstrated knowledge of application, data and infrastructure architecture disciplines
  4. Experience in developing and documenting advice on the use of big data tools relevant to a government EDW environment
  5. Demonstrated interpersonal skills and capacity to communicate effectively with all stakeholders
  6. Ability to contribute towards knowledge and skills transfer

Desirable Criteria:

  1. Inspires leadership through the ability to define and meet the expectations of the role
  2. Relevant qualifications and/or certification as a Solution Architect

 

The requirement for the Hadoop Engineer is as follows:   

Essential Criteria

  1. Demonstrated ability to develop high-quality, scalable big data solutions for ingesting, transforming, cleansing, storing and managing large data sets from various sources using the Hadoop eco-system
  2. Demonstrated ability including creation, review and development of data models and views; and being able to undertake ELT development activities using Informatica / BDM and associated tools
  3. Advanced knowledge of application, data and infrastructure architecture disciplines;
  4. Experience with Hadoop distributions in particular Cloudera
  5. Experience using Hadoop-based technologies (e.g. MapReduce, Spark, Yarn, Hive, Impala); 
  6. Experience and knowledge of Java, Scala, Python & R
  7. Experience with SQL-based technologies (e.g. Teradata) and knowledge of NoSQL technologies
  8. Experience in development of appropriate documentation such as design specifications, as built and handover documentation
  9. Demonstrated interpersonal skills and capacity to communicate effectively with all stakeholders
  10. Ability to contribute towards knowledge and skills transfer

Desirable Criteria:

  1. Inspires leadership through the ability to define and meet the expectations of the role
  2. Relevant computer science qualifications and/or Hadoop industry certificationOpen