Job description Posted 22 September 2021

Some careers have more impact than others.

If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be.

HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

 

We are currently seeking an experienced professional to join our team in the role of Big Data Engineer

 

Business: Enterprise Technology

Open positions: 1

Role Title: Senior Hadoop Admin

Location (Country / City ): UK / Sheffield

Payrate: £850/ Day and 1090/Day PAYE or Umbrella

Duration: 6 Months (Potential for contract extension)


Principal responsibilities

  • Perform CDP 7.1.x installation
  • Migrate Data and content from HDP to CDP platform
  • Perform in place major / minor upgrades on HDP/CDP clusters
  • Troubleshooting of CDP issues and configuration changes
  • Performance Tuning of the clusters
  • Cluster expansions and HA / DR setups
  • Setup Alerting and health checks
  • Automation of procedural tasks
  • Research and adopt new tooling on CDP eco system
  • Provide level 3 support for the BAU team


Requirements

  • Should have deep understanding & technical knowledge of HDP / CDP products offerings and eco system tools across the technology stack
  • Solid background of HDP/CDP platform Installation, administration and maintenance
  • Good experience in performing large scale and complex upgrades from HDP (2.6.x) to CDP (7.1.x) platforms (on premise infrastructure)
  • Experience in doing product upgrades of the core big data platform, cluster expansion, setting up High availability for core services etc.
  • Should possess good Linux administration skills
  • Comfortable working with the shell, understanding scripting, integrated authentication stacks (AD / Linux integration, Krb and ldap based directory services, SSO identity services), file systems, security, secure shell, permissions, process management, regular maintenance tasks.
  • Setting up disaster recovery solutions for clusters using platform native tools and custom code depending on the requirements
  • Should possess good understanding of various enterprise security practices and solutions such as LDAP, Kerberos, KMS, Ranger & Knox
  • Shell / Python scripting / automation knowledge will be an added advantage
  • Role relevant qualifications, i.e. HWX/Cloudera Certified Platform Administrator is desirable but not essential
  • Bachelors / Master’s degree in IT/Technology domain
  • 10 + Years of overall IT experience
  • 3+ years of relevant Big Data Platform engineering experience