- Job Title
- Hadoop Administrator
- Job ID
- Bloomington, MN 55439
- Other Location
From our start in 2009, Conexess has established itself in 3 markets, employing nearly 200+ individuals nation-wide. Operating in over 15 states, our client base ranges from Fortune 500/1000 companies, to mid-small range companies. For the majority of the mid-small range companies, we are exclusively used due to our outstanding staffing track record
Who We Are:
Conexess is a full-service staffing firm offering contract, contract-to hire, and direct placements. We have a wide range of recruiting capabilities extending from help desk technicians to CIOs. We are also capable of offering project based work.
Conexess Group is aiding a Bloomington based client in their search for a Hadoop Administrator. This is a long term opportunity with a competitive compensation package.
******We are unable to work C2C on this role******
The Hadoop Administrator has specific responsibilities of maintaining and support all Hadoop clusters in all environments to ensure the environment is stable and accessible for all tenants.
- This role is empowered and accountable for installing new Hadoop components, setup security, re-balance YARN resource pool, monitoring jobs, and fixing issues.
- On-going support is also a major part of the workload, which include diagnose issues, providing support to application teams, on boarding new users, and other applicable tasks.
- Professional growth of knowledge across both technical and operational domains is required.
- Hands on experience and proficiency in working with the Hadoop ecosystem, including: HDFS, Hive, Hbase, YARN, Sqoop, Oozie, Spark, Ambari, and Ranger.
- Passion for big data/clustered environments
- Scripting experience in both of: Python and BASH
- Experience working in Linux/Unix environments
- Experience with SQL and/or NoSQL
- Understanding of relational data models
- Strong analytical and problem solving skills
- Ability to work in a team environment to solve complex problems with little direction
- Experience with GIT / Jenkins
- Experience administrating the Hadoop stack, particularly HDP (HortonWorks)
- Experience working with Kerberized environments
- Theoretical knowledge of big data/analytics concepts
- Experience developing and troubleshooting ETL
- Understanding of networking concepts.