- Job Title
- Java/Big Data
- Job ID
- San Diego, CA 92110
- Other Location
Job Title: Java Programmer/Big Data
Location: San Diego, CA
Position Requirements: Eligibility for a DoD Security Clearance is required (thus U.S. Citizenship is required)
Technology Unlimited Group (TUG) is a customer-focused system engineering and software development company dedicated to providing innovative and cost effective solutions. TUG is dedicated to making a positive impact on each of our customers, our employees, and our community. Our principal San Diego based customers are the Department of Defense and other prime contractors supporting the Department of Defense.
At TUG we realize that our ability to accomplish goals and be successful depends solely on the hard work, dedication and integrity of our team members. Accordingly, TUG fosters a work environment that provides you with opportunities to develop your professional skill set, advance your career, and achieve your goals.
TUG is always searching for innovative and creative team members that are dedicated to the pursuit of excellence. If you share in our commitment to providing quality service while maintaining balance in life and are interested in joining our team, we want to talk to you about a role at TUG!
For more general information see our website at www.4tug.com.
Required Skills and Qualifications:
The candidate should have a BS degree in Computer Science (or other applicable discipline), and coursework and /or experience in the following:
- Ability to work with huge volumes of data so as to derive Business Intelligence
- Analyze data, uncover information, derive insights and propose data-driven strategies
- A knowledge of OOP languages like Java
- Database theories, structures, categories, properties, and best practices
- A knowledge of installing, configuring, maintaining and securing Hadoop
- An analytical mind, good learner, self-starter.
Eligibility for a security clearance is required (thus U.S. Citizenship is required)
Desired Skills and Qualifications:
- Proficient understanding of distributed computing principles
- Management of Hadoop cluster, with all included services
- Ability to solve any ongoing issues with operating the cluster
- Proficiency with Hadoop v2, MapReduce, HDFS
- Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming
- Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala
- Experience with Spark
- Experience with integration of data from multiple data sources
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
- Knowledge of various ETL techniques and frameworks, such as Flume
- Experience with various messaging systems, such as Kafka or RabbitMQ
- Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O
- Good understanding of Lambda Architecture, along with its advantages and drawbacks
- Experience with Cloudera/MapR/Hortonworks