ChantillyRecruiter Since 2001
the smart solution for Chantilly jobs

Hadoop Engineer

Company: Octo Consulting Group
Location: Chantilly
Posted on: May 26, 2023

Job Description:

Company Information
Octo, an IBM company, is an industry-leading, award-winning provider of technical solutions for the federal government. At Octo, we specialize in providing agile software engineering, user experience design, cloud services, and digital strategy services that address government's most pressing missions. Octo delivers intelligent solutions and rapid results, yielding lower costs and measurable outcomes.
Our team is what makes Octo great. At Octo you'll work beside some of the smartest and most accomplished staff you'll find in your career. Octo offers fantastic benefits and an amazing workplace culture where you will feel valued while you perform mission critical work for our government. Voted one of the region's best places to work multiple times, Octo is an employer of choice!

Job Description

Are you a self-motivated and experienced Hadoop Engineer? Are you looking for a significant career growth opportunity? You will be joining the team that is deploying and delivering a cloud-based, multi-domain Common Data Fabric (CDF), which provides data sharing services to the entire DoD Intelligence Community (IC). The CDF connects all IC data providers and consumers. It uses fully automated policy-based access controls to create a machine-to-machine data brokerage service, which is enabling the transition away from legacy point-to-point solutions across the IC enterprise.
We were founded as a fresh alternative in the Government Consulting Community and are dedicated to the belief that results are a product of analytical thinking, agile design principles and that solutions are built in collaboration with, not for, our customers. This mantra drives us to succeed and act as true partners in advancing our client's missions.
Program Mission...
The CDF program is an evolution for the way DoD programs, services, and combat support agencies access data by providing data consumers (e.g., systems, app developers, etc.) with a "one-stop shop" for obtaining ISR data. The CDF significantly increases the DI2E's ability to meet the ISR needs of joint and combined task force commanders by providing enterprise data at scale. The CDF serves as the scalable, modular, open architecture that enables interoperability for the collection, processing, exploitation, dissemination, and archiving of all forms and formats of intelligence data. Through the CDF, programs can easily share data and access new sources using their existing architecture. The CDF is a network and end-user agnostic capability that enables enterprise intelligence data sharing from sensor tasking to product dissemination.

Skills & Requirements
Primarily responsible for operating and sustaining a large clustered system in a secure environment. In this role, you will:

  • Deploy and maintain platform in accordance with network, compute, and storage infrastructure requirements
  • Monitor and manage the fielded platform to include any newly developed data integration capabilities, ensuring the highest availability factor for the platform
  • Coordinate with appropriate vendors for all required professional services needed in support of software / tools / technologies deployed within the system
  • Aid Developers in designing solutions for customer required data integrations
  • Document SOPs related to instance deployments, configurations, transforms/crosswalks, maintenance, and monitoring requirements
  • Develop automation to support the repeatable installation, configuration, monitoring and management of the system.

    What we'd like to see...

  • Must possess an active TS/SCI security clearance and willingness to obtain a Polygraph level clearance is required
  • DoD 8570 IAT Level II Certification (e.g. Security+)
  • Experience with systems operations in a large, clustered environment
  • Experience using Horton Data Platform (HDP) or Cloudera Data Platform (CDP)
  • Experience with cloud infrastructure management
  • Advanced organizational skills with the ability to handle multiple assignments
  • Strong written and oral communication skills


  • Experience with installation, configuration, and management of Cloudera Data Platform (CDP) and/or Horton Data Platform (HDP)
  • Working understanding of Kafka, Knox, HBase, WebHDFS, and Apache Zookeeper, Cloudera Manager or Ambari.
  • Experience on most phases of application development activities with minimal supervisory guidance
  • Experience with DevOps/SecDevOps in a government cloud infrastructure

    Years of Experience: 8 years of experience or more.
    Education: Bachelor's degree in systems engineering, computer engineering, or a related technical field (preferred)
    Location: Chantilly, VA
    Clearance: Active TS/SCI w/ ability to obtain CI Poly

    Octo is an Equal Opportunity/Affirmative Action employer. All qualified candidates will receive consideration for employment without regard to disability, protected veteran status, race, color, religious creed, national origin, citizenship, marital status, sex, sexual orientation/gender identity, age, or genetic information. Selected applicant will be subject to a background investigation.
    Octo is an IBM subsidiary which has been acquired by IBM and will be integrated into the IBM organization. Octo will be the hiring entity. By proceeding with this application, you understand that Octo will share your personal information with other IBM affiliates involved in your recruitment process, wherever these are located. More Information on how IBM protects your personal information, including the safeguards in case of cross-border data transfer, are available here: ".

Keywords: Octo Consulting Group, Chantilly , Hadoop Engineer, Engineering , Chantilly, Virginia

Click here to apply!

Didn't find what you're looking for? Search again!

I'm looking for
in category

Log In or Create An Account

Get the latest Virginia jobs by following @recnetVA on Twitter!

Chantilly RSS job feeds