Are you looking for an opportunity to develop a data platform that will have an impact on rapid exploitation and sharing of multi-INT information across the intelligence community? Solid platform development is a critical part of any program’s success and you know how to do it right – scalable design with baked in security. That’s why we need you, a developer with the skills to build a platform that will transform the integrated intelligence mission.
As a data ingest developer on our team, you’ll design and develop the core data ingestion architecture and specific data source ingestion flows for our project from initial development to operational deployment. You’ll work with customers and end-users to understand their mission, current architecture, and security requirements. With a focus on the customer’s goals, you’ll build a design that will scale to meet their evolving needs. Your technical expertise will be vital as you recommend tools and capabilities based on your research of the current environment and new technology. Your design will set the standard for future development, so you’ll craft an architecture that smoothly works with existing infrastructure without compromising security. As a technical contributor, you’ll integrate and optimize multiple data flows in the system to help your customers meet their toughest challenges. This is a chance to use your deep understanding of the ingestion and transformation of Multi-INT data sources and broaden your skill set into areas like Cloud computing, large-scale data ingestion and processing, multiple data store types, including graph data, time series, spatial, and other NoSQL, high performance data streaming, data science, and supporting the integration of novel mission applications and analytics. Join us as we develop software-based solutions to make a difference for the integrated intelligence mission.
Empower change with us.
Nice If You Have:
- 2+ years of experience with Java or ETL engineering
- 1+ years of experience with working in data ingestion, processing, and distribution while also performing ETL activities using Apache NiFi
- Experience in developing and deploying data ingestion, processing, and distribution systems, while working with AWS technologies and pub/sub messaging technology like Apache Kafka
- Experience in working with IC data sets and NoSQL databases, including ElasticSearch and HBase
- Experience with using the following AWS datastores: RDS Postgres, S3, or DynamoDB while working in an Agile software development practices
- Top Secret clearance required
- Ability to obtain Security+ CE, SSCP, CCNA-Security, or GSEC Certification within 6 months of start date
- Experience with cognitive computing, data integration, data mining, Natural Language Processing, Hadoop platforms, or automating machine learning components
- Experience with data mining using current methods and tools
- Experience with graph data stores, time series database, and other NoSQL technologies
- Knowledge of one or more of the following: Jira, Git, Kafka, Kubernetes, Rancher, or Docker
- Knowledge of data science tools and their integration with big data stores
- Knowledge of data security policies and guidelines
- TS/SCI clearance with polygraph preferred
- BA or BS degree preferred; MA or MS degree a plus
- AWS Certification
- Security+ CE, SSCP, CCNA-Security, or GSEC Certification
Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information; Top Secret clearance is required.
We’re an EOE that empowers our people—no matter their race, color, religion, sex, gender identity, sexual orientation, national origin, disability, veteran status, or other protected characteristic—to fearlessly drive change.