Full Job Description
Black Canyon Consulting (BCC) in partnership with Medical Science and Computing (MSC) is searching for a strong DevOps Inventory Engineer Analyst to join our DevOps team to work on the Enterprise Inventory of software products, developments and operations.at National Center for Biotechnology Information (NCBI) to help internal teams adopt the DevOps platform. This opportunity is full time and onsite at the NCBI in Bethesda, MD.
Enterprise Inventory is a collection of software artifacts and services that we develop, build, and operate. Inventory is a part of software process modernization and supports transitioning of software development and operations to a modern DevOps-based infrastructure that automates software builds and testing, deploys services to a service mesh, and automatically monitors operations. In Inventory we track project and product teams, link software development activities with our staff database, and with a formal portfolio of service offerings. The Inventory supports the adoption of the DevOps platform, including analysis and mapping of legacy infrastructure to the DevOps platform to allocate costs and plan for DevOps platform adoption.
NCBI is part of the National Library of Medicine (NLM) at the National Institutes of Health (NIH). NCBI is the world's premier biomedical center hosting over six million daily users that seek research, clinical, genetic, and other information that directly impacts biomedical research and public health – at NCBI you can literally help to accelerate cures for diseases! NCBI's wide range of applications, platforms (node, python, Django, C++, you name it) and environments (big data [petabytes], machine learning, multiple clouds) serve more users than almost any other US Government Agency according to https://analytics.usa.gov/.
We attract the best people in the business with our competitive benefits package that includes medical, dental and vision coverage, 401k plan with employer contribution, paid holidays, vacation, and tuition reimbursement. If you enjoy being a part of a high performing, professional service and technology focused organization, please apply today!
Duties & Responsibilities
A successful candidate for this position will work in a small team to:
Develop software to operate an enterprise-wide software inventory
Design data warehouse schema and ETL processes to represent legacy software processes in Inventory
Instrument software delivery pipeline with logging to track software delivery activity in Inventory
Index, integrate, and report of software development and operational metrics based on data in Inventory
Research, select and operate data warehousing and business intelligence reporting tools
Script to integrate Inventory with software development tools (source code control, continuous integration server, issue tracker, incident management system, time-series database, etc.)
Analyze legacy software infrastructure and integrate it with Inventory
Plan migration of legacy systems to the new infrastructure. Legacy software base consists of tens of thousands of source files, comprising millions of lines of code in C++ and other languages, developed over the past 30 years, and supported by several thousand on-premises hosts. Many software build and deployment processes are unique to each project team.
Index and catalog legacy software build and deployment processes,
Associate legacy software with the hosts where they run, with the development teams responsible for them, and with the new formal service offering portfolio.
Interview existing development teams to characterize build processes used by their teams and their adherence to DevOps platform principles
Script analyses to associate artifacts and services with the portfolio
Expand NCBI DevOps capabilities.
Customer-focused, team-oriented disposition
At least three years of professional experience
Excellent communication and soft skills to deal with customers, peers and management
Good judgement, sense of integrity and responsibility
Expertise in at least one programming language, such as Golang, Python, Java, Kotlin, Scala, C, C++
Solid understanding of CI/CD and experience with Git/Svn, GitHub/BitBucket/GitLab, Jenkins/TeamCity.
Solid understanding of data warehouse technologies (Cloudera/RedShift) and business intelligence reporting tools (Tableau/QuickSight)
Solid Linux skills
Understanding of C++ build processes (makefiles, linking, library builds, etc.)
Experience with AWS, GCP, or Azure
Solid understanding of web development but no need to be an expert web developer
Solid understanding of distributed systems and micro-services
Good understanding of data warehousing and modern DevOps practices and technologies
Ability to research, select and operate data warehousing and business intelligence reporting tools
BS in a STEM field (Engineering, Computer Science, Mathematics, Physics)
OR equivalent industry experience in Software Development
Legacy systems analysis skills
Solr or ElasticSearch experience
Python Django experience
Prior DevOps experience
Monitoring products (TICK/TIGK stack (Telegaf, InfluxDB, Grafana, Kapacitor), Prometheus, Stackdriver, etc.)
Logging products (ELK, Splunk, etc.)
Containers (Docker, rkt, etc.)
Cluster schedulers and container orchestrators (Kubernetes, Nomad, or Apache Mesos)
HashiCorp products (consul, Terraform, Vault, packer, Vagrant)
Service Mesh (linkerd, istio/envoy, consul connect)
Spinnaker, Blue/Green and Canary Deployments
SQL and NoSQL databases