TO BE CONSIDERED FOR THIS POSITION YOU MUST HAVE AN ACTIVE TS/SCI W/ POLYGRAPH SECURITY CLEARANCE (U.S. CITIZENSHIP REQUIRED)
This is a position within the DATAWAVE product development team. The candidate will have a primary focus of supporting all aspects of agile software design and development for the open source Accumulo product that is integrated into Data Warehouse Platform systems on large scale compute clusters. Duties include development of new capabilities, testing, optimization and performance tuning, operation, and sustainment functions.
Core Competencies and Skills:
• Willingness to be a committer/contributor to open source applications
• Java programming for distributed systems, with experience in networking and multi-threading
• Apache Hadoop
• Apache Accumulo
• Apache NiFi
• Agile development experience
• Well-grounded in Linux fundamentals and knowledge in at least one scripting language (e.g., Python, Ruby, Perl, etc.)
• Experience with source code management practices and tools
• Enabling tools: Git, Maven, Jira
• Continuous Integration / Continuous Testing: Bamboo, Jenkins, GitLab Cl/Pipelines
• Continuous Monitoring: ELK Stack (ElasticSearch, Logstash and Kibana), Nagios
• Familiarity with microservices software development technique and container-orchestration (e.g., Kubernetes)
As a DataWave Software Engineer, you are expected to perform requirements analysis, software development, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution at a high level of proficiency and independence. You are expected to communicate directions and provide guidance to more junior programmer/analysts, as required. DataWave Software Engineers must be adept at developing solutions which integrated or extended COTS or GOTS products. Additionally, Software Engineers may be responsible for evaluating project needs, determining tasks and durations, and generating and reviewing designs for technical accuracy, completeness.
Provide cloud software research, development, and engineering services to include requirements analysis, software development, installation, integration, evaluation, enhancement, sustainment, testing, validation, and issue diagnosis/resolution.
• (U//FOUO) Shall have at least eight (8) years of experience in software
development/engineering, including requirements analysis, software development, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution.
• (U//FOUO) Shall have at least six(6) years of experience developing software with high level languages such as Java, C, C++
• (U//FOUO) Shall have demonstrated ability to work with OpenSource (NoSQL)
products that support highly distributed, massively parallel computation needs such as Hbase, Acumulo, Big Table, etc.
• (U//FOUO) Shall have demonstrated work experience with the Map Reduce
programming model and technologies such as Hadoop, Hive, Pig, etc.
• (U//FOUO) Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS)
• (U//FOUO) Shall have demonstrated work experience with serialization such as JSON and/or BSON
• (U//FOUO) Shall have demonstrated work experience developing Restful services
• (U//FOUO) Shall have at least five (5) years of experience developing software for UNIX/Linux (Redhat versions 3-5) operating systems.
• (U//FOUO) Shall have demonstrated work experience in the requirements analysis and design of at least one Object Oriented system.
• (U//FOUO) Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products.
• (U//FOUO) Shall have at least three (3) years of experience in software integration and software testing, to include developing and implementing test plans and test scripts.
• (U//FOUO) Shall have demonstrated technical writing skills and shall have generated technical documents in support of a software development project.
• (U//FOUO) Hadoop /Cloud Developer Certification or comparable Cloud
• (U//FOUO) In addition, the candidate will have demonstrated work experience in at least four (4) of the desired characteristics.
• (U//FOUO) Experience deploying applications in a cloud environment.
• (U//FOUO) Experience developing and deploying data driven analytics; event driven analytics; sets of analytics orchestrated through rules engines.
(U//FOUO) A Bachelor's Degree in Computer Science or in a related technical field is highly desired which will be considered equivalent to two (2) years of experience. A Master's degree in a Technical Field will be considered equivalent to four (4) years of experience. NOTE: A degree in Mathematics, Information Systems, Engineering, or similar degree will be considered as a technical field.
(These desired qualities are representative of those types of experience sought for this labor category, but are not limited to those listed herein. Target desired qualifications will be listed in each task order Statement of Work):• (U//FOUO) Understanding of Big-Data Cloud Scalability (Amazon, Google, Facebook)
• (U//FOUO) Experience designing and developing automated analytic software, techniques,
• (U//FOUO) Experience developing and deploying: analytics that include foreign language
processing; analytic processes that incorporate/integrate multi-media technologies,
including speech, text, image and video exploitation; analytics that function on massive
data sets, for example, more than a billion rows or larger than 10 Petabytes; analytics that employ semantic relationships (i.e., inference engines) between structured and unstructured data sets; analytics that identify latent patterns between elements of massive data sets, for example more than a billion rows or larger than 10 Petabytes; analytics that employ techniques commonly associated with Artificial Intelligence, for example genetic algorithms.
• (U//FOUO) Experience with taxonomy construction for analytic disciplines, knowledge
areas and skills.
• (U//FOUO) Experience with linguistics (grammar, morphology, concepts).
• (U//FOUO) Experience developing and deploying analytics that discover and exploit social networks.
• (U//FOUO) Experience documenting ontologies, data models, schemas, formats, data
element dictionaries, software application program interfaces and other technical
• (U//FOUO) Experience developing and deploying analytics within a heterogeneous schema