This is a position within the DATAWAVE product development team, the candidate will have a primary focus on supporting all aspects of agile software design and development for the DATAWAVE ingest and query framework running on the Content Warehouse (CWH) large scale compute clusters, to include development of new capabilities, testing, optimization and performance tuning, operation, and sustainment functions. This entails leveraging corporate tools to interrogate data and enabling tools that assist software development. The candidate will collaborate with the Research Directorate and other contracts. The candidate will be responsible for addressing requirements and support ongoing Data Warehouse Platform (DWP) modernization activities which may include activities for Transitioning and Decommissioning of legacy system capabilities into the modernized DWP architecture. This role has an expectation for after-hours / on-call support. Core Competencies and Skills: • Java programming for distributed systems, with experience in networking and multi-threading • Apache Hadoop • Apache Accumulo • Apache NiFi • Agile development experience • Well-grounded in Linux fundamentals and knowledge in at least one scripting language (e.g., Python, Ruby, Perl, etc.) • Experience with source code management practices and tools • Enabling tools: Git, Maven, Jira • Willingness to be a committer/contributor to open source applications • Continuous Integration / Continuous Testing: Bamboo, Jenkins, GitLab Cl/Pipelines • Continuous Monitoring: ELK Stack (ElasticSearch, Logstash and Kibana), Nagios • Familiarity with microservices software development technique and container-orchestration (e.g., Kubernetes)
Provide cloud software research, development, and engineering services to include requirements analysis, software development, installation, integration, evaluation, enhancement, sustainment, testing, validation, and issue diagnosis/resolution.
· Bachelor's Degree in Computer Science or in a related technical field is highly desired which will be considered equivalent to two (2) years of experience. A Master's degree in a Technical Field will be considered equivalent to four (4) years of experience. NOTE: A degree in Mathematics, Information Systems, Engineering, or similar degree will be considered as a technical field.
· Shall possess the Hadoop/Cloud Developer Certification
· Experience deploying applications in a cloud environment.
· Understanding of Big-Data Cloud Scalability (Amazon, Google, Facebook)
· Experience designing and developing automated analytic software, techniques, and algorithms.
· Experience with taxonomy construction for analytic disciplines, knowledge areas and skills.
· Experience developing and deploying: data driven analytics; event driven analytics; sets of analytics orchestrated through rules engines.
· Experience with linguistics (grammar, morphology, concepts).
· Experience developing and deploying analytics that discover and exploit social networks.
· Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications.
Experience developing and deploying analytics within a heterogeneous schema environment.
- Shall have at least five (5) years of experience in software development/engineering, including requirements analysis, software development, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution.
|
- Shall have at least four(4) years of experience developing software with high level languages such as Java, C, C++
|
- Shall have demonstrated ability to work with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as Hbase, Acumulo, Big Table, etc.
|
- Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.
|
- Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS)
|
- Shall have demonstrated work experience with Serialization such as JSON and/or BSON
|
- Shall have demonstrated work experience developing Restful services
|
- Shall have at least three (3) years of experience developing software for UNIX/Linux (Redhat versions 3-5) operating systems.
|
- Shall have demonstrated work experience in the requirements analysis and design of at least one Object Oriented system.
|
- Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products.
|
- Shall have at least three (3) years of experience in software integration and software testing, to include developing and implementing test plans and test scripts.
|
- Shall have demonstrated technical writing skills and shall have generated technical documents in support of a software development project.
|
- In addition, the candidate will have demonstrated experience, work or college level courses, in at least two (2) of the desired characteristics
|