Required – Shall have at least eight (8) years experience in software development/engineering, including requirements analysis, software development, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution.
Required – Shall have demonstrated experience working with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as Hbase, CloudBase/Acumulo, Big Table, etc.
Required – Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.·
Required – Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS).
Required – Shall have demonstrated work experience with serialization such as JSON and/or BSON.
Required – Shall have demonstrated work experience in the requirements analysis and design of at least one Object Oriented system.
Required – Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products.
Required – Shall have at least three (3) years experience in software integration and software testing, to include developing and implementing test plans and test scripts.
Required – Shall have demonstrated technical writing skills and shall have generated technical documents in support of software development project.
Required – Experience developing and deploying: data driven analytics; event driven analytics; sets of analytics orchestrated through rules engines.
Required – In addition, the candidate will have demonstrated work experience in at least four (4) of the desired characteristics.
Understanding of Big-Data Cloud Scalability (Amazon, Google, Facebook).
Experience with linguistics (grammar, morphology, concepts).
Experience developing and deploying analytics within a heterogeneous schema environment
Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications.
Experience developing and deploying analytics that discover and exploit social networks.
Experience with taxonomy construction for analytic disciplines, knowledge areas and skills
Experience designing and developing automated analytic software, techniques, and algorithms.
(U) Core Competencies and Skills:
• Java programming for distributed systems, with experience in networking and multi-threading
• Apache Hadoop
• Apache Accumulo
• Apache NiFi
• Agile development experience
• Well-grounded in Linux fundamentals and knowledge in at least one scripting language (e.g.,
Python, Ruby, Perl, etc.)
• Experience with source code management practices and tools
• Enabling tools: Git, Maven, Jira
• Willingness to be a committer/contributor to open source applications
• Continuous Integration / Continuous Testing: Bamboo, Jenkins, GitLab Cl/Pipelines
• Continuous Monitoring: ELK Stack (ElasticSearch, Logstash and Kibana), Nagios
• Familiarity with microservices software development technique and container-orchestration (e.g., Kubernetes)