Contribute to a Software Design and Development team that will engineer, configure, performance tune, efficiently diagnose, and repair issues with all aspects of the Event Warehouse (EWH) and Content Warehouse (CWH) large scale computer clusters.
This position will have a primary focus on supporting all aspects of software design and development for a key product that enables ingest and query of IC data, primary through parsing and transformation of disparate data sets. Responsibilities will include development of new capabilities, testing, optimization and performance tuning, operation, and sustainment functions.
Required – Shall have at least eight (8) years experience in software development/engineering, including requirements analysis, software development, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution.
Required – Shall have demonstrated experience working with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as Hbase, CloudBase/Acumulo, Big Table, etc.
Required – Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.·
Required – Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS).
Required – Shall have demonstrated work experience with serialization such as JSON and/or BSON.
Required – Shall have demonstrated work experience in the requirements analysis and design of at least one Object Oriented system.
Required – Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products.
Required – Shall have at least three (3) years experience in software integration and software testing, to include developing and implementing test plans and test scripts.
Required – Shall have demonstrated technical writing skills and shall have generated technical documents in support of software development project.
Required – Experience developing and deploying: data driven analytics; event driven analytics; sets of analytics orchestrated through rules engines.
Required – In addition, the candidate will have demonstrated work experience in at least four (4) of the desired characteristics.
Understanding of Big-Data Cloud Scalability (Amazon, Google, Facebook).
Experience with linguistics (grammar, morphology, concepts).
Experience developing and deploying analytics within a heterogeneous schema environment
Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications.
Experience developing and deploying analytics that discover and exploit social networks.
Experience with taxonomy construction for analytic disciplines, knowledge areas and skills
Experience designing and developing automated analytic software, techniques, and algorithms.