Bachelor's Degree in Computer Science or in a related technical field is highly desired which will be considered equivalent to two (2) years of experience. A Master's degree in a Technical Field will be considered equivalent to four (4) years of experience. NOTE: A degree In Mathematics, Information Systems, Engineering, or similar degree will be considered as a technical field.
· Understanding of Big-Data Cloud Scalability (Amazon, Google, Facebook)
· Experience designing and developing automated analytic software, techniques, and algorithms.
· Experience developing and deploying: analytics that include foreign language processing; analytic processes that incorporate/integrate multi-media technologies, including speech, text, image and video exploitation; analytics that function on massive data sets, for example, more than a billion rows or larger than 10 Petabytes; analytics that employ semantic relationships (i.e., inference engines) between structured and unstructured data sets; analytics that identify latent patterns between elements of massive data sets, for example more than a billion rows or larger than 10 Petabytes; analytics that employ techniques commonly associated with Artificial Intelligence, for example genetic algorithms.
· Experience with taxonomy construction for analytic disciplines, knowledge areas and skills.
· Experience with linguistics (grammar, morphology, concepts).
· Experience developing and deploying analytics that discover and exploit social networks.
· Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications.
Experience developing and deploying analytics within a heterogeneous schema environment.
- Shall have at least eight (8) years of experience in software development/engineering, including requirements analysis, software development, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution.
|
- Shall have at least six(6) years of experience developing software with high level languages such as Java, C, C++
|
- Shall have demonstrated ability to work with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as Hbase, Acumulo, Big Table, etc.
|
- Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.
|
- Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS)
|
- Shall have demonstrated work experience with serialization such as JSON and/or BSON
|
- Shall have demonstrated work experience developing Restful services
|
- Shall have at least five (5) years of experience developing software for UNIX/Linux (Redhat versions 3-5) operating systems.
|
- Shall have demonstrated work experience in the requirements analysis and design of at least one Object Oriented system.
|
- Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products.
|
- Shall have at least three (3) years of experience in software integration and software testing, to include developing and implementing test plans and test scripts.
|
- Shall have demonstrated technical writing skills and shall have generated technical documents in support of a software development project.
|
- Hadoop /Cloud Developer Certification or comparable Cloud System/Service Certification
|
- In addition, the candidate will have demonstrated work experience in at least four (4) of the desired characteristics.
|
- Experience deploying applications in a cloud environment.
|
- Experience developing and deploying data driven analytics; event driven analytics; sets of analytics orchestrated through rules engines.
|