
Cloud Software Engineer III
Reflexive Concepts, Annapolis Junction, MD, United States
Reflexive Concepts is seeking a skilled Cloud Software Engineer to join our team!
Specifically, we are looking for a Cloud Software Engineer with expertise in Big Data, Hadoop Ecosystem, Java, and Distributed Computing to develop, maintain, and enhance large-scale cloud-based back-end processing, analytics, and indexing systems.
Qualifications:
Twelve (12) years of experience as a SWE in programs and contracts of similar scope, type, and complexity;
four (4) years of which must be in programs utilizing Big-Data cloud technologies and/or Distributed Computing
Bachelor's degree in Computer Science or related discipline from an accredited college or university
Four (4) years of cloud software engineering experience on projects with similar Big-Data systems may be substituted for a bachelor's degree
Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience
Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience
Tech-Stack Familiarity Architecture:
NiFi
Testing frameworks (JEST)
Grafana
InfluxDB
Elasticsearch
Redis
MySQL
Apache Superset
RabbitMQ/Kafka/Message Fabrics
Ansible/Salt
Terraform (for lane development)
Web-Sockets
JBlocks
CASPORT
Neo4j
Agency Security Labels(CAMKEY, LAC, COI, etc.)
Required Skills:
JavaScript (and eventually some Typescript)
React/JSX
node.js/JavaScript
Rust (for high-performance or secure portions)
Python
Shell script
Kubernetes
Helm Charts
Micro-Services deployment patterns
Service-mesh
Multi-site applications
Load Balancing topologies
Cloud Migration
The following Cloud-related experiences are required:
Two (2) years of Cloud and/or Distributed Computing Information Retrieval (IR)
One (1) year of experience with implementing code that interacts with implementation of Cloud Big Table
One (1) year of experience with implementing code that interacts with implementation of Cloud Distributed File System
One (1) year of experience with implementing complex MapReduce analytics
One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks
One (1) year of experience in architecting Cloud Computing solutions
One (1) year of experience in debugging problems with Cloud based Distributed Computing Frameworks
One (1) year of experience in managing multi-node Cloud based installation
Experience in Computer Network Operations:
Utility Computing, Network Management, Virtualization (VMWare or VirtualBox), Cloud Computing
Multi Node Management and Installation: Management and installation of Cloud and Distributed Computing on multiple nodes, Python, CFEngine, Bash, Ruby or related technologies.
Experience in Information Assurance:
Securing Cloud Based and Distributed applications through industry standard techniques such as Firewalls, PKI Certificate and Server Authentication with experience in Corporate authentication service(s)
Experience in Information Technology:
Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services
Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase, JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies
Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB
Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies
Aspect Oriented Design and Development
Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications
UNIX/LINUX, CentOS
Experience in SIGINT:
Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT)
Geolocation, emitter identification, and signal applications
Joint program collection platforms and dataflow architectures; signals characterization analysis
Experience with Other:
CentOS, Linux/RedHat
Configuration management tools such as Subversion, ClearQuest, or Razor
Specifically, we are looking for a Cloud Software Engineer with expertise in Big Data, Hadoop Ecosystem, Java, and Distributed Computing to develop, maintain, and enhance large-scale cloud-based back-end processing, analytics, and indexing systems.
Qualifications:
Twelve (12) years of experience as a SWE in programs and contracts of similar scope, type, and complexity;
four (4) years of which must be in programs utilizing Big-Data cloud technologies and/or Distributed Computing
Bachelor's degree in Computer Science or related discipline from an accredited college or university
Four (4) years of cloud software engineering experience on projects with similar Big-Data systems may be substituted for a bachelor's degree
Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience
Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience
Tech-Stack Familiarity Architecture:
NiFi
Testing frameworks (JEST)
Grafana
InfluxDB
Elasticsearch
Redis
MySQL
Apache Superset
RabbitMQ/Kafka/Message Fabrics
Ansible/Salt
Terraform (for lane development)
Web-Sockets
JBlocks
CASPORT
Neo4j
Agency Security Labels(CAMKEY, LAC, COI, etc.)
Required Skills:
JavaScript (and eventually some Typescript)
React/JSX
node.js/JavaScript
Rust (for high-performance or secure portions)
Python
Shell script
Kubernetes
Helm Charts
Micro-Services deployment patterns
Service-mesh
Multi-site applications
Load Balancing topologies
Cloud Migration
The following Cloud-related experiences are required:
Two (2) years of Cloud and/or Distributed Computing Information Retrieval (IR)
One (1) year of experience with implementing code that interacts with implementation of Cloud Big Table
One (1) year of experience with implementing code that interacts with implementation of Cloud Distributed File System
One (1) year of experience with implementing complex MapReduce analytics
One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks
One (1) year of experience in architecting Cloud Computing solutions
One (1) year of experience in debugging problems with Cloud based Distributed Computing Frameworks
One (1) year of experience in managing multi-node Cloud based installation
Experience in Computer Network Operations:
Utility Computing, Network Management, Virtualization (VMWare or VirtualBox), Cloud Computing
Multi Node Management and Installation: Management and installation of Cloud and Distributed Computing on multiple nodes, Python, CFEngine, Bash, Ruby or related technologies.
Experience in Information Assurance:
Securing Cloud Based and Distributed applications through industry standard techniques such as Firewalls, PKI Certificate and Server Authentication with experience in Corporate authentication service(s)
Experience in Information Technology:
Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services
Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase, JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies
Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB
Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies
Aspect Oriented Design and Development
Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications
UNIX/LINUX, CentOS
Experience in SIGINT:
Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT)
Geolocation, emitter identification, and signal applications
Joint program collection platforms and dataflow architectures; signals characterization analysis
Experience with Other:
CentOS, Linux/RedHat
Configuration management tools such as Subversion, ClearQuest, or Razor