We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
New

Data Engineer SME - Hybrid

Iron EagleX, Inc.
Oct 11, 2025

Data Engineer SME - Hybrid
Job Locations

US-VA-Crystal City


Job ID
2025-2776

Clearance Level
Top Secret SCI



Overview

Iron EagleX (IEX), a wholly owned subsidiary of General Dynamics Information Technology, delivers agile IT and Intelligence solutions. Combining small-team flexibility with global scale, IEX leverages emerging technologies to provide innovative, user-focused solutions that empower organizations and end users to operate smarter, faster, and more securely in dynamic environments



Responsibilities

Job Description:

We are seeking a Data Engineer SME to design, build, and maintain data collection, processing, and integration pipelines that centralize and structure data from various web-based applications. You will work closely with software developers, data analysts, and product teams to ensure seamless data flow, high-quality data storage, and efficient retrieval methods. This role is ideal for someone passionate about data-driven decision-making and optimizing data aggregation processes while exploring and identifying right-sized technologies and computing resources.

This is not a Data Science, Big Data, ML-Ops, or AI-related role. Instead, this position focuses on wrangling disparate data sources into unified, consumable data APIs for internal and external stakeholders. This is a hybrid position in Crystal City, VA (3 days onsite, 2 remote).

Job Duties Include (but not limited to):

    Design, develop, and implement scalable data pipelines and ETL processes using Apache Airflow, with a focus on data for AI applications.
  • Develop messaging solutions utilizing Kafka to support real-time data streaming and event-driven architectures.
  • Build and maintain high-performance data retrieval solutions using ElasticSearch/OpenSearch.
  • Implement and optimize Python-based data processing solutions.
  • Integrate batch and streaming data processing techniques to enhance data availability and accessibility.
  • Ensure adherence to security and compliance requirements when working with classified data.
  • Work closely with cross-functional teams to define data strategies and develop technical solutions aligned with mission objectives.
  • Deploy and manage cloud-based infrastructure to support scalable and resilient data solutions, including AWS S3.
  • Optimize data storage, retrieval, and processing efficiency, focusing on big data and the use of Parquet files.
  • Utilize Trino for distributed SQL query processing.
  • Leverage OpenMetaData for data discovery and governance.
  • Implement APIs using FastAPI for streamlined data interaction.
  • Ensure proper implementation of PKI's & ACCM's.
  • Integrate DIAS solutions within the data pipeline framework.


Qualifications

Required Skills & Experience:

  • Extensive experience with Apache Airflow for workflow orchestration.
  • Strong programming skills in Python.
  • Familiarity with OPA and Trino.
  • Experience with ElasticSearch/OpenSearch for data indexing and search functionalities.
  • Understanding of vector databases, embedding models, and vector search for AI applications.
  • Expertise in event-driven architecture and microservices development.
  • Hands-on experience with cloud services, particularly AWS S3 and MinIO, including data storage and compute resources.
  • Strong knowledge of Reggo and its application within data environments.
  • Understanding of big data technologies and file formats like Parquet.
  • Expertise in data pipeline orchestration and workflow automation.
  • Working knowledge of Linux environments and database optimization techniques.
  • Strong understanding of version control with Git.
  • Familiarity with OpenMetaData for data management.
  • Experience implementing FastAPI for building scalable APIs.
  • Due to US Government Contract Requirements, only US Citizens are eligible for this role.

Education & Certifications:

  • Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field (or equivalent experience). Advanced degrees are a plus.

Security Clearance:

  • An active TS/SCI security clearance is REQUIRED, and candidates must have or be willing to obtain a CI Poly. Candidates without this clearance will not be considered.

Equal Opportunity Employer / Individuals with Disabilities / Protected Veterans

Applied = 0

(web-c549ffc9f-bf25r)