Data Engineer

Posted: Wednesday, 11 February 2026
Valid Thru: Friday, 13 March 2026
Index Requested on: 02/11/2026 19:53:09
Indexed on: 02/11/2026 19:53:09

Location: Alexandria, VA, 22301, US

Industry: Technology
Occupational Category: 15-0000.00 - Computer and Mathematics
Type of Employment: FULL_TIME

KeyLogic, LLC is hiring!

Description:

Location: Remote
Clearance: Must be able to obtain a Public Trust (U.S. Citizen/Green Card)
Salary: $120, 000 - $125, 000/year

Overview

KeyLogic is seeking a Data Engineer to support mission-critical data engineering initiatives. This role is ideal for someone who can operate independently, lead major technical assignments, and design reliable, scalable data solutions that improve how data powers business operations.

You’ll own complex data pipeline development and database performance/maintenance work across Oracle and PostgreSQL, while helping modernize orchestration, governance, and cloud-based analytics tooling.


What You’ll Do

  • Provide leadership on major tasks or technical assignments. Works on high-visibility or mission critical aspects of a given product(s) and performs functional duties independently. Establishes goals and plans that meet product objectives. Has domain and expert technical knowledge. May supervise others.

Key Qualifications

  • Ability to independently manage Oracle and Postgres SQL databases by designing efficient schemas, optimizing complex SQL queries, and automating ETL/ELT pipelines
  • Experience with performance tuning (indexing, vacuuming), implementing data quality checks, managing backups/restores, and utilizing PSQL for database administration
  • Past experience successfully developing data pipelines — independently building robust, efficient data pipelines, managing schema evolution, handling late-arriving data, and implementing backfill strategies
  • Demonstrated ability to design efficient table structures, implement star/snowflake schemas, and utilize data lake technologies
  • Familiarity with database schema change management tools such as Liquibase
  • Technical Proficiency: Advanced SQL, strong Python, and working with big data technologies such as Apache Spark, Kafka, and Airflow for orchestration
  • Cloud Infrastructure & DevOps: Utilizing cloud services (e.g., Databricks) and infrastructure-as-code tools like Terraform and Kubernetes
  • Data Quality & Governance: Implementing data quality checks, testing with Pytest, and managing data lineage to ensure reliability
  • Collaboration: Working with stakeholders to translate business requirements into technical solutions and documenting data processes
  • Data Maintenance: Demonstrated ability to create and follow low-risk processes to handle business-requested data maintenance activities
  • Business Context: Understands how data drives business processes; able to collaborate with stakeholders to design/modify data processes and procedures to meet operational needs
  • Experience using AI-assisted development tools
  • Ability to obtain a Public Trust clearance

Education & Experience Requirements

Must meet one of the following:

  • BA/BS + 10 years of experience
  • MA/MS (or higher) + 8 years of experience
  • 14 years of experience in lieu of a degree

Responsibilities:

Please review the job description.

Educational requirements:

  • high school

Desired Skills:

See Job Description

Benefits:

Please see the job description for benefits.

Apply Now