Cloud & Data Engineer - On-site Only

Posted: Monday, 25 August 2025
Valid Thru: Wednesday, 24 September 2025
Index Requested on: 08/25/2025 19:47:55
Indexed on: 08/25/2025 19:47:55

Location: Canonsburg, PA, 15317, US

Industry: Mining
Occupational Category: 15-1061.00 - Computer and Mathematics
Type of Employment: FULL_TIME

CONSOL Mining Company LLC is hiring!

Description:

Role Summary:

The Cloud & Data Engineer supports the design, build, and maintenance of Core’s cloud infrastructure, data lake environment, and backend service integrations. Operating under the Technology Strategist & Cloud Engineering Manager, this role delivers production-grade systems that enable analytics, automation, and reliable operations across enterprise platforms.

Key Responsibilities

  • Accept, embrace, and promote the following Core Values of Core Natural Resources: Safety, Sustainability, Continuous Improvement
  • Implement and maintain cloud infrastructure using infrastructure-as-code and CI/CD pipelines (Terraform, Git-based workflows)
  • Develop, test, and monitor ETL/ELT data pipelines for ingestion, transformation, and analytics-ready data flows
  • Maintain and scale Core’s cloud-native data lake to support business reporting, data quality, and financial operations
  • Build, manage, and operationalize backend APIs and service integrations, ensuring secure and stable data exchange
  • Support internal applications and operational tooling deployments — focusing on automation, availability, and performance
  • Implement observability tools (e.g., logging, tracing, monitoring, alerts) across infra and data systems
  • Participate in incident resolution, root-cause analysis, and ongoing ops improvement
  • Create and maintain technical documentation: pipeline architecture, API contracts, infrastructure diagrams, and data lineage
  • Collaborate with internal teams to deliver consistent and auditable cloud solutions aligned to enterprise standards

Required Qualifications

  • Bachelor’s degree in Computer Science, Software Engineering, Data Engineering, or a related technical discipline
  • Professional experience in cloud or data engineering roles, with demonstrated ownership of production-grade deliverables
  • AWS Certified Solutions Architect Associate (or higher) required. Applicants without AWS certification must complete Certified Solutions Architect Associate certification within 60 days of hire
  • Must have experience working in environments with multi-account AWS architectures, enterprise security controls, and cost optimization practices.
  • Ability to independently deliver end-to-end cloud/data solutions from architecture to production with minimal oversight
  • Demonstrated ability to integrate AWS data pipelines with at least one ERP or enterprise financial platform in production
  • Hands-on experience with AWS services including S3, Lambda, RDS, IAM, and ETL tools such as Glue and Step Functions
  • Proficiency in Python and SQL for data processing, transformation, and scripting
  • Demonstrated experience developing, integrating, or consuming RESTful APIs in a backend context
  • Proven experience deploying infrastructure using Terraform in a team-based GitOps workflow
  • Solid understanding of data lake architecture, data modeling principles, and pipeline orchestration (batch and streaming)
  • Experience with monitoring and observability tooling (e.g., CloudWatch, Prometheus, Grafana), including alerting and dashboarding

Preferred Qualifications

  • 1–3 years of professional experience in production-grade cloud, data, or backend engineering roles.
  • Hands-on experience with orchestration platforms such as Apache Airflow, dbt, or AWS-native equivalents for automated pipeline management.
  • Proficiency in containerization and backend service deployment workflows, including Docker and orchestration on ECS or EKS, with knowledge of CI/CD integration.
  • Experience developing and maintaining backend applications using Python frameworks such as Django or FastAPI, or Node.js frameworks such as Express.js.
  • Familiarity with microservices architecture and service-to-service communication using REST or gRPC.
  • Understanding of event-driven architectures using AWS SNS, SQS, Kinesis, or Kafka.
  • Experience with unit testing, integration testing, and continuous testing practices (e.g., PyTest, Jest).
  • Knowledge of relational and NoSQL databases (e.g., PostgreSQL, DynamoDB, MongoDB) and ORM frameworks (e.g., Django ORM, SQLAlchemy).
  • Experience deploying and integrating machine learning models into production systems using frameworks such as scikit-learn, TensorFlow, or PyTorch.
  • Familiarity with MLOps workflows — model packaging, versioning (e.g., MLflow), and monitoring in a cloud environment.

Responsibilities:

Please review the job description.

Educational requirements:

  • high school

Desired Skills:

Please see the job description for required or recommended skills.

Benefits:

Please see the job description for benefits.

Apply Now