Integration Data Engineer

Posted: Tuesday, 05 August 2025
Valid Thru: Thursday, 04 September 2025
Index Requested on: 08/06/2025 01:39:14
Indexed on: 08/06/2025 01:39:14

Location: Jersey City, NJ, 07097, US

Industry: Advertising and Public Relations
Occupational Category: 13-0000.00 - Business and Financial Operations
Type of Employment: FULL_TIME

Mizuho Americas Services LLC is hiring!

Description:

Join Mizuho as an Integration Data Engineer!

We are seeking an Integration Data engineer with a background in SQL and data warehousing for enterprise level systems. The ideal candidate is comfortable working with business users along with business analyst expertise.

Major Responsibilities:
  • Design, develop, and deploy Databricks jobs to process and analyze large volumes of data.
  • Collaborate with data engineers and data scientists to understand data requirements and implement appropriate data processing pipelines.
  • Optimize Databricks jobs for performance and scalability to handle big data workloads.
  • Monitor and troubleshoot Databricks jobs, identify and resolve issues or bottlenecks.
  • Implement best practices for data management, security, and governance within the Databricks environment. Experience designing and developing Enterprise Data Warehouse solutions.
  • Demonstrated proficiency with Data Analytics, Data Insights
  • Proficient writing SQL queries and programming including stored procedures and reverse engineering existing process
  • Leverage SQL, programming language (Python or similar) and/or ETL Tools (Azure Data Factory, Data Bricks, Talend and SnowSQL) to develop data pipeline solutions to ingest and exploit new and existing data sources.
  • Perform code reviews to ensure fit to requirements, optimal execution patterns and adherence to established standards.


Skills:

• 5+ years - Enterprise Data Management

• 5+ years - SQL Server based development of large datasets

• 5+ years with Data Warehouse Architecture, hands-on experience with Databricks platform. Extensive experience in PySpark coding. Snowflake experience is good to have

• 3+ years Python(numpy, pandas) coding experience

• 3+ years' experience in Finance / Banking industry - some understanding of Securities and

Banking products and their data footprints.

• Experience with Snowflake utilities such as SnowSQL and SnowPipe - good to have

• Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling.

• Previous experience leading an enterprise-wide Cloud Data Platform migration with strong architectural and design skills

• Capable of discussing enterprise level services independent of technology stack

• Experience with Cloud based data architectures, messaging, and analytics

• Superior communication skills

• Cloud certification(s) preferred

• Any experience with Regulatory Reporting is a Plus

Education

• Minimally a BA degree within an engineering and/or computer science discipline

• Master's degree strongly preferred

Education

• Minimally a BA degree within an engineering and/or computer science discipline

• Master's degree strongly preferred

The expected base salary ranges from $77, 000 - $135, 000. Salary offers are based on a wide range of factors including relevant skills, training, experience, education, and, where applicable, certifications and licenses obtained. Market and organizational factors are also considered. In addition to salary and a generous employee benefits package, including Medical, Dental and 401K plans, successful candidates are also eligible to receive a discretionary bonus.

#LI-Hybrid

Responsibilities:

Please review the job description.

Educational requirements:

  • high school

Desired Skills:

Please see the job description for required or recommended skills.

Benefits:

Please see the job description for benefits.

Apply Now