Senior Data Engineer
Job Number: 25-04373

Want to be part of the Energy Industry? ECLARO is looking for a Senior Data Engineer for our client in White Plains, NY.

ECLARO's Client is America's largest state power organization and is a national leader in energy efficiency and clean energy technology. If you’re up to the challenge, then take a chance at this rewarding opportunity!

Position Overview:
  • Client's current on-premises Enterprise Resource Planning (ERP) system, SAP ECC 6.0, has been in use for over a decade and is nearing technological obsolescence resulting in Client's requirement to select and implement a new ERP system.
  • The objective for deploying a new ERP system, is to successfully implement a system that integrates all business functions, including finance, operations, and human resources into a cohesive platform.
  • This implementation aims to enhance organizational efficiency, improve data accuracy, and provide real-time reporting capabilities.
  • The goal is to streamline processes, reduce operational costs, and support informed decision-making across all departments.

Responsibilities:
  • Cloud Data Engineering & Integration:
    • Design and implement data pipelines across AWS, Azure, and Google Cloud.
    • Develop SAP BTP integrations with cloud and on-premise systems.
    • Ensure seamless data movement and storage between cloud platforms.
  • ETL & Data Pipeline Development:
    • Develop and optimize ETL workflows using Pentaho and Microsoft ADF / or equivalent ETL tools.
    • Design scalable and efficient data transformation, movement, and ingestion processes.
    • Monitor and troubleshoot ETL jobs to ensure high availability and performance.
  • API Development & Data Integration:
    • Develop and integrate RESTful APIs to support data exchange between SAP and other platforms.
    • Work with API gateways and authentication methods like OAuth, JWT, API keys.
    • Implement API-based data extractions and real-time event-driven architectures.
  • Data Analysis & SQL Development:
    • Write and optimize SQL queries, stored procedures, and scripts for data analysis, reporting, and integration.
    • Perform data profiling, validation, and reconciliation to ensure data accuracy and consistency.
    • Support data transformation logic and business rules for ERP reporting needs.
  • Data Governance & Quality (Ataccama, Collibra):
    • Work with Ataccama and Collibra to define and enforce data quality and governance policies.
    • Implement data lineage, metadata management, and compliance tracking across systems.
    • Ensure compliance with enterprise data security and governance standards.
  • Cloud & DevOps (AWS, Azure, GCP):
    • Utilize Azure DevOps and GitHub for version control, CI / CD, and deployment automation.
    • Deploy and manage data pipelines on AWS, Azure, and Google Cloud.
    • Work with serverless computing (Lambda, Azure Functions, Google Cloud Functions) to automate data workflows.
  • Collaboration & Documentation:
    • Collaborate with SAP functional teams, business analysts, and data architects to understand integration requirements.
    • Document ETL workflows, API specifications, data models, and governance policies.
    • Provide technical support and troubleshooting for data pipelines and integrations.

Required Skills:
  • 7+ years of experience in Data Engineering, ETL, and SQL development.
  • Hands-on experience with SAP BTP Integration Suite for SAP and non-SAP integrations.
  • Strong expertise in Pentaho (PDI), Microsoft ADF, and API development.
  • Proficiency in SQL (stored procedures, query optimization, performance tuning).
  • Experience working with Azure DevOps, GitHub, and CI / CD for data pipelines.
  • Good understanding of data governance tools (Ataccama, Collibra) and data quality management.
  • Experience working with AWS, Azure, and Google Cloud (GCP) for data integration and cloud-based workflows.
  • Strong problem-solving skills and ability to work independently in a fast-paced environment.

Preferred Experience:
  • Experience working on SAP S / 4HANA and cloud-based ERP implementations.
  • Familiarity with Python, PySpark for data processing and automation.
  • Experience working on Pentaho, Microsoft ADF / or equivalent ETL tools.
  • Knowledge of event-driven architectures.
  • Familiarity with CI / CD for data pipelines (Azure DevOps, GitHub Actions, etc.).

Pay Rate: $60 - $70 / Hour

If hired, you will enjoy the following ECLARO Benefits:
  • 401k Retirement Savings Plan administered by Merrill Lynch
  • Commuter Check Pretax Commuter Benefits
  • Eligibility to purchase Medical, Dental & Vision Insurance through ECLARO

If interested, you may contact:
Cedric Ceballo
cedric.ceballo@eclaro.com
646-357-1237
Cedric Ceballo | LinkedIn

Equal Opportunity Employer: ECLARO values diversity and does not discriminate based on Race, Color, Religion, Sex, Sexual Orientation, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status, in compliance with all applicable laws.
Apply Now Back to Results

Apply Now

Required
Required
Required

Resume

Required, maximum file size is 512KB, allowed file types are doc, docx, pdf, odf, and txt