About the position
We are looking for a Data Engineer with experience in designing, building, and maintaining scalable data pipelines to help us develop logistics optimization solutions for clients in the transportation and distribution sectors.
The ideal candidate should have experience integrating multiple data sources (EDI, APIs, databases) and developing systems that support advanced analytics and route optimization algorithms.
Responsibilities
• Design and build data pipelines that automate order intake from various sources (EDI, email, calls, APIs, etc.).
• Integrate data with existing platforms such as CoreX, QuickBooks, and customer communication tools.
• Collaborate with the engineering team to design route optimization and resource allocation algorithms.
• Develop and maintain ETL/ELT workflows to ensure real-time data quality and availability.
• Implement cloud solutions (AWS, GCP, or Azure) for scalability and security.
• Support the construction of dashboards and analytical tools that enable dispatchers to make quick, data-driven decisions.
• Ensure good practices in security, governance, and data documentation.
Must Haves
- Data-focused API integrations (extract/store/report data, not just transactional)
- NetSuite integration experience
- EDI integration experience
- Excellent English communication
- On-prem SQL Server experience
- Cron-type automation tools (e.g., VisualCron)
- PowerShell scripting
Nice to Haves
- Experience with files.com (file transfers/integrations)
- Python for automation/data manipulation
- Webhooks (especially with NetSuite/WMS)
- Dashboard/reporting tools (Power BI, Tableau, Looker, SSRS)
- Cloud platform exposure (AWS, Azure)
Desirable
• Previous experience in logistics, transportation, or supply chain.
• Experience in projects involving dispatch and routing process automation.
• Knowledge of machine learning applied to resource optimization.
E04JI80034mh407sufy