About the position:
We are looking for a Data Engineer with experience in designing, building, and maintaining scalable data pipelines to help us develop logistics optimization solutions for clients in the transportation and distribution sectors.
The ideal candidate should have experience integrating multiple data sources (EDI, APIs, databases) and developing systems that support advanced analytics and route optimization algorithms.
Responsibilities:
• Design and build data pipelines that automate order intake from various sources (EDI, email, calls, APIs, etc.).
• Integrate data with existing platforms such as CoreX, QuickBooks, and customer communication tools.
• Collaborate with the engineering team to design route optimization and resource allocation algorithms.
• Develop and maintain ETL/ELT workflows to ensure real-time data quality and availability.
• Implement cloud solutions (AWS, GCP, or Azure) for scalability and security.
• Support the construction of dashboards and analytical tools that enable dispatchers to make quick, data-driven decisions.
• Ensure good practices in security, governance, and data documentation.
Requirements:
• Proven experience as a Data Engineer (3+ years).
• Solid knowledge of SQL, Python, and data processing frameworks (e.g., PySpark, Pandas, Airflow, dbt).
• Experience with REST APIs, Webhooks, and data integration services.
• Experience with relational and non-relational databases (MySQL, PostgreSQL, MongoDB, etc.).
• Familiarity with logistics or route optimization algorithms (desirable).
• Experience in cloud computing (AWS, GCP, Azure).
• Knowledge of visualization tools (e.g., Power BI, Tableau, Looker, or similar).
• Intermediate-advanced English (written and spoken).
Desirable:
• Previous experience in logistics, transportation, or supply chain.
• Experience in projects involving dispatch and routing process automation.
• Knowledge of machine learning applied to resource optimization.