Back to Search Results


Data Engineer 21058 Phoenix, AZ 1/19/2026 2:34:00 PM

IT
FTE - Client

Job Description

We're looking for a skilled Data Engineer to join our team in the transportation sector. In this role, you'll work with modern cloud technologies to build and maintain data pipelines that support analytics, reporting, and operational insights. You'll be part of a highly collaborative product engineering team focused on delivering reliable, scalable data solutions that help drive smarter decision-making across the organization.
This role operates within an Agile environment using Scrum with strong XP engineering practices, emphasizing small, frequent deliveries, continuous improvement, Test Driven Development, and shared ownership. You will collaborate daily with product managers, QA, and fellow engineers through standups, backlog refinement, sprint planning, reviews, and retrospectives. Strong verbal and written communication skills are essential, as active participation in design discussions, pairing, and problem-solving is a core part of how the team operates on a daily basis.
This is a great opportunity for someone with solid experience in backend data systems who enjoys solving real-world problems, working in fast feedback loops, and contributing to continuously evolving data platforms. This is a hybrid position located at our office in downtown Phoenix.
Key Responsibilities
  • Design, develop, and maintain cloud-native data pipelines leveraging Databricks, Microsoft Azure Data Factory, and Microsoft Fabric to support robust data integration and analytics solutions.
  • Implement incremental and real-time data ingestion strategies using medallion architecture for data lake storage.
  • Write and optimize complex SQL queries to transform, integrate, and analyze data across enterprise systems.
  • Support and troubleshoot legacy data platforms built on SSIS and SQL Server, ensuring high availability and performance of critical data processes.
  • Develop features with a focus on scalability, maintainability, testability, and long-term operability within a continuous delivery mindset.
  • Troubleshoot and resolve data integration and quality issues, ensuring reliable data delivery in production environments.
  • Participate in proof-of-concept projects, technical spikes, and design discussions, providing technical analysis and pragmatic recommendations.
  • Collaborate daily with product owners, QA, and engineers through Scrum ceremonies including standups, backlog refinement, sprint planning, sprint reviews, and retrospectives.
  • Apply XP engineering practices such as pairing, incremental delivery, continuous refactoring, and shared code ownership to maintain high-quality, evolvable systems.
  • Contribute to automated validation, CI/CD pipelines, and observability practices to support fast feedback and safe releases.
  • Clearly communicate technical ideas, risks, tradeoffs, and progress to both technical and non-technical stakeholders.

Job Requirements

Required
  • 5+ years of experience designing and building data solutions.
  • Strong proficiency in SQL and Python for data analytics and transformation.
  • Experience with ETL pipeline development and automation.
  • Solid understanding of Data Lake architecture and design principles.
  • Experience working on Agile teams (Scrum, XP, or similar) with regular participation in standups, sprint planning, refinement, reviews, and retrospectives.
  • Comfort operating in highly collaborative environments with frequent verbal and written communication across engineering, product, and QA.
  • Ability to break work into small, iterative deliverables and adapt quickly based on feedback and changing priorities.
  • Strong ownership mindset, including accountability for quality, reliability, and maintainability of delivered solutions.
Preferred
  • Experience with Azure Cloud services and cloud-based ETL tools.
  • Familiarity with data visualization tools such as Power BI or Tableau.
  • Understanding of event-driven architectures, including queues, batch processing, and pub/sub models.
  • Exposure to NoSQL databases like MongoDB or Cassandra.
  • Experience working on product engineering teams delivering customer-facing or operationally critical systems.
  • Familiarity with modern engineering practices inspired by XP, such as automated testing, pairing, refactoring, and continuous integration.
  • Experience operating systems in production environments with uptime, reliability, and observability expectations.
Bonus Points For
  • Experience in Data Science or Machine Learning, particularly in model deployment or feature engineering.
  • Experience contributing to engineering standards, documentation, or continuous improvement initiatives within an Agile team.