Senior Data Engineer
Contract Type
Location
Industry
Specialisation
Salary
Contact Name
Contact Email
Date published
Job Reference
Description
Our NSW Government client is hiring for a highly capable Senior Data Engineer to join the organisation. Working in a highly agile environment, the team are working on creating a wider digital platform that will impact not just the department but a wide array of service providers and people within NSW.
The Senior Data Engineer plays a critical role in designing and implementing the management, monitoring, security, and privacy of data using the full stack of data services available to the organisation. This includes the ingestion of data across multiple external source systems across the sector. In conjunction with the enterprise data platform team, the Senior Data Engineer will be responsible for designing, developing and overseeing the Azure environment and data collection and analytics solution.
Role Title: Senior Data Engineer
Contract Length: Until the 31/12/2025
Daily Pay: $900 - $1,000 + Super per day.
Location: Parramatta, (2150). Three days per week in office / WFH Hybrid.
Role Requirements:
- Understand the business context and requirements.
- Collaborate with other team members and enterprise functions to achieve business objectives.
- Conduct feasibility analysis by verifying requirements, ingesting and profiling data.
- Define technical design, data models, loading approaches and source to target mapping documents.
- Design and establish a comprehensive solution for master reference data management and administration.
- Develop Extract Load and Transform (ELT) for Azure Synapse Analytics (using Azure Data Factory/Synapse Pipelines, Spark Notebooks, Python).
- Assess the provision of data to downstream solution components, including designing and implementing interfaces across varied environments.
- Develop, manage and deploy SQL and PySpark source code using Azure DevOps.
- Develop and conduct unit testing.
- Conduct initial root cause analysis and defect triage.
- Defect remediation.
- Identify data patterns and gaps.
- Conduct and contribute to data validation testing.
Knowledge and Experience Required:
- 5 years' experience working in data warehouse, big data or data lake environments.
- 5 years' experience working with Microsoft SQL databases and Azure Synapse Analytics.
- Working experience in Azure Data Factory/Synapse Pipelines, Spark Notebooks, Azure Functions and ADLS Gen2 Storage.
- Comprehensive SQL knowledge and experience.
- Working knowledge of Python.
- Experience in working with large, complex Data Warehouses using the Microsoft BI Stack within an Azure or cloud-based environment.
- Working knowledge and experience in Agile (Scrum) processes and practices.
- Familiarity with Microsoft Fabric and Power BI.
- Familiarity with Azure Cosmos DB.
Candidates must be based in or around the Greater Sydney Area (NSW) and must possess existing full working rights - Permanent Resident or Citizen of Australia.
If you feel this opportunity matches your skills and previous experiences, please apply with your CV.