Data Engineer

Posted 23 September 2021
Salary £500 - £525 per day + Inside IR35
LocationEngland
Job type Contract
ReferenceBBBH123340_1632482316
Contact NameLauren Pocknell

Job description

A large Government client are seeking a Data Engineer on an initial 6 month contract to work with them on their technology transformation. This role will fall inside of the IR35 Regulations and the successful candidate must have the following:


This post holder will support the development, delivery and management of common, consistent and high-quality data and information services across the client; implementing and maintaining data flows to connect operational data systems, supporting the development of data streaming solutions, writing ETL scripts and code to process data in Azure services, and building data models to support user need. The individual will also support the maintenance of our mission critical heritage data systems and platform.
The individual will be required to work regularly and closely with all teams across the agency to understand and interpret their needs, as well as the rest of the Digital Team in order to turn these requirements into practical solutions for improving and managing data.


Key relationships
The post holder is expected to develop and maintain good working relationships with:

  • Wider Digital team, supporting the delivery of new technical services and platforms, maintaining existing data connections within applications, ensuring that appropriate data standards are met.
  • Business teams and data practitioners across the client, identifying data services and tools which would improve their activities, and supporting them to make the best use of data.
  • System owners across the business, such as Legal and Finance, ensuring the data connects with other data services within the agency.

Key accountabilities and responsibilities
1. Collect, manage, clean and organise data ensuring changes to data systems are effectively implemented.
2. Work with Data and Digital colleagues to ensure that data models and structures used in internal and external applications and services are fit for purpose and follow agreed data standards.
3. Maintain, expand and continually review the core data assets and environment, manage new and existing data updates, while continuing to improve data workflows, automation and cross-system connectivity.
4. Implement agreed standards (e.g. information security, data quality) to ensure data is secure and controlled, adopting appropriate change management processes for data and systems changes and management of associated risk.
5. Support the maintenance and development of a range of core data services, automating data pipelines and ensuring that key data is consistent and accurate across the whole data ecosystem
6. Be a technical point of contact for specific legacy data systems, providing support, guidance and maintenance, and supporting the decommissioning of services
7. Work closely with stakeholders to define data requirements, working with the architecture community to design and create prototypes.
8.Actively participate in a data practitioner community of interest, sharing knowledge and industry best practice with others
9. Actively contribute to the work across the Digital team to develop common and scalable data services
10. Adhere to development standards and principles


Key skills and knowledge
Experience

  • Experience of working on Azure cloud technologies such as Azure Data Factory, Azure Databricks, Azure Data Lake, Azure VM, Azure App Services, Azure Functions, Azure Log Analytics, Azure Monitor, etc
  • Experience of coding in Python, R, Spark, .Net, PowerShell or similar.
  • Experience of working with APIs, SFTP, Gateway, JSON and Parquet file formats
  • Experience of working with Power platform services like Power Apps, Power Automate and Power BI
  • Proven experience of working with both structured and schema-less databases/storage such as Delta Lake, Cosmos DB, MongoDB, Azure SQL, Oracle, and SQL Server
  • Experience of working with DevOps principles, ADO GIT Repo, IaC release process, automated CI/CD pipelines, etc
  • Proven experience of working with large and complex structured and unstructured data.
  • Experience of coding and automating data pipelines and building data validation processes.
  • Experience developing and maintaining effective working relationships with technical and non-specialist colleagues.
  • Evidence of adaptability; able to learn new technologies and methodologies quickly.
  • Demonstrable experience in a role that includes problem solving, analytical and logical skills.

Skills

  • Efficiently collecting, manipulating and validating large and complex datasets.
  • Designing and building data models and services.
  • Comfortable giving definition to ambiguous problems, can do this independently with limited guidance.
  • Solid understanding of software design principles and Agile best practice.
  • Verbal/written communication & data presentation skills, including experience of effectively communicating with both business and technical teams.
  • Strong analytical skills with the ability to collect, organise, analyse, and disseminate significant amounts of information with attention to detail and accuracy.

Qualifications:

  • Degree level qualification and/or appropriate professional qualifications/membership or equivalent experience (5+ years of delivering in an enterprise environment).

Please send your CV to Lauren.Pocknell@Investigo.co.uk