Careers

Principal Data Engineer

Role Overview:

 As a Principal Data Engineer at Seriös Group, you will oversee the development of our data products, ensuring highly scalable, high-performance, and cost-effective data processing and analytics solutions. 

 In this role, you will provide strategic leadership and technical expertise in the development of cloud data platforms, IoT analytics, and data integration and migration projects. You will set the standards for Data Engineering best practices across the organisation, leading the creation of robust coding standards and development practices, while being accountable for their adherence. 

You will champion continuous integration, deployment, automated testing, and monitoring practices, ensuring that our data products are resilient with minimal downtime.  

 Additionally, you will play a key role in evaluating new and emerging technologies, helping to shape the organisation's approach to data engineering. 

As part of our Technical Design Authority, you will work across teams to provide technical guidance and assurance, ensuring best practices are followed.  

You will lead a team of Senior Data Engineers, Data Engineers, and Graduate Data Analysts, with a strong focus on skills development in data engineering. 

You will work in a technology-agnostic manner, leveraging market-leading technologies to shape the best-fit solutions for our clients.  

Your role will involve close collaboration with clients, so excellent client-facing skills are essential. 

 Lastly, you will have a natural passion for technology and data engineering, staying up-to-date with the latest trends and inspiring your team to continuously improve and master the craft of building efficient and effective data transformation pipelines. 

Duties: 

  • Shape the organisation's approach to developing data solutions using cloud technologies such as AWS or Azure. 

  • Serve as a technical leader, providing guidance to all Data Engineering staff, promoting best practices, and fostering a culture of collaboration and innovation. 

  • Set coding standards for Python/PySpark development across all data projects, ensuring consistency and quality. 

  • Act as a quality gate, ensuring that all technical deliveries meet the required standards for data quality, design, reliability, scalability, and cost-effectiveness. 

  • Lead the promotion of Agile delivery methods, encouraging efficiency and adaptability. 

  • Manage mid-level to senior team members, providing both line management and coaching/mentoring support. 

  • Stay up to date with the latest technologies, methodologies, and best practices in cloud and data, in line with our technology-agnostic approach. 

  • Obtain and maintain relevant certifications to support your expertise and credibility. 

  • Build and maintain strong client relationships, while coaching junior team members in soft skills. 

  • Proactively promote best practices and inspire others to continually improve. 

  • Collaborate with the learning and development function to create training plans for junior staff. 


The above list is non-exhaustive; you may be required to carry out any ancillary duties in relation to your role, in addition to the abovementioned list. 

Person Specification:

  • Ability to analyse complex business problems and develop creative technical solutions. 

  • Ability to lead technical teams and provide guidance to junior team members. 

  • Ability to professionally present and communicate technical solutions and concepts, including to non-technical stakeholders. 

  • Ability to be self-motivated and have a proactive approach to work. 

  • Exceptional verbal and non-verbal communication skills with the ability to communicate effectively, efficiently and appropriately with others at all levels of the team. 

  • Ability to develop strong client relationships through gaining trust as a respected technical authority. 

  • Ability to understand and address the needs of multiple clients. 

  • Ability to adapt to changing requirements and business needs. 

  • Ability to prioritise workload and work to deadlines. 

  • Ability to consider how solutions fit within the company’s wider portfolio of work. 

Required:

  • Extensive ETL and data pipeline design experience, technology agnostic. 

  • Experience implementing cloud data platforms in either Azure / AWS or significant on-premise design experience. 

  • 5+ years prior experience in data engineering or similar roles. 

  • Experience in creating and ensuring adherence to coding standards for Python and PySpark development 

  • Experience developing CI/CD pipelines and deep knowledge of DevOps and source control best practices. 

  • Strong knowledge of DataOps practices and awareness of MLOps practices 

  • Database Administration experience, including query optimisation and indexing strategies 

  • Knowledge of NoSQL databases (e.g. DynamoDB, CosmosDB, MongoDB). 

  • Experience implementing solutions using at least one of the following technologies Azure Data Factory, Azure Event Hubs, Azure Data Lake Storage, Azure Function Apps, Azure Synapse Analytics, AWS Glue, AWS S3, AWS Lambda Functions, AWS Redshift, Databricks, Snowflake, Google Big Query, Alteryx, SSIS, Informatica. 

  • Understanding of data warehouse and data lake principles. 

  • Demonstrable data modelling capabilities using either Kimball, Inmon or Data Vault methodologies. 

  • Experience working with relational database platforms. 

  • Expert SQL skills 

  • Experience with processing and managing true “Big Data” scale datasets  

  • Awareness of AI/ML methodologies 

  • Experience with data visualisation tools such as PowerBI or Tableau. 

  • Experience of leading or mentoring a team of junior data engineers or BI developers. 

  • Experience being technical lead on project delivery using Agile and/or DevOps methodologies including, peer reviews and continuous improvement to optimize new and existing data solutions. 

  • Experience using backlog management tools such as Jira or Azure DevOps 

  • Experience of implementation of source control utilising Git, and Git based cloud repositories such as Azure DevOps Repos and GitHub 

Desirable:

  • Experience working with IoT sensor technologies would be highly advantageous. 

  • Experience working in an agile consulting team. 

  • Experience writing Infrastructure as Code using either ARM Templates, PowerShell, CloudFormation or Terraform. 

This role profile is not exhaustive; it will be subject to periodic review and may be amended to meet the changing needs of the business. The post holder will be expected to participate in this process and we would aim to reach agreement to the changes. 


Apply now

This website uses cookies to ensure you get the best experience on our website. Please let us know your preferences.


Please read our Cookie policy.

Manage