City, Leeds
The Bridge IT Recruitment
Purpose of the Job:
Key Accountabilities:
Outcome, Results and Key Performance Indicators:
Dimensions of Job:
Key Relationships:
Knowledge and Skills:
Knowledge
– Broad data management technical knowledge so as to be able to work across full data cycle.
– Proven Experience working with AWS data technologies (S3, Redshift, Glue, Lambda, Lake formation, Cloud Formation), GitHub, CI/CD
– Coding experience in Apache Spark, Iceberg or Python (Pandas)
– Experience in change and release management.
– Experience in Database Warehouse design and data modelling
– Experience managing Data Migration projects.
– Cloud data platform development and deployment.
– Experience of performance tuning in a variery of database settings.
– Experience of Infrastructure as code practises.
– Proven ability to organise and produce work within deadlines.
Skills
– Good project and people management skills.
– Excellent data development skills.
– Excellent data manipulation and analysis skills using a variety of tools including SQL, Phyton, AWS services and the MSBI stack.
– Ability to prioritise and be flexible to change those priorities at short notice.
– Commercial acumen.
– Able to demonstrate a practical approach to problem solving.
– Able to provide appropriate and understandable data to a wide ranging audience.
– Well-developed and professional communication skills.
– Strong analytical skills – ability to create models and analyse data in order to solve complex problems or reinforce commercial decisions.
– Able to understand business processes and how this is achieved/influenced by technology.
– Must be able to work as part of a collaborative team to solve problems and assist other colleagues.
– Ability to learn new technologies, programs and procedures.
Technical Essentials:
– Expertise across data warehouse and ETL/ ELT development in AWS preferred with experience in the following:
– Strong experience in some of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB
– Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have)
– Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of other data tools and programming languages such as Python & Spark and Strong SQL experience.
– Experience is building Data lake and building CI/CD data pipelines
– A candidate is expected to understand and can demonstrate experience across the delivery lifecycle and understand both Agile and Waterfall methods and when to apply these.
Experience:
This position requires several years of practical experience in a similar environment. We require a good balance of technical and personal/softer skills so successful candidates can be fully effective immediately.
– Proven experience in developing, delivering and maintaining tactical and enterprise data management solutions.
– Proven experience in delivering data solutions using cloud platform tools.
– Proven experience in assessing the impact of proposed changes on production solutions.
– Proven experience in managing and developing a team of technical experts to deliver business outcomes and meet performance criteria.
– Exposure to Energy markets, Energy Supply industry sector
– Developing and implementing operational processes and procedures.
