MyPage is a personalized page based on your interests.The page is customized to help you to find content that matters you the most.


I'm not curious
1

Data Engineer

Location Alderley Edge, United kingdom
Posted 01-June-2020
Description

This is a fantastic opportunity to come and work for a company voted 5th in the Great Place to Work (Best Large Workplaces) awards 2018, and Number 15 in Europe in 2017! We have over 450 stores, over 310 Groom Rooms and were the UKs number one pet care business. Our business is fast-paced, innovative and fun and its our people that make the difference.

Since the appointment of the Chief Data Officer onto the Executive team we have outlined a strategy to establish 'Pet Care Analytics for All'. Analytics will become the lifeblood of our Pet Care ambitions across the Group. We will establish an internal set of capabilities (people, process, technology and culture) that will allow the Group to:

Get more impactful, relevant and timely insight out to colleagues and partnersIncrease the velocity and effectiveness of our VIP activity and customer experience journeysLeverage advanced analytics to optimise our supply chain, customer spend, workload, colleague rotas, pricing, next best action and vet practice profitabilityIntroduce further artificial intelligence to improve pet welfare and diagnosis

To support this, we are looking for a Data Engineer to be based at our Support Office in Handforth (Cheshire).

We are at the early stages of our journey and, as a Data Engineer, you will have a lot of opportunity to influence the technical and strategic direction of the team. You will also have the opportunity to take ownership of parts of the existing platform and new greenfield development. This role sits within the newly formed Pet Care Analytics Group (PCAG), specifically the Data Engineering team reporting to our Lead Data Engineer.

Key Responsibilities:

Develop, maintain and improve our cloud data platform and help plan, design, monitor, communicate and execute data projectsAssist the analytics teams in the implementation of their machine learning use-casesEvangelize about our data platform, products and Data Engineering capabilities with other departments in order to bring more relevant data into our eco-system and develop future data-products that solve real business problems.Maintain simple but useful relevant technical documentation as it is of key importance to make sure that our services and applications are easy to understand and use by the analytics communityDeliver software that is scalable, high availability and fault-tolerantDrive automation particularly in the continuous integration pipelines, infrastructure management and configuration.Ensure that data is of the highest quality and legality coming into and going out of the analytical platform.Adoption and improvement of software development patterns and best practices particularly around open-source componentsContinuous delivery and Dev Ops experience in order to drive the Data engineering team in infrastructure automation, monitoring, logging, auditing and security implementation and practicesOffer improvement to software development patterns and best practices for an analytical platformConduct code reviews, pair programming and knowledge sharing sessions

Skills / Competencies Required:

Demonstrable experience of working with and designing a cloud-based analytical platform including best practices around data ingestion on an industrial scale (batch and streaming ETL/ELT) and turning data science/machine learning algorithms into production-grade productsStrong software development skills (particularly in Python) - object oriented and/or functional design, coding, and testing patterns, the relevant DevOps principles, and the ability to document in a clean mannerSolid knowledge of data modelling and structures, and experience with data laking and warehousing tools and techniques (BigQuery, Spanner, Snowflake, Redshift etc.)Hands-on experience with ingesting and processing streaming data - messaging queues (RabbitMQ, Kafka, PubSub etc.) and data flow orchestration (Data Flow, Apache NiFi, Airflow, Luigi etc.)Strong understanding of the end-to-end deployment process of data products (from raw code to scalable deployment), the relevant CI/CD tools (Jenkins, Spinnaker, TeamCity) and containerisation (Docker, Kubernetes, Helm)Strive to create the simplest solution to any problem, using the right tool for the jobFocus on high-quality, reliable and fault tolerant software/systemsAdapt to new technologies and technical challengesSupport project delivery with pragmatic estimates and progress trackingWork collaboratively and support colleagues with areas of weaknessExcellent verbal and written communication

The Benefits:

Competitive salary plus bonus36 days paid annual leave (including bank holidays), rising to 38 days after two yearsBirthday Leave - 1 day extra leave to celebrate your Birthday!The option to buy extra holidaysAn extra days holiday when you become a new pet parent (Dog/Cat/Horse)Pension SchemeColleague Discount- 20% discount for you (plus one family member). This can be used in Pets at Home Stores, Groom Rooms, Companion Care Veterinary Services, and Pet Plan Insurance!'Treats' benefits- an online range of offers and discounts which are available exclusively to Pets at Home ColleaguesLife assuranceContributory Private Health CareCharity Leave- You are entitled to have one paid days leave each year to work for your favourite animal related charity.Celebrating Special Celebrations? Youll get 1 extra week off and a gift to celebrate your wedding or civil partnership, and a gift from us if you are expecting or adopting a baby!..... click apply for full job details

 
Awards & Accolades for MyTechLogy
Winner of
REDHERRING
Top 100 Asia
Finalist at SiTF Awards 2014 under the category Best Social & Community Product
Finalist at HR Vendor of the Year 2015 Awards under the category Best Learning Management System
Finalist at HR Vendor of the Year 2015 Awards under the category Best Talent Management Software
Hidden Image Url