Description: Marsh McLennan Innovation Centre welcome candidates for a hands-on Principal Data Engineer to be based in our Dublin office.Be the technical expert for the global team, helpg us establish the architectural & engineering standards, best practices, and processes that will guide the scaled development of a new technology platform.What can you expect:To help shape this critical technology initiative, working with senior engineers around the world to deliver a large-scale Platform that will transform our business.To join an Engineering team who together, are helping build the next-generation analytics suite of digital solutions that delivers our industry-leading data and actionable insights through an intuitive, interactive, and engaging platform.The chance to help deliver a completely new offering based on a micro services framework of reusable components, enabling the rapid and low-cost delivery of insurance solutions on a global scale.What you need to have: Significant experience as a Data Engineer on successful enterprise scale applications or projects implementing Big Data Lake architecture using AWS services/data pipelines.Hands-on AWS experience (Lambda, Glue) and Databricks, Spark, Data Pipelines.Experience in Data Architecture, data warehouse concepts and dimensional modelling.Ability to develop, debug and analyse SQL queries/Python scripts for Data Warehouse/Big Data Lakes (BDL)/BI reports and enable walkthrough sessions for stakeholders for to review.Experience of working on at least 1 Data Warehouse project.
Software Development lifecycle experience.Strong technical skills in Database and scripting (Mongo DB and Postgres) What makes you stand out: Your technical expertise with a personal curiosity for technology, trends and standards – someone that other engineers want to work with.Great communication and interpersonal skills, comfortable working in a distributed environment, with an ability to form great working relationships across teams, locations, and time zones.As an application engineer you will be part of highly technical team tasked to deliver various features on R&D automation platform using latest tech stack We will rely on you to: Provide technical hands-on expertise to the distributed technology teams building and maintaining data pipelines and APIs using AWS, Lambda, AWS Glue, Databricks and APIGEE Design highly performant platforms, core software components, and enterprise-wide applications.Collaborate closely with cross-functional teams to define data requirements, design scalable data solutions and contribute to the overall data strategy of the company.Define data models, structures and standards to ensure consistency and accuracy of data across the organization.Collaborate with business stakeholders, analysts and technical teams to understand data requirements and objectives.
Be proficient in python and cloud-native architecture patterns to support junior team members.Continuously learn about new technologies and evolving best practices.What you will be rewarded with: Our employees enjoy excellent benefits such as: A hybrid working model that nurtures a collaborative working environment in the office 3 days per week allowing 2 days per week to be spent working on a remote basis.
Discretionary Annual Bonus Up to 12% pension contribution Family Healthcare cover Study Support Flexible working arrangements Sports & Social clubs/Wellness Programme Travel ticket/bike to work/dental are optional.Subsidised canteen.35 hour working week