Location: Dublin 2
Other locations: Primary Location Only
Salary: Competitive
Date: Apr 15, 2026

Job description
Requisition ID: 1623026
At EY, we’re all in to shape your future with confidence.
We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go.
Join EY and help to build a better working world.
FS Technology Consulting – AI and Data – Data Engineer – Senior Consultant – Dublin
General Information
Location: Dublin
Available for VISA Sponsorship: Yes
Business Area: Data & Analytics
Contract Type: Full Time – Permanent
Key Responsibilities
- Design, implement, and manage robust and scalable data pipelines and solutions for core systems and applications.
- Collaborate with IT and business teams to understand business and technical requirements in order to identify, prioritise and document data requirements for management and business insights.
- Design, build, and maintain data solutions that transform raw data into structured formats suitable for analytics.
- Develop and manage data models and ETL processes to ensure effective and efficient data ingestion and transformation.
- Analyse and conduct gap analysis in existing data architectures to identify impact areas and design solution options.
- Conduct thorough analysis of existing data processes and systems, identifying areas for improvement. Work closely with Data Stakeholders (e.g. Product and System Owners, Technical Leads, Business and Change Leads, Data and Solution Architects) to communicate and influence implementation of the identified technical enhancements.
- Lead performance optimisation identification of data pipeline bottlenecks and areas for improvement. Optimise query performance in data storage solutions and implement best practices for data processes and transformations to enhance the efficiency of ETL workflows.
- Implement security measures to protect sensitive data throughout the data lifecycle, including encryption, access controls, and data masking.
- Ensure compliance with data governance policies and regulatory requirements, such as GDPR, BCBS239, IFRS9, IRB or HIPAA.
- Monitor data quality and integrity throughout the transformation process, implementing controls and fixes as needed and maintaining compliance with the organisation’s Data Governance and Data Management policies and procedures.
- Design data architecture that can scale horizontally and vertically to accommodate growing data volumes and user demands.
- Support the design and implementation of integrated technology solutions with data expertise, ensuring alignment with business goals and compliance with internal and external policies, procedures, and regulations.
- Design and implement robust reporting frameworks that facilitate timely and accurate financial status reporting
Skills and Attributes Required to Apply for this role:
- Experience in one or more of the following Cloud Platforms:
- Microsoft Azure:
- Core Azure SQL and MSSQL server
- Deep understanding of Azure-native services for data engineering.
- Experience with Azure Key Vault, Azure Monitor, and RBAC.
- AWS:
- Experience with S3, Glue, Redshift, and Lambda.
- Familiarity with IAM, CloudWatch, and Athena.
- Google Cloud Platform (GCP):
- Experience with BigQuery, Cloud Storage, and Dataflow.
- Familiarity with Cloud Functions, IAM, and Vertex AI (optional).
- Experience in one or more of the following Data Platforms:
- Microsoft:
- Experience with Azure Data Factory for orchestrating data workflows.
- Proficiency in Azure Synapse Analytics and SQL Server.
- Familiarity with Azure Data Lake Storage (Gen2) and Azure Blob Storage.
- Knowledge of Power BI integration and data modelling
- Understanding of Azure Functions and Logic Apps for automation.
- Snowflake:
- Strong SQL skills and experience with Snowflake’s architecture (virtual warehouses, storage, cloud services).
- Proficiency in Snowflake Streams & Tasks for CDC and automation.
- Experience with Snowflake Secure Data Sharing and Snowflake Marketplace.
- Familiarity with Snowpark for Python/Java-based transformations.
- Understanding of role-based access control, data masking, and time travel features.
- Databricks:
- Hands-on experience with Apache Spark and Databricks Runtime.
- Proficiency in Delta Lake for ACID-compliant data lakes.
- Experience with Structured Streaming and Auto Loader.
- Familiarity with MLflow, Feature Store, and Model Registry.
- Use of Databricks notebooks for collaborative development in Python, SQL, or Scala.
What We Look For
- Someone who is passionate about reaching their full potential and excelling in their career
- Someone with energy, enthusiasm and courage who enjoys solving complex problems and variety in their day-to-day working life
- Someone who enjoys working as part of a community which values integrity, respect, teaming, and inclusiveness
If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.
Join us in building a better working world. That’s Why, EY.
Apply now.
IMPORTANT: Where Agency assistance is required, our Talent Team will engage directly with suppliers. CVs / Profiles should not be shared directly with Hiring Managers. Unsolicited CVs / Profiles supplied to EY by Recruitment Agencies will not be accepted for this role.
EY | Building a better working world
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.