At GMS, we've been helping Canadians for more than 75 years to get the health and travel insurance they want and need. The same pioneering spirit that started our story is what drives us to do things differently today. Insurance, honestly, is our promise, and it's what we do at GMS. We care about our customers, our community and each other. As a non-profit organization, we're proud to reinvest our profits into the health of the communities we serve and that have supported us since 1949.
We want our employees to feel good about coming to work and being in a workplace that promotes flexibility, growth and a healthy work-life balance. If you'd like to be part of a team that truly takes care of our customers, our communities, and each other, this could be your chance.
Here's the role
The Data Engineer is responsible for building and maintaining data integration services in GMS's data platform. This role will also be heavily involved in developing database architectures and data models that are utilized in GMS's information management services. This role will apply database best practices to automate data flow processes and support designing a scalable and reliable data management platform. To be successful, the Data Engineer will be conducting research and documenting data requirements that are leveraged in building data collection processes from various sources such as Azure Storage services and SQL databases. The Data Engineer plays a key role in developing and implementing data administration and governance policies and standards, and data modeling essentials for GMS's Extract-Transform-Load (ETL) process. Furthermore, this role is responsible for writing SQL scripts, building stored procedures and triggers and operating database management systems.
Position Responsibilities
•Research and document data requirements and data collection processes.
•Collect and document GMS's user requirements for data storage services.
•Develop and implement data administration policy, standards and data models that need to be applied to ETL process.
•Apply best practices to data cleansing and validation steps in data pipelines.
•Collect data from different sources such as GMS SQL databases.
•Conduct research and provide advice to other information systems professionals and business SMEs regarding the collection, availability, security, and suitability of GMS data.
•Support designing and implementing data integration services between GMS's core Cloud infrastructure and third-party vendors.
•Write scripts related to stored procedures and triggers in GMS's Cloud SQL Database.
•Develop database architecture for information systems projects.
•Modify GMS's data models and database management systems on Azure to improve efficiency.
•Identify and implement internal process improvements such as automating manual data processes and optimizing data delivery.
•Export data from database management systems to perform data mining analysis.
•Collaborate with the Data team to build strategies around data management and lifecycle.
•Build, and test data pipelines and ETL processes from a wide variety of data sources including structured, semi-structured and unstructured data, using Azure Data tools such as Azure Data Factory, Azure SQL, Microsoft Fabric and more.
•Maintain GMS's data pipelines and expand them according to the new requirements due to business growth.
•Research and explore ways to enhance data quality, reliability, and efficiency.
•Identify and evaluate methods and technology solutions to improve data flows.
•Work with business stakeholders to assist with data-related technical issues and support their data infrastructure needs.
Competencies
•Impact and Influence: Uses active listening skills to understand a perspective and opinion; respectfully considers a different perspective; shares opinions and thoughts openly regardless of a shared or counter opinion.
•Quality Orientation: Completes tasks with a high level of accuracy; routinely checks inputs, outputs, tasks, and processes to ensure they are error free.
•Critical Thinking: Uses data and experience to understand and develop solutions to a range of business problems; validates inputs, assumptions, and outputs for reasonableness; sees connections between actions and their consequences.
•Problem Solving: Actively explores and analyzes options and solutions to make effective customer and business decisions; considers causal relationships and impacts of risk decisions within area of knowledge and expertise.
Education & Experience
•Post-secondary education in Computer Science, Software Engineering, Statistics, or equivalent combination of education and experience.
•3+ years' experience of developing and maintaining ETL and data pipelines.
•Highly proficient in SQL, Azure Data Factory, and Database Management Systems.
•Proficient in working with structured, semi-structured and unstructured data.
•Advanced experience working with relational databases as well as familiarity with non-relational databases.
•A successful history of extracting, manipulating, processing, and analyzing data from multiple disconnected datasets.
•Experience performing root cause analysis on data processes and ETLs to answer specific business questions, troubleshoot issues and identify opportunities for improvement.
•Experience with Azure data storage and processing services.
Are we a fit?
If you think so, please apply by November 20, 2024. We'd love to reach out to everyone who applies, but we just don't have enough hands! If you're selected for an interview, we'll be in touch. If not, please consider us again in the future.