Location: Ireland or Remote (If Remote must be willing to work CET/GMT Timezone)
Gambling.com Group is a multi-award-winning provider of digital marketing services for the global online gambling industry. The Group publishes free-to-use information portals that offer comparisons and reviews of regulated online gambling websites around the world. Gambling.com Group is looking for talented individuals to join its diverse and growing team in Europe.
The Group was founded 15 years ago and has positioned itself as one of Europe’s primary affiliate marketing companies, leading the way in responsible player acquisition across the regulated, global online gambling industry. Through a portfolio of brands including Gambling.com, Bookmakers.co.uk, Casinosource.co.uk and many more, Gambling.com Group helps online gamblers start their consumer journey with confidence by delivering best-in-class content, including expert analysis, reviews, news, tips, odds comparisons and more.
Gambling.com Group has offices in Dublin (E.U. headquarters), Malta, as well as a new and growing U.S. headquarters in Charlotte, North Carolina. As online sports betting in the U.S. continues to roll out at an exponential pace, Europe remains a key market of focus for the Group, which is experiencing a period of exciting growth on both sides of the Atlantic.
We have an opening for a BI Data Engineer with a focus on providing accurate and consistent data for analysis. This person will have the opportunity to work with bespoke tools designed for data capture, collect and process that data into a state-of-the-art Data Warehouse, primarily using Python and AWS Technologies.
The person will be expected to perform and have a positive impact on a broad range of tasks, including but not limited to:
- Design, develop, maintain, and document Python based web scraping, data processing and ETL jobs
- Ensure data collection is complete and accurate at all times
- Ensure failures are minimised and corrected efficiently and in a timely manner
- Ensure data pipelines are optimised to minimise costs, whilst also efficient enough to meet the reporting needs of the business
- Ensure that requirements are captured successfully
- Utilise leading edge technologies to ensure pipelines are using the best and most appropriate tools to achieve company goals
- Communicate results of technical projects to both technical and non-technical users
- Work with a globally distributed team and be comfortable with slack/zoom for meetings and communication, and programming remotely
- A Bachelor’s Degree in Computer Science or related field, or equivalent work experience
- 3+ years of commercial development experience using Python
- Experience building and maintaining data processing pipelines, ETL/ETL processes, and web scraping processes using Python
- Experience of working effectively in a distributed team environment
- Experience working with a major Relational Database such as Snowflake, Oracle, Teradata, SQL Server, MySQL, etc.
- Demonstrable experience in data exploration and data visualisation
- Clear ability to work in teams, contribute to technical decisions and learn/suggest new technologies
- Ability to be a creative problem solver with a solution-focused attitude
- Familiarity with data warehousing methodologies
- Comfortable with GitHub and the gitflow model
- Experience with AWS
- Possess a strong sense of ownership, urgency, and drive
- Document all code and features for maintainability
- Excellent written and oral communication skills in English
- Experience of cloud-based databases would be advantageous but not essential
Perks & Benefits
- Comprehensive private Healthcare Insurance
- Flexible work environment and home office available
- Home office allowance
- Gym & Leisure Allowance
- All the hardware and software you need to be successful
- Regular company events and social outings, activities, Spot Awards, and a Monthly Social Club
- Access to courses for Personal and Career Development
- Company Paid Volunteer Day