DBT (regular) Data architecture (regular) Azure data links (regular) Kubernetes (regular) Kafka (regular) Airflow (regular) GitLab (advanced) Snowflake (advanced) Python (advanced) SQL (advanced) Aspire Global is a leader in the iGaming space offering solutions spanning game aggregation, sportsbook and managed services offered through an industry leading core platform.The Data & Analytics team owns the group’s core data stack spanning Data Engineering, Business Intelligence and Analytics & Insights to lead the group’s data-driven modernisation both internally and for its partners.The Data Engineer will play a vital role as part of a cross-functional squad to develop data pipelines and the data lake to ingest, structure and expose data from a central location for integration, reports, analytics, and robotic process automation.The chosen candidate needs to be passionate about building scalable data models and architecture for use by every team with the aim of making it easy for BI, Analytics, Product, and other data consumers to build data-driven solutions, features and insights.Responsibilities:Create data pipelines to ingest data from dissimilar sources (performance, reliability, monitoring).Serve data models as a product to the entire organisation through implementation and debugging.Carry out research and development and work on PoCs to trial and adopt new technologies.Collaborate with the other teams to address data sourcing and provision requirements.Coordinate with the Product & Technology teams to ensure all platforms collect and provide appropriate data.Liaise with the other Data & Analytics teams to ensure reporting and analytics needs can be addressed by the central data lake.Support the Data Quality and Security initiatives by building into the architecture the necessary data access, integrity, and accuracy controls.Requirements:3+ years of experience in Data EngineeringDegree in Computer Science, Software Development or EngineeringProficient in Python. Past exposure to Java will be considered an assetUnderstanding of RDMS, Columnar and NoSQL engines & performanceExperience with cloud architecture and tools: Microsoft Azure, Amazon or GCPStrong background in stream data processing technologies such as NiFi, Kinesis, KafkaExperience with orchestration tools such as Apache AirFlow, dbtPrior exposure to the Snowflake ecosystem will be considered an assetFamiliarity with DevOps methodologies and conceptsUnderstanding of distributed logging platforms - ideally the ELK stackSkills RequirementsFluency in spoken and written English is essentialPassionate about data and on the lookout for opportunities to optimisePassionate about technology and eager to recommend new platforms
Data Engineer (rel. to Malta) in Constanţa
Datele de contact vor fi vizibile dupa ce veti aplica!