SQL (regular) PySpark (regular) Spark (regular) AWS (regular) Azure (regular) Python (advanced) We are a team of nerds dedicated to collect other nerds that happen to be the best in their respective fields in the broadly understood IT service and analytics. And with them we go out of our way to create IT and Analytical Hubs which bring together best experts from the IT world to help our partners to become real data driven entities. Currently we are looking for a Data Engineer as we are supporting our partner in developing Global Analytics unit which is a global, centralized team with the ambition to strengthen data-driven decision-making and the development of smart data products for day-to-day operations. The team is meant to be a nucleus radiating data-driven entrepreneurial culture by acting as incubator to realize ideas by simply doing it: creating smart data products in all areas of business, be it sales, marketing, purchasing, logistics or any other part of the company. The team has a lot of freedom to shape this, especially in the use of tools and technology, but also by introducing new concepts, solutions and ways of working. If you want to:take part in the development and implementation of a complex system of smart data solutions have opportunity to work on bleeding-edge projectshave a chance to see how your visions come truebe a member of an international and diverse team of ground-breaking data scientists, data engineers, ML engineers, software developers, BI developers, business analystscarry out projects which address real business challengeshave a real impact on the projects you work on and the environment you work inhave a chance to propose innovative solutions and initiativesopportunity and tools to grow, develop and drive your career forward, it’s probably a good match. Moreover, if you like:flexible working hourscasual working environment and no corporate bureaucracyhaving an access to such benefits as Multisport and private medical careworking in modern office in the centre of Warsaw with good transport links or working remotely as much as you wanta relaxed atmosphere at work where your passions and commitment are appreciatedvast opportunities for self-development (e.g. online courses and library, experience exchange with colleagues around the world, partial grant of certification),it’s certainly a good match! If you join us, your responsibilities will include:structuring whole processes of data extraction, data transformation and data storing using serverless cloud services to deploy models / analytical solutions handling activities such as quality assurance, data migration and integration, and solution deployment to ensure the business gets the best valuebuilding, developing and maintaining data pipelineswriting and maintaining ETL processes in Python, designing database systems and developing tools for real-time and offline analytic processingimplementing, maintaining and further developing the functionality of the Python packages for ETL processes, data lineage, operator inputs, including building logictroubleshooting software and processes for data consistency and integrity integrating large scale data from a variety of sources for business partners to generate insights and make decisions design and implementation of data flowunit and integration tests of Python modulesparticipating in mission critical processes of the data pipelinetechnical support in understanding business problems and designing smart data products We expect:significant commercial experience on similar positionstrong data analytics skills using Pythonworking knowledge of Azure services (standard stuff plus Terraform) and AWS (Lambda, Redshift, Glue, Kinesis, S3, Aurora)excellent software engineering skills (including unit testing, integration testing, OOP)proficiency with Python experience with working with large datasets through Spark and RDBMSsolid knowledge of PySpark with ability to apply it to write Spark applications as well as to analyze data in distributed environmentvery good SQL skillsfluency in extracting information from databasesability to write clean, efficient, documented and scalable codeexperience working in the organizations with the agile cultureteam playerfluent English as you will communicate in English almost all the time If interested please let us get to know you by sending your CV using "Apply" button. Please add to your CV the following clause: "I hereby agree to the processing of my personal data included in my job offer by hubQuest spółka z ograniczoną odpowiedzialnością located in Warsaw for the purpose of the current recruitment process.” If you want to be considered in the future recruitment processes please add the following statement: "I also agree to the processing of my personal data for the purpose of future recruitment processes.”
Cloud Data Engineer in Constanţa
Datele de contact vor fi vizibile dupa ce veti aplica!