Verbinden...

W1siziisimnvbxbpbgvkx3rozw1lx2fzc2v0cy9yzwqty29tbwvyy2uvanbnl2jhbm5lci1kzwzhdwx0lwrllmpwzyjdxq

Data Engineer - Analytical Infrastructure

Standort: Gothenburg, Sweden Gehalt: 500kr - 600kr per hour
Bereich: Consultancy Bereich: Freiberufler
Reference #: CR/058691_1549904639

Data Engineer - Analytical Infrastructure / 10 months / Gothenberg / Start ASAP

Job description:

We are looking for a Data Engineer to work with data modelling, building analytics, platforms for large enterprises. You will work on collecting, storing, processing, and analysing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.

Tasks in the assignment:

* Design, implement, and support analytical infrastructure
* Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL, scripting, and big data technologies
* Explore and learn the latest technologies to provide new capabilities and increase efficiency
* Improve data quality through testing, tooling and continuously evaluating performance
* Collaborate with other software engineers, data analysts, data scientists, and decision-makers, such as product owners, to build solutions and gain novel insights
* Act as the bridge between our backend and analysts and work on data cataloguing/management and build/maintain crucial data pipelines
* Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers
* Participate in developing strategy recommendations

Required skills:

* 3+ years of industry experience in software development, data engineering, business intelligence, data science, or related field with a track record of manipulating, processing, and extracting value from large datasets
* Demonstrated strength in data modelling, ETL/ELT development, and data warehousing
* Experience building data pipelines and analytics systems on distributed data systems
* Experience in Scala or Python scripting or equivalent
* Experience with cloud services (Azure, AWS) - Azure ecosystem is meritorious
* Experience using big data technologies (Hadoop, Hive, Hbase, Spark, etc.) is meritorious
* Knowledge of data management fundamentals and data storage principles
* Knowledge of distributed systems as it pertains to data storage and computing