Data Engineer
Lokalizacja: Anywhere)
Ogłoszenie Bezpośrednie
We are Addepto where you can feel a startup atmosphere! We believe that the only constant in life is change, so we try to keep developing and improving to become better at what we do every day! We act outside the box and create and deliver the best solutions in Big Data, AI, and Business Intelligence.
For our team based in Warsaw and remotely, we are looking for a Data Engineer focusing mainly on designing and constructing data processing architecture.
We are open for candidates with different expertise levels (Mid/Senior/Lead), who want to develop further their skills and experience in this role.
Some of our recent Big Data projects:
- Data lakes which stores terabyte of data and process machine learning tasks for big telecom company
- Streaming applications to server data analytics in real-time for manufacturing companies
- Systems that support the decision-making process and help to analyze data in a unified format for controlling and operations departments
- Support real-time machine learning prediction on massive datasets which prevents company losses for pharmaceutical companies
- And more!
What we offer:
- Work in a well-coordinated team of passionate enthusiasts of Big Data & Artificial Intelligence
- Fast career path and opportunity to develop your qualifications thanks to sponsorship for trainings, conferences and many other development possibilities in various areas
- Challenging international projects for global clients and innovative start-ups
- Friendly atmosphere, outstanding people and great culture – autonomy and supportive work environment are crucial for us
- Flexible working hours – you can adjust your schedule to better fit your daily routine
- Work-life balance – we respect your private life so you don’t have to work overtime or on weekends
- Possibility of both remote and office-based work – modern office space available in Warsaw, Cracow, Wroclaw, Bialystok or coworking space in any place in Poland if needed
- Any form of employment – we offer B2B, employment contract or contract of mandate
- Paid vacation – 20 fully paid days off if you choose B2B or contract of mandate
- Other benefits – e.g. great team-building events, language classes, trainings & workshops, knowledge sharing sessions, medical & sports package, and others
Responsibilities:
- Design and construction of scalable data processing architecture
- Using Big Data and BI technologies (e.g. Spark, Kafka, Hadoop, SQL)
- Building an application that will aggregate, process, and analyze data from various sources
- Cooperation with the Data Science department in the field of Machine Learning projects (including text/image analysis, building predictive models)
- Manage distributed database systems like ClickHouse, BQ, Teradata, Oracle Exadata, PostgreSQL + Citus
- Modeling, Star and Snowflake schema
- Develop and organize data transformations in DBT and Apache Airflow
- Translate requirements from the business and translate them into technical code
- Ensure the best possible performance and quality in the packages
- Manage business user’s expectations
Requirements:
- Higher education in technical and mathematical studies (or the last year of studies)
- Commercial experience in the implementation, development, or maintenance of Business Intelligence or Big Data systems
- Knowledge of Python or Java or Scala
- Experience in SQL
- Good command of the English language (min. B2)
- Independence and responsibility for delivering a solution
- Excellent knowledge in Dimensional Data
- Good communication and soft skills
- Lead discussions, requirement sessions, should be able to comprehend, summarize and finalize the requirements
- Knowledge of Spark, NiFi, Docker, AWS or Azure, Splunk
Dodano dnia: 10-01-2023
Podziel się