- Date Posted
- May. 27, 2021
- Data Science
Arfetch exists for the love of fashion. We believe in empowering individuality. Our mission is to be the global platform for luxury fashion, connecting creators, curators and consumers.
We’re a diverse and global community made up of Farfetchers, our partners and our customers, which we believe is at the heart of our success. Everything we do is centred around our values - Be Human, Think Global, Be Revolutionary, Todos Juntos, Be Brilliant, Amaze Customers - which define our beliefs and our actions. We welcome difference and foster a consciously inclusive environment for everyone. We are Farfetch For All.
We’re on a mission to build the technology that powers the global platform for luxury fashion. We operate a modular end-to-end technology platform purpose-built to connect the luxury fashion ecosystem worldwide, addressing complex challenges and enjoying it. We’re empowered to break traditions and disrupt, with the freedom and autonomy to make a real impact for our customers all over the world.
Our Porto office is located in Portugal’s vibrant second city, known for its history and its creative yet cosy environment. We welcome new ideas and a large number of our people. From Account Management to Technology and Product, whatever your skills are, you’ll find your fit here. You can have an informal meeting in the treehouse or play the piano in your lunch break!
We are looking for a person who will be integrated in the Data Engineering team, being responsible for helping maintain and improve the BI architecture and tools.
What you’ll do
- Design and build scalable & reliable data pipelines (ETLs) for our data platform;
- Constantly evolve data models & schema design of our Data Warehouse to support self-service needs;
- Work cross functionally with various teams, creating solutions that deal with large volumes of data;
- Work with the team to set and maintain standards and development practices;
- Be a keen advocate of quality and continuous improvement.
Who you are
- Have 5+ years of experience in building and maintaining data pipelines in a custom or commercial ETL tool (SSIS, Talend, Informatica, ...)
- Have 5+ years in a Data Warehouse environment with varied forms of data infrastructure, including relational databases, Hadoop, and Column Store
- Be an expert in SQL
- Proeficient in creating and evolving dimensional data models & schema designs to improve accessibility of data and provide intuitive analytics
- Have basic knowledge of Hadoop/BigData ecosystem ( HDFS, Hive, ....)
- Be familiarized with in continuous delivery principles: version control, unit and automated tests
- Have skills in one of the following programming languages: C#, Java, Python
- Have 2+ years working with a BI reporting tool (Tableau, QlikView, PowerBI, Looker, …)
- Be fluent in English, both written and spoken
- Have good analytical and problem solving skills, the ability to work in a fast moving operational environment and you are enthusiastic and with a positive attitude
- Responsible for helping maintain and improve the BI architecture and tools.