Vaga de Data Engineer em São Paulo - SP - Liv Up
Veja a descrição da vaga e candidate-se!
Data EngineerPostada em 26 de Fevereiro de 2019
Liv Up is the leading direct-to-consumer healthy food company in Brazil. We have been changing the way people relate to food by delivering the best customer experience in the food industry. By making healthy food very easy and tasteful while offering a transparent and humanized service, we’re turning an ever growing number of people into better eating habits. With our office in the region of Pinheiros, São Paulo, we have been growing over 15% month over month since our launch in March 2016 and we’re backed by Kaszek Ventures (Latin America's top VC firm).
Porque trabalhar aqui
• Work in an informal and cooperative environment that will push your personal and professional growth • Be part of a high impact project that will affect the lives of people throughout the whole food system (from end customers to local organic farmers and everyone in between) • Work in a horizontal, transparent team that not only values your ideas but expects you to actively build the company with us • Have autonomy and responsibility • Learn a lot
Data Analytics at Liv Up: A core part of our strategy is data analytics. Our data team is responsible for driving business decisions through the entire organization and develop state-of-art data products. In order to achieve our mission, we centralize our data in a single integrated data warehouse that stores data for our business and operations algorithms, as well as our business intelligence tools. Our data warehouse is a PostgreSQL DB, integrating data from 10+ sources. We ETL data using Stitch data, custom Python scripts, and Pentaho Data Integration transformations. On top of the infrastructure, we visualize data using Tableau and Metabase. We integrate data products with our systems through APIs. Data Engineer at Liv Up responsibilities: • Design our data warehouse architecture. Understand business and operations data needs and plan our architecture to support every decision. • Integrate different sources of data. We gather data from internal sources (MongoDB) as well as external sources (through APIs and other methods). We expect you to orchestrate ETL processes to keep our data warehouse real-time updated. • Ensure service level to algorithms and dashboards that support business and operations. • Research and develop new processes and technologies to create a state-of-art data analytics stack.
We need you to: • Be motivated by hard challenges • Coding skill in Python (or other scripting languages) • A complete domain of SQL, data architecture and relational data models • Knowledge of ETL processes and technologies (Airflow, Luigi, Pentaho Data Integration, Talend)