- Location
- Curitiba, Brazil
- Last Published
- Dec. 12, 2024
- Sector
- Fintech
- Function
- Data Science
đź‘€ Who are we?
Let’s Swile for a fulfilling work environment 🚀
Swile is the first employee super-app that offers a unified, personalized and modern experience that strengthens engagement at work!
But it's also a smart-card that brings together all your benefits: reinvented meal vouchers, gift vouchers to spoil your employees all year round, a mobility advantage to reduce your carbon impact
By combining the best of the human and technological approach, Swile undertakes to carry the current mutations of the labor market by posing as a leader of the Worktech.
Welcome to Swile! 🎉 Innovation is our heartbeat! Join our team to sculpt exceptional products that redefine employee benefits and worktech, bringing daily joy to our users – that's the Swile Touch! ✨
Responsibilities:
Translate business needs in data solutions & products aligned with the data platform strategy
Negotiate with stakeholders a impact driven data strategy
Mentore Swilers on data usage and techniques for better data knowledge and data decentralization.
Develop ETL/ELT routines and data pipelines.
Implement CI/CD routines for data quality.
Create, manage, and document data models using dbt (Data Build Tool) to ensure data quality and efficiency.
Use Metabase and to create visual dashboards and reports that provide actionable insights for business teams.
Design and implement efficient data storage solutions to support analytics, including the use of Snowflake.
Collaborate with cross-functional teams to understand business needs and translate them into clear technical requirements.
Monitor the integrity and performance of data systems and make necessary adjustments.
Ensure compliance with data security and privacy policies.
Required Qualifications:
Bachelor’s degree in Computer Science, Engineering, or a related field.
Intermediate/Advanced English proficiency.
Advanced experience in data modeling with dbt (Data Build Tool).
Proven experience as an Analytics Engineer or Data Engineer.
Experience in the analytics workflow (understanding business requirements, data transformation, quality checks, building pipelines, analyses, and dashboards).
Advanced SQL knowledge with proven track record, and experience with relational and non-relational databases.
Hands-on experience with data integration tools like Stitch.
Ability to create intuitive reports and dashboards using Metabase (Looker or Tableau experience will help).
Strong analytical skills with the ability to collect, organize, analyze, and disseminate meaningful information.
Excellent communication and collaboration skills to work effectively with multidisciplinary teams.
Preferred Qualifications:
Knowledge of programming languages such as Python.
Experience with data orchestrators like Airflow.
Knowledge of AWS data stack.
Infrastructure as Code (IaC) with Terraform.
Relevant certifications in data engineering or data analysis.
Let’s Swile for a fulfilling work environment 🚀
Swile is the first employee super-app that offers a unified, personalized and modern experience that strengthens engagement at work!
But it's also a smart-card that brings together all your benefits: reinvented meal vouchers, gift vouchers to spoil your employees all year round, a mobility advantage to reduce your carbon impact
By combining the best of the human and technological approach, Swile undertakes to carry the current mutations of the labor market by posing as a leader of the Worktech.
Welcome to Swile! 🎉 Innovation is our heartbeat! Join our team to sculpt exceptional products that redefine employee benefits and worktech, bringing daily joy to our users – that's the Swile Touch! ✨
Responsibilities:
Translate business needs in data solutions & products aligned with the data platform strategy
Negotiate with stakeholders a impact driven data strategy
Mentore Swilers on data usage and techniques for better data knowledge and data decentralization.
Develop ETL/ELT routines and data pipelines.
Implement CI/CD routines for data quality.
Create, manage, and document data models using dbt (Data Build Tool) to ensure data quality and efficiency.
Use Metabase and to create visual dashboards and reports that provide actionable insights for business teams.
Design and implement efficient data storage solutions to support analytics, including the use of Snowflake.
Collaborate with cross-functional teams to understand business needs and translate them into clear technical requirements.
Monitor the integrity and performance of data systems and make necessary adjustments.
Ensure compliance with data security and privacy policies.
Required Qualifications:
Bachelor’s degree in Computer Science, Engineering, or a related field.
Intermediate/Advanced English proficiency.
Advanced experience in data modeling with dbt (Data Build Tool).
Proven experience as an Analytics Engineer or Data Engineer.
Experience in the analytics workflow (understanding business requirements, data transformation, quality checks, building pipelines, analyses, and dashboards).
Advanced SQL knowledge with proven track record, and experience with relational and non-relational databases.
Hands-on experience with data integration tools like Stitch.
Ability to create intuitive reports and dashboards using Metabase (Looker or Tableau experience will help).
Strong analytical skills with the ability to collect, organize, analyze, and disseminate meaningful information.
Excellent communication and collaboration skills to work effectively with multidisciplinary teams.
Preferred Qualifications:
Knowledge of programming languages such as Python.
Experience with data orchestrators like Airflow.
Knowledge of AWS data stack.
Infrastructure as Code (IaC) with Terraform.
Relevant certifications in data engineering or data analysis.