Python - 3000 euro
- Financiar / Contabil
- Argeș, RO
Python - 3000 euro
- Financiar / Contabil
- Argeș, RO
About you:
You are a seasoned Data Engineer, who is comfortable with both PySpark and SQL.
You have strong communication skills, form good relationships with colleagues, and enjoy working with people.
You are a pragmatic problem solver with a bias to action and an appetite for getting things done.
You are passionate about data and the ever-evolving technology that is used to leverage it. You keep up to date with the latest trends and tools in Data Engineering.
You work with a DevOps mindset and have experience working in agile teams.
Day-to-day, you will:
Write Python (mainly PySpark) and SQL code to access and transform data.
Use modern data technologies like Azure Databricks and Fivetran to build high quality data products.
Design data models, considering factors like performance, ease of use and data security.
Adhere to and influence our DevOps engineering practices.
Work independently or with other data professionals on projects.
Build great relationships with stakeholders across the business; understand their needs and translate them into technical tasks.
Your Knowledge and Skills will include:
Python, particularly PySpark.
SQL and databases.
Lakehouses, preferably Databricks.
Cloud providers, preferably Azure.
DevOps practices, including version control, testing and CI/CD.
Agile software development.
Data modelling.
You may have experience in some of the following:
Data visualisation tools like Tableau, Looker or PowerBI.
Infrastructure as code.
You are a seasoned Data Engineer, who is comfortable with both PySpark and SQL.
You have strong communication skills, form good relationships with colleagues, and enjoy working with people.
You are a pragmatic problem solver with a bias to action and an appetite for getting things done.
You are passionate about data and the ever-evolving technology that is used to leverage it. You keep up to date with the latest trends and tools in Data Engineering.
You work with a DevOps mindset and have experience working in agile teams.
Day-to-day, you will:
Write Python (mainly PySpark) and SQL code to access and transform data.
Use modern data technologies like Azure Databricks and Fivetran to build high quality data products.
Design data models, considering factors like performance, ease of use and data security.
Adhere to and influence our DevOps engineering practices.
Work independently or with other data professionals on projects.
Build great relationships with stakeholders across the business; understand their needs and translate them into technical tasks.
Your Knowledge and Skills will include:
Python, particularly PySpark.
SQL and databases.
Lakehouses, preferably Databricks.
Cloud providers, preferably Azure.
DevOps practices, including version control, testing and CI/CD.
Agile software development.
Data modelling.
You may have experience in some of the following:
Data visualisation tools like Tableau, Looker or PowerBI.
Infrastructure as code.
Autentifică-te pentru a lua parte la discuție !