Analytics Engineer at Watu Credit Limited
Watu View all jobs
- Kenya
- Permanent
- Full-time
- Build & Deploy Pipelines: Design, create, and deploy robust data ingestion pipelines to continuously feed the Data Warehouse with raw data from diverse sources using cloud-hosted ingestion tools.
- Data Transformation: Create and maintain efficient processes to transform raw data into clean, organized, and analyst-ready datasets using dbt, SQL, and Google Cloud tools (Dataflow, Datastream).
- Data Quality & Security: act as the guardian of the Analytics data, taking full responsibility for data quality, consistency, and the implementation of strict security protocols.
- Governance Implementation: Define and implement data governance rules to ensure data integrity and compliance across the organization.
- Warehouse Administration: Manage the administration of the Data Warehouse, ensuring optimal performance, organization, and accessibility.
- Tooling & Infrastructure: Implement, develop, and maintain the necessary Analytics Engineering tools and infrastructure to support the wider data team.
- Architecture Support: Actively assist in defining and evolving the data architecture to ensure it remains scalable and efficient as the business grows.
- AI Development & Automation: Work on the company's initial AI initiatives by developing custom agents, tools, and intelligent workflows that leverage our data foundation to automate complex business processes.
- Experience: At least 3 years of proven experience working in a Data Engineering or Back-end Engineering role.
- Core Languages: Advanced proficiency in SQL and strong coding skills in Python.
- Data Warehousing: A deep understanding of modern Data Warehouse technologies, architectural patterns, and industry best practices.
- Data Operations: Expertise with the latest tools and processes for data ingestion, transformation, and management (ETL/ELT).
- Cloud Stack: Hands-on experience with Google Cloud solutions, specifically Cloud Storage, BigQuery, Datastream, and Dataflow.
- Modern Transformation: Practical experience with dbt (data build tool).
- AI Engineering Concepts: Familiarity with modern AI patterns, specifically Vector Databases for retrieval, managing context (RAG), and equipping LLMs with tools to perform actions and interact with external APIs.
- Big Data: Experience working with non-relational databases or Big Data technologies.
- Data Streaming: Familiarity with data streaming analytics and real-time data processing.
- Version Control: Proficiency with Git.
- Polyglot: Knowledge of other coding languages beyond Python and SQL.
- Precision: Rigorous attention to detail, with a high standard for data quality and accuracy.
- Autonomy: The ability to work independently and proactively; you anticipate problems before they happen.
- Drive: You are a self-starter and target-oriented, capable of managing your own roadmap to meet delivery goals.
- Collaboration: A dedicated team player and good communicator, able to bridge the gap between technical complexity and business needs.
Jobs in Kenya