Data
AI & Analytics
Insights
Python
Azure
ML/AI
Pipelines
Ontology
SQL
Claude
GPT
{ }
=>

AI & Analytics Engineer

January 2026

Job Description

We're looking for a seasoned AI & Analytics Engineer who can architect and build enterprise-scale data systems that power intelligent decision-making. If you thrive on solving complex technical challenges at the intersection of AI/ML, analytics, and data engineering, this role is for you.

At TACo, we're building HYDRA, our AI-powered operational and sustainability intelligence platform serving property and infrastructure customers across Africa. We need someone who can build AI and analytics capabilities on top of a robust data engineering foundation, turning millions of data points into intelligent automation and actionable insights.

This isn't about just connecting APIs or running notebooks. This is about building AI-powered features, architecting data pipelines, creating analytics capabilities, and designing scalable data systems that deliver real business value. You'll be the technical authority on AI, analytics, and data engineering, working directly with our CTO and CEO to shape our technical architecture.

You'll be part of a lean, high-agency startup team where your work directly impacts customers making better decisions about their properties and infrastructure.

Position Details

Position

AI & Analytics Engineer

Hours

Full-time position on a long-term fixed-term contract

Reporting

Reports to the CTO, works closely with CEO on strategic decisions

Salary Range

R75,000 - R95,000 (gross per month)

Location

Johannesburg (Fourways office 3x per week), remote considered for exceptional candidates

Current Tech Stack

Frontend: Vue.js, Vuetify, Tailwind CSS, libraries and custom visualisations, Mapbox.js

Backend: C# .NET, Razor MVC, Azure Functions, Entity Framework

Data: SQL, NoSQL databases

Cloud: Azure

Languages: Python, C#, JavaScript, Flutter

This role will primarily work with data engineering tools and frameworks, with opportunities to influence and evolve our stack.

What you'll do

Data Architecture & Engineering

  • • Design and build HYDRA's data intelligence and analytics layer
  • • Implement ontology and semantic data models for complex operational data
  • • Architect scalable data pipelines that handle millions of records daily from diverse sources: telemetry data, WhatsApp, field agents, and 3rd party systems
  • • Optimize data storage, retrieval, and processing for enterprise performance

Analytics & Data Intelligence

  • • Build advanced analytics capabilities that power HYDRA's operational and sustainability insights
  • • Design and implement analytics pipelines that process and structure data for intelligent decision-making
  • • Create data models that enable sophisticated analysis across operational, energy, water, and security domains
  • • Develop analytics features that drive customer value and differentiation

AI/ML Systems Development

  • • Build AI-powered features (predictive analytics, smart automation, anomaly detection)
  • • Leverage AI tooling and foundational model APIs (Claude, GPT, etc.) for both development acceleration and product features
  • • Implement ML models in production (not just notebooks - real systems)
  • • Create rules engines for automated insights and decision support
  • • Develop intelligent data processing that learns and adapts

Collaboration & Technical Leadership

  • • Collaborate with Lead Platform Developer on integrations and customer-facing features
  • • Partner with CTO and CEO on strategic product direction and technical architecture
  • • Set technical standards for data engineering, analytics, and AI/ML work

About you

Critical Skills (Must-have)

7-10 years of experience building scalable data systems and analytics platforms in production environments

Deep expertise in data engineering: data pipelines, data modelling, storage optimisation, and ETL processes

Strong analytics background: designing analytics systems that turn data into actionable insights

Autonomous worker who thrives in lean, high-agency startup environments with "figure it out" mentality

Excellent technical communicator who can explain complex concepts to both engineers and business stakeholders

Important Skills

AI/ML experience: built and deployed models in production (not just notebooks)

Cloud infrastructure experience with Azure

Experience with distributed systems patterns: caching, queues, event-driven architecture

Ontology and semantic data modelling experience

API architecture for enterprise-scale applications

Experience with rules engines and business logic systems

Nice to have

Experience with semantic web technologies and knowledge graphs

Background in industrial IoT, property tech, or infrastructure analytics

Experience building multi-tenant SaaS platforms

Contributions to open-source data engineering or analytics projects

Experience working in African tech ecosystems

Passion for using technology to solve sustainability and operational challenges

Dealbreakers

This role is not a fit if:

Only frontend or basic backend experience (we need deep data/analytics expertise)

No experience with data engineering, pipelines, or analytics systems

Need hand-holding or detailed specs (we're a startup - you'll need to figure things out)

Not comfortable with ambiguity or rapid iteration

Your First 90 Days

Month 1: Understand, Assess & Design

  • • Deep dive into HYDRA's existing data, AI, and analytics architecture, including SQL, Azure AI Search, Azure OpenAI, and ingestion pipelines (IoT, WhatsApp, media, documents).
  • • Understand the current ontology-driven template system and how structured operational data is captured, indexed, and analysed across domains (security, drones, workforce, assets, compliance).
  • • Audit data quality, pipeline reliability, embedding strategies, semantic search performance, and technical debt across the intelligence layer.
  • • Work closely with product, engineering, and customers to understand how analytics and AI insights are consumed and where current limitations exist.
  • • Map end-to-end data flows from raw ingestion through semantic enrichment, ontology alignment, and downstream analytics outputs.
  • • Propose improvements and extensions to the AI semantic architecture, including how templates, fields, and ontologies should scale across new data types and customers.

Month 2–3: Build, Embed & Deliver

  • • Design and implement production-ready semantic and analytics pipelines using Azure OpenAI, Azure AI Search, and HYDRA's ontology framework.
  • • Build AI-driven semantic layers for additional data modalities, including media, documents, and video, enabling unified querying and analysis across structured and unstructured data.
  • • Extend the ontology system to support both Level 1 outputs (direct, structured extraction and summarisation) and Level 2 outputs (deeper analysis, correlations, trends, and derived insights).
  • • Deliver at least one customer-facing AI or analytics capability that demonstrates clear operational or business value (for example automated reports, semantic search, or cross-domain analysis).
  • • Define and implement best practices for embeddings, indexing, semantic ranking, data modelling, and AI evaluation.
  • • Document architecture, standards, and patterns so the intelligence layer is robust, maintainable, and extensible by the wider team.

Success by Day 90

  • • Shipped at least one production AI or analytics feature that customers actively use.
  • • Established a scalable, ontology-driven semantic and analytics foundation that supports structured data, media, documents, and video.
  • • Defined a clear technical roadmap for HYDRA's intelligence layer, including deeper Level 2 analytical capabilities.
  • • Become the internal go-to person for AI, semantic modelling, and analytics architecture across the company.

Self Assessment

Have you built data engineering systems that process millions of records daily?

Have you designed and implemented analytics platforms that drive business decisions?

Have you built and deployed AI/ML features in production systems?

Can you architect scalable data pipelines and optimize data storage for performance?

Are you comfortable being the technical authority in your domain?

Does building AI-powered analytics for sustainability/operations excite you?

Can you work autonomously in a lean startup with high agency?

What We Offer

Technical Impact - Architect core data and analytics systems - your decisions shape the platform

Hard Problems - Work on genuinely challenging technical problems: ontology, analytics at scale, ML in production

Real-World Impact - Power sustainability and operational decisions across Africa's property and infrastructure sectors

Startup Velocity - Fast-moving environment with ownership, autonomy, and learning

Technical Authority - Be THE expert on AI, analytics, and data systems - shape technical strategy with the CTO

Flexibility - 3x office days (Fourways), remote considered for exceptional candidates

Elite Team - Work with ambitious, capable people who value thoughtful technical design

Growth Opportunity - Join a funded startup at an inflection point with clear growth trajectory

About TACo

The Awareness Company (TACo) is a South African AI startup building HYDRA: AI for the physical world. We're creating the operational intelligence platform that helps property and infrastructure portfolios make better decisions through live data, smart analytics, and automated insights.

We serve B2B customers across Africa: property investors, asset managers, and infrastructure operators, helping them optimize operations, improve sustainability performance, and unlock financial value from their physical assets. From energy and water management to security and workforce optimization, HYDRA turns operational data into strategic advantage.

Our positioning: While most AI companies focus on digital-first use cases, we're tackling the harder problem: bringing intelligence to the physical world of buildings, infrastructure, and industrial operations. We combine IoT integration, data engineering, and AI/ML to create a platform that works in the real world, not just in the cloud.

Why now? We're scaling our technical foundation to serve enterprise customers across the continent. You'll be joining at an inflection point where your technical decisions have outsized impact.

If you answered yes to most of these questions, we want to talk. Apply now to architect the future of AI-powered analytics at TACo!