Find N Keep Talent

Python Developer (Data heavy)

UrbanBite Analytics Singapore (Tanjong Pagar / Central) Hybrid
Type: Full-time Level: Mid–Senior level Salary: S$5,000 – S$8,000 per month
python developer data engineering food-tech singapore hybrid full-time sql aws machine learning central kitchen
UrbanBite Analytics

About the role

UrbanBite Analytics is the data & insights arm of a fast-growing Singapore F&B group operating cafes, a central kitchen and a handful of quick-service restaurants across the island. We turn POS, inventory and delivery telemetry into actionable dashboards, demand forecasts and automation that reduce waste and improve margins.

The Python Developer will focus on end-to-end data engineering: designing ETL pipelines, processing high-volume time-series and transaction data, producing clean datasets for analysts and ML models, and helping deploy monitoring and CI/CD for data workflows. You will work closely with ops managers, supply chain and product teams to translate business needs into robust technical solutions.

This role suits someone who enjoys both hands-on coding and system design, who can move between pandas/Spark jobs, SQL optimisation and containerised deployment. We offer a hybrid working model with a central office in Tanjong Pagar, frequent collaboration with restaurant operations and opportunities to shape analytics that directly affect kitchen efficiency and menu decisions.

About UrbanBite Analytics

UrbanBite Analytics supports a multi-concept F&B group (cafes, quick-service restaurants and a central kitchen) by building data products for forecasting, stock optimisation and outlet performance. Small cross-functional team, product-driven, with strong operational exposure.

What you can expect

  • Direct impact on multi-outlet operations and procurement savings
  • Hybrid work (3 days office / 2 days remote)
  • Central office in Tanjong Pagar with easy MRT access
  • Opportunities to work on forecasting, real-time analytics and ML pipelines

Key responsibilities

  • Design, build and maintain robust ETL/ELT pipelines in Python to ingest POS, inventory and delivery data
  • Author and optimise complex SQL queries and data models for analytical consumption
  • Implement batch and near-real-time processing (Kafka/stream or scheduled jobs) as needed
  • Package data processes into containerised services (Docker) and support deployments with CI/CD
  • Instrument metrics, logging and monitoring for data jobs; respond to incidents and data quality issues
  • Collaborate with product managers, operations and supply chain to translate business requirements into technical designs
  • Produce clean, documented datasets and APIs for analysts and ML engineers
  • Participate in code reviews, maintain coding standards and create technical documentation

Requirements

  • 3+ years professional experience writing production Python for data pipelines or backend services
  • Strong SQL skills with experience optimising queries on analytical databases (Postgres, Redshift, BigQuery or similar)
  • Hands-on experience with at least one workflow/orchestration tool (Airflow, Prefect) or demonstrated scheduling approach
  • Familiarity with data processing libraries (pandas) and at least one distributed processing framework (Spark, Dask) is preferred
  • Experience with containerisation (Docker) and basic CI/CD pipelines; comfortable with Git
  • Knowledge of cloud platforms (AWS, GCP or Azure) and related data services
  • Able to work flexibly across business hours to coordinate with operations teams and support deployments
  • Clear communication skills and ability to translate business problems into data solutions

Benefits

  • Hybrid work arrangement (3 days in office, 2 days remote)
  • Competitive monthly salary with performance bonus
  • Medical insurance and annual leave above statutory minimum
  • Staff meal allowances and occasional outlet discounts
  • Training/professional development budget and conference support
  • Central office with MRT access; occasional travel to outlets/central kitchen

Work schedule

Typical week: 5 days per week (core business hours), hybrid with occasional extended hours for deployments or data incidents.

  • Typical office hours: 09:30–18:30 (flexible start within core hours)
  • Occasional early/late shifts for rollout windows or site visits
  • On-call rotation for critical pipeline incidents (infrequent; planned)

How to apply

Send your CV, a short note about your relevant data projects and links to GitHub or portfolio to [email protected]. Indicate earliest available start date.

Apply Now via Email

More jobs to consider