Kavod Technologies
Claire Belle: Building a Beauty Marketplace with AR Try-On
ProductClaire Belle

Claire Belle: Building a Beauty Marketplace with AR Try-On

Aisha Okafor
Aisha Okafor
January 5, 202612 min read
Aisha Okafor

Aisha Okafor

Product Engineering Lead

Aisha bridges product and engineering at Kavod, designing systems that balance technical elegance with user delight.

A New Standard for Beauty Tech in Africa

The global beauty industry generates over $500 billion annually, yet the technology powering online beauty shopping was built primarily for lighter skin tones and Western facial feature distributions. When we began developing Claire Belle, we set out to build an AR-powered beauty marketplace that works flawlessly for the full spectrum of African skin tones, facial structures, and beauty traditions.

This post covers the technical architecture behind Claire Belle's AR try-on experience, our product recommendation engine, and the vendor platform that connects thousands of beauty brands with millions of consumers.

AR Face Mapping for Diverse Skin Tones

The core innovation in Claire Belle is our face mapping and product rendering pipeline. Traditional AR beauty filters notoriously fail on darker skin tones -- lipstick colors appear washed out, foundation shades look ghostly, and contouring effects are invisible. We rebuilt the entire pipeline from the ground up.

Face Detection and Landmark Extraction

We use a custom face detection model trained on a dataset of over 2 million facial images spanning the Fitzpatrick I-VI skin tone scale, with deliberate overrepresentation of types IV-VI. Our 468-point facial landmark model captures:

  • Lip geometry with sub-millimeter precision, including the vermilion border that defines lip shape
  • Eye contour mapping for eyeshadow, liner, and lash placement
  • Facial bone structure estimation for contouring and highlight simulation
  • Skin texture analysis at the pixel level for realistic product blending
interface FaceMeshResult {
  landmarks: Float32Array;        // 468 x 3 (x, y, depth)
  skinToneClassification: {
    fitzpatrick: number;           // 1-6
    undertone: "warm" | "cool" | "neutral";
    hexSample: string;             // median skin color
  };
  faceRegions: {
    lips: Polygon;
    leftEye: Polygon;
    rightEye: Polygon;
    cheekLeft: Polygon;
    cheekRight: Polygon;
    forehead: Polygon;
    jawline: Polygon;
    nose: Polygon;
  };
  lightingEstimate: {
    direction: Vec3;
    intensity: number;
    colorTemperature: number;
  };
}

Physics-Based Rendering for Cosmetics

Simply overlaying a color on detected face regions produces cartoonish results. Real cosmetics interact with skin in complex ways -- lipstick has specular highlights, foundation has varying coverage and finish (matte, dewy, satin), and powder products scatter light differently than cream products.

We developed a physically-based cosmetic rendering model that simulates:

  • Subsurface scattering: Light penetrates the top layer of skin and scatters before exiting, giving skin its characteristic translucency. Our renderer accounts for this when applying sheer products like tinted moisturizers.
  • Specular reflection: Glossy lip products and highlighters produce specular highlights that must respond to the lighting environment detected in the camera feed.
  • Texture mixing: Products do not replace skin texture; they sit on top of it. Our blending model preserves the underlying skin texture (pores, fine lines, natural sheen) while applying the product's color, opacity, and finish properties.

The result is an AR try-on that looks genuinely realistic, even on video calls and in varying lighting conditions.

Training Data and Bias Mitigation

Building an inclusive face mapping system required a deliberate, structured approach to data:

  • We partnered with beauty schools and cosmetics professionals across Nigeria, Kenya, South Africa, Ghana, and Senegal to collect annotated facial imagery with proper consent and compensation.
  • Every model is evaluated against fairness metrics that measure accuracy parity across skin tone groups. If lip detection accuracy for Fitzpatrick VI drops below our threshold, the model does not ship.
  • We run monthly bias audits using held-out evaluation sets, and our model cards document performance breakdowns by demographic group.

The Product Recommendation Engine

Finding the right shade of foundation or the perfect lipstick color online is notoriously difficult. Claire Belle's recommendation engine combines AR try-on data with collaborative filtering to solve this.

Shade Matching

When a user tries on a foundation virtually, our system:

  1. Extracts their skin tone from multiple face regions (forehead, cheek, jawline, neck) to account for natural variation
  2. Maps the extracted tone into a standardized color space calibrated against physical Pantone skin tone swatches
  3. Queries our product database for foundations, concealers, and tinted products whose shade profiles match within a configurable delta-E threshold

This process runs in real time on-device, so users see accurate shade recommendations as they browse products.

Collaborative Filtering with Visual Features

Beyond shade matching, we use a hybrid recommendation model that combines:

  • Visual similarity: Products tried on and saved by the user are embedded in a visual feature space, and similar products are surfaced.
  • Purchase co-occurrence: Users who bought product A also frequently bought product B -- classic collaborative filtering adapted for beauty verticals.
  • Skin-type compatibility: User-reported and algorithmically-detected skin type (oily, dry, combination, sensitive) filters recommendations to compatible formulations.
  • Trend signals: We track trending products within geographic and demographic cohorts to surface culturally relevant recommendations.
class BeautyRecommender:
    def __init__(self):
        self.visual_encoder = ProductVisualEncoder(dim=256)
        self.cf_model = NeuralCollaborativeFilter(n_users=2_000_000, n_items=85_000)
        self.skin_filter = SkinCompatibilityFilter()

    def recommend(self, user: UserProfile, context: BrowsingContext) -> list[Product]:
        visual_candidates = self.visual_encoder.similar_to(user.saved_products, k=200)
        cf_candidates = self.cf_model.predict(user.id, k=200)
        merged = self.merge_and_rank(visual_candidates, cf_candidates, context)
        filtered = self.skin_filter.apply(merged, user.skin_profile)
        return filtered[:50]

The Vendor Platform

Claire Belle is not just a consumer app -- it is a full marketplace connecting beauty brands with consumers. Our vendor platform handles everything from product listing to fulfillment.

Vendor Onboarding

Brands upload their product catalog with structured metadata:

  • Product images: We require minimum 4 angles plus swatch images on multiple skin tones
  • Shade data: We provide brands with a calibration tool that maps their shade range into our standardized color space, ensuring accurate AR try-on
  • Ingredient lists: Parsed and indexed for allergy filtering and skin-type compatibility
  • Pricing and inventory: Real-time sync with vendor inventory systems via our REST API or CSV bulk upload

Analytics Dashboard

Vendors get access to a rich analytics suite showing:

  • Try-on rates per product: How many users virtually tried the product
  • Try-to-cart conversion: The percentage of try-ons that resulted in add-to-cart
  • Shade distribution: Which shades are most tried and most purchased, informing production decisions
  • Geographic demand heatmaps: Where interest is concentrated, helping vendors plan distribution

Fulfillment Network

We built a distributed fulfillment network with warehouses in Lagos, Nairobi, Johannesburg, and Accra, with last-mile delivery partnerships in 8 countries. Vendors can fulfill orders themselves or opt into our Fulfillment by Claire Belle program for hands-off logistics.

Performance and Scale

The AR pipeline needs to run at 30fps on mid-range Android devices that constitute the majority of our user base. We achieved this through:

  • Model quantization: INT8 quantization of our face mesh model reduces inference time by 3x with negligible accuracy loss
  • GPU shader optimization: Product rendering runs as custom OpenGL ES shaders, offloading the CPU for UI responsiveness
  • Progressive loading: Products load a low-resolution preview instantly, then progressively enhance as the full texture atlas downloads

Since launch, Claire Belle has facilitated over 50 million virtual try-ons and processed 2.3 million orders. Our try-on-to-purchase conversion rate is 4.2x higher than standard product photography alone, validating the thesis that realistic AR experiences drive commerce.

Looking Ahead

We are actively developing video try-on for social sharing, letting users record short clips wearing virtual products and share them to social media or with friends for opinions. We are also expanding into hair color and nail art AR categories, each with their own rendering challenges and training data requirements.

Claire Belle proves that when you build beauty tech with African consumers as the primary design target rather than an afterthought, you create something fundamentally better for everyone.

#ar#beauty#marketplace#claire-belle#computer-vision

Try Claire Belle today

Discover how Claire Belle can help you build better, faster. Get started for free and see the difference.

Get Started
Back to All Articles

Annual Report FY2025

Our comprehensive review of performance and strategy

View Reports

Stay updated

Product launches, engineering updates, and company news.

Headquarters

Cape Town, South Africa
Technology Hub, Innovation District

Regional Offices

Lagos, Nigeria • Nairobi, Kenya
Accra, Ghana • Johannesburg, SA

Contact

info@kavodtechnologies.com
+27 21 123 4567

Kavod Technologies Limited © 2026. All rights reserved.

Accessibility Options