Google Expands Try-On Tool To Shoes: Set To Roll Out in Australia, Canada & Japan

Google has expanded its AI-powered virtual try-on tool to include shoes. It allows users to visualise how different shoes would look on them using just a full-body photo. The feature is rolling out to Australia, Canada, and Japan (beyond the U.S.).

Online shopping has always had one major flaw: you can’t see how something actually looks on you. You can browse photos, check reviews, and guess your size using charts and augmented reality (AR), but that doesn’t replace actually trying things on.

Last year, Google launched its virtual try-on tool for clothes (shirts, dresses, skirts, etc.), and now it is expanding to footwear. With a new set of sneakers, heels, and boots for users, it helps answer the timeless question: “Will these shoes actually suit me?”

Even better, Google isn’t keeping it limited to the U.S. In the coming weeks, the company will expand the feature to shoppers in Australia, Canada, and Japan, making it a global fashion tool powered by AI. This move is significant for both consumers and retailers. 

Here’s everything you need to know, how it works, why it matters, what challenges lie ahead, and what this means for the future of e-commerce.

How Google’s Virtual Try-On Works

Google’s try-on experience uses advanced AI to merge real images with product visuals in a realistic way, and it’s far simpler for users than most AR experiences.

1. A Photo Is All You Need

To try something on, you simply select a product listing in Google Shopping, tap “Try It On”, and upload a full-length photo of yourself. Within seconds, Google uses AI/image generation models to overlay or “replace” the footwear (or clothing) in that image.

It aligns with perspective, depth, and lighting in a way that a fully interactive 3D avatar is no longer needed for each user, which makes it more scalable and user-friendly.

2. AI Understands Depth and Shape

The new technology doesn’t just paste an image on top. Google’s AI model detects the pose, depth, shape, and lighting in your photo. It accurately maps how the shoes should sit on your feet and how shadows should appear.

This depth-aware rendering makes the final image look remarkably natural. Within moments, you’ll see what you might look like in those heels or sneakers.

3. Machine Learning and Image Generation

Experts suggest that the tool is “powered by a custom image generation model” or image-warping models (to fit the shoe to your unique pose) with generative diffusion techniques (to fill in missing details and adjust lighting).

This type of hybrid method is common in state-of-the-art “virtual try-on” systems like DualFit or DM-VTON, designed to merge product realism with human authenticity.

4. Connected to Google’s Shopping Graph

The try-on tool integrates with Google’s Shopping Graph that contains a live database of over 50 billion product listings (with metadata like reviews, colors, availability) from across the web. 

This means every try-on is tied directly to real products, prices, and retailers, giving users instant access to buy what they see.

Google’s AI Mode (launched in 2025) lets users navigate shopping with generative assistance: when to buy, product suggestions, price tracking, and inspiration visuals. Thus, when you see shoes (or clothes) in a listing, the “Try It On” button may appear. Upload your image, and the system responds within moments with a preview.

The Shoe Addition: Why It’s Harder (and Why It Matters)

Adding shoes to virtual try-on is a complex leap for Google. Let’s see why it’s tougher and why it matters for shoppers and sellers.

Here are the top challenges for footwear:

  • Small scale and fine detail
    Shoes are smaller (relative to clothes) and often contain intricate detail (laces, textures, heel contours). Any misalignment becomes more perceptible.

  • Angles and perspective
    Legs and feet orientation can appear at tricky angles. The shoe overlay must convincingly recede or project depth (toe vs heel).
  • Lighting and texture realism
    From glossy leather to mesh sneakers, the model must preserve fine details and consistent lighting.
  • Occlusion and overlap 
    Pants, skirts, or shadows can hide parts of your shoes. The AI must intuitively correct or decide which parts to blend.
  • Foot vs shoe distinction
    Many users’ photographs show the real shoes already. The model must either remove or “replace” the footwear with new ones, or convincingly mask them. 

Overcoming these hurdles shows that Google’s model has matured. 

If it can make shoes look realistic on millions of diverse user photos, it’s ready for large-scale global deployment.

Why It Matters for Shoppers and Sellers

  • Lower return rates: Footwear has one of the highest online return rates. Seeing shoes “on your feet” could cut that dramatically.
  • Confidence and style experimentation: Shoppers may be more willing to try bolder colors or designs when they can preview them virtually.
  • Social sharing: Google reports that users share their try-on results far more often than normal product listings, creating organic buzz.
  • Brand differentiation: Retailers using this feature can stand out with interactive, engaging listings that boost conversions.

Global Expansion: From the U.S. to the World

So far, Google’s try-on tool has been available only in the United States, but the company is now taking it international.

New Markets

In the coming weeks, the clothing and try-on features will reach Australia, Canada, and Japan. This rollout marks the start of a broader global expansion strategy.

Local Adaptation

Each country brings unique challenges:

  • Diverse body types and cultural standards: What looks “stylish” or acceptable varies by culture.
  • Local inventory & catalog coverage: To support try-on, the target markets must have local retailers.
  • Image/photo norms and quality: Users in different countries may have different norms in how they photograph themselves.
  • Regulation, privacy, and data security: Image upload features raise concerns over how long user photos are stored.
  • Bandwidth, latency, and compute constraint: The generation must be fast and seamless. 

Google will likely expand to European and Asian markets next.

What It Means for Users and Retailers

For Consumers
  • More confidence: Choose varieties that fit your aesthetic and style with greater confidence.
  • Reduced guesswork: Being able to “see” how the shoes or clothes look on you reduces guesswork.
  • Easy to use: No measuring, scanning, or new apps to be downloaded. Just upload one photo.
  • Shareable images: Users can share try-on images of their virtual looks for opinions or fun.
  • Personalised style: Users might experiment more boldly with outfits or shoe styles they wouldn’t try otherwise. 

For Retailers and Brands

  • Higher engagement & conversion: Interactive visuals can boost time-on-page, CTRs, and the likelihood of purchase.
  • Better data & attribution: Google can provide insights (e.g., which shoe styles had high try-on rates or conversion ratios) to brands.
  • Fewer returns or disappointment costs: More accurate visualisation means fewer disappointed buyers.
  • New creative ads & formats: Brands might run “try-on-first” ad campaigns: interactive experiences rather than static banners.
  • Better for smaller sellers: Local or regional sellers can benefit from the immersive presentation without building their own AR infrastructure.

Use Cases, Scenarios & Tips for Users

How to Use Google’s Try-On Tool for Shoes
  • Search for a shoe on Google Shopping or within Search.
  • Tap the “Try It On” option on eligible product listings.
  • Upload a full-length photo of yourself (well-lit, clear, and standing pose).
  • Within seconds, you’ll see what those shoes look like on you.
  • You can save, share, or compare your try-on results before buying the product.
Tips for Best Results
  • Use a well-lit environment (natural light and less harsh shadows).
  • Stand in a straight-on pose against a simple background.
  • Keep your legs and feet visible in the photo.
  • Avoid existing footwear that might confuse the overlay.
  • Try different poses if the option is available for multiple angles.

What’s Next for Google Shopping

This move into shoes is likely just the beginning. Future expansions could include:

  • Accessories (bags, hats, jewelry)
  • Eyewear and watches
  • Home decor try-ons (products in your environment)
  • Real-time AR experiences

The long-term goal is clear: merge AI creativity, personal data, and e-commerce into a single, seamless visual shopping journey from discovery to checkout.

Google’s decision to add shoes to its try-on feature and to expand globally signals a major evolution: it bridges the gap between digital browsing and real-world experience, giving shoppers unprecedented confidence in what they buy.

By using advanced AI to generate realistic previews, Google is transforming how people interact with products online. As the company continues expanding this feature to more markets and product categories, the line between shopping and experiencing will grow increasingly thin.

Soon, shopping online might feel almost indistinguishable from shopping in person, and all you’ll need is a photo.

Need a fresh perspective? Let’s talk.

At 360 OM, we specialise in helping businesses take their marketing efforts to the next level. Our team stays on top of industry trends, uses data-informed decisions to maximise your ROI, and provides full transparency through comprehensive reports.

Get Your Performance Marketing Audit
Unlock the Growth of your digital marketing strategy
Thank you!
Your submission has been received!
Oops! Something went wrong while submitting the form.
Talk to us
Get Your Performance Marketing Audit
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Related Posts